PureSoftware is a leading technology services provider with expertise in software development, digital transformation, and IT consultancy.
Noida
INR 10.0 - 20.0 Lacs P.A.
Hybrid
Full Time
• Monitor and maintain production data pipelines, ETL/ELT workflows, and batch processing jobs in Talend. • Analyze and fix stored procedures and data flows to troubleshoot and resolve production incidents, ensuring minimal downtime and accurate data delivery. • Implement alerting and monitoring solutions to proactively detect and resolve data quality or performance issues. • Collaborate with Data Engineers, BI Engineers, and Architects to ensure smooth deployments and updates. • Perform root cause analysis (RCA) for recurring issues and drive long-term solutions. • Develop and maintain runbooks, documentation, and knowledge bases for data operations and incident response. • Automate repetitive support tasks and improve operational efficiency through scripting and tooling. • Provide production support coverage on weekends, as part of a shift-based role. • Ensure adherence to data governance, privacy, and security standards in all production support activities. Interested candidates can share their resume at megha.rawat@puresoftware.com
Noida, Gurugram
INR 9.5 - 19.0 Lacs P.A.
Hybrid
Full Time
Write and execute test cases ensuring complete coverage of the functionality under test Create automated test scripts, mapped to manual test cases, using the frameworks as defined. Report issues discovered through manual testing or via automated tests and track them to closure Verify the fixes/new features and analyze the impact to identify the modules needed to be regression tested Setup and configure test environment, fixtures and games as required by the test plans or standards Create and maintain Modification Documents for submission to regulators Estimate testing tasks for projects Create test plans in Team Foundation Server/JIRA and monitor the progress Support Accredited Test Facility as well as Regulators, assisting in testing features and help maintain their test environments Identify potential areas for automation and develop automation scripts Identify and resolve product issues/queries with Engineering, Compliance, and regulatory agencies Qualifications 3-6 years experience in software testing. 2+ years hands-on experience in web application testing and API testing. Proficiency in writing test cases from functional specifications, use cases and business rules Good understanding of Behavior Driven Development on Selenium-Cucumber framework Experience in writing SQL queries Proficiency in communication - both verbal and written Experience in Casino Gaming/Digital Gaming will be an advantage B.Tech/M.Tech in Computer Science or equivalent (preferred)
Gurugram
INR 18.0 - 20.0 Lacs P.A.
Work from Office
Full Time
Job Description : Responsibility : Low level analysis, quality development, unit test. Communication with BA, QA and other stakeholders to make sure bug free end-to-end delivery. Take the ownership of their code and services in long run. Mandate : Education : B.E. , B.Tech ,MCA or other equivalent engineering degree Preferred experience required: 5-8 Year. Minimum 5-year relevant experience in .Net or .Net core server-side programming. Knowledge of OOPS , Data Structure, Rest or gRPC, WCF, Multithreading, Kafka, Azure, basics of Devops Strong understanding of SQL, Query Store, Execution plan, Query Optimization Design pattern & SOLID principle Good communication skills (Verbal and Written) Good to have- exposure to financial markets, Docker and Kubernetes.
Gurugram
INR 8.0 - 10.0 Lacs P.A.
Work from Office
Full Time
Job description Education: Graduate or other equivalent degree. Minimum experience required: 4-8Years. Minimum 4 year relevant experience in Corporate Onboarding and/or ODD is mandatory. Knowledge of AML/KYC on corporate clients. To have SME level knowledge of locally important regulatory requirements and background legislation. Good communication skills (Verbal and Written)
Gurugram
INR 15.0 - 20.0 Lacs P.A.
Hybrid
Full Time
Job Description Qualifications: 4+ years experience B.E., B.Tech or equivalent Telecom & Monitoring domain knowledge (Preferred) Technical Skills: Technologies knowledge required: Java, Angular & Perl Technologies knowledge preferred: PHP Proficient in Data Structures, OOPs concepts, data types and algorithms. Good understanding & hands-on experience of Java, Perl & Angular (11+) programming language Should be proficient in JavaScript, HTML, CSS Knowledge of JWT, UI/UX trends, 12 factor compliance are a must. Expertise in core java topics such as File and Exception Handling, Collections, and Functional programming, etc. Good grasp of framework such as spring, springboot, struts 2, etc. REST & SOAP web service knowledge using JSON Database knowledge with writing queries, joins, procedures, etc Hands on experience with testing & creating unit test cases Unix Shell scripting Cloud knowledge (GCP/Azure preferred) DevOps supporting tools experience Git, K8, Docker, YAML, etc. Agile & Scrum knowledge Monitoring of application operation, knowledge of protocols like SNMP, Netflow, etc. Cassandra Developer experience Key Responsibilities: As part of application team, you will provide level 3 support for users on a daily basis, and you will have a privileged role with the IT teams (level 4), ensuring certain operating tasks on IT solutions. Prioritize and manage workload & issues effectively, ensuring critical issues are addressed promptly. Analyze complex technical problems, troubleshoot root causes, and implement effective solutions to prevent recurrence. Collaborate with cross-functional teams, including developers, system administrators, and project managers, to escalate and resolve issues efficiently. Strong focus on quality, code reviews & development best practices/processes. Write clean, efficient, and maintainable and documented code and tests. Troubleshoot and debug software problems.
Pune
INR 20.0 - 30.0 Lacs P.A.
Hybrid
Full Time
Job Summary : We are looking for a highly skilled AWS Data Engineer with over 5 years of experience in designing, developing, and maintaining scalable data pipelines on AWS. The ideal candidate will be proficient in data engineering best practices and cloud-native technologies, with hands-on experience in building ETL/ELT pipelines, working with large datasets, and optimizing data architecture for analytics and business intelligence. Key Responsibilities : Design, build, and maintain scalable and robust data pipelines and ETL processes using AWS services (e.g., Glue, Lambda, EMR, Redshift, S3, Athena). Collaborate with data analysts, data scientists, and stakeholders to understand data requirements and deliver high-quality solutions. Implement data lake and data warehouse architectures, ensuring data governance, data quality, and compliance. Optimize data pipelines for performance, reliability, scalability, and cost. Automate data ingestion and transformation workflows using Python, PySpark, or Scala. Manage and monitor data infrastructure including logging, error handling, alerting, and performance metrics. Leverage infrastructure-as-code tools like Terraform or AWS CloudFormation for infrastructure deployment. Ensure security best practices are implemented for data access and storage (IAM, KMS, encryption, etc.). Document data processes, architectures, and standards. Required Qualifications : Bachelors or Master’s degree in Computer Science, Information Systems, or a related field. Minimum 5 years of experience as a Data Engineer with a focus on AWS cloud services. Strong experience in building ETL/ELT pipelines using AWS Glue, EMR, Lambda , and Step Functions . Proficiency in SQL , Python , PySpark , and data modeling techniques. Experience working with data lakes (S3) and data warehouses (Redshift, Snowflake, etc.) . Experience with Athena , Kinesis , Kafka , or similar streaming data tools is a plus. Familiarity with DevOps and CI/CD processes, using tools like Git , Jenkins , or GitHub Actions . Understanding of data privacy, governance, and compliance standards such as GDPR, HIPAA, etc. Strong problem-solving and analytical skills, with the ability to work in a fast-paced environment.
Noida
INR 25.0 - 30.0 Lacs P.A.
Remote
Full Time
Job Description T-SQL, SSAS, Power BI Support existing processes End-to-end understanding of the BI environment. Nice to Have Azure DevOps -- CI/CD pipeline AZURE Data factory Knowledge of High-performance document DBs like Cosmos DB, Dynamo, MongoDB. Understanding of the Logical DBA tasks SSRS, SSIS Roles & Responsibilities Data analysis: Using data analysis skills to turn data into knowledge that can help a business achieve its goals BI tool expertise: Developing, deploying, and maintaining BI tools and interfaces Data modeling and ETL: Working with ETL developers to collect, transform, and send data to the warehouse level Data warehouse applications: Creating the architecture of a data warehouse system to store corporate information in a unified location Data visualization: Building data visualization tools to help a company process and display data easily and efficiently Troubleshooting BI tools: Troubleshooting problems that may arise during the production or modeling stage Database management: Creating data models to generate business intelligence tools and reports
Gurugram
INR 10.0 - 12.0 Lacs P.A.
Work from Office
Full Time
Job Description Education: Graduate or other equivalent degree. Minimum experience required: 2-6 Years. Minimum 2 year relevant experience in Corporate Onboarding and/or ODD is mandatory. Knowledge of AML/KYC on corporate clients. To have SME level knowledge of locally important regulatory requirements and background legislation. Good communication skills (Verbal and Written)
Bengaluru
INR 20.0 - 35.0 Lacs P.A.
Work from Office
Full Time
Job Title: Data Scientist(5 Positions)/ Lead OR Manager -Data Scientist (3 positions) Experience: Data scientist (8-10 years) / Lead Data scientist(14+ years) Job Location: Whitefield, Bangalore Mode of working: Hybrid Interview Process: First Round: L1-Internal interview Second Round: Assessment shared by us needs to be completed in 48 hours Third Round: Client discussion over the submitted assessment. Final Round: HR Discussion Preferred Domain: Healthcare Insurance/ Insurance agencies / Health Insurance / Any Insurance We are looking for a talented Data Scientist to join our growing team. In this role, you will lead efforts to develop, enhance, and optimize advanced AI and machine learning models with a particular focus on Generative AI, Large Language Models (LLMs), Langchain, and Prompt Engineering. You will oversee the application of statistical modeling techniques to derive insights, build models, and lead research initiatives that push the boundaries of AI technologies. Key Responsibilities: Leadership & Collaboration: Lead a team of data scientists, researchers, and engineers working on high-impact projects related to generative models, NLP, and statistical modeling. Collaborate with cross-functional teams, including engineering, product management, and research, to deliver AI-powered products and solutions. Generative AI Development: Spearhead the development and deployment of Generative AI models and algorithms to address complex problems in areas like content generation, conversational AI, and creative automation. LLM Implementation & Optimization: Develop, fine-tune, and optimize large language models (LLMs) for diverse applications, ensuring they are robust, scalable, and accurate in real-world scenarios. Langchain Integration: Design and integrate Langchain for managing and deploying sophisticated language models with a focus on complex workflows, multi-agent systems, and real-time applications. Prompt Engineering: Lead prompt engineering efforts to optimize AI models' output quality, improve interactions, and enable more effective natural language understanding across a variety of use cases. Statistical Modeling: Utilize advanced statistical techniques to analyze and interpret data, build predictive models, and solve business-critical challenges through data-driven insights. Research & Innovation: Stay ahead of trends in AI and ML, particularly in the fields of NLP, LLMs, and generative models. Drive innovation by exploring cutting-edge techniques and methodologies in the AI space . Mentorship & Knowledge Sharing: Mentor junior team members and promote a collaborative, learning-oriented environment. Share knowledge and foster an atmosphere of continuous improvement within the data science team. Performance Optimization: Ensure model performance meets or exceeds company and client expectations by identifying areas of improvement, testing new methods, and scaling the systems accordingly. Ethical AI Development: Advocate for and implement ethical considerations in the development and deployment of AI models, including fairness, transparency, and privacy. Qualifications: Required: Education: Ph.D. or Masters degree in Computer Science, Data Science, Mathematics, Statistics, or related field, or equivalent practical experience. Experience: 8+ years of experience in data science, with at least 2-3 years in a leadership role. Proven expertise in Generative AI, particularly in areas like content generation, deep learning, and language modeling. Strong background in Large Language Models (LLMs) such as GPT, T5, BERT, or similar architectures. Hands-on experience with Langchain for building NLP workflows, pipelines, and integrating external systems with LLMs. Hands-on experience of Prompt Engineering, including techniques to refine and optimize outputs for various NLP tasks. Expertise in statistical modeling and quantitative analysis, with the ability to apply techniques to solve real-world problems. Preferred: Experience working with transformer models and fine-tuning LLMs for specific tasks. Expertise in AI model evaluation and metrics (e.g., BLEU, ROUGE, perplexity). Background in developing AI-driven products from concept to deployment. Strong publication record in AI research, particularly in NLP and machine learning . Used cases( Any of them) Automated Underwriting. Customer experience enhancement. Fraud detection . Predictive analytics. Accelerated claims processing. Risk assessment and premium calculation . Customer profiling . c ustomer segmentation . Credit Risk Assessment . Personalised marketing . Anti-Money Laundering (AML) . Personalized patient care. Medical training and simulations. Medical Data Analysis. Please share your updated resume at renuka.rathi@puresoftware.com
Bengaluru
INR 20.0 - 30.0 Lacs P.A.
Remote
Full Time
Job Description T-SQL, SSAS, Power BI Support existing processes End-to-end understanding of the BI environment. Nice to Have Azure DevOps -- CI/CD pipeline AZURE Data factory Knowledge of High-performance document DBs like Cosmos DB, Dynamo, MongoDB. Understanding of the Logical DBA tasks SSRS, SSIS Roles & Responsibilities Data analysis: Using data analysis skills to turn data into knowledge that can help a business achieve its goals BI tool expertise: Developing, deploying, and maintaining BI tools and interfaces Data modeling and ETL: Working with ETL developers to collect, transform, and send data to the warehouse level Data warehouse applications: Creating the architecture of a data warehouse system to store corporate information in a unified location Data visualization: Building data visualization tools to help a company process and display data easily and efficiently Troubleshooting BI tools: Troubleshooting problems that may arise during the production or modeling stage Database management: Creating data models to generate business intelligence tools and reports
Noida
INR 25.0 - 35.0 Lacs P.A.
Remote
Full Time
Job Title .Net with Azure Integration Job Description Experience with Azure Functions, Azure CosmosDB, AKS, ADF, Logic Apps, Azure Event Hubs, APIM, and Front Door. Strong proficiency in .NET, and Azure cloud services. Understanding of AngularJS and a willingness to learn new technologies. Excellent problem-solving, analytical, and debugging skills. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Roles & Responsibilities Development: Contribute to the development of robust and scalable applications using our primary tech stack, including C#, .NET, Azure Functions, and Azure CosmosDB. Cross-Platform Compatibility: Demonstrate a solid understanding of AngularJS and be open to learning newer technologies as needed. Cloud Expertise: Possess a deep understanding of Azure cloud services, including Azure Functions, Azure CosmosDB, AKS, ADF, Logic Apps, Azure Event Hubs, APIM, and Front Door. Problem-Solving: Identify and resolve technical challenges effectively, leveraging your problem-solving skills and expertise. Collaboration: Work collaboratively with cross-functional teams to deliver high-quality solutions that meet business objectives. Mandatory Skills .Net, .Net Core, c#, azure functions, Azure Logic Apps, Azure Api Management, Azure Data Factory, SQL.
Pune
INR 20.0 - 35.0 Lacs P.A.
Hybrid
Full Time
Job Duties and Responsibilities: We are looking for a self-starter to join our Data Engineering team. You will work in a fast-paced environment where you will get an opportunity to build and contribute to the full lifecycle development and maintenance of the data engineering platform. With the Data Engineering team you will get an opportunity to - Design and implement data engineering solutions that is scalable, reliable and secure on the Cloud environment Understand and translate business needs into data engineering solutions Build large scale data pipelines that can handle big data sets using distributed data processing techniques that supports the efforts of the data science and data application teams Partner with cross-functional stakeholder including Product managers, Architects, Data Quality engineers, Application and Quantitative Science end users to deliver engineering solutions Contribute to defining data governance across the data platform Basic Requirements: A minimum of a BS degree in computer science, software engineering, or related scientific discipline is desired 3+ years of work experience in building scalable and robust data engineering solutions Strong understanding of Object Oriented programming and proficiency with programming in Python (TDD) and Pyspark to build scalable algorithms 3+ years of experience in distributed computing and big data processing using the Apache Spark framework including Spark optimization techniques 2+ years of experience with Databricks, Delta tables, unity catalog, Delta Sharing, Delta live tables(DLT) and incremental data processing Experience with Delta lake, Unity Catalog Advanced SQL coding and query optimization experience including the ability to write analytical and nested queries 3+ years of experience in building scalable ETL/ ELT Data Pipelines on Databricks and AWS (EMR) 2+ Experience of orchestrating data pipelines using Apache Airflow/ MWAA Understanding and experience of AWS Services that include ADX, EC2, S3 3+ years of experience with data modeling techniques for structured/ unstructured datasets Experience with relational/columnar databases - Redshift, RDS and interactive querying services - Athena/ Redshift Spectrum Passion towards healthcare and improving patient outcomes Demonstrate analytical thinking with strong problem solving skills Stay on top of emerging technologies and posses willingness to learn. Bonus Experience (optional) Experience with Agile environment Experience operating in a CI/CD environment Experience building HTTP/REST APIs using popular frameworks Healthcare experience
Noida
INR 15.0 - 25.0 Lacs P.A.
Hybrid
Full Time
Job Description Extensive experience in Oracle SQL, PL/SQL, Java script; writing OOP PL/SQL codes for applications demanding type objects, nested tables (Min 4-5 years) Proficient in scripting using Python (2-3 Years) Exposure to create use cases, functional specifications technical design specifications Proficient in creation of database objects like stored procedures, functions, packages, tables, indexes, triggers, snapshots etc. Good experience in performing data migrations, data loads using ETL tools Strong hands-on experience to be able to design, develop, deploy and support RESTful services Experience in job scheduling using CRON utilities, legacy AT scheduler, windows scheduler, Toad Self-motivated and committed team player with excellent communication and interpersonal skills Solid written and verbal communication skills and able to articulate the technically complex to be understood by both technical and non-technical personnel Able to develop scalable solutions that can be utilized by current and future channels e.g. mobile Understanding the importance of real business engagement in all stages of IT product lifecycle working in agile delivery methodologies Good to have: Hands on experience in Web Services Strong hands-on experience to be able to design, develop, deploy and support RESTful services Good experience in Automation and knowledge on at least one automation tool Development Experience in Oracle Application Express Integration based on-Pager Duty / Slack/Jira/Jira SD/Confluence Knowledge of at least one of the python web framework (Django/Flask) Knowledge of fundamental concepts of Database and Programming languages Any certification in Business Analysis domain Knowledge of tools such as JIRA, MS-Visio, MS Project Plan, MS Office suite etc. Roles & Responsibilities Candidate will be responsible for developing and maintaining features/enhancements and participate in complete life cycle of software solutions based on the business need. Analyze, design develop, troubleshoot and debug software programs Writes code, completes programming and performs testing and debugging of applications Build and execute unit tests and unit plans Review integration and regression test plans Duties and tasks are varied and complex needing independent judgment Work independently with minimal supervision, collaborate with internal/external team members to maintain and make bug fixes for improving existing capabilities Integrate business support systems and automate data flow Perform R&D to propose out of the box solutions and utilize emerging technologies Demonstrate strong analytical skills and always coming up with creative ideas, time management skills and multi-tasking capabilities Develop flowcharts, algorithms and UML diagrams required for design/development Track and fully document functional and business specifications; writes detailed universally understood procedures for permanent records and for use in training
FIND ON MAP
Company Reviews
View ReviewsBrowse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.