Jobs
Interviews

128 Data Testing Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

8 - 12 Lacs

Noida

Work from Office

Key Responsibilities: Data Testing Strategy & Execution: Design, develop, and execute comprehensive test plans and test cases for data-centric applications, ETL processes, data warehouses, data lakes, and reporting solutions. SQL-Driven Validation: Utilize advanced SQL queries to perform complex data validation, data reconciliation, data integrity checks, and data quality assurance across various financial data sources. ETL Testing: Conduct thorough testing of ETL (Extract, Transform, Load) processes, ensuring data is accurately extracted, transformed according to business rules, and loaded correctly into target systems. Data Quality Assurance: Implement and monitor data quality checks, identify data discrepancies, anomalies, and inconsistencies, and work with development and business teams to resolve issues. Performance Testing (Data Focus): Contribute to performance testing efforts for data pipelines and database operations, ensuring optimal query and data load performance. Test Data Management: Create and manage robust test data sets for various testing phases, including positive, negative, and edge case scenarios. Defect Management: Identify, document, track, and re-test defects in data, collaborating closely with development and data engineering teams for timely resolution. Documentation & Reporting: Maintain clear and concise documentation of test plans, test cases, test results, and data quality reports. Provide regular status updates to stakeholders. Collaboration: Work effectively with business analysts, data architects, data engineers, and project managers to understand data flows, business requirements, and ensure data quality standards are met. Process Improvement: Proactively identify opportunities for process improvements in data testing methodologies and tools. Global Team Collaboration: Provide consistent overlap with EST working hours (until noon EST) to facilitate effective communication and collaboration with US-based teams. ________________________________________ Required Skills & Experience: Experience: 5-8 years of hands-on experience in Data Quality Assurance, Data Testing, or ETL Testing roles. SQL Expertise: o Advanced proficiency in SQL: Ability to write complex queries, subqueries, analytical functions (Window functions), CTEs, and stored procedures for data validation, reconciliation, and analysis. o Experience with various SQL databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL, Snowflake, BigQuery). o Strong understanding of database concepts: normalization, indexing, primary/foreign keys, and data types. Data Testing Methodologies: Solid understanding of data warehousing concepts, ETL processes, and various data testing strategies (e.g., source-to-target mapping validation, data transformation testing, data load testing, data completeness, data accuracy). Domain Expertise: o Strong understanding and proven experience in Risk and Finance IT domain: Familiarity with financial data (e.g., trading data, market data, risk metrics, accounting data, regulatory reporting). o Knowledge of financial products, regulations, and risk management concepts. Analytical & Problem-Solving Skills: Excellent ability to analyze complex data sets, identify root causes of data issues, and propose effective solutions. Communication: Strong verbal and written communication skills to articulate data issues and collaborate with diverse teams. Mandatory Competencies QA/QE - QA Automation - ETL Testing ETL - ETL - Tester Beh - Communication and collaboration Database - Sql Server - SQL Packages Database - PostgreSQL - PostgreSQL

Posted 1 month ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Noida

Work from Office

Key Responsibilities: Data Testing Strategy & Execution: Design, develop, and execute comprehensive test plans and test cases for data-centric applications, ETL processes, data warehouses, data lakes, and reporting solutions. SQL-Driven Validation: Utilize advanced SQL queries to perform complex data validation, data reconciliation, data integrity checks, and data quality assurance across various financial data sources. ETL Testing: Conduct thorough testing of ETL (Extract, Transform, Load) processes, ensuring data is accurately extracted, transformed according to business rules, and loaded correctly into target systems. Data Quality Assurance: Implement and monitor data quality checks, identify data discrepancies, anomalies, and inconsistencies, and work with development and business teams to resolve issues. Performance Testing (Data Focus): Contribute to performance testing efforts for data pipelines and database operations, ensuring optimal query and data load performance. Test Data Management: Create and manage robust test data sets for various testing phases, including positive, negative, and edge case scenarios. Defect Management: Identify, document, track, and re-test defects in data, collaborating closely with development and data engineering teams for timely resolution. Documentation & Reporting: Maintain clear and concise documentation of test plans, test cases, test results, and data quality reports. Provide regular status updates to stakeholders. Collaboration: Work effectively with business analysts, data architects, data engineers, and project managers to understand data flows, business requirements, and ensure data quality standards are met. Process Improvement: Proactively identify opportunities for process improvements in data testing methodologies and tools. Global Team Collaboration: Provide consistent overlap with EST working hours (until noon EST) to facilitate effective communication and collaboration with US-based teams. Required Skills & Experience: Experience: 5-8 years of hands-on experience in Data Quality Assurance, Data Testing, or ETL Testing roles. SQL Expertise: o Advanced proficiency in SQL: Ability to write complex queries, subqueries, analytical functions (Window functions), CTEs, and stored procedures for data validation, reconciliation, and analysis. o Experience with various SQL databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL, Snowflake, BigQuery). o Strong understanding of database concepts: normalization, indexing, primary/foreign keys, and data types. Data Testing Methodologies: Solid understanding of data warehousing concepts, ETL processes, and various data testing strategies (e.g., source-to-target mapping validation, data transformation testing, data load testing, data completeness, data accuracy). Domain Expertise: o Strong understanding and proven experience in Risk and Finance IT domain: Familiarity with financial data (e.g., trading data, market data, risk metrics, accounting data, regulatory reporting). o Knowledge of financial products, regulations, and risk management concepts. Analytical & Problem-Solving Skills: Excellent ability to analyze complex data sets, identify root causes of data issues, and propose effective solutions. Communication: Strong verbal and written communication skills to articulate data issues and collaborate with diverse teams. Mandatory Competencies QA/QE - QA Automation - ETL Testing ETL - ETL - Tester Beh - Communication and collaboration Database - Sql Server - SQL Packages Database - PostgreSQL - PostgreSQL

Posted 1 month ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Noida

Work from Office

Key Responsibilities: Data Testing Strategy & Execution: Design, develop, and execute comprehensive test plans and test cases for data-centric applications, ETL processes, data warehouses, data lakes, and reporting solutions. SQL-Driven Validation: Utilize advanced SQL queries to perform complex data validation, data reconciliation, data integrity checks, and data quality assurance across various financial data sources. ETL Testing: Conduct thorough testing of ETL (Extract, Transform, Load) processes, ensuring data is accurately extracted, transformed according to business rules, and loaded correctly into target systems. Data Quality Assurance: Implement and monitor data quality checks, identify data discrepancies, anomalies, and inconsistencies, and work with development and business teams to resolve issues. Performance Testing (Data Focus): Contribute to performance testing efforts for data pipelines and database operations, ensuring optimal query and data load performance. Test Data Management: Create and manage robust test data sets for various testing phases, including positive, negative, and edge case scenarios. Defect Management: Identify, document, track, and re-test defects in data, collaborating closely with development and data engineering teams for timely resolution. Documentation & Reporting: Maintain clear and concise documentation of test plans, test cases, test results, and data quality reports. Provide regular status updates to stakeholders. Collaboration: Work effectively with business analysts, data architects, data engineers, and project managers to understand data flows, business requirements, and ensure data quality standards are met. Process Improvement: Proactively identify opportunities for process improvements in data testing methodologies and tools. Global Team Collaboration: Provide consistent overlap with EST working hours (until noon EST) to facilitate effective communication and collaboration with US-based teams. Required Skills & Experience: Experience : 5-8 years of hands-on experience in Data Quality Assurance, Data Testing, or ETL Testing roles. SQL Expertise: o Advanced proficiency in SQL: Ability to write complex queries, subqueries, analytical functions (Window functions), CTEs, and stored procedures for data validation, reconciliation, and analysis. o Experience with various SQL databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL, Snowflake, BigQuery). o Strong understanding of database concepts: normalization, indexing, primary/foreign keys, and data types. Data Testing Methodologies: Solid understanding of data warehousing concepts, ETL processes, and various data testing strategies (e.g., source-to-target mapping validation, data transformation testing, data load testing, data completeness, data accuracy). Domain Expertise: o Strong understanding and proven experience in Risk and Finance IT domain: Familiarity with financial data (e.g., trading data, market data, risk metrics, accounting data, regulatory reporting). o Knowledge of financial products, regulations, and risk management concepts. Analytical & Problem-Solving Skills: Excellent ability to analyze complex data sets, identify root causes of data issues, and propose effective solutions. Communication: Strong verbal and written communication skills to articulate data issues and collaborate with diverse teams. Mandatory Competencies QA/QE - QA Automation - ETL Testing ETL - ETL - Tester Beh - Communication and collaboration Database - Sql Server - SQL Packages Database - PostgreSQL - PostgreSQL

Posted 1 month ago

Apply

5.0 - 8.0 years

10 - 15 Lacs

Noida

Work from Office

5-8 years of experience in Data Testing using SQL Knowledge of Python is a plus Domain expertise in Risk and Finance IT

Posted 1 month ago

Apply

8.0 - 10.0 years

10 - 15 Lacs

Noida

Work from Office

Key Responsibilities: Data Testing Strategy & Execution: Design, develop, and execute comprehensive test plans and test cases for data-centric applications, ETL processes, data warehouses, data lakes, and reporting solutions. SQL-Driven Validation: Utilize advanced SQL queries to perform complex data validation, data reconciliation, data integrity checks, and data quality assurance across various financial data sources. ETL Testing: Conduct thorough testing of ETL (Extract, Transform, Load) processes, ensuring data is accurately extracted, transformed according to business rules, and loaded correctly into target systems. Data Quality Assurance: Implement and monitor data quality checks, identify data discrepancies, anomalies, and inconsistencies, and work with development and business teams to resolve issues. Performance Testing (Data Focus): Contribute to performance testing efforts for data pipelines and database operations, ensuring optimal query and data load performance. Test Data Management: Create and manage robust test data sets for various testing phases, including positive, negative, and edge case scenarios. Defect Management: Identify, document, track, and re-test defects in data, collaborating closely with development and data engineering teams for timely resolution. Documentation & Reporting: Maintain clear and concise documentation of test plans, test cases, test results, and data quality reports. Provide regular status updates to stakeholders. Collaboration: Work effectively with business analysts, data architects, data engineers, and project managers to understand data flows, business requirements, and ensure data quality standards are met. Process Improvement: Proactively identify opportunities for process improvements in data testing methodologies and tools. Global Team Collaboration: Provide consistent overlap with EST working hours (until noon EST) to facilitate effective communication and collaboration with US-based teams. Required Skills & Experience: Experience: 8-10 years of hands-on experience in Data Quality Assurance, Data Testing, or ETL Testing roles. SQL Expertise: o Advanced proficiency in SQL: Ability to write complex queries, subqueries, analytical functions (Window functions), CTEs, and stored procedures for data validation, reconciliation, and analysis. o Experience with various SQL databases (e.g., SQL Server, Oracle, PostgreSQL, MySQL, Snowflake, BigQuery). o Strong understanding of database concepts: normalization, indexing, primary/foreign keys, and data types. Data Testing Methodologies: Solid understanding of data warehousing concepts, ETL processes, and various data testing strategies (e.g., source-to-target mapping validation, data transformation testing, data load testing, data completeness, data accuracy). Domain Expertise: o Strong understanding and proven experience in Risk and Finance IT domain: Familiarity with financial data (e.g., trading data, market data, risk metrics, accounting data, regulatory reporting). o Knowledge of financial products, regulations, and risk management concepts. Analytical & Problem-Solving Skills: Excellent ability to analyze complex data sets, identify root causes of data issues, and propose effective solutions. Communication: Strong verbal and written communication skills to articulate data issues and collaborate with diverse teams. Mandatory Competencies ETL - ETL - Tester QA/QE - QA Automation - ETL Testing Database - PostgreSQL - PostgreSQL Beh - Communication Database - Sql Server - SQL Packages

Posted 1 month ago

Apply

5.0 - 8.0 years

6 - 10 Lacs

Noida

Work from Office

5-8 years of experience in Data Testing using SQL Knowledge of Python is a plus Domain expertise in Risk and Finance IT Availability to provide overlap with EST hours (until noon EST) Mandatory Competencies ETL - ETL - Tester Beh - Communication QA/QE - QA Manual - Test Case Creation, Execution, Planning, Reporting with risks/dependencies

Posted 1 month ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Noida

Work from Office

5-8 years of experience in Data Testing using SQL Knowledge of Python is a plus Domain expertise in Risk and Finance IT Availability to provide overlap with EST hours (until noon EST) Mandatory Competencies ETL - ETL - Tester Beh - Communication QA/QE - QA Manual - Test Case Creation, Execution, Planning, Reporting with risks/dependencies

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

You will be joining KPMG in India, a professional services firm affiliated with KPMG International Limited since its establishment in August 1993. Leveraging a global network of firms, we provide services to national and international clients across various sectors with offices in multiple cities across India. As part of the Financial Crimes specialist team, your role will involve providing solutions to BFSI clients through model validation testing for AML risk models and frameworks, sanctions screening, and transaction monitoring systems. We are seeking individuals with advanced data science and analytics skills to support our team in addressing the challenges associated with financial crime. Your responsibilities will include but are not limited to supporting functional SME teams in building data-driven Financial Crimes solutions, conducting statistical testing of screening matching algorithms and risk rating models, validating data models of AML systems, and developing AML models to detect suspicious activities and transactions. You will collaborate with cross-functional teams to analyze data for model development and validation, prepare detailed documentation and reports, and assist in feature engineering for automation of AML-related investigations. To qualify for this role, you should have a Bachelor's degree from an accredited university, at least 3 years of hands-on experience in Python with knowledge of frameworks like Java, Fast, Django, Tornado, or Flask, experience with Relational and NoSQL databases, proficiency in BI tools such as Power BI and Tableau, and an educational background in Data Science and Statistics. Additionally, expertise in machine learning algorithms, statistical analysis, and familiarity with regulatory guidelines for AML compliance are essential. Preferred qualifications include experience in AML model validation, statistical testing of risk models, and familiarity with AML technology platforms. Hands-on experience with data analytics tools like Informatica and Kafka is also desirable. If you are looking to contribute your advanced analytics skills to combat financial crime and support leading financial institutions in adhering to industry best practices, this role offers you the opportunity to work on challenging projects and develop professionally in a dynamic environment.,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As the Lead Quality Assurance Specialist at FM, you will provide consultation at a high technical level to multiple project teams and application support groups. Your role involves analyzing, designing, and executing effective Quality Assurance procedures and defect prevention/detection processes for corporate and client area information systems across various computing environments. You will contribute directly to these activities on large and complex projects and determine the quality of multiple product releases by ensuring they meet user requirements. Your responsibilities will include identifying and testing detailed business design requirements, developing test plans, creating and maintaining specific test cases for various testing types, participating in code reviews, performing manual and automated functional testing, tracking defects, validating functionality across system releases, and developing automated test scripts. You will also ensure compliance with test plans and procedures, track project artifacts, champion the development of QA practices, perform load, stress, performance, and reliability testing, and implement automated testing tools. To be successful in this role, you should have 5 to 7 years of relevant experience in systems quality environment. You must possess knowledge of technical computing environments, test case management tools, test case generation techniques, problem tracking/reporting systems, and current business processes in multiple functional business areas. Additionally, you should have proven abilities in analyzing business requirements, developing test plans, performing various types of testing, and utilizing automated testing tools. Your technical skills should include proficiency in SQL, Python, C#, and JavaScript, experience with modern testing tools and frameworks, strong skills in SQL and advanced data analysis techniques, expertise in creating and executing comprehensive test strategies and plans, and hands-on experience in data testing methodologies. Excellent analytical ability, communication skills, judgment, and the ability to work effectively with business personnel and IS management are also essential for this role. The ideal candidate will have a BS or equivalent in Computer Science or a related discipline, along with specific advanced education in software quality assurance. The work location for this position is Bengaluru.,

Posted 1 month ago

Apply

2.0 - 5.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Design and execute test cases for various application functionalities. Perform functional, regression, and integration testing. Identify, document, and track functional defects. Collaborate with developers to resolve functional issues. Participate in code reviews and provide feedback on testability. Develop and maintain automated functional test scripts. Database Tester: Design and execute test plans and test cases for database functionality, performance, and security. Perform data validation, integrity checks, and data consistency testing. Write and execute SQL queries to verify data accuracy and completeness. Identify, document, and track database-related defects. Collaborate with database administrators and developers to resolve database issues. Develop and maintain automated database test scripts. Monitor database performance and identify potential bottlenecks. General: Contribute to the continuous improvement of testing processes and methodologies. Stay up-to-date with the latest testing technologies and trends. Communicate effectively with team members and stakeholders. Note* Experience of handling\/Testing trading application would be and advantage. Experience of Treasury and Trading Domain.

Posted 1 month ago

Apply

0.0 - 3.0 years

2 - 5 Lacs

Thane

Work from Office

Outbound calls for lead generation Present projects to potential clients Schedule coordinate site visits Update CRM follow up with leads

Posted 1 month ago

Apply

5.0 - 8.0 years

2 - 5 Lacs

Chennai

Work from Office

Job Title:Data TestersExperience5-8 YearsLocation:CHN/BGL/HYD/Pune, Remote : Expectations Provenexperienceworkingonlargeprojectsinthefinancialservicesorinvestmentmanagementdomain,preferablywithAladdinorsimilarportfoliomanagementsystems. Deepunderstandingofinvestmentmanagementworkflows,includingportfoliomanagement,trading,compliance,andriskmanagement. ExperienceintestingdatavalidationscenariosandDataingestion,pipelines,andtransformationprocesses(e.g.,ETL) ExperienceworkingonSnowflake ProficiencywithAPITestingusingCodecept/Postman&SOAPUI Experienceindatabaseskills(SQLQueries)fordatamigrationtestingandtestdatacreation. Strong knowledge in python packages. ExperienceworkinginAgileEnvironmentandimplementingCI/CDpipelineusingBamboo/GITLabs. ProficiencyonVersionControlSystemslikeBitbucket/SVNisrequired. WorkingandbuildingDataComparison(XML,JSON,Excel,Database)utilitieswouldbeadvantageous. ExposuretoAWS/Azureenvironment(s) Desirable Skills: Deepunderstandingofinvestmentmanagementworkflows,includingportfoliomanagement,trading,compliance,andriskmanagement. Knowledgeofautomationtoolsfordeploymentandconfigurationmanagement. ExperienceworkingoniCEDQ. CertificationssuchasAWS,ISTQB,PMP,orScrumMasterareadvantageous.

Posted 1 month ago

Apply

6.0 - 11.0 years

18 - 30 Lacs

Noida, Pune, Bengaluru

Work from Office

Key Responsibilities: Design, develop, and execute test cases to validate data ingestion from source systems (SQL, APIs) to Databricks UAP platform. Perform schema validation, data completeness, transformation, and row-level data checks between source and target. Utilize SQL extensively for data profiling and validation. Leverage PySpark and Pandas for large-scale data comparison and automation. Maintain and enhance the Data Test Automation Framework using Python/PySpark for efficient and scalable data testing. Participate in daily stand-ups, sprint planning, and retrospectives following Agile practices. Manage and track test progress, issues, and risks in JIRA. Ensure adherence to QA documentation practices: Test Plan Test Scenarios & Test Cases Test Summary Reports Defect Reports Own and drive the Defect Life Cycle, working closely with developers and product teams to ensure timely resolution. Collaborate with business analysts and developers to understand data requirements and ensure high test coverage. Required Skills: 4+ years of experience in QA/testing with a strong focus on data validation. Strong proficiency in SQL (Joins, Subqueries, Aggregations, Data Profiling) Experience in testing data pipelines ingesting from SQL and APIs to Databricks. Hands-on with Python and Pandas for data manipulation. Working knowledge of PySpark and familiarity with Spark DataFrames, transformations, and data handling. Knowledge of Agile testing processes and QA best practices. Familiar with QA documentation and reporting standards. Experience with JIRA for test and defect management. Good understanding of Defect Life Cycle and its role in maintaining software quality.

Posted 1 month ago

Apply

7.0 - 10.0 years

9 - 12 Lacs

Noida

Work from Office

Experience in Data Testing using SQL Knowledge of Python is a plus Domain expertise in Risk and Finance IT Availability to provide overlap with EST hours (until noon EST) Mandatory Competencies ETL - ETL - Tester Beh - Communication and collaboration QA/QE - QA Manual - Test Case Creation, Execution, Planning, Reporting with risks/dependencies Data Science and Machine Learning - Data Science and Machine Learning - Python.

Posted 1 month ago

Apply

6.0 - 8.0 years

6 - 8 Lacs

Kolkata, Hyderabad, Chennai

Work from Office

Job Title : Data Migration Testing Location State : Tamil Nadu,Telangana,West Bengal Location City : Chennai, Hyderabad, Kolkata Experience Required : 6 to 8 Year(s) CTC Range : 6 to 8 LPA Shift: Day Shift Work Mode: Onsite Position Type: C2H Openings: 2 Company Name: VARITE INDIA PRIVATE LIMITED About The Client: Client is an Indian multinational technology company specializing in information technology services and consulting. Headquartered in Mumbai, it is a part of the Tata Group and operates in 150 locations across 46 countries. About The Job: Work Location: Chennai, TN/Hyderabad, TS/ Kolkata, WB Skill Required: Data Migration Testing Experience Range in Required Skills: 6-8 Years ETL Testing Essential Job Functions: Skill Required: Data Migration Testing ETL Testing Qualifications: Skill Required: Data Migration Testing ETL Testing How to Apply: Interested candidates are invited to submit their resume using the apply online button on this job post. About VARITE: VARITE is a global staffing and IT consulting company providing technical consulting and team augmentation services to Fortune 500 Companies in USA, UK, CANADA and INDIA. VARITE is currently a primary and direct vendor to the leading corporations in the verticals of Networking, Cloud Infrastructure, Hardware and Software, Digital Marketing and Media Solutions, Clinical Diagnostics, Utilities, Gaming and Entertainment, and Financial Services. Equal Opportunity Employer: VARITE is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity or expression, national origin, age, marital status, veteran status, or disability status. Unlock Rewards: Refer Candidates and Earn. If you're not available or interested in this opportunity, please pass this along to anyone in your network who might be a good fit and interested in our open positions. VARITE offers a Candidate Referral program, where you'll receive a one-time referral bonus based on the following scale if the referred candidate completes a three-month assignment with VARITE. Exp Req - Referral Bonus 0 - 2 Yrs. - INR 5,000 2 - 6 Yrs. - INR 7,500 6 + Yrs. - INR 10,000

Posted 1 month ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

You will be working at AQM Technologies Pvt. Ltd., a company dedicated to providing exceptional testing experiences for all stakeholders. The company, established in 2000, boasts a team of over 1500 professionals, with 75% being ISTQB/ASTQB certified. AQM excels in various testing domains, including Quality Engineering, User Acceptance Testing, and Test Automation. Notably, more than 40% of the workforce is trained in Digital Testing. The company's strong client relationships are evident through their impressive 95% retention rate. AQM is globally recognized and certified as a "Great Place to Work" by the Great Place To Work Institute. As a Lead / STE / TE at AQM Technologies, you will be based in Mumbai in a full-time, on-site role. Your responsibilities will include manual testing of web and mobile applications, creating, executing, uploading, and maintaining test cases and test scenarios on JIRA, identifying, tracking, and reporting defects using JIRA or similar bug-tracking tools, ensuring compliance with banking & financial regulations for secure transactions, and collaborating with developers, business analysts, and product teams in an Agile environment. To excel in this role, you should have skills in Test Automation, Load & Performance Testing, proficiency in Security Testing, Cyber Security Audit, experience with User Acceptance Testing, Mobile Application Testing, Quality Engineering, and Data Testing. Strong analytical and problem-solving abilities, excellent communication and leadership skills, and a Bachelor's degree in computer science, Engineering, or related field are essential. Required skills include proficiency in Manual Testing (Functional, UI, Regression, and End-to-End testing), Bug Tracking & Test Management (JIRA or similar tools), understanding of APIs (hands-on API testing not required), and 2+ years of experience in Web application, mobile testing, API, DB, test design, test execution, defect life cycle, testing life cycle, Functional testing, and defect tools. Experience in using JIRA for test design and test execution status is necessary. Basic knowledge in Figma design for web and mobile journeys is a plus.,

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Looking for a Data tester with DBT (Data built tool) experience for Core conversion project. Offshore ETL Tester with knowledge on Power BI, DBT Labs. Very good in SQL concepts and Query writing Experience of writing simple procedures using T-SQL Hands on experience ETL Testing Knowledge on Power BI, DBT Labs. Data loads/intakes/extracts and incremental loads testing Good Manual Testing / UI testing Ability to design test cases from requirement and Test planning Very good communication and coordination skills Knowledge of Banking domain / Agency management is added advantage

Posted 1 month ago

Apply

4.0 - 8.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Project description We are more than a diversified global financial markets infrastructure and data business. We are dedicated, open-access partners with a commitment to excellence in delivering the services our customers expect from us. With extensive experience, deep knowledge and worldwide presence across financial markets, we enable businesses and economies around the world to fund innovation, manage risk and create jobs. It's how we've contributed to supporting the financial stability and growth of communities and economies globally for more than 300 years. Through a comprehensive suite of trusted financial market infrastructure services - and our open-access model - we provide the flexibility, stability and trust that enable our customers to pursue their ambitions with confidence and clarity. We are headquartered in the United Kingdom, with significant operations in 70 countries across EMEA, North America, Latin America and Asia Pacific. We employ 25,000 people globally, more than half located in Asia Pacific. Responsibilities As a Senior Quality Assurance Engineer, you will be responsible for ensuring the quality and reliability of complex data-driven systems, with a focus on financial services applications. You will work closely with Data Engineers, Business Analysts, and Developers across global teams to validate functionality, accuracy, and performance of software solutions, particularly around data migration from on-premises to cloud platforms. Key responsibilities include Leading and executing end-to-end test plans, including functional, unit, regression, and back-to-back testing Designing test strategies for data migration projects, with a strong focus on Oracle to Cloud transitions Verifying data accuracy and transformation logic across multiple environments Writing Python-based automated test scripts and utilities for validation Participating in Agile ceremonies, collaborating closely with cross-functional teams Proactively identifying and documenting defects, inconsistencies, and process improvements Contributing to continuous testing and integration practices Ensuring traceability between requirements, test cases, and delivered code Skills Must have Mandatory Skills Description The ideal candidate must demonstrate strong experience ( minimum 7 Years) and hands-on expertise in the following areas Data Testing (Oracle to Cloud Migration)Deep understanding of testing strategies related to large-scale data movement and transformation validation between legacy on-premise systems and modern cloud platforms. Python ScriptingProficient in using Python for writing automated test scripts and tools to streamline testing processes. Regression TestingProven ability to develop and manage comprehensive regression test suites ensuring consistent software performance over releases. Back-to-Back TestingExperience in comparing results between old and new systems or components to validate data integrity post-migration. Functional TestingSkilled in verifying system behavior against functional requirements in a business-critical environment. Unit TestingCapable of writing and executing unit tests for small code components to ensure correctness at the foundational level. Nice to have While not required, the following skills would be a strong plus and would enhance your effectiveness in the role Advanced Python DevelopmentExperience in building complex QA tools or contributing to CI/CD pipelines using Python. DBT (Data Build Tool)Familiarity with DBT for transformation testing and documentation in data engineering workflows. SnowflakeExposure to Snowflake cloud data warehouse and understanding of its testing and validation mechanisms.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Hyderabad

Hybrid

Job Summary: We are looking for a Quality Engineer Data to ensure the reliability, accuracy, and performance of data pipelines and AI/ML models in our SmartFM platform. This role is essential for delivering trustworthy data and actionable insights to optimize smart building operations. Roles and Responsibilities: Design and implement QA strategies for data pipelines and ML models. Test data ingestion and streaming systems (StreamSets, Kafka) for accuracy and completeness. Validate data stored in MongoDB, ensuring schema and data integrity. Collaborate with Data Engineers to proactively address data quality issues. Work with Data Scientists to test and validate ML/DL/LLM/Agentic Workflow models. Automate data validation and model testing using tools like Pytest, Great Expectations, Deepchecks. Monitor production pipelines for data drift, model degradation, and performance issues. Participate in code reviews and create detailed QA documentation. Continuously improve QA processes based on industry best practices. Required Technical Skills: 5 - 10 years of experience in QA, with focus on Data and ML testing. Proficient in SQL for complex data validation. Hands-on with StreamSets, Kafka, and MongoDB. Python scripting for test automation. Familiarity with ML model testing, metrics, and bias detection. Experience with cloud platforms (Azure, AWS, or GCP). Understanding of Node.js and React-based systems is a plus. Experience with QA tools like Pytest, Great Expectations, Deepchecks. Additional Qualifications: Excellent communication and documentation skills. Strong analytical mindset and attention to detail. Experience working cross-functionally with Engineers, Scientists, and Product teams. Passion for learning new technologies and QA frameworks. Domain knowledge in facility management, IoT, or building automation is a plus.

Posted 1 month ago

Apply

4.0 - 8.0 years

0 Lacs

noida, uttar pradesh

On-site

The ideal candidate for the Azure Data Engineer position will have 4-6 years of experience in designing, implementing, and maintaining data solutions on Microsoft Azure. As an Azure Data Engineer at our organization, you will be responsible for designing efficient data architecture diagrams, developing and maintaining data models, and ensuring data integrity, quality, and security. You will also work on data processing, data integration, and building data pipelines to support various business needs. Your role will involve collaborating with product and project leaders to translate data needs into actionable projects, providing technical expertise on data warehousing and data modeling, as well as mentoring other developers to ensure compliance with company policies and best practices. You will be expected to maintain documentation, contribute to the company's knowledge database, and actively participate in team collaboration and problem-solving activities. We are looking for a candidate with a Bachelor's degree in Computer Science, Information Technology, or a related field, along with proven experience as a Data Engineer focusing on Microsoft Azure. Proficiency in SQL and experience with Azure data services such as Azure Data Factory, Azure SQL Database, Azure Databricks, and Azure Synapse Analytics is required. Strong understanding of data architecture, data modeling, data integration, ETL/ELT processes, and data security standards is essential. Excellent problem-solving, collaboration, and communication skills are also important for this role. As part of our team, you will have the opportunity to work on exciting projects across various industries like High-Tech, communication, media, healthcare, retail, and telecom. We offer a collaborative environment where you can expand your skills by working with a diverse team of talented individuals. GlobalLogic prioritizes work-life balance and provides professional development opportunities, excellent benefits, and fun perks for its employees. Join us at GlobalLogic, a leader in digital engineering, where we help brands design and build innovative products, platforms, and digital experiences for the modern world. Headquartered in Silicon Valley, GlobalLogic operates design studios and engineering centers worldwide, serving customers in industries such as automotive, communications, financial services, healthcare, manufacturing, media and entertainment, semiconductor, and technology.,

Posted 1 month ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Gurugram

Work from Office

LocationBangalore/Hyderabad/Pune Experience level8+ Years About the Role We are looking for a technical and hands-on Lead Data Engineer to help drive the modernization of our data transformation workflows. We currently rely on legacy SQL scripts orchestrated via Airflow, and we are transitioning to a modular, scalable, CI/CD-driven DBT-based data platform. The ideal candidate has deep experience with DBT , modern data stack design , and has previously led similar migrations improving code quality, lineage visibility, performance, and engineering best practices. Key Responsibilities Lead the migration of legacy SQL-based ETL logic to DBT-based transformations Design and implement a scalable, modular DBT architecture (models, macros, packages) Audit and refactor legacy SQL for clarity, efficiency, and modularity Improve CI/CD pipelines for DBTautomated testing, deployment, and code quality enforcement Collaborate with data analysts, platform engineers, and business stakeholders to understand current gaps and define future data pipelines Own Airflow orchestration redesign where needed (e.g., DBT Cloud/API hooks or airflow-dbt integration) Define and enforce coding standards, review processes, and documentation practices Coach junior data engineers on DBT and SQL best practices Provide lineage and impact analysis improvements using DBTs built-in tools and metadata Must-Have Qualifications 8+ years of experience in data engineering Proven success in migrating legacy SQL to DBT , with visible results Deep understanding of DBT best practices , including model layering, Jinja templating, testing, and packages Proficient in SQL performance tuning , modular SQL design, and query optimization Experience with Airflow (Composer, MWAA), including DAG refactoring and task orchestration Hands-on experience with modern data stacks (e.g., Snowflake, BigQuery etc.) Familiarity with data testing and CI/CD for analytics workflows Strong communication and leadership skills; comfortable working cross-functionally Nice-to-Have Experience with DBT Cloud or DBT Core integrations with Airflow Familiarity with data governance and lineage tools (e.g., dbt docs, Alation) Exposure to Python (for custom Airflow operators/macros or utilities) Previous experience mentoring teams through modern data stack transitions

Posted 1 month ago

Apply

6.0 - 10.0 years

6 - 10 Lacs

Greater Noida

Work from Office

SQL DEVELOPER: Design and implement relational database structures optimized for performance and scalability. Develop and maintain complex SQL queries, stored procedures, triggers, and functions. Optimize database performance through indexing, query tuning, and regular maintenance. Ensure data integrity, consistency, and security across multiple environments. Collaborate with cross-functional teams to integrate SQL databases with applications and reporting tools. Develop and manage ETL (Extract, Transform, Load) processes for data ingestion and transformation. Monitor and troubleshoot database performance issues. Automate routine database tasks using scripts and tools. Document database architecture, processes, and procedures for future reference. Stay updated with the latest SQL best practices and database technologies.Data Retrieval: SQL Developers must be able to query large and complex databases to extract relevant data for analysis or reporting. Data Transformation: They often clean, join, and reshape data using SQL to prepare it for downstream processes like analytics or machine learning. Performance Optimization: Writing queries that run efficiently is key, especially when dealing with big data or real-time systems. Understanding of Database Schemas: Knowing how tables relate and how to navigate normalized or denormalized structures is essential. QE: Design, develop, and execute test plans and test cases for data pipelines, ETL processes, and data platforms. Validate data quality, integrity, and consistency across various data sources and destinations. Automate data validation and testing using tools such as PyTest, Great Expectations, or custom Python/SQL scripts. Collaborate with data engineers, analysts, and product managers to understand data requirements and ensure test coverage. Monitor data pipelines and proactively identify data quality issues or anomalies. Contribute to the development of data quality frameworks and best practices. Participate in code reviews and provide feedback on data quality and testability. Strong SQL skills and experience with large-scale data sets. Proficiency in Python or another scripting language for test automation. Experience with data testing tools Familiarity with cloud platforms and data warehousing solutions

Posted 1 month ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Bengaluru

Work from Office

Location: Bangalore/Hyderabad/Pune Experience level: 8+ Years About the Role We are looking for a technical and hands-on Lead Data Engineer to help drive the modernization of our data transformation workflows. We currently rely on legacy SQL scripts orchestrated via Airflow, and we are transitioning to a modular, scalable, CI/CD-driven DBT-based data platform. The ideal candidate has deep experience with DBT , modern data stack design , and has previously led similar migrations improving code quality, lineage visibility, performance, and engineering best practices. Key Responsibilities Lead the migration of legacy SQL-based ETL logic to DBT-based transformations Design and implement a scalable, modular DBT architecture (models, macros, packages) Audit and refactor legacy SQL for clarity, efficiency, and modularity Improve CI/CD pipelines for DBT: automated testing, deployment, and code quality enforcement Collaborate with data analysts, platform engineers, and business stakeholders to understand current gaps and define future data pipelines Own Airflow orchestration redesign where needed (e.g., DBT Cloud/API hooks or airflow-dbt integration) Define and enforce coding standards, review processes, and documentation practices Coach junior data engineers on DBT and SQL best practices Provide lineage and impact analysis improvements using DBTs built-in tools and metadata Must-Have Qualifications 8+ years of experience in data engineering Proven success in migrating legacy SQL to DBT , with visible results Deep understanding of DBT best practices , including model layering, Jinja templating, testing, and packages Proficient in SQL performance tuning , modular SQL design, and query optimization Experience with Airflow (Composer, MWAA), including DAG refactoring and task orchestration Hands-on experience with modern data stacks (e.g., Snowflake, BigQuery etc.) Familiarity with data testing and CI/CD for analytics workflows Strong communication and leadership skills; comfortable working cross-functionally Nice-to-Have Experience with DBT Cloud or DBT Core integrations with Airflow Familiarity with data governance and lineage tools (e.g., dbt docs, Alation) Exposure to Python (for custom Airflow operators/macros or utilities) Previous experience mentoring teams through modern data stack transitions

Posted 1 month ago

Apply

5.0 - 9.0 years

12 - 16 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

We are seeking a skilled ETL Data Tester to join our dynamic team on a 6-month contract. The ideal candidate will focus on implementing ETL processes, creating comprehensive test suites using Python, and validating data quality through advanced SQL queries. The role involves collaborating with Data Scientists, Engineers, and Software teams to develop and monitor data tools, frameworks, and infrastructure changes. Proficiency in Hive QL, Spark QL, and Big Data concepts is essential. The candidate should also have experience in data testing tools like DBT, iCEDQ, and QuerySurge, along with expertise in Linux/Unix and messaging systems such as Kafka or RabbitMQ. Strong analytical and debugging skills are required, with a focus on continuous automation and integration of data from multiple sources. Location: Chennai, Ahmedabad, Kolkata, Pune, Hyderabad, Remote

Posted 1 month ago

Apply

10.0 - 15.0 years

1 Lacs

Hyderabad

Remote

Complex Data Testing Engineer/ Relational Data Testing Specialist/ Big Data Testing Engineer

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies