Jobs
Interviews

12 Hadoop Testing Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

8 - 16 Lacs

hyderabad, chennai, bengaluru

Hybrid

We are hiring for ETL Professional for our client Place in Bangalore/Chennai/Hyderabad/Pune. Please find the below details Experience: 4 to 12 yrs Work location: Bangalore/Chennai/Hyderabad/Pune. Type: working Under Nityo Payroll Work Type: WFO Skills strong DS testing . ETL tester with the combination of Python/sql/Selenium Should be available for F2F interview in client place. Notice period immediate to 30 days If you are interested, Please share your update profile to vidhya.b@nityo.com/ jagadeesh@nityo.com or reach me @ 8925740451 (Jagadeesh) to know more details about the requirement and client. Thanks & Regards, Vidhya B Phone: +91 9384850049 |Mobile: +91 8056239878| www.nityo.com https://www.linkedin.com/in/vidhya-bhaskaran-1720835/

Posted 1 day ago

Apply

9.0 - 12.0 years

8 - 18 Lacs

pune, chennai, coimbatore

Work from Office

Company Name: Hexaware Technologies Experience: 9-12 Years Location: Pune/Chennai/Coimbatore (Hybrid Model) Interview Mode: F2F Interview Date: 23rd Aug (Saturday) Interview Rounds: 2 Rounds Notice Period: Immediate to 30 days Job description: ETL Tester with 6+ Years of BigData experience Good programming/scripting experience in SQL, PySpark and Python Collaborate with cross-functional teams to understand project requirements and Big Data system specifications. Develop detailed test plans, test cases, and test scripts tailored to Big Data testing needs. Execute manual test cases to verify the correctness and completeness of data transformation processes Validate data ingestion and extraction procedures, ensuring data accuracy and consistency. Identify and document defects, anomalies, and data quality issues. Utilize Big Data technologies such as Hadoop, Spark, Hive and related ecosystem components for testing. Write and execute complex queries in SQL, HiveQL, or other query languages to validate data processing and transformation. Perform regression testing to ensure that previously identified defects have been resolved and new changes have not introduced new issues. Maintain regression test suites specific to Big Data applications. Collaborate closely with developers, data engineers, and data scientists to understand data processing logic and resolve issues. Effectively communicate test results, progress, and potential risks to project stakeholders. Good in Jira and Testing methodologies

Posted 3 days ago

Apply

6.0 - 9.0 years

5 - 12 Lacs

pune, chennai, coimbatore

Work from Office

Company Name: Hexaware Technologies Experience: 6-9 Years Location: Pune/Chennai/Coimbatore (Hybrid Model) Interview Mode: F2F Interview Date: 23rd Interview Rounds: 2 Rounds Notice Period: Immediate to 30 days Job description: ETL Tester with 6+ Years of BigData experience Good programming/scripting experience in SQL, PySpark and Python Collaborate with cross-functional teams to understand project requirements and Big Data system specifications. Develop detailed test plans, test cases, and test scripts tailored to Big Data testing needs. Execute manual test cases to verify the correctness and completeness of data transformation processes Validate data ingestion and extraction procedures, ensuring data accuracy and consistency. Identify and document defects, anomalies, and data quality issues. Utilize Big Data technologies such as Hadoop, Spark, Hive and related ecosystem components for testing. Write and execute complex queries in SQL, HiveQL, or other query languages to validate data processing and transformation. Perform regression testing to ensure that previously identified defects have been resolved and new changes have not introduced new issues. Maintain regression test suites specific to Big Data applications. Collaborate closely with developers, data engineers, and data scientists to understand data processing logic and resolve issues. Effectively communicate test results, progress, and potential risks to project stakeholders. Good in Jira and Testing methodologies

Posted 3 days ago

Apply

7.0 - 11.0 years

0 Lacs

telangana

On-site

You are a highly skilled and detail-oriented ETL QA - Technical Lead with a solid background in Big Data Testing, the Hadoop ecosystem, and SQL validation. Your primary responsibility will be leading end-to-end testing efforts for data/ETL pipelines across various big data platforms. You will be working closely with cross-functional teams in an Agile environment to ensure the quality and integrity of large-scale data solutions. Your key responsibilities include designing and implementing test strategies for validating large datasets, transformations, and integrations. You will be hands-on testing Hadoop-based data platforms such as HDFS, Hive, and Spark. Additionally, you will develop complex SQL queries for data validation and business rule testing. Collaborating with developers, product owners, and business analysts in Agile ceremonies will also be a crucial part of your role. As the ETL QA - Technical Lead, you will own test planning, test case design, defect tracking, and reporting for assigned modules. Identifying areas of automation and building reusable QA assets will be essential, along with driving QA best practices and mentoring junior QA team members. To excel in this role, you should have 7-11 years of experience in Software Testing, with a minimum of 3 years in Big Data/Hadoop testing. Strong hands-on experience in testing Hadoop components like HDFS, Hive, Spark, and Sqoop is required. Proficiency in SQL, especially in complex joins, aggregations, and data validation, is essential. Experience in ETL/Data Warehouse testing and familiarity with data ingestion, transformation, and validation techniques are also necessary.,

Posted 2 weeks ago

Apply

7.0 - 12.0 years

1 - 1 Lacs

Pune, Chennai, Bengaluru

Work from Office

Role & responsibilities Design and execute test plans for big data applications. Perform data validation, data quality, and data integrity tests. Automate test cases and develop testing scripts for accurate results. Collaborate with development teams to identify and resolve issues in big data processing. Analyze test results to ensure data systems meet specified requirements. Continuously improve testing processes and methodologies. Document testing protocols, outcomes, and system defects. Preferred candidate profile Proficiency in Hadoop, Spark, and Hive. Knowledge of SQL and NoSQL databases. Experience with data warehousing and ETL processes. GenAI knowledge would be an added advantage along with Vibe coding This role involves working closely with data engineers, data analysts, and other stakeholders to ensure the accuracy, efficiency, and reliability of big data systems and applications. Should be able to lead a young team for 5 members. 5 days of working from office.

Posted 1 month ago

Apply

3.0 - 7.0 years

17 - 25 Lacs

Bangalore Rural, Bengaluru

Work from Office

Job Description We are looking for energetic, high-performing and highly skilled Quality Assurance Engineer to help shape our technology and product roadmap. You will be part of the fast-paced, entrepreneurial Enterprise Personalization portfolio focused on delivering the next generation global marketing capabilities. This team is responsible for Global campaign tracking of new accounts acquisition and bounty payments and leverages transformational technologies, such as SQL, Hadoop, Spark, Pyspark, HDFS, MapReduce, Hive, HBase, Kafka & Java. Focus: Provides domain expertise to engineers on Automation, Testing and Quality Assurance (QA) methodologies and processes, crafts and executes test scripts, assists in preparation of test strategies, sets up and maintains test data & environments as well as logs results. 4 - 6 years of hands-on software testing experience in developing test cases and test plans with extensive knowledge of automated testing and architecture. Expert knowledge of Testing Frameworks and Test Automation Design Patterns like TDD, BDD etc. Expertise in developing software test cases for Hive, Spark, SQL written in pyspark SQL and Scala. Hands-on experience in Performance and Load Testing tools such as JMeter, pytest or similar tool. Experience with industry standard tools for defect tracking, source code management, test case management, test automation, and other management and monitoring tools Experience working with Agile methodology Experience with Cloud Platform (GCP) Experience in designing, developing, testing and debugging, and operating resilient distributed systems using Big Data ClustersGood sense for software quality, the clean code principles, test driven development and an agile mindset High engagement, self-organization, strong communication skills and team spirit Experience with building and adopting new test frameworks.Bonus skills:Testing Machine learning/data mining Roles & Responsibilities Responsible for testing and quality assurance of large data processing pipeline using Pyspark and SQL. Develops and tests software, including ongoing refactoring of code, and drives continuous improvement in code structure and quality• Functions as a platform SME who drives quality and automation strategy at application level, identifies new opportunities and drives Software Engineers to deliver the highest quality code. Delivers on capabilities for the portfolio automation strategy and executes against the test and automation strategy defined at the portfolio level. Works with engineers to drive improvements in code quality via manual and automated testing. Involved in the review of the user story backlog and requirements specifications for completeness and weaknesses in function, performance, reliability, scalability, testability, usability, and security and compliance testing. Provides recommendations Plans and defines testing approach, providing advice on prioritization of testing activity in support of identified risks in project schedules or test scenarios.

Posted 1 month ago

Apply

8.0 - 13.0 years

24 - 57 Lacs

Bengaluru

Work from Office

Job description Min 8-10 years of experience on Manual/Automation testing Able to develop automation scripts using keyword based tool Experience in any of the OOPS (JAVA/Python/C# (Atleast one preferred)) Strong understanding of OOPS Concepts

Posted 2 months ago

Apply

1.0 - 4.0 years

5 - 8 Lacs

Pune, Gurugram

Work from Office

Technical Skills: Proficient in SQL and Linux with hands-on experience. Good understanding of Hadoop ecosystem and job scheduling tools like Airflow and Oozie. Skilled in writing and executing SQL queries for comprehensive data validation. Basic programming knowledge in Python is a plus. Experience with S3 buckets and cloud storage workflows is advantageous. Soft Skills: Strong analytical and problem-solving skills with a high attention to detail. Excellent verbal and written communication abilities. Ability to collaborate effectively in a fast-paced Agile/Scrum environment. Adaptable and eager to learn new tools, technologies, and processes. Experience: 0-4 years of experience in Big Data testing, focusing on both automated and manual testing for data validation and UI testing. Experience in testing Spark job performance, security, and integration across diverse systems. Hands-on experience with defect tracking tools such as JIRA or Bugzilla.

Posted 2 months ago

Apply

4.0 - 8.0 years

11 - 12 Lacs

Gurugram

Work from Office

Big Data Tester Requirements: • Experience 4-8 years • Good knowledge and hands on experience of Big Data (HDFS, Hive, Kafka) testing. (Must) • Good knowledge and hands on experience of SQL(Must). • Good knowledge and hands on experience of Linux (Must) • Well versed with QA methodologies. • Manual + Automation will work. • Knowledge of DBT, AWS or Automation testing will be a plus.

Posted 2 months ago

Apply

3.0 - 6.0 years

8 - 15 Lacs

Gurugram

Work from Office

2-5 Yrs in Exp Big Data testing, focusing on both automated and manual testing. Strong understanding of Hadoop. Proficient in SQL, Unix, Linux with hands-on experience. also Exp in ETL .

Posted 3 months ago

Apply

6.0 - 9.0 years

2 - 6 Lacs

hyderabad, pune, bengaluru

Work from Office

Key Responsibilities: Perform ETL testing on data pipelines and data warehouse systems. Validate data transformations, loads, and extractions from source to target systems. Conduct data reconciliation and data quality checks across large datasets. Work with Hadoop ecosystem tools such as Hive , Pig , HDFS , Spark , or Sqoop . Develop and execute SQL queries to validate business rules, transformations, and data accuracy. Automate data validation and testing processes using appropriate scripting tools or frameworks. Understand business requirements and create comprehensive test plans and test cases. Collaborate with data engineers, developers, and business analysts to resolve data-related issues. Document and track test results, defects, and provide root cause analysis. Required Skills: 38 years of hands-on experience in ETL testing and Data Warehouse testing . Strong understanding of ETL concepts , data integration , and data warehousing . Hands-on experience in Hadoop ecosystem – Hive, Pig, HDFS, Sqoop, Spark, etc. Proficient in SQL – ability to write complex queries for data validation. Familiarity with tools like Informatica , Talend , or other ETL platforms (optional but preferred). Knowledge of data modeling concepts – star schema, snowflake schema. Experience with test management and defect tracking tools (e.g., JIRA, ALM). Good understanding of Agile / Scrum methodologies. Nice to Have:

Posted Date not available

Apply

2.0 - 6.0 years

6 - 12 Lacs

pune

Work from Office

Hot Job Opening: Big Data QA/Test Engineer | 2-6 Years | Automation + Hadoop Testing Location: Pune Experience: 2 to 6 years Job Type: Full-Time | Immediate Joiners Preferred About the Role: Were seeking a passionate and detail-oriented Big Data QA/Test Engineer to join our growing team. In this role, you'll ensure the reliability and accuracy of our Big Data pipelines, working on automated and manual testing of complex data flows and APIs. You will have the opportunity to work with cutting-edge tools like Spark, Airflow, S3, and more! Key Responsibilities: Perform end-to-end testing of Big Data pipelines (manual & automated). Validate large datasets using complex SQL queries across various stages of ETL. Execute and troubleshoot Spark and Hadoop jobs. Work with job scheduling tools like Airflow or Oozie . Implement test scripts using automation frameworks like Robot Framework (or similar). Handle data validation workflows across S3 buckets and cloud storage . Collaborate with developers and data engineers in Agile/Scrum environment. Log and track defects using JIRA/Bugzilla and participate in regular QA status meetings.

Posted Date not available

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies