Home
Jobs

4716 Hadoop Jobs - Page 4

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

5 - 10 Lacs

Kolkata

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Develop and implement software solutions to meet business requirements.- Collaborate with cross-functional teams to analyze and address technical issues.- Conduct code reviews and provide feedback to enhance application performance.- Stay updated on emerging technologies and trends in application development.- Assist in troubleshooting and resolving application-related issues. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data analytics and data processing techniques.- Experience with cloud-based data platforms like AWS or Azure.- Knowledge of programming languages such as Python, Java, or Scala.- Hands-on experience in developing and deploying applications using Databricks platform. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Kolkata office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 7 hours ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Design, develop, and maintain ETL processes using Ab Initio and other ETL tools. Manage and optimize data pipelines on AWS. Write and maintain complex PL/SQL queries for data extraction, transformation, and loading. Provide Level 3 support for ETL processes, troubleshooting and resolving issues promptly. Collaborate with data architects, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Ensure data quality and integrity through rigorous testing and validation. Stay updated with the latest industry trends and technologies in ETL and cloud computing. : Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Certification in Ab Initio. Proven experience with AWS and cloud-based data solutions. Strong proficiency in PL/SQL and other ETL tools. Experience in providing Level 3 support for ETL processes. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Qualifications: Experience with other ETL tools such as Informatica, Talend, or DataStage. Knowledge of data warehousing concepts and best practices. Familiarity with scripting languages (e.g., Python, Shell scripting).

Posted 7 hours ago

Apply

8.0 - 13.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Educational Bachelor of Engineering Service Line Strategic Technology Group Responsibilities Power Programmer is an important initiative within Global Delivery to develop a team of Full Stack Developers who will be working on complex engineering projects, platforms and marketplaces for our clients using emerging technologies., They will be ahead of the technology curve and will be constantly enabled and trained to be Polyglots., They are Go-Getters with a drive to solve end customer challenges and will spend most of their time in designing and coding, End to End contribution to technology oriented development projects., Providing solutions with minimum system requirements and in Agile Mode., Collaborate with Power Programmers., Open Source community and Tech User group., Custom Development of new Platforms & Solutions ,Opportunities., Work on Large Scale Digital Platforms and marketplaces., Work on Complex Engineering Projects using cloud native architecture ., Work with innovative Fortune 500 companies in cutting edge technologies., Co create and develop New Products and Platforms for our clients., Contribute to Open Source and continuously upskill in latest technology areas., Incubating tech user group Technical and Professional : Bigdata Spark, scala, hive, kafka Preferred Skills: Technology-Big Data-Hbase Technology-Big Data-Sqoop Technology-Functional Programming-Scala Technology-Big Data - Data Processing-Spark-SparkSQL

Posted 7 hours ago

Apply

5.0 - 8.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Educational Bachelor of Engineering Service Line Strategic Technology Group Responsibilities Power Programmer is an important initiative within Global Delivery to develop a team of Full Stack Developers who will be working on complex engineering projects, platforms and marketplaces for our clients using emerging technologies., They will be ahead of the technology curve and will be constantly enabled and trained to be Polyglots., They are Go-Getters with a drive to solve end customer challenges and will spend most of their time in designing and coding, End to End contribution to technology oriented development projects., Providing solutions with minimum system requirements and in Agile Mode., Collaborate with Power Programmers., Open Source community and Tech User group., Custom Development of new Platforms & Solutions ,Opportunities., Work on Large Scale Digital Platforms and marketplaces., Work on Complex Engineering Projects using cloud native architecture ., Work with innovative Fortune 500 companies in cutting edge technologies., Co create and develop New Products and Platforms for our clients., Contribute to Open Source and continuously upskill in latest technology areas., Incubating tech user group Technical and Professional : Bigdata Spark, scala, hive, kafka Preferred Skills: Technology-Big Data - Hadoop-Hadoop-Hive Technology-Big Data-Sqoop Technology-Functional Programming-Scala Technology-Big Data - Data Processing-Spark-SparkSQL

Posted 7 hours ago

Apply

6.0 - 11.0 years

3 - 7 Lacs

Pune

Work from Office

Naukri logo

Experience :7-9 yrs Experience in AWS services must like S3, Lambda , Airflow, Glue, Athena, Lake formation ,Step functions etc. Experience in programming in JAVA and Python. Experience performing data analysis (NOT DATA SCIENCE) on AWS platforms Nice to have : Experience in a Big Data technologies (Terradata, Snowflake, Spark, Redshift, Kafka, etc.) Experience with data management process on AWS is a huge Plus Experience in implementing complex ETL transformations on AWS using Glue. Familiarity with relational database environment (Oracle, Teradata, etc.) leveraging databases, tables/views, stored procedures, agent jobs, etc.

Posted 7 hours ago

Apply

7.0 - 12.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Emirates NBD is one of the largest banks in the Middle East, with over USD 6 Bn in revenues and 14000 employees globally. Core objectives of the unit include: The BI function in group finance is a function which requires strong alignment with the business heads to deliver the best data driven insights and craft data driven strategy for the organization. The BI Engineer works on pioneering business intelligence use cases, realizing the output into engaging, highly intuitive, and crystal-clear visualizations, therefore minimizing the time to insight for senior stakeholders. S/he makes a real difference through their own delivery, aesthetic sense, and ability to cut through the noise direct to the signal. S/he will be an expert-level user of Tableau and Power BI S/he will also have expert level SQL Capabilities, being able to write optimized queries S/he will have a portfolio of data visualizations, ideally built using Tableau / Power BI Work with a team of product owners, translators, data scientists and data engineers to deliver business impact through data Create innovative and compelling visualizations that make analytical insights easily digestible for relevant stakeholders Develop (Tableau) visualization solutions that are sustainable and that are embedded into end to end analytics solutions that provide stakeholders with continuous, up to date analytics insights Implement an agile approach to appropriate visualization development, including rapid prototyping, short and frequent development cycles Support hiring, coaching and training of citizen visualization talent across the Bank Help build understanding and familiarity with visualization solutions with relevant stakeholders Proven ability to visualize analytics insights Proficiency with Tableau & Power BI (Including SSRS) Experience working with large datasets (Hadoop & SQL) Portfolio of existing work Technical proficiency regarding database design development, data models, techniques for data mining, and segmentation.

Posted 7 hours ago

Apply

8.0 - 13.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

P1-C3-STS Seeking a developer who has good Experience in Athena, Python code, Glue, Lambda, DMS , RDS, Redshift Cloud Formation and other AWS serverless resources. Can optimize data models for performance and efficiency. Able to write SQL queries to support data analysis and reporting Design, implement, and maintain the data architecture for all AWS data services. Work with stakeholders to identify business needs and requirements for data-related projects Design and implement ETL processes to load data into the data warehouse Good Experience in Athena, Python code, Glue, Lambda, DMS , RDS, Redshift, Cloud Formation and other AWS serverless resources Cloud Formation and other AWS serverless resources

Posted 7 hours ago

Apply

6.0 - 11.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

React Developer Responsibilities Developing new user-facing features using React.js Building reusable components and front-end libraries for future use Translate User Stories and wireframes into high quality code Create applications which provide fantastic UI/UX and responsive design Integrate apps with third-party APIs and Cloud APIs Apply core Computer Science concepts to improve consumer web apps Profile and improve our frontend performance Design for scalability and adherence to standards Required Skills: Should be excellent in UI development using React framework Should be strong in Redux or Flux Should be strong in JavaScript (ES 6 and above standards)

Posted 7 hours ago

Apply

8.0 - 13.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

This role will be instrumental in building and maintaining robust, scalable, and reliable data pipelines using Confluent Kafka, ksqlDB, Kafka Connect, and Apache Flink. The ideal candidate will have a strong understanding of data streaming concepts, experience with real-time data processing, and a passion for building high-performance data solutions. This role requires excellent analytical skills, attention to detail, and the ability to work collaboratively in a fast-paced environment. Essential Responsibilities Design & develop data pipelines for real time and batch data ingestion and processing using Confluent Kafka, ksqlDB, Kafka Connect, and Apache Flink. Build and configure Kafka Connectors to ingest data from various sources (databases, APIs, message queues, etc.) into Kafka. Develop Flink applications for complex event processing, stream enrichment, and real-time analytics. Develop and optimize ksqlDB queries for real-time data transformations, aggregations, and filtering. Implement data quality checks and monitoring to ensure data accuracy and reliability throughout the pipeline. Monitor and troubleshoot data pipeline performance, identify bottlenecks, and implement optimizations. Automate data pipeline deployment, monitoring, and maintenance tasks. Stay up-to-date with the latest advancements in data streaming technologies and best practices. Contribute to the development of data engineering standards and best practices within the organization. Participate in code reviews and contribute to a collaborative and supportive team environment. Work closely with other architects and tech leads in India & US and create POCs and MVPs Provide regular updates on the tasks, status and risks to project manager The experience we are looking to add to our team Required Bachelors degree or higher from a reputed university 8 to 10 years total experience with majority of that experience related to ETL/ELT, big data, Kafka etc. Proficiency in developing Flink applications for stream processing and real-time analytics. Strong understanding of data streaming concepts and architectures. Extensive experience with Confluent Kafka, including Kafka Brokers, Producers, Consumers, and Schema Registry. Hands-on experience with ksqlDB for real-time data transformations and stream processing. Experience with Kafka Connect and building custom connectors. Extensive experience in implementing large scale data ingestion and curation solutions Good hands on experience in big data technology stack with any cloud platform - Excellent problemsolving, analytical, and communication skills. Ability to work independently and as part of a team Good to have Experience in Google Cloud Healthcare industry experience Experience in Agile

Posted 7 hours ago

Apply

10.0 - 15.0 years

12 - 16 Lacs

Pune

Work from Office

Naukri logo

To be successful in this role, you should meet the following requirements(Must have ) Payments and Banking experience is a must. Experience in implementing and monitoring data governance using standard methodology throughout the data life cycle, within a large organisation. Demonstrate up-to-date knowledge of data governance theory, standard methodology and the practical considerations. Demonstrate knowledge of data governance industry standards and tools. Overall experience of 10+ years experience in Data governance, encompassing Data Quality management, Master data management, Data privacy & compliance, Data cataloguing and metadata management, Data security, maturity and lineage. Prior experience in implementing an end-to-end data governance framework. Experience in Automating Data cataloguing, ensuring accurate, consistent metadata, making data easily discoverable and usable. Domain experience across the payments and banking lifecycle. An analytical mind and inclination for problem-solving, with an attention to detail. Ability to effectively navigate and deliver transformation programmes in large global financial organisations, amidst the challenges posed by bureaucracy, globally distributed teams and local data regulations. Strong communication skills coupled with presentation skills of complex information and data. A first-class degree in Engineering or relevant field - with 2 or more of the following subjects as a major Mathematics, Computer Science, Statistics, Economics. The successful candidate will also meet the following requirements(Good to have ) Database TypeRelational, NoSQL, DocumentDB DatabasesOracle, PostgreSQL, BigQuery, Big Table, MongoDB, Neo4j Experience in Conceptual/ Logical/Physical Data Modeling. Experience in Agile methodology and leading agile delivery, aligned with organisational needs. Effective leader as well as team player with a strong commitment to quality and efficiency.

Posted 7 hours ago

Apply

6.0 - 11.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Deliver performance focused backend system solutions mostly in Java. Build and maintain new and existing applications using Java Object-Oriented software analysis and design Solid understanding of object oriented programming and data modelling Experience with networking, and distributed system Experience with and appreciation for automated testing Experience with cloud compute, virtualisation and automation, using Kubernetes and AWS Preferable if you have exposure with open-source applications, e.g. Cassandra and Apache Flink B.S./MS/PhD in Computer Science or related field or equivalent experience Proven experience solving problems in complex domains Proactively identify and manage risks, including assessing and controlling risks of various kinds and apply this appropriately to diverse situations Displays courage and willing to always contribute constructive feedback - not being afraid to highlight issues and challenges and bringing alternative solutions to the table

Posted 7 hours ago

Apply

6.0 - 11.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Sr Developer with special emphasis and experience of 8 to 10 years on Python and Pyspark along with hands on experience on AWS Data components like AWS Glue, Athena etc.,. Also have good knowledge on Data ware house tools to understand the existing system. Candidate should also have experience on Datalake, Teradata and Snowflake. Should be good at terraform. 8-10 years of experience in designing and developing Python and Pyspark applications Creating or maintaining data lake solutions using Snowflake,taradata and other dataware house tools. Should have good knowledge and hands on experience on AWS Glue , Athena etc., Sound Knowledge on all Data lake concepts and able to work on data migration projects. Providing ongoing support and maintenance for applications, including troubleshooting and resolving issues. Expertise in practices like Agile, Peer reviews and CICD Pipelines.

Posted 7 hours ago

Apply

6.0 - 11.0 years

9 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

: We are hiring for Abinitio ETL Lead with at least 6+ years of experience in: who can not only design and implement ETL solutions but also perform data analysis and production support activities . The ideal candidate will manage the end-to-end development lifecycle of ETL processes while providing leadership on data analysis tasks to ensure accurate and actionable insights. Additionally, the role involves providing production support to maintain the stability and performance of critical data pipelines.

Posted 7 hours ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Google BigQuery Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. Your typical day will involve collaborating with teams to design and develop innovative solutions for business needs. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead design discussions and provide technical guidance to the team- Conduct code reviews and ensure adherence to coding standards- Stay updated on industry trends and technologies to drive innovation Professional & Technical Skills: - Must To Have Skills: Proficiency in Google BigQuery- Strong understanding of data modeling and database design- Experience with cloud-based data warehousing solutions- Hands-on experience in ETL processes and data integration- Knowledge of SQL and query optimization techniques Additional Information:- The candidate should have a minimum of 5 years of experience in Google BigQuery- This position is based at our Bengaluru office- A 15 years full time education is required Qualification 15 years full time education

Posted 7 hours ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Design, develop, and maintain ETL processes using Talend. Manage and optimize data pipelines on Amazon Redshift. Implement data transformation workflows using DBT (Data Build Tool). Write efficient, reusable, and reliable code in PySpark. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and deliver effective solutions. Ensure data quality and integrity through rigorous testing and validation. Stay updated with the latest industry trends and technologies in data engineering. : Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or similar role. High proficiency in Talend. Strong experience with Amazon Redshift. Expertise in DBT and PySpark. Experience with data modeling, ETL processes, and data warehousing. Familiarity with cloud platforms and services. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Preferred Qualifications: Experience with other data engineering tools and frameworks. Knowledge of machine learning frameworks and libraries.

Posted 7 hours ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

1Airflow with Pyspark Emphasize expertise in designing, developing, and deploying data pipelines using Apache Airflow. The focus is on creating, managing, and monitoring workflows, ensuring data quality, and collaborating with other data teams.

Posted 7 hours ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Chennai

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing and mentoring within the team to enhance overall performance.- Monitor project progress and make adjustments as necessary to ensure successful delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data engineering principles and practices.- Experience with cloud-based data solutions and architectures.- Familiarity with data integration tools and techniques.- Ability to analyze and optimize application performance. Additional Information:- The candidate should have minimum 5 years of experience in Databricks Unified Data Analytics Platform.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 7 hours ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Pune

Work from Office

Naukri logo

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 7 hours ago

Apply

2.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Seeking a skilled Data Engineer to work on cloud-based data pipelines and analytics platforms. The ideal candidate will have hands-on experience in PySpark and AWS, with proficiency in designing Data Lakes and working with modern data orchestration tools. Data Engineer to work on cloud-based data pipelines and analytics platforms PySpark and AWS, with proficiency in designing Data Lakes working with modern data orchestration tools

Posted 7 hours ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Mumbai

Work from Office

Naukri logo

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 7 hours ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Talend ETL Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the development process. Your role will be pivotal in driving the success of application projects and fostering a collaborative environment among team members and other departments. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement adjustments as necessary to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in Talend ETL.- Good To Have Skills: Experience with data integration tools and methodologies.- Strong understanding of data warehousing concepts and practices.- Experience in performance tuning and optimization of ETL processes.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 5 years of experience in Talend ETL.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 7 hours ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

JD for Data Engineer Python At least 5 to 8 years of experience in AWS Python programming and who can design, build, test & deploy the code. Candidate should have worked on LABMDA based APIs development. Should have experience in using following AWS servicesAWS SQS, AWS MSK, AWS RDS Aurora DB, BOTO 3. Very strong SQL knowledge is a must, should be able to understand build complex queries. He/she should be closely working with enterprise architect & other client teams at onsite as needed. Having experience in building solutions using Kafka would be good value addition(optional).

Posted 7 hours ago

Apply

8.0 - 13.0 years

4 - 8 Lacs

Mumbai

Work from Office

Naukri logo

Sr Devloper with special emphasis and experience of 8 to 10 years on Python and Pyspark along with hands on experience on AWS Data components like AWS Glue, Athena etc.,. Also have good knowledge on Data ware house tools to understand the existing system. Candidate should also have experience on Datalake, Teradata and Snowflake. Should be good at terraform. 8-10 years of experience in designing and developing Python and Pyspark applications Creating or maintaining data lake solutions using Snowflake,taradata and other dataware house tools. Should have good knowledge and hands on experience on AWS Glue , Athena etc., Sound Knowledge on all Data lake concepts and able to work on data migration projects. Providing ongoing support and maintenance for applications, including troubleshooting and resolving issues. Expertise in practices like Agile, Peer reviews and CICD Pipelines.

Posted 7 hours ago

Apply

2.0 - 5.0 years

14 - 17 Lacs

Navi Mumbai

Work from Office

Naukri logo

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark)In-depth knowledge of Spark’s architecture, core APIs, and PySpark for distributed data processing. Big Data TechnologiesFamiliarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modelling, and data warehousing concepts. Strong proficiency in PythonExpertise in Python programming with a focus on data processing and manipulation. Data Processing FrameworksKnowledge of data processing libraries such as Pandas, NumPy. SQL ProficiencyExperience writing optimized SQL queries for large-scale data analysis and transformation. Cloud PlatformsExperience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 7 hours ago

Apply

8.0 - 13.0 years

4 - 8 Lacs

Mumbai

Work from Office

Naukri logo

4+ years of experience as a Data Engineer or similar role. Proficiency in Python, PySpark, and advanced SQL. Hands-on experience with big data tools and frameworks (e.g., Spark, Hive). Experience with cloud data platforms like AWS, Azure, or GCP is a plus. Solid understanding of data modeling, warehousing, and ETL processes. Strong problem-solving and analytical skills. Good communication and teamwork abilities.Design, build, and maintain data pipelines that collect, process, and store data from various sources. Integrate data from multiple heterogeneous sources such as databases (SQL/NoSQL), APIs, cloud storage, and flat files. Optimize data processing tasks to improve execution efficiency, reduce costs, and minimize processing times, especially when working with large-scale datasets in Spark. Design and implement data warehousing solutions that centralize data from multiple sources for analysis.

Posted 7 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies