Home
Jobs

1901 Data Engineering Jobs - Page 41

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for an experienced Analytics Engineer to join Okta s enterprise data team. This analyst will have strong background in SaaS subscription and product analytics, a passion for providing customer usage insights to internal stakeholders, and experience organizing complex data into consumable data assets. In this role, you will be focusing on subscription analytics and product utilization insights and will partner with Product, Engineering, Customer Success, and Pricing to implement enhancements and build end-to-end customer subscription insights into new products. Requirements Experience in customer analytics, product analytics, and go-to-market analytics Experience in SaaS business, Product domain as well as Salesforce Proficiency in SQL, ETL tools, GitHub, and data integration technologies, including familiarity with data modeling techniques, database design, and query optimization. Experience in data languages like R and Python. Knowledge of data processing frameworks like PySpark is also beneficial. Experience working with cloud-based data solutions like AWS or Google Cloud Platform and cloud-based data warehousing tools like Snowflake. Strong analytical and problem-solving skills to understand complex data problems and provide effective solutions. Experience in building reports and visualizations to represent data in Tableau or Looker Ability to effectively communicate with stakeholders, and work cross-functionally and communicate with technical and non-technical teams Familiarity with SCRUM operating model and tracking work via a tool such as Jira 6+ years in data engineering, data warehousing, or business intelligence BS in computer science, data science, statistics, mathematics, or a related field Responsibilities Engage with Product and Engineering to implement product definitions into subscription and product analytics, building new insights and updates to existing key data products Analyze a variety of data sources, structures, and metadata and develop mapping, transformation rules, aggregations and ETL specifications Configure scalable and reliable data pipelines to consume, integrate and analyze large volumes of complex data from different sources to support the growing needs of subscription and product analytics Partner with internal stakeholders to understand user needs and implement user feedback, and develop reporting and dashboards focused on subscription analytics Work closely with other Analytics team members to optimize data self service, reusability, performance, and ensure validity of source of truth Enhance reusable knowledge of the models and metrics through documentation and use of the data catalog Ensure data security and compliance by implementing appropriate data access controls, encryption, and auditing mechanisms. Take ownership of successful completion for project activities Nice to Have Experience in data science, AI/ML concepts and techniques.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

14 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

Data Engineering & Analytics Team Lead About BizAcuity BizAcuity is a Business Intelligence and Data Analytics product engineering and consulting company developing solutions for clients across globe in various domains, including Gaming, BFSI, Media, E-Commerce etc. BizAcuity has developed a product, which is an AI-powered Software-as-a-Service (SaaS) platform that provides visual and predictive analytics using Machine Learning and Artificial Intelligence. Job Mode Onsite/Work from Office | Monday to Friday | Shift 1 (Morning). Job Location Hyderabad. Job Description : Overview : We are seeking an experienced Team Lead to oversee our data engineering and analytics team consisting of data engineers, ML engineers, reporting engineers, and data/business analysts. The ideal candidate will drive end-to-end data solutions from data lake and data warehouse implementations to advanced analytics and AI/ML projects, ensuring timely delivery and quality standards. Key Responsibilities : - Lead and mentor a cross-functional team of data professionals including data engineers, ML engineers, reporting engineers, and data/business analysts. - Manage the complete lifecycle of data projects from requirements gathering to implementation and maintenance. - Develop detailed project estimates and allocate work effectively among team members based on skills and capacity. - Implement and maintain data architectures including data lakes, data warehouses, and lakehouse solutions. - Review team deliverables for quality, adherence to best practices, and performance optimization. - Hold team members accountable for timelines and quality standards through regular check-ins and performance tracking. - Translate business requirements into technical specifications and actionable tasks. - Collaborate with clients and internal stakeholders to understand business needs and define solution approaches. - Ensure proper documentation of processes, architectures, and code. Technical Requirements : - Strong understanding of data engineering fundamentals including ETL/ELT processes, data modeling, and pipeline development. - Proficiency in SQL and data warehousing concepts including dimensional modeling and optimization techniques. - Experience with big data technologies and distributed computing frameworks. - Hands-on experience with at least one major cloud provider (AWS, GCP, or Azure) and their respective data services. - Knowledge of on-premises data infrastructure setup and maintenance. - Understanding of data governance, security, and compliance requirements. - Familiarity with AI/ML workflows and deployment patterns. - Experience with BI and reporting tools for data visualization and insights delivery. Management Skills : - Proven experience leading technical teams of 4+ members. - Strong project estimation and resource allocation capabilities. - Excellent code and design review skills. - Ability to manage competing priorities and deliver projects on schedule. - Effective communication skills to bridge technical concepts with business objectives. - Problem-solving mindset with the ability to remove blockers for the team. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field; - 5+ years of experience in data engineering or related roles. - 2+ years of team leadership or management experience. - Demonstrated success in delivering complex data projects. - Certification in relevant cloud platforms or data technologies is a plus. What We Offer : - Opportunity to lead cutting-edge data projects for diverse clients. - Professional development and technical growth path. - A collaborative work environment that values innovation. - Competitive salary and benefits package.

Posted 3 weeks ago

Apply

2.0 - 6.0 years

4 - 7 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Job Summary: We are seeking a skilled Data Scientist with experience in both Python and SAS to join our high-performing analytics team. The ideal candidate will be adept at developing predictive models, analyzing large datasets, and delivering actionable insights using modern data science tools and platforms. Key Responsibilities: Design and develop machine learning and statistical models using Python and SAS. Conduct data exploration, preprocessing, and analysis on structured and unstructured datasets. Use SAS (Base, Advanced, Enterprise Guide, Visual Analytics, or SAS Viya) for reporting, data preparation, and statistical modeling as required. Work with large-scale datasets using Python libraries such as Pandas, NumPy, Scikit-learn, and TensorFlow. Translate business requirements into technical solutions in collaboration with cross-functional teams. Build data pipelines and automate workflows for data analysis and model deployment. Present data-driven insights through visualizations using tools such as Seaborn, Plotly, Matplotlib, or SAS VA. Document models, code, and methodologies for reproducibility and auditability. Qualifications: 2-10 years of experience in data science, machine learning, or advanced analytics. Proficient in Python (Pandas, NumPy, Scikit-learn, TensorFlow, PyTorch, etc.) and SAS (Base, Advanced, DI, VA, or Viya). Strong knowledge of SQL and experience working with relational databases. Solid foundation in statistical analysis, hypothesis testing, regression, classification, clustering, etc. Experience in building, evaluating, and deploying predictive models. Excellent communication skills and ability to convey complex findings to non-technical stakeholders. Preferred Qualifications : Experience with big data tools (Spark, Hive, Hadoop). Exposure to MLOps tools (MLflow, Airflow, Docker, etc.). Familiarity with cloud platforms (AWS, Azure, GCP). Understanding of SAS integration with cloud or open-source tools. Experience with NLP, image recognition, or deep learning frameworks. Location- Pan ,Delhi NCR,Bengaluru,Chennai,Pune,Kolkata,Ahmedabad,Mumbai, Hyderabad Education- Bachelor's degree in Computer Science, Information Technology, or a related field.

Posted 3 weeks ago

Apply

2.0 - 6.0 years

12 - 22 Lacs

Hyderabad, Bengaluru

Hybrid

Naukri logo

What we ask Experience: 2-6 years of experience in Data Engineering roles. Technical Skills: Proficiency in SQL, Python and Big data technologies (PySpark, Hive, Hadoop) Strong understanding of data pipeline Familiarity with data visualization tools. Good understanding of ETL pipelines Good experience in Data modelling Communication Skills: Ability to communicate complex technical concepts. Strong collaborative and team-oriented mindset We would be excited if you have Excellent communication and interpersonal skills Ability to meet deadlines and manage project delivery Excellent report-writing and presentation skills Critical thinking and problem-solving capabilities Whats in it for you? A Happy Workplace! We create an environment where everyone feels welcome and we are more than just co-workers, sharing an informal and fun workplace. Our teams are highly adaptive, and our dynamic culture pushes everyone to create success in all dimensions. Lucrative Packages and Perks At Indium we recognize your talent and offer competitive salaries better than the market standards. In addition to appraisals, rewards, and recognition programs conducted regularly, we have performance bonuses, sign-offs, and joining bonuses to value your contributions and success for yourself and Indium. Your Health is Priority for Us! A healthy and happy workforce is important for us, hence we ensure that you and your dependents are covered under our Medical Insurance Policy. From 1:1 counselling session for your mental well-being to fun filled fitness initiatives we ensure you stay healthy and happy! Skill Up to Scale Up We believe in continuous learning as part of our core values and hence we provide excellent training initiatives along with access to our mainspring learning platform, Indium Academy to ensure you keep yourself equipped with the necessary technical skills for greater success. Work-Life Balance With Flexi hybrid working culture and 5-day work week structure, and lots of fun destressing initiatives we create a positive and relaxed environment to work with!

Posted 3 weeks ago

Apply

10.0 - 14.0 years

35 - 40 Lacs

Indore, Hyderabad, Ahmedabad

Work from Office

Naukri logo

Experience - 10 to 14 Years Experience Work From Office All Days Presales Experience Required in Data Engineeting, Data Analytics, BI Domain Job Location - Hyderabad, Indore, Ahmedabad

Posted 3 weeks ago

Apply

8.0 - 13.0 years

20 - 32 Lacs

Bengaluru

Hybrid

Naukri logo

Job Title: Senior Data Engineer Experience: 9+ Years Location: Whitefield, Bangalore Notice Period: Serving or Immediate joiners. Role & Responsibilities: Design and implement scalable data pipelines for ingesting, transforming, and loading data from diverse sources and tools. Develop robust data models to support analytical and reporting requirements. Automate data engineering processes using appropriate scripting languages and frameworks. Collaborate with engineers, process managers, and data scientists to gather requirements and deliver effective data solutions. Serve as a liaison between engineering and business teams on all data-related initiatives. Automate monitoring and alerting for data pipelines, products, and dashboards; provide support for issue resolution including on-call responsibilities. Write optimized and modular SQL queries, including view and table creation as required. Define and implement best practices for data validation, ensuring alignment with enterprise standards. Manage QA data environments, including test data creation and maintenance. Qualifications: 9+ years of experience in data engineering or a related field. Proven experience with Agile software development practices. Strong SQL skills and experience working with both RDBMS and NoSQL databases. Hands-on experience with cloud-based data warehousing platforms such as Snowflake and Amazon Redshift . Proficiency with cloud technologies, preferably AWS . Deep knowledge of data modeling , data warehousing , and data lake concepts. Practical experience with ETL/ELT tools and frameworks. 5+ years of experience in application development using Python , SQL , Scala , or Java . Experience in working with real-time data streaming and associated platforms. Note: The professional should be based out of Bangalore, as one technical round has to be taken F2F from Bellandur, Bangalore office.

Posted 3 weeks ago

Apply

17.0 - 19.0 years

55 - 60 Lacs

Hyderabad

Work from Office

Naukri logo

Position Summary: The Software Development Associate Director provides hands on leadership, management, and thought leadership for a Delivery organization enabling Cigna's Technology teams. This individual will lead a team based in our Hyderabad Innovation Hub to deliver innovative solutions supporting multiple business and technology domains within Cigna. This includes the Sales & Underwriting, Producer, Service Operations, and Pharmacy business lines, as well as testing and DevOps enablement. The focus of the team is to build innovative go-to-market solutions enabling business while modernizing our existing asset base to support business growth. The Technology strategy is aligned to our business strategy and the candidate will not only be able to influence technology direction but also establishing our team through recruiting and mentoring employees and vendor resources. This is a hands-on position with visibility to the highest levels of the Cigna Technology team. This leader will focus on enabling innovation using the latest technologies and development techniques. This role will foster rapidly building out a scalable delivery organization that aligns with all areas within the Technology team. The ideal candidate will be able to attract and develop talent in a highly dynamic environment. Job Description & Responsibilities: Provide leadership, vision, and design direction for the quality and development of the US Medical and Health Services Technology teams based at the Hyderabad Innovation Hub (HIH). Work in close coordination with leaders and teams based in the United States, as well as contractors employed by the US Medical and Health Services Technology team who are based both within and outside of the United States, to deliver products and capabilities in support of Cigna's business lines. Provide leadership to HIH leaders and teams ensuring the team is meeting the following objectives: Design, configuration, implementation application design/development, and quality engineering within the supported technologies and products. Hands-on people manager who has experience leading agile teams of highly talented technology professionals developing large solutions and internal facing applications. They are expected to work closely with developers, quality engineers, technical project managers, principal engineers, and business stakeholders to ensure that application solutions meet business/customer requirements. A servant leader mentality and a history of creating an inclusive environment, fostering diverse views and approaches from the team, and coaching and mentoring them to thrive in a dynamic workplace. A history of embracing and incubating emerging technology and open-source products. A passion for building highly resilient, scalable, and available platforms, rich reusable foundational capabilities and seamless developer experience while focusing on strategic vision and technology roadmap delivery in an MVP iterative fast paced approach. Accountable for driving towards timely decisions while influencing across engineering and development delivery teams to drive towards meeting project timelines while balancing destination state. Ensure engineering solutions align with the Technology strategy and that they support the applications requirements. Plan and implement procedures that will maximize engineering and operating efficiency for application integration technologies. Identify and drive process improvement opportunities. Proactive monitoring and management design of supported assets assuring performance, availability, security, and capacity. Maximize the efficiency (operational, performance, and cost) of the application assets. Experience Required: 17 to 19 years of IT and business/industry or equivalent experience preferred, with at least 5 years of experience in a leadership role with responsibility for the delivery of large-scale projects and programs. Leadership, cross-cultural communication, and familiarity with wide range of technologies and stakeholders. Strong Emotional Intelligence with the ability to foster collaboration across geographically dispersed teams. Experience Desired: Recognized leader with proven track record of delivering software engineering initiatives and cross-IT/business initiatives. Proven experience leading/managing technical teams with a passion for developing talent within the team. Experience with vendor management in an onshore/offshore model. Experience in Healthcare, Pharmacy and/or Underwriting systems. Experience with AWS. Education and Training Required: B.S. degree in Computer Science, Information Systems, or other related degrees; Industry certifications such as AWS Solution Architect, PMP, Scrum Master, or Six Sigma Green Belt are also ideal. Primary Skills: Familiarity with most of the following Application Development technologies: Python, RESTful services, React, Angular, Postgres, and MySQL (relational database management systems). Familiarity with most of the following Data Engineering technologies: Databricks, Spark, PySpark, SQL, Teradata, and multi-cloud environments. Familiarity with most of the following Cloud and Emerging technologies: AWS, LLMs (OpenAI, Anthropic), vector databases (Pinecone, Milvus), graph databases (Neo4j, JanusGraph, Neptune), prompt engineering, and fine-tuning AI models. Familiarity with enterprise software development lifecycle to include production reviews and ticket resolution, navigating freeze/stability periods effectively, total cost of ownership reporting, and updating applications to align with evolving security and cloud standards. Familiarity with agile methodology including SCRUM team leadership or Scaled Agile (SAFE). Familiarity with modern delivery practices such as continuous integration, behavior/test driven development, and specification by example. Deep people and matrix management skills, with a heavy emphasis on coaching and mentoring of less senior staff, and a strong ability to influence VP level leaders. Proven ability to resolve issues and mitigate risks that could undermine the delivery of critical initiatives. Strong written and verbal communication skills with the ability to interact with all levels of the organization. Strong influencing/negotiation skills. Strong interpersonal/relationship management skills. Strong time and project management skills.

Posted 3 weeks ago

Apply

4.0 - 7.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Design, develop, and implement data solutions using Microsoft Fabric , including data pipelines, data warehouses, and data marts. Develop data pipelines, data transformations, and data workflows using Microsoft Fabric.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

10 - 15 Lacs

Bangalore Rural

Work from Office

Naukri logo

A candidate with distributed computer understanding and experience with SQL, Spark, ETL. Experience using databases like MySQL DB, Postegre, SQL, OracleAWS or Datafactory based ETL on Azure is required.

Posted 3 weeks ago

Apply

2.0 - 7.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for experienced Software Engineers who can help design and own the building, deploying and optimizing the streaming infrastructure. This project has a directive from engineering leadership to make OKTA a leader in the use of data and machine learning to improve end-user security and to expand that core-competency across the rest of engineering. You will have a sizable impact on the direction, design & implementation of the solutions to these problems. Job Duties and Responsibilities: Design, implement and own data-intensive, high-performance, scalable platform components Work with engineering teams, architects and cross functional partners on the development of projects, design, and implementation Conduct and participate in design reviews, code reviews, analysis, and performance tuning Coach and mentor engineers to help scale up the engineering organization Debug production issues across services and multiple levels of the stack Required Knowledge, Skills, and Abilities: 2+ years of experience of software development Proficient in at least one language while comfortable in more than one of the backend languages, preferably Java or Typescript, Ruby, GoLang, Python. Have experience working with at least one of the database technologies - MySQL, Redis, or PostgreSQL. Demonstrable knowledge of computer science fundamentals with strong API Design skills. Comfortable working on a geographically distributed extended team. Brings the right attitude to the team: ownership, accountability, attention to detail, and customer focus. Track record of delivering work incrementally to get feedback and iterating over solutions. Comfortable in React or similar front-end UI stacks; if not comfortable yet, you are willing to learn Nice to have Experience using a cloud-based distributed computing technologies such as Messaging systems such as Kinesis, Kafka Data processing systems like Flink, Spark, Beam Storage & Compute systems such as Snowflake, Hadoop Coordinators and schedulers like the ones in Kubernetes, Hadoop, Mesos Maintained security, encryption, identity management, or authentication infrastructure Leveraged major public cloud providers to build mission-critical, high volume services Hands-on experience in developing Data Integration applications for large scale (petabyte scale) environments with experience in both batch and online systems. Contributed to the development of distributed systems or used one or more at high volume or criticality such as Kafka or Hadoop

Posted 3 weeks ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Sr Analytics Engineer We are looking for an experienced Analytics Engineer to join Okta s enterprise data team. This analyst will have strong background in SaaS subscription and product analytics, a passion for providing customer usage insights to internal stakeholders, and experience organizing complex data into consumable data assets. In this role, you will be focusing on subscription analytics and product utilization insights and will partner with Product, Engineering, Customer Success, and Pricing to implement enhancements and build end-to-end customer subscription insights into new products. Requirements Experience in customer analytics, product analytics, and go-to-market analytics Experience in SaaS business, Product domain as well as Salesforce Proficiency in SQL, ETL tools, GitHub, and data integration technologies, including familiarity with data modeling techniques, database design, and query optimization. Experience in data languages like R and Python. Knowledge of data processing frameworks like PySpark is also beneficial. Experience working with cloud-based data solutions like AWS or Google Cloud Platform and cloud-based data warehousing tools like Snowflake. Strong analytical and problem-solving skills to understand complex data problems and provide effective solutions. Experience in building reports and visualizations to represent data in Tableau or Looker Ability to effectively communicate with stakeholders, and work cross-functionally and communicate with technical and non-technical teams Familiarity with SCRUM operating model and tracking work via a tool such as Jira 6+ years in data engineering, data warehousing, or business intelligence BS in computer science, data science, statistics, mathematics, or a related field Responsibilities Engage with Product and Engineering to implement product definitions into subscription and product analytics, building new insights and updates to existing key data products Analyze a variety of data sources, structures, and metadata and develop mapping, transformation rules, aggregations and ETL specifications Configure scalable and reliable data pipelines to consume, integrate and analyze large volumes of complex data from different sources to support the growing needs of subscription and product analytics Partner with internal stakeholders to understand user needs and implement user feedback, and develop reporting and dashboards focused on subscription analytics Work closely with other Analytics team members to optimize data self service, reusability, performance, and ensure validity of source of truth Enhance reusable knowledge of the models and metrics through documentation and use of the data catalog Ensure data security and compliance by implementing appropriate data access controls, encryption, and auditing mechanisms. Take ownership of successful completion for project activities Nice to Have Experience in data science, AI/ML concepts and techniques

Posted 3 weeks ago

Apply

8.0 - 10.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do We are seeking a seasoned Principal Data Engineer to lead the design, development, and implementation of our data strategy. The ideal candidate possesses a deep understanding of data engineering principles, coupled with strong leadership and problem-solving skills. As a Principal Data Engineer, you will architect and oversee the development of robust data platforms, while mentoring and guiding a team of data engineers. Roles & Responsibilities: Possesses strong rapid prototyping skills and can quickly translate concepts into working code. Provide expert guidance and mentorship to the data engineering team, fostering a culture of innovation and standard methodologies. Design, develop, and implement robust data architectures and platforms to support business objectives. Oversee the development and optimization of data pipelines, and data integration solutions. Establish and maintain data governance policies and standards to ensure data quality, security, and compliance. Architect and manage cloud-based data solutions, using AWS or other preferred platforms. Lead and motivate an impactful data engineering team to deliver exceptional results. Identify, analyze, and resolve complex data-related challenges. Collaborate closely with business collaborators to understand data requirements and translate them into technical solutions. Stay abreast of emerging data technologies and explore opportunities for innovation. Basic Qualifications: Masters degree and 8 to 10 years of computer science and engineering preferred, other Engineering field is considered OR Bachelors degree and 10 to 14 years of computer science and engineering preferred, other Engineering field is considered; Diploma and 14 to 18 years of in computer science and engineering preferred, other Engineering field is considered Demonstrated proficiency in using cloud platforms (AWS, Azure, GCP) for data engineering solutions. Strong understanding of cloud architecture principles and cost optimization strategies. Proficient on experience in Python, PySpark, SQL. Handon experience with bid data ETL performance tuning. Proven ability to lead and develop impactful data engineering teams. Strong problem-solving, analytical, and critical thinking skills to address complex data challenges. Preferred Qualifications: Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with Apache Spark, Apache Airflow Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Experienced with AWS, GCP or Azure cloud services Professional Certifications AWS Certified Data Engineer preferred Databricks Certificate preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills.

Posted 3 weeks ago

Apply

8.0 - 12.0 years

15 - 27 Lacs

Mumbai, Pune, Bengaluru

Work from Office

Naukri logo

Role & responsibilities : Job Description: Primarily looking for a Data Engineer (AWS) with expertise in processing data pipelines using Data bricks, PySpark SQL on Cloud distributions like AWS Must have AWS Data bricks ,Good-to-have PySpark, Snowflake, Talend Requirements- • Candidate must be experienced working in projects involving • Other ideal qualifications include experiences in • Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. • Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python • Familiarity with AWS compute storage and IAM concepts • Experience in working with S3 Data Lake as the storage tier • Any ETL background Talend AWS Glue etc. is a plus but not required • Cloud Warehouse experience Snowflake etc. is a huge plus • Carefully evaluates alternative risks and solutions before taking action. • Optimizes the use of all available resources • Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit • Skills • Hands on experience on Databricks Spark SQL AWS Cloud platform especially S3 EMR Databricks Cloudera etc. • Experience on Shell scripting • Exceptionally strong analytical and problem-solving skills • Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses • Strong experience with relational databases and data access methods especially SQL • Excellent collaboration and cross functional leadership skills • Excellent communication skills both written and verbal • Ability to manage multiple initiatives and priorities in a fast-paced collaborative environment • Ability to leverage data assets to respond to complex questions that require timely answers • has working knowledge on migrating relational and dimensional databases on AWS Cloud platform Skills Mandatory Skills: Apache Spark, Databricks, Java, Python, Scala, Spark SQL. Note : Need only Immediate joiners/ Serving notice period. Interested candidates can apply. Regards, HR Manager

Posted 3 weeks ago

Apply

3.0 - 5.0 years

9 - 15 Lacs

Bengaluru

Remote

Naukri logo

Develop apps using Python program language S/w development life cycle - test/deploy Conduct thorough testing of applications Deploy apps/provide post-deployment support Upgrade s/w programs regularly Required Candidate profile Utilize data Develop comprehensive documentation for tools, strategies, data pipelines while ensuring their security Build apps, design s/w, write code, ensure apps are functional Data Engineering

Posted 3 weeks ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Overview In this role, we are seeking an Associate Manager Offshore Program & Delivery Management to oversee program execution, governance, and service delivery across DataOps, BIOps, AIOps, MLOps, Data IntegrationOps, SRE, and Value Delivery programs. This role requires expertise in offshore execution, cost optimization, automation strategies, and cross-functional collaboration to enhance operational excellence. Manage and support DataOps programs, ensuring alignment with business objectives, data governance standards, and enterprise data strategy. Assist in real-time monitoring, automated alerting, and self-healing mechanisms to improve system reliability and performance. Contribute to the development and enforcement of governance models and operational frameworks to streamline service delivery and execution roadmaps. Support the standardization and automation of pipeline workflows, report generation, and dashboard refreshes to enhance efficiency. Collaborate with global teams to support Data & Analytics transformation efforts and ensure sustainable, scalable, and cost-effective operations. Assist in proactive issue identification and self-healing automation, enhancing the sustainment capabilities of the PepsiCo Data Estate. Responsibilities Support DataOps and SRE operations, assisting in offshore delivery of DataOps, BIOps, Data IntegrationOps, and related initiatives. Assist in implementing governance frameworks, tracking KPIs, and ensuring adherence to operational SLAs. Contribute to process standardization and automation efforts, improving service efficiency and scalability. Collaborate with onshore teams and business stakeholders, ensuring alignment of offshore activities with business needs. Monitor and optimize resource utilization, leveraging automation and analytics to improve productivity. Support continuous improvement efforts, identifying operational risks and ensuring compliance with security and governance policies. Assist in managing day-to-day DataOps activities, including incident resolution, SLA adherence, and stakeholder engagement. Participate in Agile work intake and management processes, contributing to strategic execution within data platform teams. Provide operational support for cloud infrastructure and data services, ensuring high availability and performance. Document and enhance operational policies and crisis management functions, supporting rapid incident response. Promote a customer-centric approach, ensuring high service quality and proactive issue resolution. Assist in team development efforts, fostering a collaborative and agile work environment. Adapt to changing priorities, supporting teams in maintaining focus on key deliverables. Qualifications 6+ years of technology experience in a global organization, preferably in the CPG industry. 4+ years of experience in Data & Analytics, with a foundational understanding of data engineering, data management, and operations. 3+ years of cross-functional IT experience, working with diverse teams and stakeholders. 12 years of leadership or coordination experience, supporting team operations and service delivery. Strong communication and collaboration skills, with the ability to convey technical concepts to non-technical audiences. Customer-focused mindset, ensuring high-quality service and responsiveness to business needs. Experience in supporting technical operations for enterprise data platforms, preferably in a Microsoft Azure environment. Basic understanding of Site Reliability Engineering (SRE) practices, including incident response, monitoring, and automation. Ability to drive operational stability, supporting proactive issue resolution and performance optimization. Strong analytical and problem-solving skills, with a continuous improvement mindset. Experience working in large-scale, data-driven environments, ensuring smooth operations of business-critical solutions. Ability to support governance and compliance initiatives, ensuring adherence to data standards and best practices. Familiarity with data acquisition, cataloging, and data management tools. Strong organizational skills, with the ability to manage multiple priorities effectively.

Posted 3 weeks ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Purpose of the role Reporting to the Engineering Data Shared Services DL, and working closely with other Digital Transformation Teams, Business Process Owners, Data Owners and end users, you will: Be responsible to ensure consistency of Master data in compliance with core business rules. Contribute in defining the data standards and data quality criteria Manage critical activities in the process Be the subject matter expert and share knowledge Responsibilities Create and enforce Standard , Specific & Design parts for effective data management Formulate techniques for quality data collection to ensure adequacy, accuracy and legitimacy of data Devise and implement efficient and secure procedures for data handling and analysis with attention to all technical aspects Support others in the daily use of data systems and ensure adherence to legal and company standards Assist with reports and data extraction when needed Monitor and analyze information and data systems and evaluate their performance to discover ways of enhancing them (new technologies, upgrades etc.) Troubleshoot data-related problems and authorize maintenance or modifications Manage all incoming data files. Continually develop data management strategies. Analyse & validate master data during rollouts. Raise incidents tickets and work closely with other IT operations teams to resolve MDM issues. Being resilient and strive towards taking the team to next level by highlighting roadblocks to management Critical Challenges Mtiers facing transformation challenges while business continuity must be maintained in Regions Complex end to end data flows with many cross-data dependencies

Posted 3 weeks ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Overview As a member of the data engineering team, you will be the key technical expert developing and overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be an empowered member of a team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. Responsibilities Be a founding member of the data engineering team. Help to attract talent to the team by networking with your peers, by representing PepsiCo HBS at conferences and other events, and by discussing our values and best practices when interviewing candidates. Own data pipeline development end-to-end, spanning data modeling, testing, scalability, operability and ongoing metrics. Ensure that we build high quality software by reviewing peer code check-ins. Define best practices for product development, engineering, and coding as part of a world class engineering team. Collaborate in architecture discussions and architectural decision making that is part of continually improving and expanding these platforms. Lead feature development in collaboration with other engineers; validate requirements / stories, assess current system capabilities, and decompose feature requirements into engineering tasks. Focus on delivering high quality data pipelines and tools through careful analysis of system capabilities and feature requests, peer reviews, test automation, and collaboration with other engineers. Develop software in short iterations to quickly add business value. Introduce new tools / practices to improve data and code quality; this includes researching / sourcing 3rd party tools and libraries, as well as developing tools in-house to improve workflow and quality for all data engineers. Support data pipelines developed by your teamthrough good exception handling, monitoring, and when needed by debugging production issues. Qualifications 6-9 years of overall technology experience that includes at least 5+ years of hands-on software development, data engineering, and systems architecture. 4+ years of experience in SQL optimization and performance tuning Experience with data modeling, data warehousing, and building high-volume ETL/ELT pipelines. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with data profiling and data quality tools like Apache Griffin, Deequ, or Great Expectations. Current skills in following technologies: Python Orchestration platforms: Airflow, Luigi, Databricks, or similar Relational databases: Postgres, MySQL, or equivalents MPP data systems: Snowflake, Redshift, Synapse, or similar Cloud platforms: AWS, Azure, or similar Version control (e.g., GitHub) and familiarity with deployment, CI/CD tools. Fluent with Agile processes and tools such as Jira or Pivotal Tracker Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus.

Posted 3 weeks ago

Apply

10.0 - 15.0 years

32 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

Responsibilities Lead the migration and modernization of data platforms, moving applications and pipelines to Azure-based solutions. Actively contribute to code development in projects and services. Manage and scale data pipelines from internal and external data sources to support new product launches and ensure high data quality. Develop automation and monitoring frameworks to capture key metrics and operational KPIs for pipeline performance. Implement best practices around systems integration, security, performance, and data management. Collaborate with internal teams, including data science and product teams, to drive solutioning and proof-of-concept (PoC) discussions. Develop and optimize procedures to transition data into production. Define and manage SLAs for data products and operational processes. Prototype and build scalable solutions for data engineering and analytics. Research and apply state-of-the-art methodologies in data and Platform engineering. Create and maintain technical documentation for knowledge sharing. Develop reusable packages and libraries to enhance development efficiency. Qualifications Bachelors degree in Computer Science, MIS, Business Management, or related field 10 + years experience in Information Technology 4 + years of Azure, AWS and Cloud technologies Experience in data platform engineering, with a focus on cloud transformation and modernization. Strong knowledge of Azure services, including Databricks, Azure Data Factory, Synapse Analytics, and Azure DevOps (ADO). Proficiency in SQL, Python, and Spark for data engineering tasks. Hands-on experience building and scaling data pipelines in cloud environments. Experience with CI/CD pipeline management in Azure DevOps (ADO). Understanding of data governance, security, and compliance best practices. Experience working in an Agile development environment. Prior experience in migrating applications from legacy platforms to the cloud. Knowledge of Terraform or Infrastructure-as-Code (IaC) for cloud resource management. Familiarity with Kafka, Event Hubs, or other real-time data streaming solutions. Experience with lagacy RDBMS (Oracl, DB2, Teradata) Background in supporting data science models in production.

Posted 3 weeks ago

Apply

11.0 - 15.0 years

35 - 40 Lacs

Hyderabad

Work from Office

Naukri logo

Overview Primary focus would be to lead development work within Azure Data Lake environment and other related ETL technologies, with the responsibility of ensuring on time and on budget delivery; Satisfying project requirements, while adhering to enterprise architecture standards. Role will lead key data lake projects and resources, including innovation related initiatives (e.g. adoption of technologies like Databricks, Presto, Denodo, Python,Azure data factory; database encryption; enabling rapid experimentation etc.)). This role will also have L3 and release management responsibilities for ETL processes Responsibilities Lead delivery of key Enterprise Data Warehouse and Azure Data Lake projects within time and budget Drive solution design and build to ensure scalability, performance and reuse of data and other components Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards. Manage work intake, prioritization and release timing; balancing demand and available resources. Ensure tactical initiatives are aligned with the strategic vision and business needs Oversee coordination and partnerships with Business Relationship Managers, Architecture and IT services teams to develop and maintain EDW and data lake best practices and standards along with appropriate quality assurance policies and procedures May lead a team of employee and contract resources to meet build requirements: o Set priorities for the team to ensure task completion o Coordinate work activities with other IT services and business teams. o Hold team accountable for milestone deliverables o Provide L3 support for existing applications o Release management Qualifications Bachelors degree in Computer Science, MIS, Business Management, or related field 11 + years experience in Information Technology or Business Relationship Management 7 + years experience in Data Warehouse/Azure Data Lake 3 years experience in Azure data lake 2 years experience in project management Technical Skills Thorough knowledge of data warehousing / data lake concepts Hands on experience on tools like Azure data factory, databricks, pyspark and other data management tools on Azure Proven experience in managing Data, BI or Analytics projects Solutions Delivery experience - expertise in system development lifecycle, integration, and sustainability Experience in data modeling or database experience; Non-Technical Skills Excellent remote collaboration skills Experience working in a matrix organization with diverse priorities Experience dealing with and managing multiple vendors Exceptional written and verbal communication skills along with collaboration and listening skills Ability to work with agile delivery methodologies Ability to ideate requirements & design iteratively with business partners without formal requirements documentation Ability to budget resources and funding to meet project deliverables

Posted 3 weeks ago

Apply

4.0 - 8.0 years

15 - 18 Lacs

Lucknow

Work from Office

Naukri logo

Urgent Hiring for Data Engineers Job Location: Lucknow (On-Site) Exp - 4+ yrs (relevant) Salary range: 15 lpa - 18 lpa No.of open positions : 10 Immediate joiners are only required Job Overview: We are seeking experienced Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, building, and maintaining large-scale data systems and pipelines. You will work closely with our data science team to prepare data for prescriptive and predictive modeling, ensuring high-quality data outputs. Key Responsibilities: - Analyze and organize raw data from various sources - Build and maintain data systems and pipelines - Prepare data for prescriptive and predictive modeling - Combine raw information from different sources to generate valuable insights - Enhance data quality and reliability Requirements : - 4+ years of experience as a Data Engineer or in a similar role - Technical expertise in data models, data mining, and segmentation techniques - Experience with Cloud data technologies (Azure Data Factory, Databricks) - Knowledge of CI/CD pipelines and Jenkins - Strong programming skills in Python - Hands-on experience with SQL databases

Posted 3 weeks ago

Apply

5.0 - 10.0 years

10 - 12 Lacs

Noida

Work from Office

Naukri logo

Job Title: Data Warehouse Developer II Location: Noida Department: IT Reports To: IT Supervisor/Manager/Director Direct Reports: No Job Summary The Data Warehouse Developer is responsible for designing, developing, maintaining, and supporting data transformation, integration, and analytics solutions across both cloud and on-premises environments. This role also provides 24x7 support for global systems. Key Responsibilities Understand and translate business requirements into technical solutions. Develop, test, debug, document, and implement ETL processes. Ensure performance, scalability, reliability, and security of solutions. Work with structured and semi-structured data across multiple platforms. Participate in Agile practices, including daily SCRUM meetings. Collaborate with infrastructure teams, DBAs, and software developers. Adhere to corporate standards for databases, data engineering, and analytics. Provide accurate time estimates, communicate status, and flag risks. Work across the full SDLC (analysis to support) using Agile methodologies. Demonstrate motivation, self-drive, and strong communication skills. Perform other related duties as assigned. Requirements Education & Experience Bachelors degree or equivalent work experience. 5+ years in software development/data engineering roles. At least 2 years of dedicated data engineering experience preferred. Technical Skills Strong experience with data transformations and manipulation. Ability to design data stores for analytics and other needs. Familiarity with traditional and modern data architectures (e.g., data lakes). Hands-on experience with cloud-native data tools (Azure preferred; GCP is a plus). Proficiency in traditional Microsoft ETL tools: SSIS, SSRS, SSAS, Power BI. Experience with Azure Data Factory is a plus. Soft Skills Ability to present and document clearly. Self-motivated and independent. Strong partnership and credibility with stakeholders. Work Environment Standard office setting. Use of standard office equipment.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

32 - 40 Lacs

Bengaluru

Work from Office

Naukri logo

Design, develop, and optimize large-scale data processing pipelines using PySpark. Work with various Apache tools and frameworks (like Hadoop, Hive, HDFS, etc.) to ingest, transform, and manage large datasets.

Posted 3 weeks ago

Apply

4.0 - 9.0 years

15 - 27 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Mandatory Skills – Java OR Python, SQL OR Oracle Exp – 4 to 10 Yrs 28 LPA CTC 4+ yrs working with advanced statistical methods such as regressions, classifiers, recommenders, anomaly detection, optimization algorithms, tree methods, neural nets, etc Required Candidate profile 2+ years with relational or NoSQL databases Dev and imp. of data mining protocols, architectures, and models as well as data analysis methodologies, used to identify trends in large data sets 28 LPA

Posted 3 weeks ago

Apply

3.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

About PhonePe Group PhonePe is India s leading digital payments company with 50 crore (500 Million) registered users and 3.7 crore (37 Million) merchants covering over 99% of the postal codes across India. On the back of its leadership in digital payments, PhonePe has expanded into financial services (Insurance, Mutual Funds, Stock Broking, and Lending) as well as adjacent tech-enabled businesses such as Pincode for hyperlocal shopping and Indus App Store which is India's first localized App Store. The PhonePe Group is a portfolio of businesses aligned with the company's vision to offer every Indian an equal opportunity to accelerate their progress by unlocking the flow of money and access to services. Culture At PhonePe, we take extra care to make sure you give your best at work, Everyday! And creating the right environment for you is just one of the things we do. We empower people and trust them to do the right thing. Here, you own your work from start to finish, right from day one. Being enthusiastic about tech is a big part of being at PhonePe. If you like building technology that impacts millions, ideating with some of the best minds in the country and executing on your dreams with purpose and speed, join us! Responsibilities Primary job is to coordinate with product, compliance and engineering teams to ensure smooth functioning of all CRM channels - PN, sms, VMN, email, Push, RCS, in-app banners & WhatsApp campaigns. Own the business relation with channel partners for PhonePe platforms - PhonePe, Share.Market, Indus Appstore & Pincode Benchmark our capabilities and performance against third party tools and companies globally and share learnings with internal teams Collaborate with product development teams to chart a product roadmap in line with business expectations Work on automating repetitive campaigns with tech ops and CRM ops team. Design workflows, triggers, and alerts to minimize manual tasks Develop and present reports on merchandising and CRM performance to stakeholders. Track and measure campaign effectiveness to maximize ROI, delivery & conversions Collaborate with cross-functional teams to create targeted marketing campaigns and promotions Required Experience & Skill set: Engineering/MBA degree from a Tier 1/2 college Proven experience (3+ years) in CRM (or related field such as Martech) in a consumer tech company Should have worked on one of the CRM platforms like Clevertap, Moengage etc. Basic understanding of how data engineering, analytical queries & product teams work in a tech company is a must. Strong analytical ability and logical reasoning Proficiency in Excel, Google Sheets, Slides, and Docs. PhonePe Full Time Employee Benefits (Not applicable for Intern or Contract Roles) Insurance Benefits - Medical Insurance, Critical Illness Insurance, Accidental Insurance, Life Insurance Wellness Program - Employee Assistance Program, Onsite Medical Center, Emergency Support System Parental Support - Maternity Benefit, Paternity Benefit Program, Adoption Assistance Program, Day-care Support Program Mobility Benefits - Relocation benefits, Transfer Support Policy, Travel Policy Retirement Benefits - Employee PF Contribution, Flexible PF Contribution, Gratuity, NPS, Leave Encashment Other Benefits - Higher Education Assistance, Car Lease, Salary Advance Policy Working at PhonePe is a rewarding experience! Great people, a work environment that thrives on creativity, the opportunity to take on roles beyond a defined job description are just some of the reasons you should work with us. Read more about PhonePe on our blog . Life at PhonePe PhonePe in the news

Posted 3 weeks ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

About the Role: We are seeking a skilled and detail-oriented Data Migration Specialist with hands-on experience in Alteryx and Snowflake. The ideal candidate will be responsible for analyzing existing Alteryx workflows, documenting the logic and data transformation steps and converting them into optimized, scalable SQL queries and processes in Snowflake. The ideal candidate should have solid SQL expertise, a strong understanding of data warehousing concepts. This role plays a critical part in our cloud modernization and data platform transformation initiatives. Key Responsibilities: Analyze and interpret complex Alteryx workflows to identify data sources, transformations, joins, filters, aggregations, and output steps. Document the logical flow of each Alteryx workflow, including inputs, business logic, and outputs. Translate Alteryx logic into equivalent SQL scripts optimized for Snowflake, ensuring accuracy and performance. Write advanced SQL queries , stored procedures, and use Snowflake-specific features like Streams, Tasks, Cloning, Time Travel , and Zero-Copy Cloning . Implement data ingestion strategies using Snowpipe , stages, and external tables. Optimize Snowflake performance through query tuning , partitioning, clustering, and caching strategies. Collaborate with data analysts, engineers, and stakeholders to validate transformed logic against expected results. Handle data cleansing, enrichment, aggregation, and business logic implementation within Snowflake. Suggest improvements and automation opportunities during migration. Conduct unit testing and support UAT (User Acceptance Testing) for migrated workflows. Maintain version control, documentation, and audit trail for all converted workflows. Required Skills: Bachelor s or master s degree in computer science, Information Technology, Data Science, or a related field. Must have aleast 4 years of hands-on experience in designing and developing scalable data solutions using the Snowflake Data Cloud platform Extensive experience with Snowflake, including designing and implementing Snowflake-based solutions. 1+ years of experience with Alteryx Designer, including advanced workflow development and debugging. Strong proficiency in SQL, with 3+ years specifically working with Snowflake or other cloud data warehouses. Python programming experience focused on data engineering. Experience with data APIs , batch/stream processing. Solid understanding of data transformation logic like joins, unions, filters, formulas, aggregations, pivots, and transpositions. Experience in performance tuning and optimization of SQL queries in Snowflake. Familiarity with Snowflake features like CTEs, Window Functions, Tasks, Streams, Stages, and External Tables. Exposure to migration or modernization projects from ETL tools (like Alteryx/Informatica) to SQL-based cloud platforms. Strong documentation skills and attention to detail. Experience working in Agile/Scrum development environments. Good communication and collaboration skills.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies