Principal Data Engineer (Logistics/Manufacturing)

11 - 21 years

35 - 80 Lacs

Posted:None| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Job Title: Principal Data Engineer, Logistics

Employment Type: Full Time

Experience: 12+ Years

About the Role:

We are looking for a Principal Data Engineer to lead the design and delivery of scalable data solutions using Azure Data Factory and Azure Databricks. This is a consulting-focused role that requires strong technical expertise, stakeholder engagement, and architectural thinking.

You will work closely with business, functional, and technical teams to define data strategies, design robust pipelines, and ensure smooth delivery in an Agile environment.

Responsibilities

  • Collaborate with business and technology stakeholders to gather and understand data needs
  • Translate functional requirements into scalable and maintainable data architecture
  • Design and implement robust data pipelines
  • Lead data modeling, transformation, and performance optimization efforts
  • Ensure data quality, validation, and consistency
  • Participate in Agile ceremonies including sprint planning and backlog grooming
  • Support CI/CD automation for data pipelines and integration workflows
  • Mentor junior engineers and promote best practices in data engineering

Must Have

  • 12+ years of IT experience, with at least 5 years in data architecture roles in modern metadata driven and cloud-based technologies, bringing a software engineering mindset
  • Strong analytical and problem-solving skills - Ability to determine data patterns and perform root cause analysis to resolve production issues
  • Excellent communication skills, with experience in leading client-facing discussion
  • Strong hands-on experience with Azure Data Factory and Databricks, leveraging custom solutioning and design beyond drag-and-drop capabilities for big data workloads
  • Demonstrated proficiency in SQL, Python, and Spark
  • Experience with CI/CD pipelines, version control and DevOps tools
  • Experience with applying dimensional and Data Vault methodologies
  • Background in working with Agile methodologies and sprint-based delivery
  • Ability to produce clear and comprehensive technical documentation

Nice to Have

  • Experience with Azure Synapse and Power BI
  • Experience with Microsoft Purview and/or Unity Catalog
  • Understanding of Data Lakehouse and Data Mesh concepts
  • Familiarity with enterprise data governance and quality frameworks
  • Manufacturing experience within the operations domain

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

pune, bengaluru, delhi / ncr