Home
Jobs

Principal DevOps Engineer

10 years

0 Lacs

Posted:9 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Your IT Future, Delivered.Principal DevOps Engineer

With a global team of 5600+ IT professionals, DHL IT Services connects people and keeps the global economy running by continuously innovating and creating sustainable digital solutions. We work beyond global borders and push boundaries across all dimensions of logistics. You can leave your mark shaping the technology backbone of the biggest logistics company of the world. All our locations have earned the #GreatPlaceToWork certification, reflecting our commitment to exceptional employee experience.

Digitalization. Simply delivered.

At DHL IT Services, we are designing, building and running IT solutions for the whole DPDHL globally.
The AI & Analytics team builds and runs solutions to get much more value out of our data. We help our business colleagues all over the world with machine learning algorithms, predictive models and visualizations. We manage more than 46 AI & Big Data Applications, 3.000 active users, 87 countries and up to 100,000,000 daily transaction.Integration of AI & Big Data into business processes to compete in a data driven world needs state of the art technology. Our infrastructure, hosted on-prem and in the cloud (Azure and GCP), includes MapR, Airflow, Spark, Kafka, jupyter, Kubeflow, Jenkins, GitHub, Tableau, Power BI, Synapse (Analytics), Databricks and further interesting tools.We like to do everything in an Agile/DevOps way. No more throwing the “problem code” to support, no silos. Our teams are completely product oriented, having end to end responsibility for the success of our product.

Grow together.

Currently, we are looking for Principal DevOps Engineer. In this role, you will have the opportunity to design and develop solutions, contribute to roadmaps of Big Data architectures and provide mentorship and feedback to more junior team members. We are looking for someone to help us manage the petabytes of data we have and turn them into value.
Does that sound a bit like you? Let’s talk! Even if you don’t tick all the boxes below, we’d love to hear from you; our new department is rapidly growing and we’re looking for many people with the can-do mindset to join us on our digitalization journey. Thank you for considering DHL as the next step in your career – we do believe we can make a difference together!

Ready to embark on the journey? Here’s what we are looking for:

  • University Degree in Computer Science, Information Systems, Business Administration, or related field.
  • 10+ years of IT Experience with 5+ years of experience in the Data Engineering.
  • Strong analytic skills related to working with structured, semi structured and unstructured datasets.
  • Hands-on experience with Data Lake/Big Data Projects implementation on On-premise and Cloud platforms (preferably Azure/GCP)
  • Hands-on experience with Docker and Kubernetes and related ecosystem tooling on-prem and in public clouds.
  • Hands-on experience with public clouds (preferred: GCP, Azure), with specific focus on using them for Data Lakes.
  • Experience working with Big Data Taechnologies – e.g. Spark, Kafka, HDFS, Hive, Hadoop distributions (Cloudera or MapR)
  • Experience with streaming platforms/frameworks such as Kafka, Spark-Streaming, Flink.
  • Good Programming skills (Java/Scala/Python)
  • Advanced working SQL knowledge and experience with relational databases, query authoring (SQL), as well as working familiarity with a variety of databases.
  • Proven experience in building and optimizing big data pipelines, architectures and data sets.
  • Experience in performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
  • Experience in building processes supporting data transformation, data structures, metadata, dependency and workload management.
  • A successful history of manipulating, processing and extracting value from large disconnected datasets.
  • Experience with Git, CI/CD and Good to have Containerization like Docker or Open shift.

You should have:

  • Certifications in some of the core technologies.
  • Ability to collaborate across different teams/geographies/stakeholders/levels of seniority.
  • Customer focus with an eye on continuous improvement.
  • Energetic, enthusiastic and results-oriented personality.
  • Ability to coach other team members, you must be a team player!
  • Strong will to overcome the complexities involved in developing and supporting data pipelines.

Language requirements:

  • English – Fluent spoken and written (C1 level)

An array of benefits for you:

  • Hybrid work arrangements to balance in-office collaboration and home flexibility.
  • Annual Leave: 42 days off apart from Public / National Holidays.
  • Medical Insurance: Self + Spouse + 2 children. An option to opt for Voluntary Parental Insurance (Parents / Parent -in-laws) at a nominal premium covering pre existing disease.
  • In House training programs: professional and technical training certifications.

Mock Interview

Practice Video Interview with JobPe AI

Start Devops Interview Now
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Java Skills

Practice Java coding challenges to boost your skills

Start Practicing Java Now
DHL
DHL

349 Jobs

RecommendedJobs for You

Indore, Madhya Pradesh, India

Indore, Madhya Pradesh, India