Data Engineer(Location: Pan India)

5 years

0 Lacs

Posted:2 weeks ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Job Title: Data Engineer

Experience: 5+ Years

Location: Pan India

Mode: Hybrid


Skill combination- Python AND AWS AND Databricks AND Pyspark AND Elastic Search

Data Engineer

Key Responsibilities:

  • Design, implement, and maintain scalable data pipelines using

    Databricks

    ,

    PySpark

    , and

    SQL

    .
  • Develop and optimize ETL processes leveraging services like

    AWS Glue

    ,

    GCP DataProc/DataFlow

    ,

    Azure ADF/ADLF

    , and

    Apache Spark

    .
  • Build, manage, and monitor

    Airflow DAGs

    to orchestrate data workflows.
  • Integrate and manage

    Elastic Search

    for data indexing, querying, and analytics.
  • Write advanced

    SQL queries

    using window functions and analytics techniques.
  • Design data schemas and models that align with various business domains and use cases.
  • Optimize data warehousing performance and storage using best practices.
  • Ensure data security, governance, and compliance across all environments.
  • Apply data engineering design patterns and frameworks to build robust solutions.
  • Collaborate with Product, Data, and Engineering teams; support executive data needs.
  • Participate in Agile ceremonies and follow DevOps/DataOps/DevSecOps practices.
  • Respond to critical business issues as part of an on-call rotation.

Must-Have Skills:

  • Databricks

    (3+ years): Development and orchestration of data workflows.
  • Python & PySpark

    (3+ years): Hands-on experience in distributed data processing.
  • Elastic Search

    (3+ years): Indexing and querying large-scale datasets.
  • SQL

    (3+ years): Proficiency in analytical SQL including

    window functions

    .
  • ETL Services

    :
  • AWS Glue

  • GCP DataProc/DataFlow

  • Azure ADF / ADLF

  • Airflow

    : Designing and maintaining data workflows.
  • Data Warehousing

    : Expertise in performance tuning and optimization.
  • Data Modeling

    : Understanding of data schemas and business-oriented data models.
  • Data Security

    : Familiarity with encryption, access control, and compliance standards.
  • Cloud Platforms

    : AWS (must), GCP and Azure (preferred).

Skills

Python,Databricks,Pyspark,Elastic Search

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You