GCP Data Engineer

3 - 8 years

5 - 15 Lacs

Posted:2 days ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

Job Title:

About the Role:

Senior Data Engineer

Responsibilities:

  • Design, develop, and deploy

    scalable ETL pipelines

    using

    GCP data services

    , including

    BigQuery

    ,

    Cloud Composer (Apache Airflow)

    , and

    Cloud Storage

    .
  • Develop, deploy, and manage

    complex DAGs

    in

    Apache Airflow

    for orchestrating data workflows.
  • Write and optimize

    complex SQL and PL/SQL

    queries, stored procedures, and functions for data manipulation, transformation, and analysis.
  • Optimize

    BigQuery

    workloads for

    performance, cost efficiency

    , and

    scalability

    .
  • Develop scripts using

    Python

    and

    Shell scripting

    to support automation, data movement, and transformations.
  • Ensure

    data quality

    ,

    integrity

    , and

    reliability

    across all data solutions.
  • Collaborate with cross-functional teams including

    data scientists, analysts, and engineers

    to understand data requirements and deliver effective solutions.
  • Participate in

    code reviews

    and contribute to establishing and maintaining

    data engineering best practices

    .
  • Troubleshoot and resolve data pipeline issues in a timely manner.
  • Use

    version control systems (e.g., Git)

    for managing code and collaborating on engineering work.
  • Stay updated on the latest trends and technologies in

    data engineering

    ,

    cloud computing

    , and

    ETL processes

    .

Required Skills and Qualifications:

  • 5–8 years of hands-on experience in

    Data Engineering

    roles.
  • Mandatory Skills:

    • ETL

      pipeline design and implementation
    • SQL and PL/SQL

      (complex queries, procedures, and transformations)
    • Google Cloud Platform (GCP)

      — BigQuery, Cloud Composer, Cloud Storage
    • Apache Airflow

      (designing and deploying complex DAGs)
    • Python

      for scripting and data processing
    • Shell Scripting

      for automation and orchestration tasks
  • Experience with

    Informatica

    is a strong plus.
  • Proven ability to

    optimize BigQuery

    for performance and cost.
  • Familiarity with

    Git

    and version control best practices.
  • Excellent

    problem-solving

    ,

    analytical

    , and

    communication skills

    .
  • Ability to thrive both

    independently and within a collaborative agile team

    .
  • Bachelor’s degree in

    Computer Science

    ,

    Engineering

    , or a related field.

What We Offer:

  • A

    challenging and rewarding

    role in a dynamic and fast-paced environment.
  • Opportunity to work with

    cutting-edge technologies

    on the

    Google Cloud Platform

    .
  • A

    collaborative and supportive

    team culture.
  • Continuous

    learning and professional growth

    opportunities.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Datametica logo
Datametica

IT Services and IT Consulting

New York NY

RecommendedJobs for You

gurugram, haryana, india

pune, maharashtra, india

bengaluru, karnataka, india

chennai, tamil nadu, india

andhra pradesh, india