Data Engineering Architect (GCP)

6 - 10 years

20 - 35 Lacs

Posted:5 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Job Title:

Location:

About the Role

Data Engineering Architect

Key Responsibilities

Design & Architecture:

  • Design and implement scalable, reliable, and efficient data architectures for Google products and services.
  • Develop and maintain

    data models, schemas, and ontologies

    to support diverse data sources.
  • Evaluate and recommend

    emerging data technologies

    to enhance the data infrastructure.
  • Collaborate with

    product managers, engineers, and researchers

    to define data requirements and translate them into technical solutions.

Data Processing & Pipelines:

  • Build and optimize

    batch and real-time data pipelines

    using GCP services such as Dataflow, Dataproc, Pub/Sub, and Cloud Functions.
  • Implement

    data quality checks and validation processes

    to ensure data accuracy and consistency.
  • Define and enforce

    data governance policies

    for security and compliance.

Data Storage & Management:

  • Architect scalable data storage solutions using

    BigQuery, Cloud Storage, and Spanner

    .
  • Optimize data storage and retrieval for performance and cost-effectiveness.
  • Implement

    data lifecycle management policies

    including retention, archiving, and deletion.

Team Leadership & Mentorship:

  • Provide

    technical guidance

    to data engineers and team members.
  • Mentor junior engineers to build expertise in data engineering best practices.
  • Foster a culture of

    innovation, collaboration, and continuous learning

    within the team.

Required Qualifications

  • 6–8 years of experience in

    data engineering or related fields

    .
  • Strong understanding of

    data warehousing, data modeling, and ETL/ELT processes

    .
  • Proven expertise in designing and implementing

    large-scale data pipelines and architectures

    .
  • Proficiency in

    SQL

    and at least one programming language (

    Python or Java

    ).
  • Hands-on experience with

    GCP services

    : BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage, Spanner.
  • Experience with

    big data frameworks

    : Hadoop, Spark, Kafka.
  • Excellent

    communication, collaboration, and problem-solving skills

    .

Preferred Qualifications

  • Experience in

    data governance and quality management

    .
  • Familiarity with

    machine learning and data science

    workflows.
  • Knowledge of

    Docker, Kubernetes, and cloud-based orchestration

    .
  • Contributions to

    open-source projects

    or communities.
  • Google Cloud Professional Data Engineer certification

    .

Technical Skills Summary

Mandatory:

  • GCP (BigQuery, Dataflow, Dataproc, Pub/Sub, Cloud Storage)
  • SQL & Python/Java
  • Data modeling, ETL/ELT, data warehousing
  • Batch & real-time data pipelines
  • Big data frameworks: Hadoop, Spark, Kafka

Preferred:

  • Data governance and quality management
  • Machine learning & data science pipelines
  • Docker & Kubernetes

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Bean Hr Consulting logo
Bean Hr Consulting

Human Resources Consulting

New York

RecommendedJobs for You