Posted:8 hours ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Key Responsibilities

  • Design and architect

    scalable, secure, and resilient data platforms

    leveraging cloud-native technologies (AWS, Snowflake, Redshift).
  • Lead the design and implementation of

    data pipelines (ETL/ELT)

    for both batch and real-time processing.
  • Build and optimize

    data lakes and data warehouses

    to support enterprise analytics and AI/ML use cases.
  • Define and implement

    data modeling standards, metadata management, and master data management (MDM)

    practices.
  • Architect

    event-driven and streaming solutions

    using Kafka, EventBridge, and similar technologies.
  • Ensure compliance with

    governance, data security, privacy, and regulatory requirements

    (HIPAA, GDPR, etc.).
  • Drive adoption of best practices in

    data engineering, cloud architecture, and DevOps for data platforms

    .
  • Collaborate with business stakeholders, data scientists, engineers, and product teams to deliver data-driven insights.
  • Provide

    technical leadership, mentoring, and guidance

    to engineering teams.

Required Skills & Experience

  • 10+ years of experience in

    data architecture, engineering, or related technical roles

    .
  • Proven expertise in the

    modern data stack

    : AWS Glue, Lambda, Kinesis, S3, Redshift/Snowflake.
  • Strong programming and scripting experience with

    SQL and Python

    .
  • Hands-on experience with

    workflow orchestration

    tools (Airflow, Prefect, Step Functions).
  • Proficiency in

    Big Data technologies

    : Spark, PySpark, Scala.
  • Solid understanding of

    streaming & event-driven architectures

    (Kafka, EventBridge/Event Bus).
  • Experience in

    data modeling, building data lakes/warehouses

    , and architecting analytical platforms.
  • Strong knowledge of

    data governance, security, compliance, and MDM

    practices.
  • Excellent problem-solving, analytical, and

    system design

    skills.
  • Exceptional communication,

    stakeholder management

    , and leadership capabilities.

Nice-to-Have Skills

  • Exposure to

    ML Ops

    and supporting AI/ML pipelines.
  • Experience with

    containerization and orchestration

    (Docker, Kubernetes).
  • Familiarity with

    CI/CD pipelines

    and Infrastructure as Code (Terraform, CloudFormation).
  • Experience in

    multi-cloud or hybrid-cloud environments

    .

Mock Interview

Practice Video Interview with JobPe AI

Start DevOps Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Hash Agile Technologies logo
Hash Agile Technologies

Information Technology

San Francisco

RecommendedJobs for You

hyderabad, bengaluru, delhi / ncr

coimbatore, tamil nadu, india

mulshi, maharashtra, india

chennai, bengaluru, mumbai (all areas)

chennai, tamil nadu, india