Home
Jobs

5 - 8 years

10 - 18 Lacs

Posted:1 day ago| Platform: Naukri logo

Apply

Work Mode

Hybrid

Job Type

Full Time

Job Description

We are seeking an experienced Data Engineer, skilled in building and managing data orchestration pipelines within cloud-native environments. The ideal candidate will have extensive experience with Kubernetes, Airflow, Python, modern observability tools (Grafana + Prometheus) and Google Cloud Platform (GCP). You will be responsible for designing, developing, and maintaining data pipelines that support NLP and LLM models, ensuring data quality, scalability, and reliability.


Key Responsibilities:

  • Design and Develop Data Pipelines:

    Create, manage, and optimize data collection and processing pipelines using Airflow, Kubernetes (GKE), and GCP to handle large volumes of text-based social media data.
  • Cloud Infrastructure Management:

    Implement and maintain cloud infrastructure on GCP, ensuring high availability, scalability, and security of data processing environments.
  • Data Integration:

    Develop robust data integration solutions to aggregate data from various social media platforms and other sources, ensuring data consistency and reliability.
  • NLP and LLM Model Support:

    Work closely with data scientists and machine learning engineers to support the deployment and maintenance of NLP and LLM models in production.
  • Database Management:

    Design, manage, and optimize databases for storage and retrieval of large-scale text data, ensuring efficient data access and query performance.
  • Version Control:

    Utilize Git for version control and collaboration on codebases, ensuring best practices in code management and deployment.
  • Performance Tuning:

    Monitor and improve the performance of data pipelines, identifying and resolving bottlenecks and inefficiencies.
  • Documentation:

    Maintain comprehensive documentation for all data engineering processes, ensuring transparency and knowledge sharing within the team.
  • Collaboration:

    Work collaboratively with cross-functional teams, including data scientists, product managers, and other stakeholders, to understand data requirements and deliver solutions that meet business needs.

Requirements:

  • Airflow on GKE Production Experience

    DAG authoring, Helm/Terraform cluster provisioning, autoscaling (KEDA/HPA/GKE Autopilot), and CI/CD of DAGs.
  • Observability & Monitoring Vision

    Hands-on dashboarding in

    Grafana

    , metrics via Prometheus/Cloud Monitoring, and definition of SLA/SLOs for pipelines.
  • Python Expertise

    Advanced Python for data processing, custom Airflow operators/hooks/sensors, and familiarity with Airflow 2.x.
  • GCP Core Services

    Daily use of BigQuery, Cloud Storage, Pub/Sub, Secret Manager, IAM/Workload Identity, VPC-SC; infrastructure as code with Terraform.
  • Database & SQL Skills

    Proficiency with relational databases (PostgreSQL)
  • Git & DevOps Practices

    Branching strategies, code reviews, automated testing, and GitOps-style deployments.

Preferred / Bonus:

  • Prior experience supporting large-scale NLP or LLM workloads.
  • Familiarity with social-media APIs (Twitter/X, Reddit, TikTok, Meta).
  • GCP Professional Data Engineer or Cloud DevOps Engineer certification.

More Jobs at Turbodyne Energy Systems

Mock Interview

Practice Video Interview with JobPe AI

Start Airflow Interview Now
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru

Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru