We are looking for a hands-on, technically strong Data Operations Lead to head our newly established Data Integration & Operations team in Chennai.
This is a build-and-run role: you ll help define how the team operates while leading day-to-day delivery. The team is part of the global Data & Measure function and is responsible for ensuring that our data products run efficiently, reliably, and consistently across platforms and markets.
You will own the operational layer of our data products including data ingestion, monitoring, deployment pipelines, automation, and support. This role requires deep technical knowledge of Azure and/or GCP, alongside the ability to lead and scale a growing team.
What youll be doing:
-
Technical Ownership & Execution
- Lead a team responsible for data integration, ingestion, orchestration, and platform operations
- Build and maintain automated data pipelines using Azure Data Factory, GCP Dataflow/Composer, or equivalent tools
- Define and implement platform-wide monitoring, logging, and alerting
- Manage cloud environments, including access control, security, and deployment automation
-
Operational Standardisation
- Create and roll out standard operating procedures, runbooks, onboarding guides, and automation patterns
- Ensure repeatable, scalable practices across all supported data products
- Define reusable deployment frameworks and templates for integration
-
Platform Support & Performance
- Set up and manage SLAs, incident workflows, and escalation models
- Proactively identify and resolve operational risks in cloud-based data platforms
- Partner with development and product teams to ensure seamless transition from build to run
-
Team Leadership
- Lead and mentor a new, growing team in Chennai
- Shape the team s operating model, priorities, and capabilities
- Act as a subject matter expert and escalation point for technical operations
What youll need:
-
Required Skills
- 7+ years in data operations, platform engineering, or data engineering
- Deep, hands-on experience in Azure and/or GCP environments
- Strong understanding of cloud-native data pipelines, architecture, and security
- Skilled in orchestration (e.g. ADF, Dataflow, Airflow), scripting (Python, Bash), and SQL
- Familiarity with DevOps practices, CI/CD, and infrastructure-as-code
- Proven experience managing production data platforms and support
- Ability to design operational frameworks from the ground up
- Demonstrated experience leading technical teams, including task prioritization, mentoring, and delivery oversight
-
Preferred Skills
- Experience with tools like dbt, Azure Synapse, BigQuery, Databricks, etc.
- Exposure to BI environments (e.g. Power BI, Looker)
- Familiarity with global support models and tiered ticket handling
- Experience with documentation, enablement, and internal tooling