Data Ops Engineer

3 - 6 years

7 - 15 Lacs

Posted:2 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Job Scope:

Responsible for orchestrating, monitoring, and maintaining reliable and scalable data pipelines across Tanlas CPaaS ecosystem.

Ensure observability, reliability, and data quality across all data platforms through monitoring, alerting, and automated health checks.

Participate in incident detection, troubleshooting, and resolution to minimize downtime and data disruptions.

Implement and maintain data quality frameworks ensuring consistency, accuracy, and availability of business-critical data.

Job Responsibilities:

  • Design, build, and maintain data pipeline orchestration workflows using tools like Apache Airflow or Prefect.
  • Implement monitoring and logging frameworks (Prometheus, Grafana, ELK, Splunk) to ensure proactive alerting and performance visibility.
  • Manage incidents by identifying anomalies, conducting root cause analysis, and coordinating resolutions across teams.
  • Enforce data quality frameworks, including validation checks, schema management, and data freshness monitoring.
  • Collaborate with engineering, product, and analytics teams to ensure seamless data flow and high system reliability.
  • Automate repetitive data operations and create runbooks for faster troubleshooting.
  • Document operational procedures, system health dashboards, and data reliability SLAs/SLOs.

Qualification and Experience:

  • B.E. / B.Tech / M.Sc. / MCA in Computer Science, Information Systems, or related discipline.
  • 3-5 years of experience in Data Operations / Data Engineering / Data Reliability roles.
  • Proven exposure to pipeline orchestration, monitoring, troubleshooting, and data quality management in large-scale production environments.

Knowledge and Skills:

  • Hands-on experience with orchestration tools (Apache Airflow, Prefect, or equivalent).
  • Strong experience in monitoring & logging using Prometheus, Grafana, ELK, or Splunk.
  • Knowledge of incident management and troubleshooting techniques in data environments.
  • Experience implementing and managing data quality frameworks.
  • Proficiency in SQL and scripting languages (Python, Bash).
  • Familiarity with cloud data environments (AWS, GCP, or Azure).
  • Understanding of CI/CD, version control, and automation principles.
  • Good problem-solving, analytical, and collaboration skills.

Why Join Us?

  • Impactful Work: Play a pivotal role in safeguarding Tanla's assets, data, and reputation in the industry.
  • Tremendous Growth Opportunities: Be part of a rapidly growing company in the telecom and CPaaS space, with opportunities for professional development.
  • Innovative Environment: Work alongside a world-class team in a challenging and fun environment, where innovation is celebrated.
    Tanla is an equal opportunity employer. We champion diversity and are committed to creating an inclusive environment for all employees. www.tanla.com

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Tanla Platforms logo
Tanla Platforms

Cloud Communications

Hyderabad

RecommendedJobs for You

Noida, Uttar Pradesh, India