Data Engineer

0 years

0 Lacs

Posted:6 days ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

About The Role

As a Data Engineer in the Edge of Technology Center, you will play a critical role in designing and implementing scalable data infrastructure to power advanced analytics, AI/ML, and business intelligence. This position demands a hands-on technologist who can architect reliable pipelines, manage real-time event streams, and ensure smooth data operations across cloud-native environments. You will work closely with cross functional teams to enable data-driven decision- making and innovation across theorganization.

Key Responsibilities

  • Design, implement, and maintain robust ETL/ELT pipelines using tools like Argo Workflows or Apache Airflow.
  • Manage and execute database schema changes with Alembic or Liquibase, ensuring data consistency.
  • Configure and optimize distributed query engines like Trino and AWS Athena for analytics.
  • Deploy and manage containerized workloads on AWS EKS or GCP GKE using Docker, Helmfile, and Argo CD.
  • Build data lakes/warehouses on AWS S3 and implement performant storage using Apache Iceberg.
  • Use Terraform and other IaC tools to automate cloud infrastructure provisioning securely.
  • Develop CI/CD pipelines with GitHub Actions to support rapid and reliable deployments.
  • Architect and maintain Kafka-based real-time event-driven systems using Apicurio and AVRO.
  • Collaborate with product, analytics, and engineering teams to define and deliver data solutions.
  • Monitor and troubleshoot data systems for performance and reliability issues using observability tools (e.g., Prometheus, Grafana).
  • Document data flows and maintain technical documentation to support scalability and knowledge sharing.

Key Deliverables

  • Fully operational ETL/ELT pipelines supporting high-volume, low-latency data processing.
  • Zero-downtime schema migrations with consistent performance across environments.
  • Distributed query engines tuned for large-scale analytics with minimal response time.
  • Reliable containerized deployments in Kubernetes using GitOps methodologies.
  • Kafka-based real-time data ingestion pipelines with consistent schema validation.
  • Infrastructure deployed and maintained as code using Terraform and version control.
  • Automated CI/CD processes ensuring fast, high-quality code releases.
  • Cross-functional project delivery aligned with business requirements.
  • Well-maintained monitoring dashboards and alerting for proactive issue resolution.
  • Internal documentation and runbooks for operational continuity and scalability.

Qualifications

Bachelor’s or master’s degree in computer science, Data Science, Engineering, or a related field from a recognized institution.

Technical Skills

  • Orchestration Tools: Argo Workflows, Apache Airflow
  • Database Migration: Alembic, Liquibase
  • SQL Engines: Trino, AWS Athena
  • Containers & Orchestration: Docker, AWS EKS, GCP GKE
  • Data Storage: AWS S3, Apache Iceberg
  • Relational Databases: Postgres, MySQL, Aurora
  • Infrastructure Automation: Terraform (or equivalent IaC tools)
  • CI/CD: GitHub Actions or similar
  • GitOps Tools: Argo CD, Helmfile
  • Event Streaming: Kafka, Apicurio, AVRO
  • Languages: Python, Bash
  • Monitoring: Prometheus, Grafana (preferred)

Soft Skills

  • Strong analytical and problem-solving capabilities in complex technical environments.
  • Excellent written and verbal communication skills to interact with both technical and non- technical stakeholders.
  • Self-motivated, detail-oriented, and proactive in identifying improvement opportunities.
  • Team player with a collaborative approach and eagerness to mentor junior team members.
  • High adaptability to new technologies and dynamic business needs.
  • Effective project management and time prioritization.
  • Strong documentation skills for maintaining system clarity.
  • Ability to translate business problems into data solutions efficiently.

Benefits

  • Competitive salary and benefits package in a globally operating company.
  • Opportunities for professional growth and involvement in diverse projects.
  • Dynamic and collaborative work environment

Why You'll Love Working With Us

Encardio offers a thriving environment where innovation and collaboration are essential. You'll be part of a diverse team shaping the future of infrastructure globally. Your work will directly contribute to some of the world's most ambitious and ground-breaking engineering projects.Encardio is an equal-opportunity employer committed to diversity and inclusion.

How To Apply

Please submit your CV and cover letter outlining your suitability for the role at

humanresources@encardio.com

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You

hyderabad, telangana

Noida, Uttar Pradesh, India