Posted:1 week ago|
Platform:
Work from Office
Full Time
We are looking for a Technical Lead Data Engineering with strong experience in Infrastructure & ETL development.
This requires experience in setting up environments, tool installations, and cloud services to support ongoing
development initiatives
Design and manage scalable data architectures that meet business needs and performance requirements.
Lead the implementation of data storage solutions, such as data warehouses and data lakes, across hybrid and cloud-based environments (AWS, Azure, or GCP)
Develop and enforce data governance, quality, and security standards to protect sensitive data and ensure compliance.
Develop and optimize complex SQL queries on traditional and cloud-native databases (Postgres, Big Query, Redshift,
MSSQL).
Setup and manage infrastructure tools such as Apache Airflow, Pentaho, and related components.
Handle end-to-end Linux-based server setups, configurations, and performance optimization.
Monitor and troubleshoot data infrastructure and pipeline issues to ensure high availability and reliability.
Conduct performance tuning and optimization of ETL jobs and database systems to support growing data volumes
Promoting best practices in code quality, documentation, and continuous integration/continuous deployment
(CI/CD).
Participate in infrastructure planning and contribute to strategic decisions for tool and platform adoption.
Provide hands-on support for tool deployments and environment setups across product teams.
Lead and mentor junior engineers while collaborating with cross-functional teams (Network, Server and
services providers).
Ensure smooth migration from Windows to Linux-based environments.
Demonstrated experience with DevOps practices along with cloud infrastructure services
(AWS/GCP/Azure).
Advanced SQL expertise with MPP databases (BigQuery, Redshift) and RDBMS (Postgres, Oracle).
Strong hands-on experience in Linux environments along with scripting (Batch and shell).
Experience with Airflow installation, configuration and scaling, and DAG development.
Understanding of infrastructure concepts system design, server setup, user/role management,
performance tuning.
Familiarity with cloud-based data storage and processing services, big data technologies (e.g., Spark, Hadoop), and
containerization (e.g., Docker, Kubernetes)
Proven expertise in designing and implementing large-scale data solutions using Pentaho Data Integration (PDI)
Experience in code versioning tools (Git preferred).
Excellent documentation and communication skills.
Experience with orchestration tools like Control-M, JAMS, Autosys.
Exposure to Hadoop ecosystem (Hive, Spark, Sqoop).
Working knowledge of Python.
Familiarity with digital marketing/AdTech data and tools.
Experience with BI tools like Tableau or Pentaho BA
WebMD
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
thane, navi mumbai, mumbai (all areas)
15.0 - 30.0 Lacs P.A.
india
Salary: Not disclosed
sholinganallur, tamil nadu, india
Salary: Not disclosed
gurugram, haryana, india
Salary: Not disclosed
bengaluru east, karnataka, india
Salary: Not disclosed
noida, pune, bengaluru
15.0 - 25.0 Lacs P.A.
gurugram, haryana, india
Salary: Not disclosed
hyderabad, chennai
8.0 - 18.0 Lacs P.A.
hyderābād
5.3125 - 7.225 Lacs P.A.
gurgaon
6.0 - 9.0 Lacs P.A.