Posted:2 weeks ago|
Platform:
Hybrid
Full Time
We are looking for a skilled Data Engineer with hands-on experience in Airflow , Python , AWS , and Big Data technologies like Spark to join our dynamic team. Key Responsibilities Design and implement data pipelines and workflows using Apache Airflow Develop robust and scalable data processing applications using Python Leverage AWS services (S3, EMR, Lambda, Glue, Redshift, etc.) for data engineering and ETL pipelines Work with Big Data technologies like Apache Spark to process large-scale datasets Optimize and monitor data pipelines for performance, reliability, and scalability Collaborate with Data Scientists, Analysts, and Business teams to understand data needs and deliver solutions Ensure data quality, consistency, and governance across all data pipelines Document processes, pipelines, and best practices Mandatory Skills Apache Airflow - workflow orchestration and scheduling Python - strong programming skills for data engineering AWS - hands-on experience with core AWS data services Big Data technologies particularly Apache Spark Location: Hyderabad (Hybrid) Please share your resume with +91 9361912009
Kryon Knowledge Works
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Hyderabad
15.0 - 27.5 Lacs P.A.
Hyderabad
7.0 - 9.0 Lacs P.A.
Bengaluru
7.0 - 9.0 Lacs P.A.
Mumbai
7.0 - 9.0 Lacs P.A.
7.0 - 9.0 Lacs P.A.
Telangana
Salary: Not disclosed
Hyderabad
6.0 - 8.0 Lacs P.A.
Bengaluru
0.7 - 2.0 Lacs P.A.
Bengaluru
4.0 - 8.0 Lacs P.A.
8.0 - 16.0 Lacs P.A.