Posted:3 hours ago|
Platform:
Work from Office
Full Time
Required Skills Python Strong programming skills for data processing, automation, and workflow orchestration. PySpark — Proven experience in big data processing using Spark for distributed computing. SQL — Expertise in writing, optimizing, and troubleshooting complex SQL queries for data transformation and reporting. ETL — Hands-on experience in designing, developing, and maintaining robust ETL pipelines for structured and semi-structured data. Job Responsibilities Design, develop, and maintain ETL pipelines to efficiently ingest, process, and transform large-scale datasets. Develop and optimize PySpark for distributed data processing on big data platforms. Automate data workflows and data quality checks using Python scripts and libraries . Write, tune, and manage complex SQL queries for data extraction, transformation, and analysis. Monitor and ensure data pipeline performance, reliability, and accuracy. Collaborate with data analysts, engineers, and business stakeholders to gather requirements and deliver scalable data solutions.
Consulting Krew
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python NowHyderabad
25.0 - 30.0 Lacs P.A.
Prayagraj, Varanasi, Ghaziabad, Kanpur, Lucknow, Agra
25.0 - 30.0 Lacs P.A.
Noida, Thiruvananthapuram
35.0 - 45.0 Lacs P.A.
Bengaluru
8.0 - 13.0 Lacs P.A.
Chennai
11.0 - 12.0 Lacs P.A.
Pune
15.0 - 16.0 Lacs P.A.
Experience: Not specified
12.0 - 24.0 Lacs P.A.
Bengaluru
10.0 - 18.0 Lacs P.A.
Coimbatore
4.0 - 8.0 Lacs P.A.
Chennai
4.0 - 8.0 Lacs P.A.