Posted:1 month ago|
Platform:
Work from Office
Full Time
Key Responsibilities - Python & PySpark: - Writing efficient ETL (Extract, Transform, Load) pipelines. - Implementing data transformations using PySpark DataFrames and RDDs. - Optimizing Spark jobs for performance and scalability. - Apache Spark: - Managing distributed data processing. - Implementing batch and streaming data processing. - Tuning Spark configurations for efficient resource utilization. - Unix Shell Scripting: - Automating data workflows and job scheduling. - Writing shell scripts for file management and log processing. - Managing cron jobs for scheduled tasks. - Google Cloud Platform (GCP) & BigQuery: - Designing data warehouse solutions using BigQuery. - Writing optimized SQL queries for analytics. - Integrating Spark with BigQuery for large-scale data processing
Allegis Group
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Bengaluru
6.0 - 8.4 Lacs P.A.
6.0 - 14.0 Lacs P.A.
Bengaluru
0.7 - 0.8 Lacs P.A.
8.0 - 15.0 Lacs P.A.
8.0 - 18.0 Lacs P.A.
7.2 - 12.0 Lacs P.A.
Coimbatore
2.4 - 7.2 Lacs P.A.
18.0 - 22.5 Lacs P.A.
Hyderabad, Chennai, Bengaluru
18.0 - 22.5 Lacs P.A.
Hyderabad
18.0 - 30.0 Lacs P.A.