Posted:1 week ago|
Platform:
On-site
Full Time
Role description
Exp 5 8 yr
Location Hyd
JD Pyspark developer to work on a range of datadriven projects using PysparkSQLPython and Apache Airflow for Job scheduling and orchestration on Google Cloud PlatformGCPIn this this role you will be responsible for implementing data pipelines processing large datasets writing sql queries and ensuring smooth orchestration and automation of jobs using Airflow
Required Skills Qualifications
Experience with Pyspark for data processing and largescale data processing
Proficiency in SQL for writing complex queries and optimizing database operations
Strong knowledge of Python and experience in using Python libraries like Pandas and Numpy
Handsonexperience with Apache Airflow for job scheduling DAG creation and workflow management
experience working with Google Cloud PlatformGCP including Goolg Cloud StorageGCS BigQuery Dataflow and Dataproc
Strong understanding of ETL processes and data pipeline development
Familiarity with version control systems like Git
Skills
Mandatory Skills : GCP Storage,GCP BigQuery,GCP DataProc,GCP Cloud Composer,GCP DMS,Apache airflow,Java,Python,Scala,GCP Datastream,Google Analytics Hub,GCP Workflows,GCP Dataform,GCP Datafusion,GCP Pub/Sub,ANSI-SQL,GCP Dataflow,GCP Data Flow,GCP Cloud Pub/Sub,Big Data Hadoop Ecosystem
LTIMindtree
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python NowHyderabad, Telangana, India
Experience: Not specified
Salary: Not disclosed
Hyderabad, Telangana, India
Experience: Not specified
Salary: Not disclosed