Posted:1 month ago|
Platform:
Remote
Full Time
Job Overview : We are looking for an experienced GCP Data Engineer with deep expertise in BigQuery, DataFlow, DataProc, Pub/Sub, and GCS to build, manage, and optimize large-scale data pipelines. The ideal candidate should have a strong background in cloud data storage, real-time data streaming, and orchestration. Key Responsibilities : Data Storage & Management : - Manage Google Cloud Storage (GCS) buckets, set up permissions, and optimize storage solutions for handling large datasets. - Ensure data security, access control, and lifecycle management. Data Processing & Analytics : - Design and optimize BigQuery for data warehousing, querying large datasets, and performance tuning. - Implement ETL/ELT pipelines for structured and unstructured data. - Work with DataProc (Apache Spark, Hadoop) for batch processing of large datasets. Real-Time Data Streaming : - Use Pub/Sub for building real-time, event-driven streaming pipelines. - Implement Dataflow (Apache Beam) for real-time and batch data processing. Workflow Orchestration & Automation : - Use Cloud Composer (Apache Airflow) for scheduling and automating data workflows. - Build monitoring solutions to ensure data pipeline health and performance. Cloud Infrastructure & DevOps : - Implement Terraform for provisioning and managing cloud infrastructure. - Work with Google Kubernetes Engine (GKE) for container orchestration and managing distributed applications. Advanced SQL & Data Engineering : - Write efficient SQL queries for data transformation, aggregation, and analysis. - Optimize query performance and cost efficiency in BigQuery. Required Skills & Qualifications : - 4-8 years of experience in GCP Data Engineering - Strong expertise in BigQuery, DataFlow, DataProc, Pub/Sub, and GCS - Experience in SQL, Python, or Java for data processing and transformation - Proficiency in Airflow (Cloud Composer) for scheduling workflows - Hands-on experience with Terraform for cloud infrastructure automation - Familiarity with NoSQL databases like Bigtable for high-scale data handling - Knowledge of GKE for containerized applications and distributed processing Preferred Qualifications : - Experience with CI/CD pipelines for data deployment - Familiarity with Cloud Functions or Cloud Run for serverless execution - Understanding of data governance, security, and compliance Why Join Us ? - Work on cutting-edge GCP data projects in a cloud-first environment - Competitive salary and career growth opportunities - Collaborative and innovative work culture - Exposure to big data, real-time streaming, and advanced analytics.
Gloify
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Java coding challenges to boost your skills
Start Practicing Java Now15.0 - 19.0 Lacs P.A.
Greater Kolkata Area
Salary: Not disclosed
Sadar, Uttar Pradesh, India
Salary: Not disclosed
Bengaluru
8.0 - 14.0 Lacs P.A.
Hyderabad, Ahmedabad
20.0 - 32.5 Lacs P.A.
3.0 - 7.0 Lacs P.A.
Pune, Chennai, Mumbai (All Areas)
10.0 - 13.0 Lacs P.A.
5.0 - 8.0 Lacs P.A.
9.0 - 13.0 Lacs P.A.
5.0 - 8.0 Lacs P.A.