Home
Jobs

13 Gcp Services Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

3 - 8 Lacs

Bengaluru

Remote

Naukri logo

Analyze the current GCP setup and DevOps workflows Propose and implement improvements in infrastructure, security, and automation Build and maintain scalable, secure GKE clusters and CI/CD pipelines

Posted 4 days ago

Apply

5.0 - 8.0 years

15 - 25 Lacs

Hyderabad, Pune

Work from Office

Naukri logo

Devops Engineer 5-10 Years Pune & Hyderabad We are looking for a skilled DevOps Engineer with 5 to 12 years of experience to join our dynamic team. The ideal candidate will have a strong background in DevOps practices, CI/CD pipeline creation, and experience with GCP services. You will play a crucial role in ensuring smooth development, deployment, and integration processes. Key Responsibilities: CI/CD Pipeline Creation: Design, implement, and manage CI/CD pipelines using GitHub, ensuring seamless integration and delivery of software. Version Control: Manage and maintain code repositories using GitHub, ensuring best practices for version control and collaboration. Infrastructure as Code: Write and maintain infrastructure as code (IaC) using Terraform/YAML, ensuring consistent and reliable deployment processes. GCP Services Management: Utilize Google Cloud Platform (GCP) services to build, deploy, and scale applications. Manage and optimize cloud resources to ensure

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Kolkata

Hybrid

Naukri logo

Role - GCP Data Engineer Experience:4+ years Preferred - Data Engineering Background Location -Kolkata ( Face to Face Interview) Required Skills - GCP DE Experience, Big query, SQL, Cloud compressor/Python, Cloud functions, Dataproc+pyspark, Python injection, Dataflow+PUB/SUB Job description - > Have Implemented and Architected solutions on Google Cloud Platform using the components of GCP >Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. > Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. > Experience programming in Java, Python, etc. > Expertise in at least two of these technologies: Relational Databases, Analytical Databases, NoSQL databases. > Certified in Google Professional Data Engineer/ Solution Architect is a major Advantage Skills Required: 3+ years experience in IT or professional services experience in IT delivery or large-scale IT analytics projects Candidates must have expertise knowledge of Google Cloud Platform; the other cloud platforms are nice to have. Expert knowledge in SQL development. Expertise in building data integration and preparation tools using cloud technologies (like Snaplogic, Google Dataflow, Cloud Dataprep, Python, etc). Experience with Apache Beam/Google Dataflow/Apache Spark in creating end to end data pipelines. Experience in some of the following: Python, Hadoop, Spark, SQL, Big Query, Big Table Cloud Storage, Datastore, Spanner, Cloud SQL, Machine Learning. Experience programming in Java, Python, etc. Identify downstream implications of data loads/migration (e.g., data quality, regulatory, etc.) Implement data pipelines to automate the ingestion, transformation, and augmentation of data sources, and provide best practices for pipeline operations. Capability to work in a rapidly changing business environment and to enable simplified user access to massive data by building scalable data solutions Advanced SQL writing and experience in data mining (SQL, ETL, data warehouse, etc.) and using databases in a business environment with complex datasets Interested candidate please revert with your updates CV to aruna.b@tredence.com

Posted 1 week ago

Apply

4.0 - 6.0 years

14 - 15 Lacs

Mumbai

Work from Office

Naukri logo

4+yrs exp in cloud pre-sales, consulting/architecture roles, specifically with GCP Working with enterprise clients across different vertical Prepare technical presentations,demos, proof-of-concept solution documentation,proposals&responses RFPs/RFIs Required Candidate profile Solid understanding of GCP services such as Compute Engine, App Engine, GKE, BigQuery, Cloud Storage, Pub/Sub, IAM, VPC, etc Ability to perform TCO/ROI analysis and cost optimization for GCP solutions

Posted 1 week ago

Apply

5.0 - 7.0 years

15 - 25 Lacs

Hyderabad, Pune

Work from Office

Naukri logo

Devops Engineer 5-7 Years Pune & Hyderabad We are looking for a skilled DevOps Engineer with 5 to 12 years of experience to join our dynamic team. The ideal candidate will have a strong background in DevOps practices, CI/CD pipeline creation, and experience with GCP services. You will play a crucial role in ensuring smooth development, deployment, and integration processes. Key Responsibilities: CI/CD Pipeline Creation: Design, implement, and manage CI/CD pipelines using GitHub, ensuring seamless integration and delivery of software. Version Control: Manage and maintain code repositories using GitHub, ensuring best practices for version control and collaboration. Infrastructure as Code: Write and maintain infrastructure as code (IaC) using Terraform/YAML, ensuring consistent and reliable deployment processes. GCP Services Management: Utilize Google Cloud Platform (GCP) services to build, deploy, and scale applications. Manage and optimize cloud resources to ensure

Posted 1 week ago

Apply

3.0 - 5.0 years

6 - 8 Lacs

Chandigarh

Work from Office

Naukri logo

Role Overview We are seeking a talented ETL Engineer to design, implement, and maintain end-to-end data ingestion and transformation pipelines in Google Cloud Platform (GCP). This role will collaborate closely with data architects, analysts, and BI developers to ensure high-quality, performant data delivery into BigQuery and downstream Power BI reporting layers. Key Responsibilities Data Ingestion & Landing Architect and implement landing zones in Cloud Storage for raw data. Manage buckets/objects and handle diverse file formats (Parquet, Avro, CSV, JSON, ORC). ETL Pipeline Development Build and orchestrate extraction, transformation, and loading workflows using Cloud Data Fusion. Leverage Data Fusion Wrangler for data cleansing, filtering, imputation, normalization, type conversion, splitting, joining, sorting, union, pivot/unpivot, and format adjustments. Data Modeling Design and maintain fact and dimension tables using Star and Snowflake schemas. Collaborate on semantic layer definitions to support downstream reporting. Load & Orchestration Load curated datasets into BigQuery across different zones (raw, staging, curated). Develop SQL-based orchestration and transformation within BigQuery (scheduled queries, scripting). Performance & Quality Optimize ETL jobs for throughput, cost, and reliability. Implement monitoring, error handling, and data quality checks. Collaboration & Documentation Work with data analysts and BI developers to understand requirements and ensure data readiness for Power BI. Maintain clear documentation of pipeline designs, data lineage, and operational runbooks. Required Skills & Experience Bachelors degree in Computer Science, Engineering, or related field. 3+ years of hands-on experience building ETL pipelines in GCP. Proficiency with Cloud Data Fusion , including Wrangler transformations. Strong command of SQL , including performance tuning in BigQuery. Experience managing Cloud Storage buckets and handling Parquet, Avro, CSV, JSON, and ORC formats. Solid understanding of dimensional modeling: fact vs. dimension tables, Star and Snowflake schemas. Familiarity with BigQuery data zones (raw, staging, curated) and dataset organization. Experience with scheduling and orchestration tools (Cloud Composer, Airflow, or BigQuery scheduled queries). Excellent problem-solving skills and attention to detail. Preferred (Good to Have) Exposure to Power BI data modeling and DAX. Experience with other GCP services (Dataflow, Dataproc). Familiarity with Git, CI/CD pipelines, and infrastructure as code (Terraform). Knowledge of Python for custom transformations or orchestration scripts. Understanding of data governance best practices and metadata management.

Posted 2 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centres (Delivery Centres), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centres offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your role and responsibilities A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe.Youll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat. In your role, you will be responsible for: Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies : OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 3 weeks ago

Apply

5.0 - 10.0 years

1 - 4 Lacs

Noida

Work from Office

Naukri logo

Fin Ops Practitioner Sr. Fin Ops Practitioner Skill & Experience Manage cost visibility of public cloud platform for AWS/AZURE/GCP Monitor cloud spend and create budget alerts Review & recommend the FinOps tool Conduct periodic reports and regular reviews of cloud spend with (AWS/AZURE/GCP) Cloud Manage cloud commitments (CUD, Saving Plans, RI) & suggest use of Preemptible or Spot instances, wherever suitable Become the bridge between Finance, Product Owners & Cloud Engineers Advocate FinOps principles in day-to-day operations & induce FinOps culture within the stakeholders Engage with multiple client stakeholders technology, finance and procurement and understand current consumption patterns and run cost Review, recommend and facilitate the implementation of FinOps tool Manage cloud commitments (CUD, Saving Plans, RI) & recommend use of right cloud services for optimization Bachelors degree holder in Computer Science, Information Technology or other relevant fields At least 10 years of experience on public cloud platforms and at least 5 years of exposure to AWS/AZURE/GCP billing management Associate or Professional level certified candidate in AWS/AZURE/GCP is a plus FinOps Certified Practitioner is an additional advantage Good understanding of AWS/AZURE/GCP Billing methodology, Organization & Project structure Good understanding of instance types, storage types & of other AWS/AZURE/GCP services Good understanding of cost drivers for cloud resources Capable to consolidate data and deliver aggregate view/report Understanding of variable cost models for cloud resources Possess moderate verbal and written communication skills to work effectively with technical and nontechnical personnel at various levels in the organization and with vendors Experience building BI dashboards (Power BI, Tableau, etc...) Experience in Platforms like Flexera, CloudHealth, Apptio Datadog would be an added advantage.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Foundit logo

Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centres (Delivery Centres), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centres offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe.Youll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat. In your role, you will be responsible for: Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies : OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 3 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Foundit logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. Youll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, youll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. Youll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat In your role, you will be responsible for: Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies: OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 3 weeks ago

Apply

10 - 18 years

6 - 16 Lacs

Kolkata, Hyderabad, Pune

Work from Office

Naukri logo

GCP services, Infrastructure,Cloud Architecture & Design:Cloud Infrastructure Management:Collaboration & Stakeholder Engagement: Monitoring & Troubleshooting:

Posted 1 month ago

Apply

- 1 years

0 - 2 Lacs

Gurugram

Work from Office

Naukri logo

Seeking a motivated DevOps Intern with basic AWS/GCP knowledge and interest in DevOps to support CI/CD, automation, and cloud infrastructure tasks. Required Candidate profile Pursuing/completed CS or IT degree; basic AWS/GCP, Linux, Git knowledge; scripting (Bash/Python) a plus; strong problem-solving; eager to learn and collaborate.

Posted 1 month ago

Apply

8 - 13 years

5 - 12 Lacs

Chennai, Bengaluru, Hyderabad

Work from Office

Naukri logo

Role: Python developer Experience:8+years Location: Bangalore, Hyderabad, Chennai Work Mode: Work from Office Responsibilities: Participate effectively in the entire software development life cycle Leading for solution design and implementation Design, develop, test and refine deliverables that meet the objectives Collaborate with US partners for requirement understanding solution implementation and deployments Collaborate with Product Team, Scrum Master, Developers, QA and any other stakeholders as needed Analyze business and technology challenges and suggest solutions Responsible for application development, maintenance, and security requirements Essential Qualifications (Technical Skills): 8+ years of experience as Python developer. Strong Python programming skills. Design, develop, and maintain backend services using FastAPI/Flask. Design, develop, and deploy microservices using Python. In depth experience of python framework and tool such as NumPy, Pandas, pyMongo. Implement event-driven architecture and use messaging queues such as Kafka, RabbitMQ or ActiveMQ for asynchronous processing. Ensure efficient multithreading and concurrency in backend processes. Design and implement secure REST APIs for consumption by a React application. Consume services provided by interface systems Write complex MongoDB queries and perform data aggregation. Experience with MongoDB for database management and data retrieval. Implement security measures to protect API endpoints. Hands on experience with GCP services, particularly Vertex AI, Document AI. Understanding of Cloud principles and experience on developing application hosted on cloud environment. Understanding of high availability, scalability, and resilience in software systems Experience with CI/CD technologies such as Gradle, Jenkins, GitHub, Artifactory, Harness, Sonar, open shift/Kubernetes, Docker etc. Experience on automated unit testing framework Pytest, magicmock Experience on agile software development lifecycle. Object oriented design and analysis, programming styles and design patterns. Non-Technical: • Capable of reasoning and thinking through problems and developing desired solutions, independently or with others as required If Interested Please share me your updated resume: komalikab@upwardiq.com, nithishb@upwardiq.com Please fill the below details: Total Experience: - Relevant Experience: - Exp in Python developer- Exp in CICD Technologies. - Exp in Pytest- Exp in kafka Exp in RabbitMA Exp in ActiveMA Exp in GCP Services Exp in Mongodb Exp in Python Frameworks LinkedIn Id: - Current Company - Current CTC - Expected CTC- Last Working Day/Notice period - Notice period (Negotiable)- Current Location - Preferred Location - Holding Offer - Any Pipelines- (Please mention about your pipelines clearly)- How soon can you join- Reason for leaving previous company- Are you okay with the Location-

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies