Home
Jobs

220 Gcp Cloud Jobs - Page 8

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

15 - 30 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Naukri logo

Salary: 15 to 30 LPA Exp: 3 to 8 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Key Skills: GCP, Cloud, Pubsub, Data Engineer

Posted 1 month ago

Apply

4.0 - 8.0 years

8 - 18 Lacs

Hyderabad

Remote

Naukri logo

Programmer : Python + SQL Developer (Backend + Data Analytics) Role: We are looking for a versatile Python + SQL Developer who can work across both backend application development and data analytics workflows . You will build scalable APIs, automate data pipelines, and enable data-driven decision-making through analytical scripting and reporting. Key Responsibilities Develop and maintain backend systems and REST APIs using Python (Flask, FastAPI, or Django). Design and optimize relational databases (e.g., PostgreSQL, MySQL) for both application logic and analytics. Build and automate ETL workflows for internal data processing and reporting. Write complex SQL queries for data extraction, transformation, and aggregation. Generate analytical outputs, dashboards, and insights using Python (Pandas, Matplotlib/Plotly). Collaborate with cross-functional teams (Product, Ops, Data) to deliver data-backed features and reports. Expert in Pivot Tables in Excel Desired Candidate Profile Required Skills Proficiency in Python for backend and scripting tasks. Strong expertise in SQL and relational databases . Experience building APIs and backend services using Flask , FastAPI , or Django . Hands-on with Pandas , NumPy , and other analytics libraries. Hands on Pivot Table Ability to write reusable, well-structured code with good documentation. Experience with version control tools like Git . Job Benefits & Perks As per General IT industry Practices

Posted 1 month ago

Apply

5 - 10 years

5 - 15 Lacs

Hyderabad, Chennai

Hybrid

Naukri logo

Databuzz is Hiring for GCP cloud Engineer(HYD/Chennai) -5+ Yrs -Hybrid- Immediate joiners Please mail your profile to alekya.chebrolu@databuzzltd.com with the below details, If you are Interested. About DatabuzzLTD: Databuzz is One stop shop for data analytics specialized in Data Science, Big Data, Data Engineering, AI & ML, Cloud Infrastructure and Devops. We are an MNC based in both UK and INDIA. We are a ISO 27001 & GDPR complaint company. CTC - ECTC - Notice Period/LWD - (Candidate serving notice period will be preferred) Position: GCP cloud Engineer(HYD/Chennai) -5+ Yrs -Hybrid Exp -5+ yrs Mandatory Skills: Should have L2 level support on all aspects of Google cloud administrative capabilities and also have knowledge on Denodo capabilities Should have Experience in Linux operating systems in server environments Familiarity with Automation using Terraform and Ansible would be preferred Should have Knowledge on Gitlab integration with Denodo for code promotion and version control Familiarity with Python scripting for Denodo integrations Google Cloud Console capabilities Regards, Alekya Talent Acquisition Lead alekya.chebrolu@databuzzltd.com

Posted 1 month ago

Apply

2 - 5 years

10 - 20 Lacs

Bengaluru

Hybrid

Naukri logo

What the job involves You will be joining a fast-growing team of motivated and talented engineers, helping us to build and enhance a suite of innovative products that are changing the mobile marketing industry by enabling our clients to measure the effectiveness of their campaigns in a completely novel way. Working closely with our existing team of software engineers, you will contribute to improving our product suite. You will do this by adding new features to our existing systems and helping create new systems to facilitate new product offerings. You'll work under the mentorship of a lead software engineer who will support you and manage your onboarding and continuous professional development needs. We operate a blameless culture with a flat organizational structure where all input and ideas are welcome; we make decisions fast and value good ideas over seniority, so everyone in the team can make a real difference in product evolution. Who you are Required Skills: You have at least 2-5 years of commercial experience with software engineering in Golang, including REST API development and a strong understanding of data structures and concurrency using goroutines. Experience in relational databases (e.g. MySQL) is a must, exposure to GraphQL, NoSQL databases (e.g. Mongo) would be an advantage. Youve worked with microservice architectures with a good appreciation of performance and quality requirements, Hands-on experience with Docker containers, Kubernetes. Experience working with any cloud platforms such as GCP, AWS, Azure. Experience in data engineering technologies like ELT/ETL workflows, Kafka, Airflow, etc. Optional Skills: Any experience with C# or Python or NodeJS Understanding of the adtech landscape and familiarity with mobile advertising measurement solutions is highly desirable. Experience using AI-powered coding assistants like GitHub CoPilot, etc. Soft Skills: You enjoy new challenges and gain satisfaction from solving interesting problems in a wide range of areas You care deeply about the quality of your work; you are sensitive to the importance of testing your code thoroughly and maintaining it to a high standard You dont need to be micromanaged; Youll ask for help when you need it but you can apply initiative to solve problems on your own You are enthusiastic about broadening your skill set; you are willing and able to quickly learn new techniques and technologies You know how to collaborate effectively with a remote, international team

Posted 1 month ago

Apply

2 - 5 years

10 - 20 Lacs

Bengaluru

Hybrid

Naukri logo

What the job involves You will be joining a fast-growing team of motivated and talented engineers, helping us to build and enhance a suite of innovative products that are changing the mobile marketing industry by enabling our clients to measure the effectiveness of their campaigns in a completely novel way. Working closely with our existing team of software engineers, you will contribute to improving our product suite. You will do this by adding new features to our existing systems and helping create new systems to facilitate new product offerings. You'll work under the mentorship of a lead software engineer who will support you and manage your training and continuous professional development needs. We operate a blameless culture with a flat organizational structure where all input and ideas are welcome; we make decisions fast and value good ideas over seniority, so everyone in the team can make a real difference in product evolution. Who you are Required Skills: You have at least 2-5 years of commercial experience with full-stack software engineering developing backend APIs with Golang or NodeJS and Integrating mobile SDKs/web applications with backend APIs and services. You have a passion for developing front-end applications to create great user interfaces. You must have commercial experience with VueJS. Experience in relational databases (e.g. MySQL) is a must, exposure to GraphQL, NoSQL databases (e.g. Mongo) would be an advantage. Youve worked with microservice architectures with a good appreciation of performance and quality requirements, Hands-on experience with Docker containers, Kubernetes. Experience working with any cloud platforms such as GCP, AWS, Azure. Experience in data engineering technologies like ELT/ETL workflows, Kafka, Airflow, etc. Optional Skills: Any experience with Angular or React Experience using AI-powered coding assistants like GitHub CoPilot, etc. Understanding of the adtech landscape and familiarity with mobile advertising measurement solutions is highly desirable. Soft Skills: You enjoy new challenges and gain satisfaction from solving interesting problems in a wide range of areas You don’t need to be micromanaged; You’ll ask for help when you need it but you can apply initiative to solve problems on your own You are enthusiastic about broadening your skill set; you are willing and able to quickly learn new techniques and technologies. You care deeply about the quality of your work; you are sensitive to the importance of testing your code thoroughly and maintaining it to a high standard You know how to collaborate effectively with a remote, international team

Posted 1 month ago

Apply

6 - 10 years

9 - 18 Lacs

Bengaluru

Remote

Naukri logo

Hiring, Performance Tester Experience: 6+ years Good understanding of GCP/other cloud platforms Strong in performance Centre Good skill in Web-HTTP protocol and API Very strong in Dynatrace or App dynamics Strong on Laodrunner & Jmeter Good communication skill You must be willing to work under a split shift, specifically from 10 AM to 2 PM and 6 PM to 10 PM.

Posted 1 month ago

Apply

7 - 11 years

8 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ?java cloud Kotlin 7 to 11 years 1.Java, Spring boot, Kafka, Kotlin, MongoDB, GCP Or Java, Springboot, , Kotlin, MongoDB, Cloud Or Java, Springboot, Kafka, MongoDB, GCP Cloud ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

7 - 12 years

5 - 15 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

We are hiring for PostgreSQL DBA Experience range & skills: Mandatory, with no exceptions Experience range 7+ years as Postgresql DBA with cloud & SQL Education: BE/B.Tech/MCA/M.Tech/MSc./MS Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!

Posted 1 month ago

Apply

1 - 3 years

5 - 10 Lacs

Noida, Gurugram, Delhi / NCR

Work from Office

Naukri logo

Job Description: Junior Backend Developer Overview We are seeking a dedicated and talented Junior Backend Developer with 1-2 year of experience to join our growing team. The ideal candidate will be responsible for building and maintaining the server-side logic, ensuring high performance and responsiveness to requests from the front-end. You will collaborate with front-end developers to integrate user-facing elements with server-side logic. Key Responsibilities Develop and maintain server-side applications using backend technologies such as Node.js, Express, and databases. Write clean, scalable, and efficient code for backend logic. Design and implement APIs to support front-end functionality and integrate with other services. Optimize application performance, ensuring fast and responsive interactions. Debug and resolve technical issues, ensuring smooth operation of the applications. Collaborate with front-end developers to integrate user-facing elements with server-side logic. Participate in code reviews, providing and receiving constructive feedback. Stay updated with emerging trends and technologies in backend development. Assist in the deployment of applications and monitor performance to ensure stability. Contribute to the improvement of development processes and best practices. Required Skills and Qualifications Experience : 1-2 year of professional experience in backend development. Proficiency in Backend Technologies : Strong understanding of Node.js and Express.js or similar backend frameworks. Database Management : Experience with databases such as MongoDB, MySQL, or PostgreSQL, including database design and optimization. API Development : Practical experience in designing and building RESTful APIs. Version Control : Proficiency in using Git for version control and collaboration. Problem-Solving : Strong analytical and problem-solving skills with attention to detail. Communication : Excellent verbal and written communication skills to work effectively in a team environment. Security Best Practices : Basic understanding of security principles and practices in backend development. Testing : Familiarity with testing frameworks and tools for backend services, such as Mocha, Chai, or Jest. Preferred Skills Additional Frameworks : Experience with other backend frameworks like Django, Flask, or Ruby on Rails is a plus. Cloud Services : Knowledge of cloud services (AWS, Azure, Google Cloud) and deployment processes. Containerization : Familiarity with containerization tools like Docker and orchestration tools like Kubernetes. Agile/Scrum : Experience working in an Agile/Scrum development environment. DevOps : Basic understanding of DevOps practices and tools for continuous integration and deployment (CI/CD). Education Bachelors degree in Computer Science, Information Technology, or a related field, or equivalent work experience.

Posted 1 month ago

Apply

3 - 7 years

6 - 7 Lacs

Chennai

Work from Office

Naukri logo

AI/ML Developer Position Description: The AI/ML Developer is to develop, train, validate, and deploy AI and ML models that meet the project's analytical needs, ensuring accuracy, scalability, and efficiency. Role & responsibilities Data Processing & Analysis Collect, clean, and preprocess structured and unstructured data. Conduct exploratory data analysis to identify trends and patterns. Model Development Design, train, validate, and fine-tune machine learning models using frameworks such as TensorFlow, PyTorch, Scikit-learn, or Hugging Face. Develop and optimize deep learning architectures for classification, regression, NLP, or computer vision tasks. Deployment & Integration Deploy models using tools like Docker, FastAPI, Flask, or TensorFlow Serving. Integrate models into existing applications, pipelines, or APIs. Implement monitoring and performance evaluation of deployed models. Documentation & Collaboration Write clear and comprehensive documentation for models, data pipelines, and APIs. Work collaboratively with data engineers, software developers, and domain experts to ensure alignment with project objectives. Innovation & Research Stay up to date with the latest AI/ML research and tools. Contribute to prototyping and innovation initiatives within the organization. Deliverables: Clean, well-documented code and model scripts. Trained and validated AI/ML models with performance reports. Data pipelines and preprocessing scripts. Deployment-ready models integrated with APIs or platforms. Monthly progress reports on development and outcomes. Final report with documentation on model architecture, datasets, results, and recommendations. Preferred candidate profile Education: Bachelors or Masters degree in Computer Science, Data Science, Artificial Intelligence, or a related field. Experience: Minimum 3 years of experience in machine learning or AI development Proficiency in Python and ML frameworks such as TensorFlow, PyTorch, Scikit-learn. Experience with data preprocessing, feature engineering, and model evaluation. Experience with cloud services (AWS, Azure, GCP) for AI model deployment, good understanding of data structures, algorithms, and software engineering practices Experience with database systems (SQL, NoSQL) and big data tools (e.g., Spark, Hadoop) is a plus. Personal Qualities: Strong analytical and problem-solving skills. Excellent programming and debugging abilities. Ability to communicate technical concepts to non-technical stakeholders. Attention to detail and commitment to reproducible research/code. Strong teamwork and collaborative mindset.

Posted 1 month ago

Apply

3 - 8 years

15 - 30 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

Salary: 15 to 30 LPA Exp: 3 to 8 years Location : Gurgaon/Bangalore/Pune/Chennai Notice: Immediate to 30 days..!! Key Responsibilities & Skillsets: Common Skillsets : 3+ years of experience in analytics, SAS Pyspark, Python, Spark, SQL and associated data engineering jobs. Must have experience with managing and transforming big data sets using pyspark, spark-scala, Numpy pandas Excellent communication & presentation skills Experience in managing Python codes and collaborating with customer on model evolution Good knowledge of data base management and Hadoop/Spark, SQL, HIVE, Python (expertise). Superior analytical and problem solving skills Should be able to work on a problem independently and prepare client ready deliverable with minimal or no supervision Good communication skill for client interaction Data Management Skillsets: Ability to understand data models and identify ETL optimization opportunities. Exposure to ETL tools is preferred Should have strong grasp of advanced SQL functionalities (joins, nested query, and procedures). Strong ability to translate functional specifications / requirements to technical requirements

Posted 1 month ago

Apply

1 - 3 years

3 - 6 Lacs

Jewar

Work from Office

Naukri logo

Responsibilities: Develop, maintain & enhance services, APIs, & databases. Design & implement scalable, high-performance & secure architecture. Optimize database queries & ensure data integrity. Troubleshoot, resolve issues & performance bottlenecks.

Posted 1 month ago

Apply

5 - 9 years

5 - 15 Lacs

Gurugram

Work from Office

Naukri logo

DevOps Team Lead 5-9 Years of Experience TOP 3 SKILLS Hands-on experience with CI/CD, Kubernetes (GKE) and Terraform based GCP Deployments. Well versed with Cloud based Database, Kafka or other Event based systems, and Observation Stack (Grafana, Prometheus, Loki, Tempo etc) Deployments and automation with focus on Performance Monitoring. Familiarity with query languages, SQL, PromQL etc. Job Description Roles & Responsibility Design, build, and maintain infrastructure and tools to enable continuous integration, delivery, and deployment of software products. Collaborate with technology owners, developers, testers, and other stakeholders to understand the requirements and specifications of the software projects. Monitor, troubleshoot, and optimize the performance and availability of the systems and applications. Research and evaluate new technologies and methodologies that can improve the efficiency and quality of the DevOps processes. Essential Skills Proficient in using various tools and technologies for DevOps, such as Git, Jenkins, Docker, Kubernetes, Ansible, Prometheus, AWS, GCP etc. Proven experience in managing multiple environments and manage DevOps Engineers or a similar role in a software development environment. Strong problem-solving and troubleshooting skills.

Posted 1 month ago

Apply

4 - 6 years

12 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Role & responsibilities Development of modern based applications in the back-end and front-end. Development, implementation and optimisation of innovative IoT products, web apps and new features Implement cleaner solutions for the problems with recommended system design concepts. Technical product design, solution architecture, specifications and implementation of Livello solutions Work with a cross-functional team to define, build, test, and deploy universal applications Version control with Git is part of your daily work and continuous integration Ensure the implementation of technical standards, quality assurance and best practices. Preferred candidate profile Bachelor/Master degree in Computer Science or comparable field of study 4-5 years experience in agile software development Proficiency in Node.js, Typescript, React js, React Native/Flutter, MongoDB, Docker along with any cloud experience (AWS, GCP, Azure). Understanding of software architecture and design patterns. Experience with test and deployment automation (Gitlab, Fastlane, Jest). Experience with GraphQL and Kubernetes as well as state management solutions (Redux, Saga). Ability to work cooperatively and independently, analytical and logical thinking, willingness to lead and take on responsibility. Strong understanding of OOPs concepts and their practical application in software development. Fluent in English. Nice to have: Experienced in IoT-to-Cloud managed services. Knowledge of IoT device management and message brokers like AMQP or MQTT Interest in designing dashboards for data visualization. Working Experience with Nest.js. Perks and benefits A responsible position in a fast-growing and highly innovative start-up An agile and diverse team with colleagues from all over the world, working with our main office in Germany English speaking open work environment, with flat hierarchies and short decision-making paths Creative freedom for own ideas, projects and personal development. Quarterly awards for recognizing the hard work and talent within the team.

Posted 1 month ago

Apply

8 - 13 years

15 - 25 Lacs

Gurugram, Chennai, Delhi / NCR

Work from Office

Naukri logo

Role & responsibilities Java Backend Developer Java, Springboot, Spring,Microservices, GCP Cloud experience GCP certification Preferred candidate profile

Posted 1 month ago

Apply

8 - 10 years

27 - 32 Lacs

Hyderabad, Gurugram, Bengaluru

Work from Office

Naukri logo

The Team: As a Senior Lead Machine Learning Engineer of the Document Platforms and AI Team, you will play a critical role in building the next generation of data extraction tools, working on cutting-edge ML-powered products and capabilities that power natural language understanding, information retrieval, and data sourcing solutions for the Enterprise Data Organization and our clients. This is an exciting opportunity to shape the future of data transformation and see your work make a real difference, all while having fun in a collaborative and engaging environment. You'll spearhead the development and deployment of production-ready AI products and pipelines, leading by example and mentoring a talented team. This role demands a deep understanding of machine learning principles, hands-on experience with relevant technologies, and the ability to inspire and guide others. You'll be at the forefront of a rapidly evolving field, learning and growing alongside some of the brightest minds in the industry. If you're passionate about AI, driven to make an impact, and thrive in a dynamic and supportive workplace, we encourage you to join us! The Impact: The Document Platforms and AI team has already delivered breakthrough products and significant business value over the last 3 years. In this role you will be developing our next generation of new products while enhancing existing ones aiming at solving high-impact business problems. Whats in it for you: Be a part of a global company and build solutions at enterprise scale Collaborate with a highly skilled and technically strong team Contribute to solving high complexity, high impact problems Responsibilities: Build production ready data acquisition and transformation pipelines from ideation to deployment. Being a hands-on problem solver and developer helping to extend and manage the data platforms. Apply best practices in data modeling and building ETL pipelines (streaming and batch) using cloud-native solutions Technical leadership: Drive the technical vision and architecture for the extraction project, making key decisions about model selection, infrastructure, and deployment strategies. Model development: Design, develop, and evaluate state-of-the-art machine learning models for information extraction, leveraging techniques from NLP, computer vision (if applicable), and other relevant domains. Data preprocessing and feature engineering: Develop robust pipelines for data cleaning, preprocessing, and feature engineering to prepare data for model training. Model training and evaluation: Train, tune, and evaluate machine learning models, ensuring high accuracy, efficiency, and scalability. Deployment and monitoring: Deploy and maintain machine learning models in a production environment, monitoring their performance and ensuring their reliability. Research and innovation: Stay up-to-date with the latest advancements in machine learning and NLP, and explore new techniques and technologies to improve the extraction process. Collaboration: Work closely with product managers, data scientists, and other engineers to understand project requirements and deliver effective solutions. Code quality and best practices: Ensure high code quality and adherence to best practices for software development. Communication: Effectively communicate technical concepts and project updates to both technical and non-technical audiences. What Were Looking For: 8-10 years of professional software work experience, with a strong focus on Machine Learning, Natural Language Processing (NLP) for information extraction and MLOps Expertise in Python and related NLP libraries (e.g., spaCy, NLTK, Transformers, Hugging Face) Experience with Apache Spark or other distributed computing frameworks for large-scale data processing. AWS/GCP Cloud expertise, particularly in deploying and scaling ML pipelines for NLP tasks. Solid understanding of the Machine Learning model lifecycle, including data preprocessing, feature engineering, model training, evaluation, deployment, and monitoring, specifically for information extraction models . Experience with CI/CD pipelines for ML models, including automated testing and deployment. Docker & Kubernetes experience for containerization and orchestration. OOP Design patterns, Test-Driven Development and Enterprise System design SQL (any variant, bonus if this is a big data variant) Linux OS (e.g. bash toolset and other utilities) Version control system experience with Git, GitHub, or Azure DevOps. Excellent Problem-solving, Code Review and Debugging skills Software craftsmanship, adherence to Agile principles and taking pride in writing good code Techniques to communicate change to non-technical people Nice to have Core Java 17+, preferably Java 21+, and associated toolchain Apache Avro Apache Kafka Other JVM based languages - e.g. Kotlin, Scala C# - in particular .NET Core

Posted 1 month ago

Apply

5 - 8 years

15 - 25 Lacs

Pune

Hybrid

Naukri logo

Role & responsibilities Data Pipeline Development: Design, develop, and maintain data pipelines utilizing Google Cloud Platform (GCP) services like Dataflow, Dataproc, and Pub/Sub. Data Ingestion & Transformation: Build and implement data ingestion and transformation processes using tools such as Apache Beam and Apache Spark. Data Storage Management: Optimize and manage data storage solutions on GCP, including BigQuery, Cloud Storage, and Cloud SQL. Security Implementation: Implement data security protocols and access controls with GCP's Identity and Access Management (IAM) and Cloud Security Command Center. System Monitoring & Troubleshooting: Monitor and troubleshoot data pipelines and storage solutions using GCP's Stackdriver and Cloud Monitoring tools. Generative AI Systems: Develop and maintain scalable systems for deploying and operating generative AI models, ensuring efficient use of computational resources. Gen AI Capability Building: Build generative AI capabilities among engineers, covering areas such as knowledge engineering, prompt engineering, and platform engineering. Knowledge Engineering: Gather and structure domain-specific knowledge to be utilized by large language models (LLMs) effectively. Prompt Engineering: Design effective prompts to guide generative AI models, ensuring relevant, accurate, and creative text output. Collaboration: Work with data experts, analysts, and product teams to understand data requirements and deliver tailored solutions. Automation: Automate data processing tasks using scripting languages such as Python. Best Practices: Participate in code reviews and contribute to establishing best practices for data engineering within GCP. Continuous Learning: Stay current with GCP service innovations and advancements. Core data services (GCS, BigQuery, Cloud Storage, Dataflow, etc.). Skills and Experience: Experience: 5+ years of experience in Data Engineering or similar roles. Proficiency in GCP: Expertise in designing, developing, and deploying data pipelines, with strong knowledge of GCP core data services (GCS, BigQuery, Cloud Storage, Dataflow, etc.). Generative AI & LLMs: Hands-on experience with Generative AI models and large language models (LLMs) such as GPT-4, LLAMA3, and Gemini 1.5, with the ability to integrate these models into data pipelines and processes. Experience in Webscraping Technical Skills: Strong proficiency in Python and SQL for data manipulation and querying. Experience with distributed data processing frameworks like Apache Beam or Apache Spark is a plus. Security Knowledge: Familiarity with data security and access control best practices. • Collaboration: Excellent communication and problem-solving skills, with a demonstrated ability to collaborate across teams. Project Management: Ability to work independently, manage multiple projects, and meet deadlines. Preferred Knowledge: Familiarity with Sustainable Finance, ESG Risk, CSRD, Regulatory Reporting, cloud infrastructure, and data governance best practices. Bonus Skills: Knowledge of Terraform is a plus. Education: Degree: Bachelors or masters degree in computer science, Information Technology, or a related field. Experience: 3-5 years of hands-on experience in data engineering. Certification: Google Professional Data Engineer

Posted 1 month ago

Apply

9 - 14 years

17 - 32 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Job Title : DevOps Engineer GCP | Terraform | Jenkins | CI/CD Location: Bangalore/Chennai Job Summary: We are seeking a skilled and motivated DevOps Engineer with hands-on experience in Google Cloud Platform (GCP), Infrastructure as Code using Terraform, Jenkins automation, and CI/CD pipelines. You will play a key role in designing, building, and maintaining scalable, secure, and efficient cloud infrastructure to support continuous delivery and deployment. Key Responsibilities: Design and implement scalable infrastructure on GCP using Terraform. Develop, manage, and maintain CI/CD pipelines using Jenkins, Git, and related tools. Collaborate with developers, QA, and other IT teams to automate and optimize deployments. Manage version control, code integration, and release processes across environments. Monitor infrastructure performance, troubleshoot issues, and implement best practices in reliability and security. Ensure infrastructure is compliant with internal security and governance requirements. Maintain documentation for infrastructure and operational processes. Required Skills and Qualifications: 9+ years of experience in DevOps or Cloud Engineering roles. Strong experience with Google Cloud Platform (GCP) services (e.g., Compute Engine, Cloud Storage, VPC, IAM). Proven expertise in Terraform for writing and maintaining infrastructure as code. Hands-on experience with Jenkins for pipeline creation, job automation, and integration. Deep understanding of CI/CD concepts and best practices. Proficiency with Git, shell scripting, and automation tools. Working knowledge of Docker and containerization. Experience in monitoring/logging (e.g., Stackdriver, Prometheus, Grafana) is a plus. Preferred Qualifications: GCP certification (Associate or Professional Cloud Engineer). Experience with Kubernetes and GKE. Familiarity with configuration management tools (e.g., Ansible, Chef). Knowledge of security best practices in cloud environments. Soft Skills: Strong problem-solving and troubleshooting skills. Excellent communication and collaboration abilities. Ability to work independently and in a fast-paced, agile environment.

Posted 1 month ago

Apply

6 - 10 years

15 - 20 Lacs

Gurugram, Delhi / NCR

Work from Office

Naukri logo

Role & responsibilities 1. Pipeline Development and Support Design, build, and optimize scalable ETL pipelines on Databricks using PySpark, SQL, and Delta Lake. Work with structured and semi-structured insurance data (policy, claims, actuarial, risk, customer data) from multiple sources. Implement data quality checks, governance, and monitoring across pipelines. Collaborate with data scientists, actuaries, and business stakeholders to translate analytics requirements into data models. Develop and deliver compelling visualizations and dashboards using Databricks SQL, Power BI, Tableau, or similar tools. Monitor and troubleshoot pipeline issues, ensuring data integrity and resolving bottlenecks or failures. Optimize Databricks clusters for performance and cost efficiency. Support ML model deployment pipelines in collaboration with data science teams. Document pipelines, workflows, and architecture following best practices. 2. SQL Write complex SQL queries to extract, transform, and load (ETL) data for reporting, analytics, or downstream applications. Optimize SQL queries for performance, especially when working with large datasets in Snowflake or other relational databases. Create and maintain database schemas, tables, views, and stored procedures to support business requirements. 3. Data Integration Integrate data from diverse sources (e.g., on-premises databases, cloud storage like S3 or Azure Blob, or third-party APIs) into a unified system. Ensure data consistency, quality, and availability by implementing data validation and cleansing processes. 4. Good to have skills Insurance domain experience P&C or L&A domain experience candidate will be preferred Team player / strong communication skills Experience with MLflow, feature stores, and model monitoring. Hands-on experience with data governance tools (e.g., Unity Catalog, Collibra). Familiarity with regulatory and compliance requirements in insurance data. Skills Typically Required 5+ years of experience in data engineering, with at least 2+ years hands-on with Databricks and Spark. Strong proficiency in PySpark, SQL, Delta Lake, and data modeling. Solid understanding of cloud platforms (Azure, AWS, or GCP) and data lake architectures. Experience integrating Databricks with BI tools (Power BI, Tableau, Looker) for business-facing dashboards. Knowledge of insurance data (L&A, P&C) and industry metrics is highly preferred. Familiarity with DevOps tools (Git, CI/CD pipelines) and orchestration tools (Airflow, Databricks Jobs). Strong communication skills to explain technical concepts to business stakeholders

Posted 1 month ago

Apply

3 - 8 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Support Engineer Project Role Description : Act as software detectives, provide a dynamic service identifying and solving issues within multiple components of critical business systems. Must have skills : Google Kubernetes Engine Good to have skills : Kubernetes, Google Cloud Compute Services Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education About The Role :Job Summary :We are seeking a motivated and talented GCP & Kubernetes Engineer to join our growing cloud infrastructure team. This role will be a key contributor in building and maintaining our Kubernetes platform, working closely with architects to design, deploy, and manage cloud-native applications on Google Kubernetes Engine (GKE).Responsibilities: Extensive hands-on experience with Google Cloud Platform (GCP) and Kubernetes implementations. Demonstrated expertise in operating and managing container orchestration engines such as Dockers or Kubernetes. Knowledge or experience on various Kubernetes tools like Kubekafka, Kubegres, Helm, Ingress, Redis, Grafana, and Prometheus Proven track record in supporting and deploying various public cloud services. Experience in building or managing self-service platforms to boost developer productivity. Proficiency in using Infrastructure as Code (IaC) tools like Terraform. Skilled in diagnosing and resolving complex issues in automation and cloud environments. Advanced experience in architecting and managing highly available and high-performance multi-zonal or multi-regional systems. Strong understanding of infrastructure CI/CD pipelines and associated tools. Collaborate with internal teams and stakeholders to understand user requirements and implement technical solutions. Experience working in GKE, Edge/GDCE environments. Assist development teams in building and deploying microservices-based applications in public cloud environments.Technical Skillset: Minimum of 3 years of hands-on experience in migrating or deploying GCP cloud-based solutions. At least 3 years of experience in architecting, implementing, and supporting GCP infrastructure and topologies. Over 3 years of experience with GCP IaC, particularly with Terraform, including writing and maintaining Terraform configurations and modules. Experience in deploying container-based systems such as Docker or Kubernetes on both private and public clouds (GCP GKE). Familiarity with CI/CD tools (e.g., GitHub) and processes.Certifications: GCP ACE certification is mandatory. CKA certification is highly desirable. HashiCorp Terraform certification is a significant plus.

Posted 1 month ago

Apply

4 - 9 years

16 - 31 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Role & responsibilities Execute project specific development activities in accordance to applicable standards and quality parameters Developing Reviewing Code Setting up the right environment for the projects. Ensure delivery within schedule by adhering to the engineering and quality standards. Own deliver end to end projects within GCP for Payments Data Platform Once a month available on support rota for a week for GCP 24x7 on call Basic Knowledge on Payments ISO standards Message Types etc Able to work under pressure on deliverables P1 Violations Incidents Should be fluent and clear in communications Written Verbal Should be able to follow Agile ways of working. Must have hands on experience on JAVA GCP Shell script and Python knowledge a plus Have in depth knowledge on Java Spring boot Should have experience in GCP Data Flow Big Table Big Query etc Should have experience on managing large database Should have worked on requirements design develop Event Driven and Near Real time data patterns Ingress Egress Preferred candidate profile

Posted 1 month ago

Apply

4 - 9 years

10 - 14 Lacs

Pune

Hybrid

Naukri logo

Job Description; Technical Skills; Top skills for this positions is : Google Cloud Platform (Composer, Big Query, Airflow, DataProc, Data Flow, GCS) Data Warehousing knowledge Hands on experience in Python language and SQL database. Analytical technical skills to be able to predict the consequences of configuration changes (impact analysis), to identify root causes that are not obvious and to understand the business requirements. Excellent communication with different stakeholders (business, technical, project) Good understading of the overall Big Data and Data Science ecosystem Experience with buiding and deploying containers as services using Swarm/Kubernetes Good understanding of container concepts like buiding lean and secure images Understanding modern DevOps pipelines Experience with stream data pipelines using Kafka or Pub/Sub (mandatory for Kafka resources) Good to have: Professional Data Engineer or Associate Data Engineer Certification Roles and Responsibilities; Design, build & manage Big data ingestion and processing applications on Google Cloud using Big Query, Dataflow, Composer, Cloud Storage, Dataproc Performance tuning and analysis of Spark, Apache Beam (Dataflow) or similar distributed computing tools and applications on Google Cloud Good understanding of google cloud concepts, environments and utilities to design cloud optimal solutions for Machine Learning Applications Build systems to perform real-time data processing using Kafka, Pub-sub, Spark Streaming or similar technologies Manage the development life-cycle for agile software development projects Convert a proof of concept into an industrialization for Machine Learning Models (MLOps). Provide solutions to complex problems. Deliver customer-oriented solutions in a timely, collaborative manner Proactive thinking, planning and understanding of dependencies Develop & implement robust solutions in test & production environments.

Posted 1 month ago

Apply

10 - 20 years

25 - 40 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Hi, Hope you are looking for a job change. We have opening for GCP Data Architect for an MNC in Pan India Location, I'm sharing JD with you. Please have a look and revert with below details and Updated Resume. Apply only if you can join in 10 Days. Its 5-Days Work from Office . We dont process High-Notice period Candidates Role GCP Data Architect Experience: 10+ Years Mode: Permanent Work Location: Pan India Notice Period: immediate to 10 Days Mandatory Skills: : GCP, Architecture Experience , Big Data, Data Modelling , BigQuery Full Name ( As Per Aadhar Card ): Email ID: Mobile Number: Alternate No: Qualification: Graduation Year: Regular Course: Total Experience: Relevant experience: Current Organization: Working as Permanent Employee: Payroll Company:: Experience in GCP : Experience in Architecture : Experience in GCP Data Architecture : Experience in Big Data: Experience in BigQuery : Experience in Data Management : Official Notice period: Serving Notice Period: Current location: Preferred location: Current CTC: Exp CTC: CTC Breakup: Pan Card Number : Date of Birth : Any Offer in Hand: Offered CTC: LWD: Any Offer in hand: Serving Notice Period: Can you join immediately: Ready to work from Office for 5 Days : Job Description: GCP Data Architect We are seeking a skilled Data Solution Architect to design Solution and Lead the implementation on GCP. The ideal candidate will possess extensive experience in data Architecting, Solution design and data management practices. Responsibilities: Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions. Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Experience with common GCP services such as BigQuery, Dataflow, GCS, Service Accounts, cloud function Extremely strong in BigQuery design, development Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modelling using any Modelling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills. Regards, Rejeesh S Email : rejeesh.s@jobworld.jobs Mobile : +91 - 9188336668

Posted 1 month ago

Apply

1 - 6 years

4 - 9 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities TO Handle field sales ( CLOUD PRODUCTS ) 1. Primarily a hunter and hustler personality with 1 - 8 years of experience in SME & Enterprise Segment. Strong enterprise sales background in solutions / SaaS space ideally with knowledge of / AI/Azure /AWS/GCP/GWS. 2. Sellthe Google Cloudproduct and services to new and existing clients. Identify and properly qualifyCloud opportunities. PresentCloudsolutions at the executive level (C level Executive). Lead negotiations and overcome objections for deal closure. Manage complex sales cycles and multiple engagements simultaneously, Work with partner sales consultants to discover, identify and meet customer requirements. 3. Prepare accurate BOQ & sales forecasts and sales cycle reporting. Provide hand holding to ensure the success of the potentialorcurrent clients. Leverage and enhance partner relationships to drive additional value and revenue. 4. Forge strong working relationships with Partners. Encourage and develop increased awareness of Microsoft Cloud services among partners. Collaborate with channel partners executive, sales, and technical teams. Develops and executes successful targeted territory development plans / GTM to help achieve growth and revenue. Monitor and report sales activity within the system. 5. Generate new ARR and long term TCVs by landing new clients. Create territory specific sales strategy aligned to Redington Limited GTM plans and execute on it. Grow business by signing new partnerships and leveraging existing ones. Preferred candidate profile Preferred from B2B Sales Specialist who has knowledge in GCP / GWS Products Perks and benefits Good salary and Work life balance

Posted 1 month ago

Apply

4 - 8 years

10 - 20 Lacs

Hyderabad

Hybrid

Naukri logo

Job Description : We are seeking a highly motivated and experienced ML Engineer/Data Scientist to join our growing ML/GenAI team. You will play a key role in designing, developing and productionalizing ML applications by evaluating models, training and/or fine tuning them. You will play a crucial role in developing Gen AI based solutions for our customers. As a senior member of the team, you will take ownership of projects, collaborating with engineers and stakeholders to ensure successful project delivery. What we're looking for: At least 3 years of experience in designing & building AI applications for customer and deploying them into production At least 5 years of Software engineering experience in building Secure, scalable and performant applications for customers. Experience with Document extraction using AI, Conversational AI, Vision AI, NLP or Gen AI. Design, develop, and operationalize existing ML models by fine tuning, personalizing it. Evaluate machine learning models and perform necessary tuning. Develop prompts that instruct LLM to generate relevant and accurate responses. Collaborate with data scientists and engineers to analyze and preprocess datasets for prompt development, including data cleaning, transformation, and augmentation. Conduct thorough analysis to evaluate LLM responses, iteratively modify prompts to improve LLM performance. Hands on customer experience with RAG solution or fine tuning of LLM model. Build and deploy scalable machine learning pipelines on GCP or any equivalent cloud platform involving data warehouses, machine learning platforms, dashboards or CRM tools. Experience working with the end-to-end steps involving but not limited to data cleaning, exploratory data analysis, dealing outliers, handling imbalances, analyzing data distributions (univariate, bivariate, multivariate), transforming numerical and categorical data into features, feature selection, model selection, model training and deployment. Proven experience building and deploying machine learning models in production environments for real life applications Good understanding of natural language processing, computer vision or other deep learning techniques. Expertise in Python, Numpy, Pandas and various ML libraries (e.g., XGboost, TensorFlow, PyTorch, Scikit-learn, LangChain). Familiarity with Google Cloud or any other Cloud Platform and its machine learning services. Excellent communication, collaboration, and problem-solving skills.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies