Home
Jobs

220 Gcp Cloud Jobs - Page 6

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 9.0 years

7 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Role Overview: We are seeking a talented and forward-thinking Data Engineer for one of the large financial services GCC based in Hyderabad with responsibilities that include designing and constructing data pipelines, integrating data from multiple sources, developing scalable data solutions, optimizing data workflows, collaborating with cross-functional teams, implementing data governance practices, and ensuring data security and compliance. Technical Requirements: 1. Proficiency in ETL, Batch, and Streaming Process 2. Experience with BigQuery, Cloud Storage, and CloudSQL 3. Strong programming skills in Python, SQL, and Apache Beam for data processing 4. Understanding of data modeling and schema design for analytics 5. Knowledge of data governance, security, and compliance in GCP 6. Familiarity with machine learning workflows and integration with GCP ML tools 7. Ability to optimize performance within data pipelines Functional Requirements: 1. Ability to collaborate with Data Operations, Software Engineers, Data Scientists, and Business SMEs to develop Data Product Features 2. Experience in leading and mentoring peers within an existing development team 3. Strong communication skills to craft and communicate robust solutions 4. Proficient in working with Engineering Leads, Enterprise and Data Architects, and Business Architects to build appropriate data foundations 5. Willingness to work on contemporary data architecture in Public and Private Cloud environments T his role offers a compelling opportunity for a seasoned Data Engineering to drive transformative cloud initiatives within the financial sector, leveraging unparalleled experience and expertise to deliver innovative cloud solutions that align with business imperatives and regulatory requirements . Qualification Engineering Grad / Postgraduate CRITERIA 1. Proficient in ETL, Python, and Apache Beam for data processing efficiency. 2. Demonstrated expertise in BigQuery, Cloud Storage, and CloudSQL utilization. 3. Strong collaboration skills with cross-functional teams for data product development. 4. Comprehensive knowledge of data governance, security, and compliance in GCP. 5. Experienced in optimizing performance within data pipelines for efficiency. 6. Relevant Experience: 6-9 years Connect at 9993809253

Posted 2 weeks ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

Gurugram

Work from Office

Naukri logo

As a Technical Lead, you will be responsible for leading the development and delivery of the platforms. This includes overseeing the entire product lifecycle from the solution until execution and launch, building the right team close collaboration with business and product teams. Primary Responsibilities: Design end-to-end solutions that meet business requirements and align with the enterprise architecture. Define the architecture blueprint, including integration, data flow, application, and infrastructure components. Evaluate and select appropriate technology stacks, tools, and frameworks. Ensure proposed solutions are scalable, maintainable, and secure. Collaborate with business and technical stakeholders to gather requirements and clarify objectives. Act as a bridge between business problems and technology solutions. Guide development teams during the execution phase to ensure solutions are implemented according to design. Identify and mitigate architectural risks and issues. Ensure compliance with architecture principles, standards, policies, and best practices. Document architectures, designs, and implementation decisions clearly and thoroughly. Identify opportunities for innovation and efficiency within existing and upcoming solutions. Conduct regular performance and code reviews, and provide feedback to the development team members to improve professional development. Lead proof-of-concept initiatives to evaluate new technologies. Functional Responsibilities: Facilitate daily stand-up meetings, sprint planning, sprint review, and retrospective meetings. Work closely with the product owner to priorities the product backlog and ensure that user stories are well-defined and ready for development. Identify and address issues or conflicts that may impact project delivery or team morale. Experience with Agile project management tools such as Jira and Trello. Required Skills: Bachelor's degree in Computer Science, Engineering, or related field. 7+ years of experience in software engineering, with at least 3 years in a solution architecture or technical leadership role. Proficiency with AWS or GCP cloud platform. Strong implementation knowledge in JS tech stack, NodeJS, ReactJS, Experience with JS stack - ReactJS, NodeJS. Experience with Database Engines - MySQL and PostgreSQL with proven knowledge of Database migrations, high throughput and low latency use cases. Experience with key-value stores like Redis, MongoDB and similar. Preferred knowledge of distributed technologies - Kafka, Spark, Trino or similar with proven experience in event-driven data pipelines. Proven experience with setting up big data pipelines to handle high volume transactions and transformations. Experience with BI tools - Looker, PowerBI, Metabase or similar. Experience with Data warehouses like BigQuery, Redshift, or similar. Familiarity with CI/CD pipelines, containerization (Docker/Kubernetes), and IaC (Terraform/CloudFormation). Good to Have: Certifications such as AWS Certified Solutions Architect, Azure Solutions Architect Expert, TOGAF, etc. Experience setting up analytical pipelines using BI tools (Looker, PowerBI, Metabase or similar) and low-level Python tools like Pandas, Numpy, PyArrow Experience with data transformation tools like DBT, SQLMesh or similar. Experience with data orchestration tools like Apache Airflow, Kestra or similar.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

3 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

CLOUD ADMIN : Core Responsibilities Cloud Resource Provisioning & Management Execute provisioning and decommissioning activities for cloud resources (compute, storage, network, security groups) as per approved service requests or automation triggers, ensuring adherence to approved configurations and naming conventions. Maintain accurate tagging and metadata for all provisioned resources to enable visibility, policy enforcement, and proper cost attribution across business units or projects. Implement and maintain resource policies, such as auto-shutdown schedules, retention rules, and rightsizing recommendations, to ensure optimal usage and compliance with organizational guardrails. Operational Maintenance & Monitoring Monitor the health and completion of backup jobs, including VM snapshots, database dumps, and object storage versioning, escalating failures to relevant teams or L3 support. Apply and track lifecycle rules for storage buckets, archives, and temporary data to avoid unnecessary storage cost accumulation. Perform routine cloud maintenance tasks, including patch scheduling, instance resizing, log file rotation and cleanup, and temporary volume management, as per defined maintenance windows and SOPs. Deliverables Provisioning & Lifecycle Records Cloud provisioning logs, including timestamps, resource details, request origin, and tagging validation for all resources created or modified. Backup job summaries, highlighting success/failure status, size, timestamp, and target recovery point, with retention validation. Resource deallocation and cost recovery reports, listing terminated resources, associated cost savings, and confirmation of associated tag/policy cleanup. PLEASE SHARE YOUR UPDATED RESUME ON netra.chaubal@bootlabstech.com

Posted 3 weeks ago

Apply

8.0 - 12.0 years

20 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

The ideal candidate will have extensive experience with Google Cloud Platform's data services, building scalable data pipelines, and implementing modern data architecture solutions. Key Responsibilities Design and implement data lake solutions using GCP Storage and Data Transfer Service Develop and maintain ETL/ELT pipelines for data processing and transformation Orchestrate complex data workflows using Cloud Composer (managed Apache Airflow) Build and optimize BigQuery data models and implement data governance practices Configure and maintain Dataplex for unified data management across our organization Implement monitoring solutions using Cloud Monitoring to ensure data pipeline reliability Create and maintain data visualization solutions using Looker for business stakeholders Collaborate with data scientists and analysts to deliver high-quality data products Required Skills & Experience 8+ years of hands-on experience with GCP data services including: Cloud Storage and Storage Transfer Service for data lake implementation BigQuery for data warehousing and analytics Cloud Composer for workflow orchestration Dataplex for data management and governance Cloud Monitoring for observability and alerting Strong experience with ETL/ELT processes and data pipeline development Proficiency in SQL and at least one programming language (Python preferred) Experience with Looker or similar BI/visualization tools Knowledge of data modeling and dimensional design principles Experience implementing data quality monitoring and validation Preferred Qualifications Google Cloud Professional Data Engineer certification Experience with streaming data processing using Dataflow or Pub/Sub Knowledge of data mesh or data fabric architectures Experience with dbt or similar transformation tools Familiarity with CI/CD practices for data pipelinesRole & responsibilities Preferred candidate profile

Posted 3 weeks ago

Apply

13.0 - 25.0 years

15 - 30 Lacs

Navi Mumbai, Bengaluru

Work from Office

Naukri logo

gcp with docker

Posted 3 weeks ago

Apply

12.0 - 18.0 years

50 - 55 Lacs

Hyderabad

Work from Office

Naukri logo

Extensive, hands on knowledge of Data Modelling, Data Architecture and Data Lineage Broad knowledge of banking products , financial products (ie international trade, credit physical data modelling preferably with cloud GCP

Posted 3 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for a Senior Site Reliability Engineer to join Okta s Workflows SRE team which is part of our Emerging Products Group (EPG). Okta Workflows is the foundation for secure integration between cloud services. By harnessing the power of the cloud, Okta allows people to quickly integrate different services, while still enforcing strong security policies. With Okta Workflows, organizations can implement no-code or low-code workflows quickly, easily, at a large scale, and low total cost. Thousands of customers trust Okta Workflows to help their organizations work faster, boost revenue, and stay secure. If you like to be challenged and have a passion for solving large-scale automation, testing, and tuning problems, we would love to hear from you. The ideal candidate is someone who exemplifies the ethics of, If you have to do something more than once, automate it and who can rapidly self-educate on new concepts and tools. What you ll be doing? Designing, building, running, and monitoring Okta Workflows and other EPG products global production infrastructure. Lead and implement secure, scalable Kubernetes clusters across multiple environments. Be an evangelist for security best practices and also lead initiatives/projects to strengthen our security posture for critical infrastructure. Responding to production incidents and determining how we can prevent them in the future. Triaging and troubleshooting complex production issues to ensure reliability and performance. Enhance automation workflows for patching, vulnerability assessments, and incident response. Continuously evolving our monitoring tools and platform. Promoting and applying best practices for building scalable and reliable services across engineering. Developing and maintaining technical documentation, runbooks, and procedures. Supporting a highly available and large scale Kubernetes and AWS environment as part of an on-call rotation. Be a technical SME for a team that designs and builds Okta's production infrastructure, focusing on security at scale in the cloud. What you ll bring to the role? Are always willing to go the extra mile: see a problem, fix the problem. Are passionate about encouraging the development of engineering peers and leading by example. Have experience with Kubernetes deployments in either AWS and/or GCP Cloud environments. Have an understanding and familiarity with configuration management tools like Chef, Terraform, or Ansible. Have expert-level abilities in operational tooling languages such as Go and shell, and use of source control. Have knowledge of various types of data stores, particularly PostgreSQL, Redis, and OpenSearch. Experience with industry-standard security tools like Nessus and OSQuery. Have knowledge of CI/CD principles, Linux fundamentals, OS hardening, networking concepts, and IP protocols. Skilled in using Datadog for real-time monitoring and proactive incident detection. Strong ability to collaborate with cross-functional teams and promote a security first culture. Experience in the following 5+ years of experience running and managing complex AWS or other cloud networking infrastructure resources including architecture, security and scalability. 5+ years of experience with Ansible, Chef, and/or Terraform 3+ years of experience in cloud security, including IAM (Identity and Access Management) and/or secure identity management for cloud platforms and Kubernetes. 3+ years of experience in automating CI/CD pipelines using tools such as Spinnaker, or ArgoCD with an emphasis on integrating security throughout the process. Proven experience in implementing monitoring and observability solutions such as Datadog or Splunk to enhance security and detect incidents in real-time. Strong leadership and collaboration skills with experience working cross-functionally with security engineers and developers to enforce security best practices and policies. Strong Linux understanding and experience. Strong security background and knowledge. BS In computer science (or equivalent experience).

Posted 3 weeks ago

Apply

8.0 - 13.0 years

30 - 45 Lacs

Hyderabad

Hybrid

Naukri logo

Position: Cloud Architect Experience: 10 15 Years Shift Timings: 3:30 PM IST to 12:30 AM IST . Job Description - Cloud Solutions Architect Skills: Expertise in AWS, Azure, Google Cloud. Infrastructure as Code (IaC) using tools like Terraform or CloudFormation. Experience in cloud migration and optimization. Knowledge of cloud security, governance, and cost management. Understanding of multi-cloDataud and hybrid-cloud architectures. Short JD: As a Cloud Solutions Architect, you will design and implement scalable, secure, and cost-effective cloud solutions for retail IT services. You will lead cloud migration strategies, optimize cloud infrastructure, and work with teams to ensure robust security and governance across cloud environments. Interested can drop your CV to bhavanit@techprojects.com or call on 7386945761

Posted 3 weeks ago

Apply

4.0 - 8.0 years

20 - 27 Lacs

Hyderabad

Work from Office

Naukri logo

Role & responsibilities : Job Title : AI Engineer (AI-Powered Agents, Knowledge Graphs, & MLOps) Location: Hyderabad Job Type : Full-time Hands-on Gen AI development in GCP and Azure stack Job Summary : We seek an AI Engineer with deep expertise in building AI-powered agents, designing and implementing knowledge graphs, and optimizing business processes through AI-driven solutions. The role also requires hands-on experience in AI Operations (AI Ops), including continuous integration/deployment (CI/CD), model monitoring, and retraining. The ideal candidate will have experience working with open-source or commercial large language models (LLMs) and be proficient in using platforms like Azure Machine Learning Studio or Google Vertex AI to scale AI solutions effectively. Key Responsibilities : AI Agent Development : Design, build, and deploy AI-powered agents for applications such as virtual assistants, customer service bots, and task automation systems using LLMs and other AI models. Knowledge Graph Implementation : Develop and implement knowledge graphs for enterprise data integration, enhancing the retrieval, structuring, and management of large datasets to support decision-making. AI-Driven Process Optimization : Collaborate with business units to optimize workflows using AI-driven solutions, automating decision-making processes and improving operational efficiency. AI Ops (MLOps) : Implement robust AI/ML pipelines that follow CI/CD best practices to ensure continuous integration and deployment of AI models across different environments. Model Monitoring and Maintenance : Establish processes for real-time model monitoring, including tracking performance, drift detection, and accuracy of models in production environments. Model Retraining and Optimization : Develop automated or semi-automated pipelines for model retraining based on changes in data patterns or model performance. Implement processes to ensure continuous improvement and accuracy of AI solutions. Cloud and ML Platforms : Utilize platforms such as Azure Machine Learning Studio, Google Vertex AI, and open-source frameworks for end-to-end model development, deployment, and monitoring. Collaboration : Work closely with data scientists, software engineers, and business stakeholders to deploy scalable AI solutions that deliver business impact. MLOps Tools : Leverage MLOps tools for version control, model deployment, monitoring, and automated retraining processes to ensure operational stability and scalability of AI systems. Performance Optimization : Continuously optimize models for scalability and performance, identifying bottlenecks and improving efficiencies. Qualifications : Bachelors or Masters degree in Computer Science, Artificial Intelligence, Data Science, or a related field. 3+ years of experience as an AI Engineer, focusing on AI-powered agent development, knowledge graphs, AI-driven process optimization, and MLOps practices. Proficiency in working with large language models (LLMs) such as GPT-3/4, GPT-J, BLOOM, or similar, including both open-source and commercial variants. Experience with knowledge graph technologies, including ontology design and graph databases (e.g., Neo4j, AWS Neptune). AI Ops/MLOps Expertise : Hands-on experience with AI/ML CI/CD pipelines, automated model deployment, and continuous model monitoring in production environments. Familiarity with tools and frameworks for model lifecycle management, such as MLflow, Kubeflow, or similar. Strong skills in Python, Java, or similar languages, and proficiency in building, deploying, and monitoring AI models. Solid experience in natural language processing (NLP) techniques, including building conversational AI, entity recognition, and text generation models. Model Monitoring & Retraining : Expertise in setting up automated pipelines for model retraining, monitoring for drift, and ensuring the continuous performance of deployed models. Experience in using cloud platforms like Azure Machine Learning Studio, Google Vertex AI, or similar cloud-based AI/ML tools. Preferred Skills : Experience with building or integrating conversational AI agents using platforms like Microsoft Bot Framework, Rasa, or Dialogflow. Familiarity with AI-driven business process automation and RPA integration using AI/ML models. Knowledge of advanced AI-driven process optimization tools and techniques, including AI orchestration for enterprise workflows. Experience with containerization technologies (e.g., Docker, Kubernetes) to support scalable AI/ML model deployment. Certification in Azure AI Engineer Associate, Google Professional Machine Learning Engineer, or relevant MLOps-related certifications is a plus. Preferred candidate profile Perks and benefits

Posted 3 weeks ago

Apply

3.0 - 5.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

: Job TitleGCP Data Engineer, AS LocationPune, India Corporate TitleAssociate Role Description An Engineer is responsible for designing and developing entire engineering solutions to accomplish business goals. Key responsibilities of this role include ensuring that solutions are well architected, with maintainability and ease of testing built in from the outset, and that they can be integrated successfully into the end-to-end business process flow. They will have gained significant experience through multiple implementations and have begun to develop both depth and breadth in several engineering competencies. They have extensive knowledge of design and architectural patterns.They will provide engineering thought leadership within their teams and will play a role in mentoring and coaching of less experienced engineers. What we'll offer you As part of our flexible scheme, here are just some of the benefits that youll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your key responsibilities Design, develop and maintain data pipelines using Python and SQL programming language on GCP. Experience in Agile methodologies, ETL, ELT, Data movement and Data processing skills. Work with Cloud Composer to manage and process batch data jobs efficiently. Develop and optimize complex SQL queries for data analysis, extraction, and transformation. Develop and deploy google cloud services using Terraform. Implement CI CD pipeline using GitHub Action Consume and Hosting REST API using Python. Monitor and troubleshoot data pipelines, resolving any issues in a timely manner. Ensure team collaboration using Jira, Confluence, and other tools. Ability to quickly learn new any existing technologies Strong problem-solving skills. Write advanced SQL and Python scripts. Certification on Professional Google Cloud Data engineer will be an added advantage. Your skills and experience 6+ years of IT experience, as a hands-on technologist. Proficient in Python for data engineering. Proficient in SQL. Hands on experience on GCP Cloud Composer, Data Flow, Big Query, Cloud Function, Cloud Run and well to have GKE Hands on experience in REST API hosting and consumptions. Proficient in Terraform/ Hashicorp. Experienced in GitHub and Git Actions Experienced in CI-CD Experience in automating ETL testing using python and SQL. Good to have API knowledge Good to have Bit Bucket How we'll support you Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs

Posted 3 weeks ago

Apply

6.0 - 11.0 years

5 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

The Senior Database Administrator (DBA) will be responsible for management, maintenance and migration of secure databases on Cloud and On-Prem platforms. Primary goal is to provide Always Available secure database tier with high performance, for both backend data and frontend application accessibility for end-users. Responsible database administrator should be able to effectively communicate and troubleshoot problems, have sound technical skills and administrative aptitude to design, build, operate, secure, manage and maintain our company databases. William ONeil India is part of the O’Neil companies, a family of businesses dedicated to providing industry-leading financial services and information. Our professional teams are passionate about stock market research and the development of services that support all O’Neil brands. Responsibilities • Design, develop, install, tune, deploy, secure, migrate and upgrade DBMS installations • Monitor database performance, identify and resolve database issues • Migrate databases across Cloud and On-Prem platforms with minimal or no downtime • Provide guidance and support to Developers on design, code review, SQL query performance tuning • HA/DR setup, Replication, database encryption, Index and Filegroup management • Monitors database system for performance and capacity constraints • Regularly liaise with On-shore and Off-shore Managers, Developers, Operations, System and Database administrators • Suggest changes and improvements for management, maintenance and securing databases • Explore new database tools and stay apprised about emerging technologies and trends • Be available for on-call and weekend support as needed Database Administrator Skills and Qualifications • Working knowledge in database programming languages (in MSSQL/PostgreSQL/MySQL) • Knowledge of database backup-recovery, security and performance monitoring standards • Understanding of database concepts – ACID properties, normal forms, DDL/DML, transaction logging, log shipping, Mirroring, High-Availability etc. • Understanding of relational and dimensional data modelling • Experience in developing migration plans to migrate on-premises databases to Cloud (such as AWS/Azure) • Excellent communication skills with attention to detail and problem-solving attitude • Adapt to new process or technology changes within the organization Educational and Experience Requirements • Bachelor’s degree in computer science or related information technology field with 6 to 8 years of experience in database administration or development • Proficient knowledge in the working, maintenance of database technologies (MSSQL, PostgreSQL, MySQL etc.) • Working experience with database cloud services (AWS, Azure, GCP) including migration • Relevant database certifications on cloud technologies such as AWS, Azure • Knowledge of Windows Server, Linux systems and Cloud environments

Posted 3 weeks ago

Apply

2.0 years

27 - 42 Lacs

Pune

Work from Office

Naukri logo

The Role We’re hiring a Software Engineer to be a part of a team which will deliver these groundbreaking software solutions. In this role, you will collaborate closely with cross-functional teams, including data scientists and product managers, to build intuitive solutions that transform how clients experience AI and ML in the application and elevate their interaction with financial data. Come join us! What You’ll Do Design and deliver secure, event-driven AI applications that provide responsive, impactful chat experiences powered by LLMs. Implement and maintain engineering solutions by writing well-designed, testable code. Build scalable and resilient systems with a focus on safety, privacy, and real-time performance. Document software functionality, system design, and project plans; this includes clean, readable code with comments. Collaborate across Addepar with product teams and other stakeholders to deliver seamless, AI-powered client experiences. Who You Are Proficient with Python, Java or similar Experience with AWS, Azure or GCP cloud deployment Experience with streaming data platforms and event driven architecture Ability to write software to process, aggregate, and compute on top of large amounts of data in an efficient way. Engage with all levels of collaborators on a technical level. A strong ownership mentality and strive to take on the most important problems. Knowledge of front end development is a plus.

Posted 3 weeks ago

Apply

5.0 - 8.0 years

10 - 17 Lacs

Pune, Chennai, Mumbai (All Areas)

Work from Office

Naukri logo

Role & responsibilities ETL Testing,Any automation(java,Python, Selenium), Cloud- AWS/Azure/GCP,)SQL Preferred candidate profile

Posted 3 weeks ago

Apply

6.0 - 25.0 years

15 - 70 Lacs

Navi Mumbai, Pune, Bengaluru

Work from Office

Naukri logo

Navi Mumbai, Pune, Bengaluru | 6 - 25 years | Gcp Cloud, Java, Spring Boot, Microservices, Java Programming, Gke Cluster, J2Ee, JEE, SQL, Google Cloud Services, Google Cloud Platforms

Posted 3 weeks ago

Apply

12.0 - 17.0 years

13 - 18 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Designation - Technical Architect 12+ years of experience working in Java and relevant technologies. Guiding customers in designing and creating a new architecture. Significant software development experience with expertise in Java and knowledge of latest Java 9 features Strong knowledge in Microservices Design Patterns and Architecture Must have experience in GCP Cloud Excellent knowledge of Spring and SpringBoot , and proven track record of using SpringBoot to build cloud-native microservices Knowledge of synchronous and event-driven integration patterns between services Experience with multi-threading, collections, etc. Thorough experience in writing high quality code with fully automated unit test coverage (Junit, Mockito, etc.) Extensive experience in defining and applying design standards, depending on the solutions Working experience with various CI/CD tools Designing data models for different types of database solutions - Oracle and Mongo DB Working experience with web-services (REST, SOAP) and/or experience in Microservices Experience with Kafka and XML Deep knowledge of OOPS, data structure, and algorithm Working knowledge of other DevOps tools, container technologies (Docker, Kubernetes, etc.) and Cloud Good knowledge of build tools (like Maven), automated testing like Cucumber, and building apps that meet all NFRs Understanding and experience with building GCP cloud native applications Working experience creating high performing applications including profiling and tuning to boost performance Experience in Unit Testing, TDD/BDD and in Scrum/Agile Understanding of cloud infrastructures and operating procedures

Posted 3 weeks ago

Apply

5.0 - 10.0 years

17 - 30 Lacs

Noida

Remote

Naukri logo

JD - Required skills: 5+ years of industry experience in the field of Data Engineering support and enhancement. Proficient in Google Cloud Platform (GCP) services such as Dataflow, BigQuery, Cloud Storage and Pub/Sub. Strong understanding of data pipeline architectures and ETL processes. Experience with Python programming language in terms of data processing. Knowledge of SQL and experience with relational databases. Familiarity with version control systems like Git.Ability to analyze, troubleshoot, and complex data pipeline issues. Software engineering experience in optimizing data pipelines to improve performance and reliability. Continuously optimize data pipeline efficiency and reduce operational costs and reduce number of issues/failures. Automate repetitive tasks in data processing and management Experience in monitoring and alerting for Data Pipelines. Continuously improve data pipeline reliability through analysis and testing Perform SLA-oriented monitoring for critical pipelines and provide suggestions as well implement post business approval for SLA adherence if needed. Monitor performance and reliability of GCP data pipelines, Informatica ETL workflows, MDM and Control-M jobs. Maintain infrastructure reliability for GCP data pipelines, Informatica ETL workflows, MDM and Control-M jobs. Conduct post-incident reviews and implement improvements for data pipelines. Develop and maintain documentation for data pipeline systems and processes. Excellent communication and documentation skills. Strong problem-solving and analytical skills. Open to work in a 24X7 shift.

Posted 3 weeks ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

GCP data engineer Big Query SQL Python Talend ETL Programmer GCP or Any Cloud technology. Job Description: Experienced in GCP data engineer Big Query SQL Python Talend ETL Programmer GCP or Any Cloud technology. Good experience in building the pipeline of GCP Components to load the data into Big Query and to cloud storage buckets. Excellent Data Analysis skills. Good written and oral communication skills Self-motivated able to work independently

Posted 3 weeks ago

Apply

3.0 - 6.0 years

5 - 10 Lacs

Mumbai

Work from Office

Naukri logo

Role & responsibilities L2 SRE / Site Reliability Engineer with GCP cloud experience Experience- 3 to 5 Years Location - Mumbai Work from Office Notice Period- Immediate to 15 Days Proficiency in GCP , monitoring. Strong knowledge of Linux and PostgreSQL. Experience in database management, troubleshooting, RCA, and application deployment using cloud platform Ability to create SLA reports and provide on-call support to clients Qualification - B.E.,/ B. Tech,/ MCA / BCA / B. Sc. IT Interested candidate please share resume to ruvina.m@futurzhr.com along with below details Total experience Current CTC Expected CTC Current Location Ready to Relocate Mumbai Note : Only Female candidate can apply Thanks Futurz Staffing Solutions Pvt. Ltd

Posted 3 weeks ago

Apply

6.0 - 10.0 years

1 - 1 Lacs

Chennai

Work from Office

Naukri logo

Location: Chennai/DLF IT Park Notice period: Immediate to 45 days We are looking for a Senior Java Engineer with experience in Cloud/GCP - strong understanding of APIs, data structures & optimization Should have very good understanding of basic concepts, Core Java good in kubernetes, orchestration.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Cloud Platform Engineer Project Role Description : Designs, builds, tests, and deploys cloud application solutions that integrate cloud and non-cloud infrastructure. Can deploy infrastructure and platform environments, creates a proof of architecture to test architecture viability, security and performance. Must have skills : Infrastructure As Code (IaC) Good to have skills : Google Cloud Platform ArchitectureMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a GCP Cloud Infrastructure Engineer, you will build and maintain scalable, reliable, and secure cloud infrastructure on Google Cloud Platform. Implement best practices for resource provisioning, monitoring, and cost management. Roles & Responsibilities:- Build and maintain scalable, reliable, and secure cloud infrastructure on Google Cloud Platform. Implement best practices for resource provisioning, monitoring, and cost management. Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the implementation of new cloud technologies- Develop and maintain cloud infrastructure as code templates- Ensure compliance with security and governance policies Professional & Technical Skills: - Cloud services, infrastructure as code (IaC), GCP,-Terraform-Ansible- Must To Have Skills: Proficiency in Infrastructure As Code (IaC)- Good To Have Skills: Experience with Google Cloud Platform Architecture- Strong understanding of cloud architecture principles- Experience in deploying and managing cloud services- Knowledge of automation tools like Terraform or Ansible Additional Information:- The candidate should have a minimum of 7.5 years of experience in Infrastructure As Code (IaC)- This position is based at our Bengaluru office- A 15 years full time education is required Qualification 15 years full time education

Posted 3 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Mumbai, Bengaluru, Delhi / NCR

Work from Office

Naukri logo

Role Overview: We are looking for a Cloud Engineer who can work across the entire web development stack to build robust, scalable, and user-centric applications for our client You will play a critical role in designing and delivering systems end-to-endfrom sleek, responsive UIs to resilient backend services and APIs Whether you're just starting your career or bringing seasoned expertise, were looking for hands-on problem solvers with a passion for clean code and great product experiences Responsibilities: Design and implement secure, scalable, and cost-optimized cloud infrastructure using AWS/GCP/Azure services Automate infrastructure provisioning and management using Infrastructure as Code (IaC) tools like Terraform or CloudFormation Set up and maintain CI/CD pipelines for smooth and reliable software delivery Monitor system performance, availability, and incident response using modern observability tools (e g , CloudWatch, Datadog, ELK, Prometheus) Ensure robust cloud security by managing IAM policies, encryption, and secrets Collaborate closely with backend engineers, data teams, and DevOps to support deployment and system stability Optimize cloud costs and usage through rightsizing, autoscaling, and resource cleanups Required Skills: Hands-on experience with cloud platforms: AWS, Azure, or GCP (preferably AWS) Proficiency in IaC tools: Terraform, CloudFormation, or Pulumi Experience with containerization and orchestration: Docker and Kubernetes Strong scripting skills in Bash, Python, or similar Deep understanding of networking, firewalls, load balancing, and VPC setups Experience with CI/CD tools (GitHub Actions, Jenkins, GitLab CI) and Git workflows Familiarity with monitoring and logging stacks (Prometheus, Grafana, ELK, etc) Sound knowledge of cloud security, IAM, and access control best practices Nice to Have: Exposure to serverless architecture (AWS Lambda, GCP Cloud Functions) Experience in multi-cloud or hybrid cloud environments Familiarity with cloud-native database services (eg, RDS, DynamoDB, Firestore) Awareness of compliance frameworks (SOC2, GDPR, HIPAA) and cloud governance practices Educational Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related technical field Location-Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 3 weeks ago

Apply

10.0 - 14.0 years

35 - 50 Lacs

Hyderabad

Work from Office

Naukri logo

Azure devops, AWS EKS , Terraform, Python, Kubernetes, SRE Job Summary We are seeking an experienced Infra Ops Specialist with 10 to 14 years of experience to join our team. The ideal candidate will have expertise in Kubernetes Azure DevOps AWS EKS Elastic Beanstalk Automation Python AWS GCP SRE Ansible and Terraform. This role requires a strong background in Consumer Lending. The work model is hybrid and the shift is day. No travel is required. Responsibilities Lead the design and implementation of infrastructure solutions using Kubernetes AWS EKS and Elastic Beanstalk. Oversee the deployment and management of applications using Azure DevOps and Terraform. Provide automation solutions using Python and Ansible to streamline operations. Ensure the reliability and availability of infrastructure through SRE practices. Collaborate with cross-functional teams to support Consumer Lending applications. Monitor and optimize cloud infrastructure on AWS and GCP. Develop and maintain CI/CD pipelines for efficient software delivery. Implement security best practices and compliance standards in cloud environments. Troubleshoot and resolve infrastructure issues in a timely manner. Document infrastructure configurations and operational procedures. Mentor junior team members and provide technical guidance. Stay updated with the latest industry trends and technologies. Contribute to the continuous improvement of infrastructure processes. Qualifications Must have extensive experience with Kubernetes AWS EKS and Elastic Beanstalk. Should have strong expertise in Azure DevOps and Terraform. Must be proficient in automation using Python and Ansible. Should have a solid understanding of SRE practices. Must have experience with AWS and GCP cloud platforms. Should have a background in Consumer Lending domain.

Posted 4 weeks ago

Apply

3.0 - 8.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Naukri logo

Senior Software Engineer HUB 2 Building of SEZ Towers, Karle Town Center, Nagavara, Bengaluru, Karnataka, India, 560045 Hybrid - Full-time Company Description When you are one of us, you get to run with the best. For decades, weve been helping marketers from the world’s top brands personalize experiences for millions of people with our cutting-edge technology, solutions and services. Epsilon’s best-in-class identity gives brands a clear, privacy-safe view of their customers, which they can use across our suite of digital media, messaging and loyalty solutions. We process 400+ billion consumer actions each day and hold many patents of proprietary technology, including real-time modeling languages and consumer privacy advancements. Thanks to the work of every employee, Epsilon India is now Great Place to Work-Certified™. Epsilon has also been consistently recognized as industry-leading by Forrester, Adweek and the MRC. Positioned at the core of Publicis Groupe, Epsilon is a global company with more than 8,000 employees around the world. For more information, visit epsilon.com/apac or our LinkedIn page. Click here to view how Epsilon transforms marketing with 1 View, 1 Vision and 1 Voice. https://www.epsilon.com/apac/youniverse Job Description About BU The Product team forms the crux of our powerful platforms and connects millions of customers to the product magic. This team of innovative thinkers develop and build products that help Epsilon be a market differentiator. They map the future and set new standards for our products, empowered with industry best practices, ML and AI capabilities. The team passionately delivers intelligent end-to-end solutions and plays a key role in Epsilon’s success story. Why we are Looking for You? At Epsilon, we run on our people’s ideas. It’s how we solve problems and exceed expectations. Our team is now growing, and we are on the lookout for talented individuals who always raise the bar by constantly challenging themselves and are experts in building customized solutions in the digital marketing space. What you will enjoy in this Role? So, are you someone who wants to work with cutting-edge technology and enable marketers to create data-driven, omnichannel consumer experiences through data platforms? Then you could be exactly who we are looking for. Apply today and be part of a creative, innovative, and talented team that’s not afraid to push boundaries or take risks. What will you do? We seek Software Engineers with experience building and scaling services in on-premises and cloud environments. As a Senior & Lead Software Engineer in the Epsilon Attribution/Forecasting Product Development team, you will design, implement, and optimize data processing solutions using Scala, Spark, and Hadoop. Collaborate with cross-functional teams to deploy big data solutions on our on-premises and cloud infrastructure along with building, scheduling and maintaining workflows. Perform data integration and transformation, troubleshoot issues, Document processes, communicate technical concepts clearly, and continuously enhance our attribution engine/forecasting engine. Strong written and verbal communication skills (in English) are required to facilitate work across multiple countries and time zones. Good understanding of Agile Methodologies – SCRUM. Qualifications Strong experience (3 - 8 years) in Python or Scala programming language and extensive experience with Apache Spark for Big Data processing for design, developing and maintaining scalable on-prem and cloud environments, especially on AWS and as needed with GCP cloud. Proficiency in performance tuning of Spark jobs, optimizing resource usage, shuffling, partitioning, and caching for maximum efficiency in Big Data environments. In-depth understanding of the Hadoop ecosystem, including HDFS, YARN, and MapReduce. Expertise in designing and implementing scalable, fault-tolerant data pipelines with end-to-end monitoring and alerting. Using Python to develop infrastructure modules. Hence, hands-on experience with Python. Solid grasp of database systems and SQLs for writing efficient SQL’s (RDBMS/Warehouse) to handle TBS of data. Familiarity with design patterns and best practices for efficient data modelling, partitioning strategies, and sharding for distributed systems and experience in building, scheduling and maintaining DAG workflows. End-to-end ownership with definition, development, and documentation of software’s objectives, business requirements, deliverables, and specifications in collaboration with stakeholders. Experience in working on GIT (or equivalent source control) and solid understanding of Unit and integration test frameworks. Must have the ability to collaborate with stakeholders/teams to understand requirements and develop a working solution and the ability to work within tight deadlines and effectively prioritize and execute tasks in a high-pressure environment. Must be able to mentor junior staff. Advantageous to have experience on below: Hands-on with Databricks for unified data analytics, including Databricks Notebooks, Delta Lake, and Catalogues. Proficiency in using the ELK (Elasticsearch, Logstash, Kibana) stack for real-time search, log analysis, and visualization. Strong background in analytics, including the ability to derive actionable insights from large datasets and support data-driven decision-making. Experience with data visualization tools like Tableau, Power BI, or Grafana. Familiarity with Docker for containerization and Kubernetes for orchestration.

Posted 4 weeks ago

Apply

4.0 - 6.0 years

0 - 1 Lacs

Hyderabad

Work from Office

Naukri logo

Roles and Responsibilities Design, develop, and maintain scalable and efficient cloud infrastructure on Google Cloud Platform (GCP) using Kubernetes Engine, Cloud Run, and other services. Collaborate with cross-functional teams to identify business requirements and design solutions that meet those needs. Develop automation scripts using Ansible or Terraform to deploy applications on GCP. Troubleshoot issues related to application deployment, networking, storage, and compute resources. Ensure compliance with security best practices and company policies.

Posted 4 weeks ago

Apply

1.0 - 3.0 years

2 - 3 Lacs

Thanjavur

Remote

Naukri logo

Stack Required: React, Node.js, Mongo DB & any cloud candidate should be proficient in React, Node.js, Mongo DB & Cloud. As a Full Stack Developer, you will be responsible for developing, and maintaining web applications across the entire stack. Required Candidate profile Role: Software Development - Other Employment Type: Full Time, Permanent Role Category: Software Development Tamil Nadu candidate most preferred

Posted 4 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies