Home
Jobs

15 Cloud Run Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 8.0 years

3 - 12 Lacs

Hyderabad, Telangana, India

On-site

Foundit logo

4 to 8 years of experience as a DevOps Engineer We will align with Clients existing agile methodology, terminology and backlog management tool DevOps Tools have been chosen (Jenkins, Github, etc.) and agreed with developers, some specific tools e.g. scanning may need to be selected and acquired. Focus is on native GCP services (GKE and no 3rd party or OSS platforms) GCP will be provisioned as code using Terraform Initial dev sandbox for October will have limited platform capabilities (enable development with MVP) Security testing & scanning tools will be decided during discovery Databases will be re-platformed to Cloud SQL Environment will be specified for up to 8 initial microservices (release plan finalized before end of Discovery) Design will include multi-zone, not multi-region Essential functions The primary objective of this project is to unify the e-commerce experience into a single, best-of-breed platform that not only caters to the current needs but also sets the stage for seamless migration of other e-commerce experiences in the future Qualifications - GCP - Kubernetes - Terraform - Cloud Run - Ansible - GKE Would be a plus

Posted 17 hours ago

Apply

3.0 - 5.0 years

14 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Strong in Python with libraries such as polars, pandas, numpy, scikit-learn, matplotlib, tensorflow, torch, transformers • Must have: Deep understanding of modern recommendation systems including two-tower , multi-tower , and cross-encoder architectures • Must have: Hands-on experience with deep learning for recommender systems using TensorFlow , Keras , or PyTorch • Must have: Experience generating and using text and image embeddings (e.g., CLIP , ViT , BERT , Sentence Transformers ) for content-based recommendations • Must have: Experience with semantic similarity search and vector retrieval for matching user-item representations • Must have: Proficiency in building embedding-based retrieval models , ANN search , and re-ranking strategies • Must have: Strong understanding of user modeling , item representations , temporal/contextual personalization • Must have: Experience with Vertex AI for training, tuning, deployment, and pipeline orchestration • Must have: Experience designing and deploying machine learning pipelines on Kubernetes (e.g., using Kubeflow Pipelines , Kubeflow on GKE , or custom Kubernetes orchestration ) • Should have experience with Vertex AI Matching Engine or deploying Qdrant , F AISS , ScaNN , on GCP for large-scale retrieval • Should have experience working with Dataproc (Spark/PySpark) for feature extraction, large-scale data prep, and batch scoring • Should have a strong grasp of cold-start problem solving using metadata and multi-modal embeddings • Good to have: Familiarity with Multi-Modal Retrieval Models combining text, image, and tabular features • Good to have: Experience building ranking models (e.g., XGBoost , LightGBM , DLRM ) for candidate re-ranking • Must have: Knowledge of recommender metrics (Recall@K, nDCG, HitRate, MAP) and offline evaluation frameworks • Must have: Experience running A/B tests and interpreting results for model impact • Should be familiar with real-time inference using Vertex AI , Cloud Run , or TF Serving • Should understand feature store concepts , embedding versioning , and serving pipelines • Good to have: Experience with streaming ingestion (Pub/Sub, Dataflow) for updating models or embeddings in near real-time • Good to have: Exposure to LLM-powered ranking or personalization , or hybrid recommender setups • Must follow MLOps practices version control, CI/CD, monitoring, and infrastructure automation GCP Tools Experience: ML & AI : Vertex AI, Vertex Pipelines, Vertex AI Matching Engine, Kubeflow on GKE, AI Platform Embedding & Retrieval : Matching Engine, FAISS, ScaNN, Qdrant, GKE-hosted vector DBs (Milvus) Storage : BigQuery, Cloud Storage, Firestore Processing : Dataproc (PySpark), Dataflow (batch & stream) Ingestion : Pub/Sub, Cloud Functions, Cloud Run Serving : Vertex AI Online Prediction, TF Serving, Kubernetes-based custom APIs, Cloud Run CI/CD & IaC : GitHub Actions, GitLab CI

Posted 2 days ago

Apply

4.0 - 6.0 years

5 - 8 Lacs

Gurugram

Work from Office

Naukri logo

Required Skills: Strong expertise in NestJS framework. Proficient in building and managing Microservices architecture. Hands-on experience with Apache Kafka for real-time data streaming and messaging. Experience with Google Cloud Platform (GCP) services, including but not limited to Cloud Functions, Cloud Run, Pub/Sub, BigQuery, and Kubernetes Engine. Familiarity with RESTful APIs, database systems (SQL/NoSQL), and performance optimization. Solid understanding of version control systems, particularly Git. Preferred Skills: Knowledge of containerization using Docker. Experience with automated testing frameworks and methodologies. Understanding of monitoring, logging, and observability tools and practices. Responsibilities: Design, develop, and maintain backend services using NestJS within a microservices architecture. Implement robust messaging and event-driven architectures using Kafka. Deploy, manage, and optimize applications and services on Google Cloud Platform. Ensure high performance, scalability, reliability, and security of backend services. Collaborate closely with front-end developers, product managers, and DevOps teams. Write clean, efficient, and maintainable code, adhering to best practices and coding standards. Perform comprehensive testing and debugging, addressing production issues promptly. Job Location is in Office & based out of Gurgaon Selected candidate needs to have own Laptop

Posted 1 week ago

Apply

4.0 - 9.0 years

10 - 15 Lacs

Pune

Work from Office

Naukri logo

DevOps Engineer (Google Cloud Platform) About Us IntelligentDX is a dynamic and innovative company dedicated to changing the Software landscape in the Healthcare industry. We are looking for a talented and experienced DevOps Engineer to join our growing team and help us build and maintain our scalable, reliable, and secure cloud infrastructure on Google Cloud Platform. Job Summary We are seeking a highly skilled DevOps Engineer with 4 years of hands-on experience, specifically with Google Cloud technologies. The ideal candidate will be responsible for designing, implementing, and maintaining our cloud infrastructure, ensuring the scalability, reliability, and security of our microservices-based software services. You will play a crucial role in automating our development and deployment pipelines, managing cloud resources, and supporting our engineering teams in delivering high-quality applications. Responsibilities Design, implement, and manage robust, scalable, and secure cloud infrastructure on Google Cloud Platform (GCP). Implement and enforce best practices for GCP Identity and Access Management (IAM) to ensure secure access control. Deploy, manage, and optimize applications leveraging Google Cloud Run for serverless deployments. Configure and maintain Google Cloud API Gateway for efficient and secure API management. Implement and monitor security measures across our GCP environment, including network security, data encryption, and vulnerability management. Manage and optimize cloud-based databases, primarily Google Cloud SQL, ensuring data integrity, performance, and reliability. Lead the setup and implementation of new applications and services within our GCP environment. Troubleshoot and resolve issues related to Cross-Origin Resource Sharing (CORS) configurations and other API connectivity problems. Provide ongoing API support to development teams, ensuring smooth integration and operation. Continuously work on improving the scalability and reliability of our software services, which are built as microservices. Develop and maintain CI/CD pipelines to automate software delivery and infrastructure provisioning. Monitor system performance, identify bottlenecks, and implement solutions to optimize resource utilization. Collaborate closely with development, QA, and product teams to ensure seamless deployment and operation of applications. Participate in on-call rotations to provide timely support for critical production issues. Qualifications Required Skills & Experience Minimum of 4 years of hands-on experience as a DevOps Engineer with a strong focus on Google Cloud Platform (GCP). Proven expertise in GCP services, including: GCP IAM: Strong understanding of roles, permissions, service accounts, and best practices. Cloud Run: Experience deploying and managing containerized applications. API Gateway: Experience in setting up and managing APIs. Security: Solid understanding of cloud security principles, network security (VPC, firewall rules), and data protection. Cloud SQL: Hands-on experience with database setup, management, and optimization. Demonstrated experience with the setup and implementation of cloud-native applications. Familiarity with addressing and resolving CORS issues. Experience providing API support and ensuring API reliability. Deep understanding of microservices architecture and best practices for their deployment and management. Strong commitment to building scalable and reliable software services. Proficiency in scripting languages (e.g., Python, Bash) and automation tools. Experience with Infrastructure as Code (IaC) tools (e.g., Terraform, Cloud Deployment Manager). Familiarity with containerization technologies (e.g., Docker, Kubernetes). Excellent problem-solving skills and a proactive approach to identifying and resolving issues. Strong communication and collaboration abilities. Preferred Qualifications GCP certification (e.g., Professional Cloud DevOps Engineer, Professional Cloud Architect). Experience with monitoring and logging tools (e.g., Cloud Monitoring, Cloud Logging, Prometheus, Grafana). Knowledge of other cloud platforms (AWS, Azure) is a plus. Experience with Git and CI/CD platforms (e.g., GitLab CI, Jenkins, Cloud Build). What We Offer Health insurance, paid time off, and professional development opportunities. Fun working environment Flattened hierarchy, where everyone has a say Free snacks, games, and happy hour outings If you are a passionate DevOps Engineer with a proven track record of building and managing robust systems on Google Cloud Platform, we encourage you to apply!

Posted 1 week ago

Apply

8.0 - 10.0 years

19 - 34 Lacs

Bengaluru

Work from Office

Naukri logo

Greetings from TATA Consultancy Services!! Thank you for expressing your interest in exploring a career possibility with the TCS Family. Role: Python with microservices developer Experience: 8 to 10 years Interview Location: Bangalore Key Responsibilities: Design, develop, and maintain Python-based microservices that are scalable, efficient, and secure. Deploy and manage containerized applications Collaborate with cross-functional teams to define, design, and ship new features. Optimize applications for maximum speed and scalability. Implement CI/CD pipelines for automated testing and deployment. Monitor and troubleshoot production applications to ensure high availability and performance. Write clean, maintainable code and ensure adherence to coding standards. Stay updated with industry trends and emerging technologies related to microservices and cloud computing. Requirements: Proven experience as a Python developer, specifically in developing microservices. Strong understanding of containerization and orchestration (Docker, Kubernetes). Experience with Google Cloud Platform, specifically Cloud Run, Cloud Functions, and other related services. Familiarity with RESTful APIs and microservices architecture. Knowledge of database technologies (SQL and NoSQL) and data modelling. Proficiency in version control systems (Git). Experience with CI/CD tools and practices. Strong problem-solving skills and the ability to work independently and collaboratively. Excellent communication skills, both verbal and written. Preferred Qualifications: Experience with cloud platforms is a plus. Familiarity with Python frameworks (Flask, FastAPI, Django). Understanding of DevOps practices and tools (Terraform, Jenkins). Knowledge of monitoring and logging tools (Prometheus, Grafana, Stackdriver).

Posted 1 week ago

Apply

7.0 - 12.0 years

12 - 22 Lacs

Bengaluru

Hybrid

Naukri logo

Responsibilities: Design and implement secure architecture on Google Cloud Platforms (GCP) using IAM, SDLC, CI/CD pipelines with Python or Java.

Posted 2 weeks ago

Apply

9.0 - 12.0 years

0 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Excellent knowledge on GCP Cloud Run, Cloud Task, Cloud Pub Sub & Cloud Storage. Handson with Python or NodeJs coding for application development Understanding of GCP service choices based on SLAs, scalability, compliance, and integration needs Proficiency in understanding trade-offs between choosing either of the services (e.g., why Cloud Run with 4 vCPUs over GKE with GPU). Deep understanding of concurrency settings, DB pool strategies, scaling Proficiency in implementing resilient, cost-optimized, low latency, enterprise-grade cloud solutions Proficient in suggesting configuration, predictive autoscaling, concurrency, cold start mitigation, failover etc for the different GCP services as per business needs Experience in Micro Services architecture and development. Able to configure and build systems, not just stitch them together Must be able to root cause pipeline latency, scaling issues, errors and downtime Cross-functional leadership during architecture definition, implementation, and rollout.

Posted 2 weeks ago

Apply

0.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. Youll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, youll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities As an Application Developer, you will lead IBM into the future by translating system requirements into the design and development of customized systems in an agile environment. The success of IBM is in your hands as you transform vital business needs into code and drive innovation. Your work will power IBM and its clients globally, collaborating and integrating code into enterprise systems. You will have access to the latest education, tools and technology, and a limitless career path with the worlds technology leader. Come to IBM and make a global impact! IBMs Cloud Services are focused on supporting clients on their cloud journey across any platform to achieve their business goals. It encompasses Cloud Advisory, Architecture, Cloud Native Development, Application Portfolio Migration, Modernization, and Rationalization as well as Cloud Operations. Cloud Services supports all public/private/hybrid Cloud deployments: IBM Bluemix/IBM Cloud/Red Hat/AWS/ Azure/Google and client private environments. Cloud Services has the best Cloud developer architect Complex SI, Sys Ops and delivery talent delivered through our GEO CIC Factory model. As a member of our Cloud Practice you will be responsible for defining and implementing application cloud migration, modernisation and rationalisation solutions for clients across all sectors. You will support mobilisation and help to lead the quality of our programmes and services, liaise with clients and provide consulting services including: Create cloud migration strategies defining delivery architecture, creating the migration plans, designing the orchestration plans and more. Assist in creating and executing of migration run books Evaluate source cloud (Physical Virtual and Cloud) and target Workloads Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Cloud data engineers with GCP PDE certification and working experience with GCP. Building end to end data pipelines in GCP using pub sub, big query, dataflow, cloud workflow/cloud scheduler, cloud run, data proc, Cloud Function Experience in logging and monitoring of GCP services and Experience in Terraform and infrastructure automation. Expertise in Python coding language Develops data engineering solutions on Google Cloud ecosystem and Supports and maintains data engineering solutions on Google Cloud ecosystem Preferred technical and professional experience Stay updated with the latest trends and advancements in cloud technologies, frameworks, and tools. Conduct code reviews and provide constructive feedback to maintain code quality and ensure adherence to best practices. Troubleshoot and debug issues, and deploy applications to the cloud platform

Posted 3 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centres (Delivery Centres), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centres offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your role and responsibilities A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe.Youll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat. In your role, you will be responsible for: Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies : OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 3 weeks ago

Apply

15.0 - 18.0 years

45 - 50 Lacs

Noida, Mumbai, Pune

Work from Office

Naukri logo

Skill & Experience 15-18 year experience on JAVA is must with hands-on experience of architecting solutions using cloud native PaaS services such as Databases, Messaging, Storage, Compute in Google Cloud in Pre-sales capability Experience in monolith to microservices modernization engagements Should have worked on multiple engagements related with Application assessment as part of Re-factoring/Containerization and Re-architecting cloud journeys Should have been part of large digital transformation project Experience building, architecting, designing, and implementing highly distributed global cloud-based systems. Experience in network infrastructure, security, data, or application development. Experience with structured Enterprise Architecture practices, hybrid cloud deployments, and on premise-to-cloud migration deployments and roadmaps. Architecting microservice/API Ability to deliver results and work cross-functionally. Ability to engage/influence audiences and identify expansion engagements Certification in Google Professional Cloud Architect is desirable Experience with Agile/SCRUM environment. Familiar with Agile Team management tools (JIRA, Confluence) Understand and promote Agile values: FROCC (Focus, Respect, Openness, Commitment,Courage) Working with Docker, Openshift, GKE and Cloud Run Designing database in Oracle/Cloud SQL/Cloud Spanner Designing software which has low operational cost and cloud billing Contributing to building best practices and defining reference architecture

Posted 3 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Foundit logo

Introduction In this role, youll work in one of our IBM Consulting Client Innovation Centres (Delivery Centres), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centres offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe.Youll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat. In your role, you will be responsible for: Skilled Multiple GCP services - GCS, BigQuery, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies : OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 3 weeks ago

Apply

5.0 - 7.0 years

0 Lacs

Gurgaon / Gurugram, Haryana, India

On-site

Foundit logo

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. Youll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, youll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. In this role, youll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology. Your role and responsibilities A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. Youll work with visionaries across multiple industries to improve the hybrid cloud and Al journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat In your role, you will be responsible for: Skilled Multiple GCP services - GCS, Big Query, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer etc. Must have Python and SQL work experience & Proactive, collaborative and ability to respond to critical situation Ability to analyse data for functional business requirements & front face customer Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise 5 to 7 years of relevant experience working as technical analyst with Big Query on GCP platform. Skilled in multiple GCP services - GCS, Cloud SQL, Dataflow, Pub/Sub, Cloud Run, Workflow, Composer, Error reporting, Log explorer Ambitious individual who can work under their own direction towards agreed targets/goals and with creative approach to work You love collaborative environments that use agile methodologies to encourage creative design thinking and find innovative ways to develop with cutting edge technologies. End to End functional knowledge of the data pipeline/transformation implementation that the candidate has done, should understand the purpose/KPIs for which data transformation was done Preferred technical and professional experience Experience with AEM Core Technologies: OSGI Services, Apache Sling ,Granite Framework., Java Content Repository API, Java 8+, Localization Familiarity with building tools, Jenkin and Maven , Knowledge of version control tools, especially Git, Knowledge of Patterns and Good Practices to design and develop quality and clean code, Knowledge of HTML, CSS, and JavaScript , jQuery Familiarity with task management, bug tracking, and collaboration tools like JIRA and Confluence

Posted 3 weeks ago

Apply

5.0 - 8.0 years

20 - 22 Lacs

Chennai

Work from Office

Naukri logo

Minimum 5 Years of in-depth experience in Java/Spring Boot Minimum 3 Years of Experience in Angular ability to develop rich UI screens and custom/re-usable components. Minimum 2 Years of GCP experience working in GCP Big Query, Google Cloud Storage, Cloud Run, PubSub. Minimum 2 of experience in using CI/CD pipelines like Tekton. 1-2 Years of experience in deploying google cloud services using Terraform. Experience mentoring other software engineers and delivering systemic change across 5+ years of experience in J2EE

Posted 3 weeks ago

Apply

5.0 - 9.0 years

19 - 25 Lacs

Chennai

Work from Office

Naukri logo

5+ years of experience in Java/J2EE development, including strong object-oriented design principles. --ERP implementation exp preferred. MBC and BTP knowledge appreciated Expertise in Java 8 and above, including functional programming concepts. Expertise in Spring Platform (Spring MVC, Spring Boot, Spring JDBC, Spring Cloud) and RESTful/SOAP web services. In-depth knowledge of GCP services (Cloud Run, Redis, PubSub, Kubernetes, Cloud Scheduler). Experience with Enterprise SSO technology Mandatory Key Skills Java*,functional programming,object-oriented design,SOAP web services,J2EE development,ERP implementation,Java 8,Spring Platform,MBC,BTP,RESTful,GCP services

Posted 4 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

Req ID: 326833 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a GKE to join our team in Bangalore, Karn?taka (IN-KA), India (IN). Job Title / Role: GCP & GKE Staff Engineer NTT DATA Services strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Lead Engineer to join our team in Bangalore, Karn?taka (IN-KA), India (IN). Job Description: Primary Skill: Cloud-Infrastructure-Google Cloud Platform Minimum work experience: 8+ yrs Total Experience: 8+ Years Must have GCP Solution Architect Certification & GKE Mandatory Skills: Technical Qualification/ Knowledge: Expertise in assessment, designing and implementing GCP solutions including aspects like compute, network, storage, identity, security , DR/business continuity strategy, migration , templates , cost optimization, PowerShell ,Terraforms, Ansible etc.. Must have GCP Solution Architect Certification Should have prior experience in executing large complex cloud transformation programs including discovery, assessment , business case creation , design , build , migration planning and migration execution Should have prior experience in using industry leading or native discovery, assessment and migration tools Good knowledge on the cloud technology, different patterns, deployment methods, compatibility of the applications Good knowledge on the GCP technologies and associated components and variations Anthos Application Platform Compute Engine , Compute Engine Managed Instance Groups , Kubernetes Cloud Storage , Cloud Storage for Firebase , Persistant Disk , Local SSD , Filestore , Transfer Service Virtual Private Network (VPC), Cloud DNS , Cloud Interconnect , Cloud VPN Gateway , Network Load Balancing , Global load balancing , Firewall rules , Cloud Armor Cloud IAM , Resource Manager , Multi-factor Authentication , Cloud KMS Cloud Billing , Cloud Console , Stackdriver Cloud SQL, Cloud Spanner SQL, Cloud Bigtable Cloud Run Container services, Kubernetes Engine (GKE) , Anthos Service Mesh , Cloud Functions , PowerShell on GCP Solid understanding and experience in cloud computing based services architecture, technical design and implementations including IaaS, PaaS, and SaaS. Design of clients Cloud environments with a focus on mainly on GCP and demonstrate Technical Cloud Architectural knowledge. Playing a vital role in the design of production, staging, QA and development Cloud Infrastructures running in 24x7 environments. Delivery of customer Cloud Strategies, aligned with customers business objectives and with a focus on Cloud Migrations and DR strategies Nurture Cloud computing expertise internally and externally to drive Cloud Adoption Should have a deep understanding of IaaS and PaaS services offered on cloud platforms and understand how to use them together to build complex solutions. Ensure that all cloud solutions follow security and compliance controls, including data sovereignty. Deliver cloud platform architecture documents detailing the vision for how GCP infrastructure and platform services support the overall application architecture, interaction with application, database and testing teams for providing a holistic view to the customer. Collaborate with application architects and DevOps to modernize infrastructure as a service (IaaS) applications to Platform as a Service (PaaS) Create solutions that support a DevOps approach for delivery and operations of services Interact with and advise business representatives of the application regarding functional and non-functional requirements Create proof-of-concepts to demonstrate viability of solutions under consideration Develop enterprise level conceptual solutions and sponsor consensus/approval for global applications. Have a working knowledge of other architecture disciplines including application, database, infrastructure, and enterprise architecture. Identify and implement best practices, tools and standards Provide consultative support to the DevOps team for production incidents Drive and support system reliability, availability, scale, and performance activities Evangelizes cloud automation and be a thought leader and expert defining standards for building and maintaining cloud platforms. Knowledgeable about Configuration management such as Chef/Puppet/Ansible. Automation skills using CLI scripting in any language (bash, perl, python, ruby, etc) Ability to develop a robust design to meet customer business requirement with scalability, availability, performance and cost effectiveness using GCP offerings Ability to identify and gather requirements to define an architectural solution which can be successfully built and operate on GCP Ability to conclude high level and low level design for the GCP platform which may also include data center design as necessary Capabilities to provide GCP operations and deployment guidance and best practices throughout the lifecycle of a project Understanding the significance of the different metrics for monitoring, their threshold values and should be able to take necessary corrective measures based on the thresholds Knowledge on automation to reduce the number of incidents or the repetitive incidents are preferred Good knowledge on the cloud center operation, monitoring tools, backup solution GKE . Set up monitoring and logging to troubleshoot a cluster, or debug a containerized application. . Manage Kubernetes Objects . Declarative and imperative paradigms for interacting with the Kubernetes API. . Managing Secrets . Managing confidential settings data using Secrets. . Configure load balancing, port forwarding, or setup firewall or DNS configurations to access applications in a cluster. . Configure networking for your cluster. . Hands-on experience with terraform. Ability to write reusable terraform modules. . Hands-on Python and Unix shell scripting is required. . understanding of CI/CD Pipelines in a globally distributed environment using Git, Artifactory, Jenkins, Docker registry. . Experience with GCP Services and writing cloud functions. . Hands-on experience deploying and managing Kubernetes infrastructure with Terraform Enterprise. Ability to write reusable terraform modules. . Certified Kubernetes Administrator (CKA) and/or Certified Kubernetes Application Developer (CKAD) is a plus . Experience using Docker within container orchestration platforms such as GKE. . Knowledge of setting up splunk . Knowledge of Spark in GKE Certification: GCP solution architect & GKE Process/ Quality Knowledge: Must have clear knowledge on ITIL based service delivery ITIL certification is desired Knowledge on quality Knowledge on security processes Soft Skills: Excellent communication skill and capability to work directly with global customers Strong technical leadership skill to drive solutions Focused on quality/cost/time of deliverables Timely and accurate communication Need to demonstrate the ownership for the technical issues and engage the right stakeholders for timely resolution. Flexibility to learn and lead other technology areas like other public cloud technologies, private cloud, automation Good reporting skill Willing to work in different time zones as per project requirement Good attitude to work in team and as individual contributor based on the project and situation Focused, result oriented and self-motivating About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.

Posted 4 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies