Home
Jobs

19537 Gcp Jobs - Page 8

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 20.0 years

13 - 17 Lacs

Pune

Work from Office

Naukri logo

Project Role : Security Architect Project Role Description : Define the cloud security framework and architecture, ensuring it meets the business requirements and performance goals. Document the implementation of the cloud security controls and transition to cloud security-managed operations. Must have skills : Security Architecture Design Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As Security Architect ability to provide Enterprise Security Strategy, Enterprise security design, performing threat modeling for building a Secure Application and Infrastructure for enterprise (Cloud, On-prem, hybrid model). Thorough understanding of IT and its security architecture principles, methodologies and designs patterns. Good working knowledge of current IT risks and experience in implementing security solutions. Experience in designing and reviewing security controls for IT infrastructure (Cloud and on-prem applications). Ability to assess and evaluate different security products as per the security design requirements. Work as Trusted Security Advisor for various clients. Roles & Responsibilities:- Minimum of 8 years of professional experience, preferably with a minimum of 3 years of hands-on involvement in security architecture and threat modeling.Demonstrate a profound comprehension of security architecture, capable of creating, assessing, and revising secure solutions that promote scalability, adaptability, and reusability.Act as the subject matter expert (SME) responsible for guiding and making security architecture decisions across all aspects of Accenture client presales, proposal design, and integration within client ecosystems.Develop and uphold reusable security architecture and design patterns for utilization.Create, devise, and troubleshoot intricate security implementations, overseeing the development of High-Level Design (HLD) and Low-Level Design (LLD) documents.Conduct design and implementation assessments and engage in threat modeling as necessary, adhering to established standards and best practices (e.g., STRIDE, PCI DSS, CSA CCM).Possess experience in cloud architectures and security controls, encompassing network security, Identity and Access Management (IAM), data protection, application security, and logging, among others.Proven track record in security frameworks and processes, including CIS, NIST, PCI/DSS, CCM SOCI/II, ISO/IEC 27001, NIST 800-53, OWASP, ISM, etc.Support Sales Leads by serving as a consultant during pre-sales activities, which involve assessing client requirements, defining project scopes, and preparing proposals and project plans.Demonstrate a robust understanding of potential attack vectors and the ability to design and articulate agile security controls to safeguard against them.Thorough comprehension of security principles and tools, including certificates, Data Loss Prevention (DLP), Web Application Firewalls (WAF), Security Information and Event Management (SIEM), firewalls, Distributed Denial of Service (DDoS) prevention, Intrusion Detection Systems/Intrusion Prevention Systems (IDS/IPS), privileged access management, encryption, SSL, VPN, IPSec, TCP/IP, DNS, and web security architecture, among others. Professional & Technical Skills: Strong Network & Cyber Security Architecture Experience in architecting and developing security solutions on one or more cloud platform (AWS, GCP or Azure) and applying the cloud native security services.Cloud Security certifications (CCSP, AWS, Azure, Google Cloud etc.)Good to have Industry / academic accreditations / certifications in Security, Architecture, Network Security, Cloud or Technology disciplines preferred (e.g., CISSP, TOGAF, SABSA, CISM, CCIE etc.) Bonus if you have experience in DevSecOps, DevOps, Additional Information:- The candidate should have minimum 5 years of experience in Security Architecture Design.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 13 hours ago

Apply

6.0 - 11.0 years

4 - 8 Lacs

Coimbatore

Work from Office

Naukri logo

The IBM SAP Practice is seeking top talent with experience in SAP HANA BASIS Administration having end to end Brownfield and or Green field experiences of SAP BASIS activities . As part of our team, you will lead one of our teams that is responsible for engineering / Solutioning of SAP BASIS scope for given AMS, HANA implementation/ HANA Migration projects Your day in the role will include.. Perform technical Architect role in projects and lead the delivery team to successful outcomes. Perform design, architect solutions for S/4 HANA implementations and Migrations. Perform consulting role to advice clients to move their SAP Platforms on various cloud service providers Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 6+ Years of SAP BASIS experience of which 3 to 5 years on BSOH or S/4 HANA Experience in design of SAP Architectures for BSOH and S/4 HANA projects Experience in SAP Version Upgrades, OS/DB Migrations and SAP Migrations to Cloud platforms. Experience in working with Global team and Client Technical and Infrastructure teams on System set up and BASIS Administration activities Well versed with/ Experienced in Fiori setup and administration and HANA administration Preferred technical and professional experience Knowledge of reference architectures of SAP on Amazon, Azure and GCP. Knowledge on Integration of various SAP Applications and SAAS products and SAP approved third party applications. Knowledge on various integration scenarios on SAP Cloud platform and API’s to integrate SAP with other applications on Cloud

Posted 13 hours ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Gurugram

Work from Office

Naukri logo

- External We are looking for a MuleSoft Developer to join our team, responsible for designing and implementing API-led integrations using MuleSoft Anypoint Platform. The ideal candidate will develop robust, scalable APIs, manage integrations, and optimize the performance of systems. You will work closely with both business and technical teams to deliver seamless integration solutions and ensure smooth data flow across applications. Key Responsibilities: Design, develop, and maintain integration solutions using MuleSoft Anypoint Platform. Develop, deploy, and manage APIs and web services (REST/SOAP). Collaborate with cross-functional teams to understand integration requirements and deliver solutions. Ensure optimal performance, scalability, and security of integration solutions. Troubleshoot and resolve integration-related issues. Required Skills & Experience: Strong hands-on experience with MuleSoft Anypoint Platform and API-led connectivity. Proficient in developing and managing RESTful and SOAP APIs. Knowledge of cloud platforms (AWS, Azure, GCP) and integration patterns. Experience with CI/CD processes and automation for MuleSoft applications. Strong problem-solving skills and ability to work in a collaborative environ

Posted 13 hours ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Design, develop, and maintain high-performance, scalable Java applications using Java, Spring Boot and React/Angular. Build REST APIs and SDKs. Should be excellent in Java, OOPS concepts & Java Collections. Should be excellent in Spring Boot/Spring/hibernate. Strong proficiency in Java and related frameworks (e.g., Spring, Hibernate). Should have worked on REST API implementation and microservices implementation Experience with cloud platforms (e.g., AWS, Azure, Google Cloud). Experience in AWS, Docker and Kubernetes. Knowledge of microservices architecture. Familiarity with CI/CD pipelines and DevOps practices. Excellent communication skills Ability to work effectively in a fast-paced, collaborative environment.

Posted 13 hours ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

Java+GCP We are seeking a skilled Java Developer with hands-on experience in Google Cloud Platform (GCP) to design, build, and maintain scalable backend services and cloud-native applications. The ideal candidate will have strong programming expertise in Java, combined with practical experience deploying and managing applications on GCP. Responsibilities: Develop and maintain backend services using Java and Spring Boot. Design scalable and reliable microservices architecture. Deploy, monitor, and manage applications on GCP using services such as App Engine, Cloud Run, Cloud Functions, GKE, and Pub/Sub. Collaborate with cross-functional teams including DevOps, QA, and front-end developers. Ensure code quality through unit testing, code reviews, and CI/CD pipelines. Optimize cloud infrastructure and applications for performance, cost, and scalability. Implement security best practices in cloud-native applications. Troubleshoot production issues and perform root cause analysis.

Posted 13 hours ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Experience 8+ years Data Engineering experience. 3+ years experience of cloud platform services (preferably GCP) 2+ years hands-on experience on Pentaho. Hands-on experience in building and optimizing data pipelines and data sets. Hands-on experience with data extraction and transformation tasks while taking care of data security, error handling and pipeline performance. Hands-on experience with relational SQL (Oracle, SQL Server or MySQL) and NoSQL databases . Advance SQL experience - creating, debugging Stored Procedures, Functions, Triggers and Object Types in PL/SQL Statements. Hands-on experience with programming languages - Java (mandatory), Go, Python. Hands-on experience in unit testing data pipelines. Experience in using Pentaho Data Integration (Kettle/Spoon) and debugging issues. Experience supporting and working with cross-functional teams in a dynamic environment. Technical Skills Programming & LanguagesJAVA Database TechOracle, Spanner, BigQuery, Cloud Storage Operating SystemsLinux Good knowledge and understanding of cloud based ETL framework and tools. Good understanding and working knowledge of batch and streaming data processing. Good understanding of the Data Warehousing architecture. Knowledge of open table and file formats (e.g. delta, hudi, iceberg, avro, parquet, json, csv) Strong analytic skills related to working with unstructured datasets. Excellent numerical and analytical skills. Responsibilities Design and develop various standard/reusable to ETL Jobs and pipelines. Work with the team in extracting the data from different data sources like Oracle, cloud storage and flat files. Work with database objects including tables, views, indexes, schemas, stored procedures, functions, and triggers. Work with team to troubleshoot and resolve issues in job logic as well as performance. Write ETL validations based on design specifications for unit testing Work with the BAs and the DBAs for requirements gathering, analysis, testing, metrics and project coordination.

Posted 13 hours ago

Apply

5.0 - 10.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

Design, develop, and maintain high-performance, scalable Java applications using Java, Spring Boot and React/Angular. Build REST APIs and SDKs. Should be excellent in Java, OOPS concepts & Java Collections. Should be excellent in Spring Boot/Spring/hibernate. Strong proficiency in Java and related frameworks (e.g., Spring, Hibernate). Should have worked on REST API implementation and microservices implementation Experience with cloud platforms (e.g., AWS, Azure, Google Cloud). Experience in AWS, Docker and Kubernetes. Knowledge of microservices architecture. Familiarity with CI/CD pipelines and DevOps practices. Excellent communication skills Ability to work effectively in a fast-paced, collaborative environment.

Posted 13 hours ago

Apply

2.0 - 5.0 years

14 - 17 Lacs

Navi Mumbai

Work from Office

Naukri logo

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark)In-depth knowledge of Spark’s architecture, core APIs, and PySpark for distributed data processing. Big Data TechnologiesFamiliarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modelling, and data warehousing concepts. Strong proficiency in PythonExpertise in Python programming with a focus on data processing and manipulation. Data Processing FrameworksKnowledge of data processing libraries such as Pandas, NumPy. SQL ProficiencyExperience writing optimized SQL queries for large-scale data analysis and transformation. Cloud PlatformsExperience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 13 hours ago

Apply

8.0 - 13.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

JAVA Developer The Cloud Engineer is responsible for expanding, developing, implementing, and operationalizing our cloud implementation. We are looking for an experienced engineer with extensive knowledge in Java micro services and experience across a broad set of infrastructure and application technologies. The ideal candidate will have a vision for the future of these technologies and the ability to develop and execute the plan to get there. Responsibilities: Migrating from local server environments to GCP based Cloud Architecture GCP Cloud application intake automation. Working with compliance teams to ensure the design meets requirements. Monitoring and performance tuning Cloud environment. Sharing knowledge and awareness of alternative Cloud environments and services Identifying and proposing new work. Documenting and recommending best practices and articulating process improvements Developing automation and/or applications Automation of data transfer using file transfer technologies such as SFTP, etc. Day to day process, documentation, KPIs, and reporting. Basic Qualifications Bachelors degree Minimum of 3 yrs in a combination of Infrastructure, applications security knowledge with cloud experience Preferred Qualifications: Java micro services Must own ability to learn fast, adapt to new technology and keep current with industry Embrace and focus on emerging technology and evangelize to the rest of the team Experience with Google CloudExperience in Cloud run and similar technologies a strong plusExperience with Springboot programming language Experience supporting complex production application environments Mandatory Skills Openshift Kafka Kubernetis

Posted 13 hours ago

Apply

8.0 - 13.0 years

4 - 8 Lacs

Mumbai

Work from Office

Naukri logo

4+ years of experience as a Data Engineer or similar role. Proficiency in Python, PySpark, and advanced SQL. Hands-on experience with big data tools and frameworks (e.g., Spark, Hive). Experience with cloud data platforms like AWS, Azure, or GCP is a plus. Solid understanding of data modeling, warehousing, and ETL processes. Strong problem-solving and analytical skills. Good communication and teamwork abilities.Design, build, and maintain data pipelines that collect, process, and store data from various sources. Integrate data from multiple heterogeneous sources such as databases (SQL/NoSQL), APIs, cloud storage, and flat files. Optimize data processing tasks to improve execution efficiency, reduce costs, and minimize processing times, especially when working with large-scale datasets in Spark. Design and implement data warehousing solutions that centralize data from multiple sources for analysis.

Posted 13 hours ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

Tech stack GCP Data Fusion, BigQuery, Dataproc, SQL/T-SQL, Cloud Run, Secret Manager Git, Ansible Tower / Ansible scripts, Jenkins, Java, Python, Terraform, Cloud Composer/Airflow Experience and Skills Must Have Proven (3+ years) hands on experience in designing, testing, and implementing data ingestion pipelines on GCP Data Fusion, CDAP or similar tools, including ingestion and parsing and wrangling of CSV, JSON, XML etc formatted data from RESTful & SOAP APIs, SFTP servers, etc. Modern world data contract best practices in-depth understanding with proven experience (3+ years) for independently directing, negotiating, and documenting best in class data contracts. Java (2+ years) experience in development, testing and deployment (ideally custom plugins for Data Fusion) Proficiency in working with Continuous Integration (CI), Continuous Delivery (CD) and continuous testing tools, ideally for Cloud based Data solutions. Experience in working in Agile environment and toolset. Strong problem-solving and analytical skills Enthusiastic willingness to learn and develop technical and soft skills as needs require rapidly and independently. Strong organisational and multi-tasking skills. Good team player who embraces teamwork and mutual support. Nice to Have Hands on experience in Cloud Composer/Airflow, Cloud Run, Pub/Sub Hands on development in Python, Terraform Strong SQL skills for data transformation, querying and optimization in BigQuery, with a focus on cost, time-effective SQL coding and concurrency/data integrity (ideally in BigQuery dialect) Data Transformation/ETL/ELT pipelines development, testing and implementation ideally in Big Query Experience in working in DataOps model Experience in Data Vault modelling and usage. Proficiency in Git usage for version control and collaboration. Proficiency with CI/CD processes/pipelines designing, creation, maintenance in DevOps tools like Ansible/Jenkins etc. for Cloud Based Applications (Ideally GCP)

Posted 13 hours ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

Design, implement, and manage scalable, secure, and highly available infrastructure on GCP Automate infrastructure provisioning using tools like Terraform or Deployment Manager Build and manage CI/CD pipelines using Jenkins, GitLab CI, or similar tools Manage containerized applications using Kubernetes (GKE) and Docker Monitor system performance and troubleshoot infrastructure issues using tools like Stackdriver, Prometheus, or Grafana Implement security best practices across cloud infrastructure and deployments Collaborate with development and operations teams to streamline release processes Ensure high availability, disaster recovery, and backup strategies are in place Participate in performance tuning and cost optimization of GCP resources Strong hands-on experience with Google Cloud Platform (GCP) services Harness as an optional skill. Proficiency in Infrastructure as Code tools like Terraform or Google Deployment Manager Experience with Kubernetes (especially GKE) and Docker Knowledge of CI/CD tools such as Jenkins, GitHub Actions, GitLab CI, or CircleCI Familiarity with scripting languages (e.g., Bash, Python) Experience with logging and monitoring tools (e.g., Stackdriver, Prometheus, ELK, Grafana) Understanding of networking, security, and IAM in a cloud environment Strong problem-solving and communication skills Experience in Agile environments and DevOps culture GCP Associate or Professional Cloud DevOps Engineer certification Experience with Helm, ArgoCD, or other GitOps tools Familiarity with other cloud platforms (AWS, Azure) is a plus Knowledge of application performance tuning and cost management on GCP

Posted 13 hours ago

Apply

2.0 - 7.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Basis Administration Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are functioning optimally and meeting the needs of the organization. Your role will require you to stay updated on industry trends and best practices to enhance application performance and user experience. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application milestones.Application lifecycle management- installations, upgrades, patching and performance optimization of S/4 HANA Suites and other SAP applications platforms (e.g., HANA, S/4 suite, SAP Business suite (ERP, BW/BI, APO, PI/PO, MII, SLT, GRC, BODS, Solution manager), SAP SaaS Solutions (BTP, SSF, Cloud ALM) etc.).Perform integration/interface setup of SAP, and non-SAP, and 3rd Party cloud applications. Application and database performance Options for SAP and Database.Technical Configuration related to all SAP applications, databases, integrations and Data Archiving servers.Perform the sizing and configuration of SAP Installation on Hyperscalers (MS Azure) using Backup/Restore/Recovery.Collaborate and orchestrate the work between teams of Cargill and Application partners.Coordinate and manage the platform setup for the SAP application with Infra Platform teams on Azure.Perform system refresh/client copies.Supports the integration of solutions with various technologies.Raise and perform Pre-Go live checks and post go-live checks and optimize the systems accordingly.Configure backups post installation/upgrades.Perform the solution manager setup for ChaRM/STMS, LMDB and backbone connections to SAP. QualificationsBachelor's degree in computer science, system analysis or a related study, or equivalent experience Proven track record of delivering SAP upgrades, migrations and other Basis related projectsHands-on experience of 10/12+ years in SAP BASIS & HANA installation, upgrades and Patching, Migration.Exposure on handling the upgrade, installation and migration of SAP technologies like S/4 HANA suite, SAP Business suite (ERP, BW/BI, APO, GRC), SAP Integration suites PO/PI, BODS, BOBJ, Monitoring and administration using Solution manager, SAP SaaS (like Ariba, IBP, etc.) and BTP & Cloud ALM.Experience of technical evaluations of applications, databases and integrationsExperience in backup/Restore/Recovery of SAP/Oracle installations, Server Monitoring and optimizing techniquesExperience with SAP HANA, Sybase, MaxDB, MS SQL databases, Oracle databaseExperience of resolving technical issues, deep problem-solving skills, including those involving 3rd partiesExcellent analytical, technical and problem-solving skills, Root cause eradication mindset, Proactive approach, Receiver/Customer centricity.Familiar with ITIL concepts of Service Management, Change Management and Root Cause Analysis and using the ITIL tools like Service Now, BMC RemedyCloud knowledge and understanding of hyperscalers platform setup for SAP applications (e.g., experience of working in Public Cloud Domains like Microsoft Azure, AWS and GCP)Experience working in a global environment and with virtual teams.Good communication skills to interact with cross-functional and global teamsAbility to work under pressure and manage multiple priorities effectively.Ability to work independently and as part of a team. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Basis Administration.- Strong understanding of system configuration and performance tuning.- Experience with database management and backup strategies.- Familiarity with SAP landscape management and transport management.- Ability to troubleshoot and resolve technical issues efficiently. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP Basis Administration.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 13 hours ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Proven working experience in Automation Testing - 8 year+ experience Karate, Postman, TestNG, Maven, Jenkins, REST/SOAP, Java Architect test frameworks for enterprise-grade APIs Drive shift-left testing strategies with CI/CD integration Automate regression, performance, and contract tests Define QA best practices and ensure test coverage KPIs Test Management, Cucumber, Selinium, Java, TestNG Test Strategy, Test plan and Test case preparation API testing and automation Mentor SDETs and promote test automation culture AWS cloud work experience Take full ownership of the Test Automation, QA process, including analyzing requirements, writing and executing manual and automated tests, and reporting on test results. Identify, log, and track defects, ensuring timely resolution and verification of fixes. Gather and report on key metrics related to quality assurance. Make informed decisions about when and what to re-test based on defect status and project changes. Collaborate with team members, adapting to schedule and scope changes, and maintaining a high standard of quality throughout the development lifecycle. Document use cases, functional requirements, and maintain the Traceability Matrix (RTM). Cloud KnowledgeFamiliarity with public cloud infrastructure AWS/GCP. Agile ExperienceAt least 4 years of experience working in Agile/Scrum environments. IndependenceDemonstrated ability to handle all QA activities with minimal supervision. UI AutomationExperience with UI automation frameworks, including making changes and enhancements

Posted 13 hours ago

Apply

2.0 - 7.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Governance Risk and Compliance (SAP GRC) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application requirements are met, overseeing the development process, and providing guidance to team members. You will also engage in problem-solving activities, ensuring that the applications are aligned with business objectives and user needs, while maintaining a focus on quality and efficiency throughout the project lifecycle. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of milestones.Design, build, implement and support SAP security roles, profiles and authorizations to SAP ECC, ,S4HANA, Fiori, GRC environments.Manage SAP Security settings, update profiles, roles, permission sets, and object & field level access as necessary.Perform Security Hardening activities, Ensure all Security settings are maintained as per baselineCollaborate with cross functional teams and provide security guidance.Have understanding of SAP UCON, Onapsis, UI Masking Tools.Experience in creating Fiori Spaces and PagesExperience in handling any non-SAP IDM tools like Sailpoint IIQ Worked on one or two SAP security implementations or upgrades. QualificationsBachelor's degree in computer science, system analysis or a related study, or equivalent experience Proven track record of delivering SAP upgrades, migrations and other Basis related projectsHands-on experience of 10/12+ years in SAP BASIS & HANA installation, upgrades and Patching, Migration.Exposure on handling the upgrade, installation and migration of SAP technologies like S/4 HANA suite, SAP Business suite (ERP, BW/BI, APO, GRC), SAP Integration suites PO/PI, BODS, BOBJ, Monitoring and administration using Solution manager, SAP SaaS (like Ariba, IBP, etc.) and BTP & Cloud ALM.Experience of technical evaluations of applications, databases and integrationsExperience in backup/Restore/Recovery of SAP/Oracle installations, Server Monitoring and optimizing techniquesExperience with SAP HANA, Sybase, MaxDB, MS SQL databases, Oracle databaseExperience of resolving technical issues, deep problem-solving skills, including those involving 3rd partiesExcellent analytical, technical and problem-solving skills, Root cause eradication mindset, Proactive approach, Receiver/Customer centricity.Familiar with ITIL concepts of Service Management, Change Management and Root Cause Analysis and using the ITIL tools like Service Now, BMC RemedyCloud knowledge and understanding of hyperscalers platform setup for SAP applications (e.g., experience of working in Public Cloud Domains like Microsoft Azure, AWS and GCP)Experience working in a global environment and with virtual teams.Good communication skills to interact with cross-functional and global teamsAbility to work under pressure and manage multiple priorities effectively.Ability to work independently and as part of a team. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Governance Risk and Compliance (SAP GRC).- Strong understanding of risk management frameworks and compliance regulations.- Experience with application design and configuration best practices.- Ability to analyze complex business requirements and translate them into technical specifications.- Familiarity with project management methodologies and tools. Additional Information:- The candidate should have minimum 7.5 years of experience in SAP Governance Risk and Compliance (SAP GRC).- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 13 hours ago

Apply

2.0 - 7.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

A Cloud Software Developer for OpenShift is responsible for designing, developing, deploying, and maintaining cloud-native applications on Red Hat OpenShift . Their role primarily involves working with containers, Kubernetes, and DevOps practices to build scalable, resilient, and secure cloud applications. Roles & Responsibilities: Design and develop cloud-native applications using OpenShift. Containerize applications using Docker and deploy them on Kubernetes . Implement microservices architecture and ensure scalability. Develop applications using languages like Java, Python, Go, or Node.js . Configure and manage OpenShift clusters . Develop and manage Operators for automating OpenShift workflows. Work with Operators for efficient application deployment Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 2 years of industrial experience in working with Unix/Linux based products developed using C or C++ or Go Lang programming language. Minimum 2 - 3 years of experience in leading development or support teams troubleshooting to resolve issues. Good experience in development/support experience in working with various network protocols (Layer 2 - Layer5) and devices (routers, switches, firewalls, load balancers, VPN, QoS). Must have knowledge of virtualization, Operating systems internals and Hypervisor (kVM, z/VM, Hyper-V) Expertise in Translate Technical specification or customer requirements, Preparing of HLD/LLD and Working closely with Team members in translating the Specifications /design into product deliverables. Good understanding of Enterprise servers, firmware, patches, hotfixes, and security configurations. Proven operational experience in network operations including incident, change and problem management. Excellent analytical and problem-solving skills. Excellent written and verbal communication skills. Ability to effectively communicate product architectures, design proposals and negotiate options at senior management levels. Experience in working with global teams/partner labs. Preferred technical and professional experience Solid understanding of systems hardware & architecture Good understanding operating systems internals/Kernel (Process Management, Memory Management, Virtualization, Scheduling, I/O (Networking & Storage), Security, etc.) Understanding of AI/ML model deployments, AI lifecycle, Hands on experience on deploying AI/ML models on cloud

Posted 13 hours ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

Design, build, test and deploy Google Cloud data models and transformations in BigQuery, environment . SQL, stored procedures, indexes, clusters, partitions, triggers, Deliver a data warehouse and pipelines which follow abstraction and database refactoring best practice in order to support evolutionary development and continual change Protect the solution with appropriate Authorization and Authentication models, data encryption and other security components this will include consumer registration and storage of identification and change management considerations Review and refine, interpret and implement business and technical requirements Ensure you are part of the on-going productivity and priorities by refining User Stories, Epics and Backlogs in Jira Manage code artefacts and CI/CD using tools like Git, Jenkins, Google Secrets Manager, etc. Estimate, commit and deliver requirements to scope, quality, and time expectations Deliver non-functional requirements, IT standards and developer and support tools to ensure our applications are a secure, compliant, scalable, reliable and cost effective Write well-commented, maintainable and self-documenting code Fix defects and provide enhancements during the development period and hand-over knowledge, expertise, code and support responsibilities to support team Essential Experience Expert in database design, development and administration understanding of relational and dimensional data models (and preferably Data Vault) Expertise in On-prem or Cloud Databases, Warehouses and Lakes Excellent understanding of GCP Architecting and solution design Proven experience / solid knowledge in developing and optimization of SQL/T-SQL procedures in Traditional or Cloud Databases Coding and development of DDL and DML database components Excellent knowledge of devops tools like Ansible , Jenkins, Github ,Puppet, Chef etc. IT methodology/procedural knowledge; Agile/Scrum, DevOps and ITIL principles. BS/MS degree in Computer Science, Engineering or a related subject Excellent communication and interpersonal skills in English. Proficiency in verbal, listening and written English is crucial.

Posted 13 hours ago

Apply

7.0 - 12.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Key Responsibilities Design and implement the backend services and APIs Develop integration frameworks for external systems Implement workflow engine and automation capabilities using Restate Create event bus and webhook system with Kafka Ensure system security performance and reliability Required Skills 6 years of experience with TypeScript Node js backend development Strong experience with NestJS or similar backend frameworks Experience with database design and ORM frameworks Expert knowledge of API design REST GraphQL Experience with Apache Kafka and event driven architectures Experience with Restate or willingness to quickly become proficient Strong understanding of authentication and authorization systems Extensive experience with integration patterns and API development Experience with Neo4j or other graph databases Experience with cloud provider APIs AWS GCP Azure Desired Skills Experience with CQRS and event sourcing patterns Knowledge of BPM Business Process Management systems Experience with service mesh technologies Understanding of compliance and governance frameworks

Posted 13 hours ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title Data Analyst / Technical Business Analyst Job Summary We are looking for a skilled Data Analyst to support a large-scale data migration initiative within the banking and insurance domain. The role involves analyzing, validating, and transforming data from legacy systems to modern platforms, ensuring regulatory compliance, data integrity, and business continuity. Key Responsibilities Collaborate with business stakeholders, data architects, and IT teams to gather and understand data migration requirements. Analyze legacy banking and insurance systems (e.g., core banking, policy admin, claims, CRM) to identify data structures and dependencies. Work with large-scale datasets and understand big data architectures (e.g., Hadoop, Spark, Hive) to support scalable data migration and transformation. Perform data profiling, cleansing, and transformation using SQL and ETL tools, with the ability to understand and write complex SQL queries and interpret the logic implemented in ETL workflows. Develop and maintain data mapping documents and transformation logic specific to financial and insurance data (e.g., customer KYC, transactions, policies, claims). Validate migrated data against business rules, regulatory standards, and reconciliation reports. Support UAT by preparing test cases and validating migrated data with business users. Ensure data privacy and security compliance throughout the migration process. Document issues, risks, and resolutions related to data quality and migration. Required Skills & Qualifications Bachelor’s degree in Computer Science, Information Systems, Finance, or a related field. 5+ years of experience in data analysis or data migration projects in banking or insurance. Strong SQL skills and experience with data profiling and cleansing. Familiarity with ETL tools (e.g., Informatica, Talend, SSIS) and data visualization tools (e.g., Power BI, Tableau). Experience working with big data platforms (e.g., Hadoop, Spark, Hive) and handling large volumes of structured and unstructured data. Understanding of banking and insurance data domains (e.g., customer data, transactions, policies, claims, underwriting). Knowledge of regulatory and compliance requirements (e.g., AML, KYC, GDPR, IRDAI guidelines). Excellent analytical, documentation, and communication skills. Preferred Qualifications Experience with core banking systems (e.g., Finacle, Flexcube) or insurance platforms Exposure to cloud data platforms (e.g.,AWS, Azure, GCP). Experience working in Agile/Scrum environments. Certification in Business Analysis (e.g., CBAP, CCBA) or Data Analytics.

Posted 13 hours ago

Apply

10.0 - 15.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Looking for hands on experience with high knowledge on relevant skills to deliver the task on time in our team with quality Primary Skills - Java Spring boot, Microservices React/Angular Docker Kubernetes CICD Automated Unit Testing Design Architecture Experience Secondary Skills AWS GCP Note Need to work from office min 3 days a week

Posted 13 hours ago

Apply

10.0 - 15.0 years

14 - 18 Lacs

Hyderabad

Work from Office

Naukri logo

: Design, develop, and maintain scalable data pipelines and systems using DBT and Big Data technologies. Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. Implement data models and transformations using DBT. Develop and maintain ETL processes to ingest and process large volumes of data from various sources. Optimize and troubleshoot data workflows to ensure high performance and reliability. Ensure data quality and integrity through rigorous testing and validation. Monitor and manage data infrastructure, ensuring security and compliance with best practices. Provide technical support and guidance to team members on data engineering best practices. : Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience as a Data Engineer or in a similar role. Strong proficiency in DBT for data modeling and transformations. Hands-on experience with Big Data technologies (e.g., Hadoop, Spark, Kafka). Proficient in Python for data processing and automation. Experience with SQL and database management. Familiarity with data warehousing concepts and best practices. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Qualifications: Experience with cloud platforms (e.g., AWS, Azure, Google Cloud). Knowledge of data governance and security practices. Certification in relevant technologies (e.g., DBT, Big Data platforms).

Posted 13 hours ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Analytics Cloud Development Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process while ensuring alignment with organizational goals. You will also engage in strategic planning and decision-making to enhance application performance and user experience, fostering a collaborative environment that encourages innovation and efficiency. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to ensure timely delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Analytics Cloud Development.- Strong understanding of data modeling and visualization techniques.- Experience with integrating SAP Analytics Cloud with other SAP solutions.- Familiarity with agile methodologies and project management tools.- Ability to analyze business requirements and translate them into technical specifications. Additional Information:- The candidate should have minimum 5 years of experience in SAP Analytics Cloud Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 13 hours ago

Apply

5.0 - 10.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Seeking a skilled Data Engineer to work on cloud-based data pipelines and analytics platforms. The ideal candidate will have hands-on experience in PySpark and AWS, with proficiency in designing Data Lakes and working with modern data orchestration tools. Data Engineer to work on cloud-based data pipelines and analytics platforms PySpark and AWS, with proficiency in designing Data Lakes working with modern data orchestration tools

Posted 13 hours ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

Team Lead to lead a team of 4-5 developers in a dynamic and fast-paced environment. This role requires hands-on experience in modern technologies including Cloud, CI/CD, Kafka, Spring Boot, and Oracle SQL. The ideal candidate will be responsible for managing and guiding the team, as well as actively contributing to the development process through coding, building Proof of Concepts (POCs), and writing detailed MDDs (Model Driven Designs). The candidate must be able to balance both leadership and technical expertise to drive the teams success in delivering high-quality solutions. Lead and manage a team of 4-5 developers, providing guidance and mentorship to ensure timely delivery of high-quality solutions. Design, develop, and maintain applications using Spring Boot, Cloud services (AWS/Azure/GCP), Oracle SQL, and Kafka for distributed systems. Develop Proof of Concepts (POCs) to evaluate new technologies and solutions, ensuring technical feasibility before implementation. Write and maintain detailed technical documentation, including MDDs (Model Driven Designs), to clearly communicate architecture and design decisions. Conduct regular code reviews and ensure adherence to best practices, promoting high-quality code, scalability, and maintainability. Collaborate with cross-functional teams, including Product Managers and Architects, to translate business requirements into technical solutions. Monitor team performance, provide constructive feedback, and foster a culture of continuous learning and professional development. Stay current with industry trends and emerging technologies, proposing innovative solutions to improve application performance and development practices.

Posted 13 hours ago

Apply

3.0 - 6.0 years

8 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

As a Delivery Consultant, you will work closely with IBM clients and partners to design, deliver, and optimize IBM Technology solutions that align with your clients' goals. In this role, you will apply your technical expertise to ensure world-class delivery while leveraging your consultative skills such as problem-solving issue- / hypothesis-based methodologies, communication, and service orientation skills. As a member of IBM Technology Expert Labs, a team that is client focused, courageous, pragmatic, and technical, you'll collaborate with clients to optimize and trailblaze new solutions that address real business challenges. If you are passionate about success with both your career and solving clients' business challenges, this role is for you. To help achieve this win-win outcome, a 'day-in-the-life' of this opportunity may include, but not be limited to Solving Client Challenges Effectively: Understanding clients' main challenges and developing solutions that helps them reach true business value by working thru the phases of design, development integration, implementation, migration and product support with a sense of urgency . Agile Planning and Execution Creating and executing agile plans where you are responsible for installing and provisioning assets, testing, migrating to production, and day-two operations. Technical Solution Workshops: Conducting and participating in technical solution workshops. Building Effective Relationships: Developing successful relationships at all levels —from engineers to CxOs—with experience of navigating challenging debate to reach healthy resolutions. Self-Motivated Problem Solver: Demonstrating a natural bias towards self-motivation, curiosity, initiative in addition to navigating data and people to find answers and present solutions. Collaboration and Communication Strong collaboration and communication skills as you work across the client, partner, and IBM team. Required education Bachelor's Degree Required technical and professional expertise Required Skills: Deep understanding of midrange storage systems (Dell EMC PowerStore, Unity, VNX, NetApp FAS, HPE Nimble, 3PAR), installation, configuration, maintenance of storage arrays, disk shelves, and related hardware, knowledge of storage protocols (FC, iSCSI, NFS, SMB), and familiarity with RAID levels. Proficiency in storage management software (Unisphere, System Manager, CLIs), experience with storage provisioning, capacity planning, performance tuning, snapshot, replication, data migration, and understanding of storage monitoring and alerting tools. Basic understanding of operating systems (Windows, Linux, VMware), virtualization concepts (VMware vSphere, Hyper-V), and storage integration with virtualized environments. Strong analytical and problem-solving skills for diagnosing and resolving storage issues, root cause analysis, and experience with performance monitoring and capacity planning. Basic understanding of networking concepts (TCP/IP, subnetting, VLANs), knowledge of Fibre Channel and iSCSI networking. Ability to create clear documentation, excellent communication skills, and ability to work in a team. Preferred technical and professional experience Preferred Skills: Experience with scripting (PowerShell, Python, Bash) for automation, and knowledge of IaC tools (Ansible, Terraform). Familiarity with cloud storage and integration with on-premises, experience with AWS, Azure, GCP, and hybrid cloud solutions. Advanced knowledge of storage performance tuning, and experience with performance analysis tools. In-depth knowledge of data protection and disaster recovery, experience with backup/recovery software (Veeam, Commvault), and business continuity planning. Relevant storage certifications (Dell EMC Proven Professional, NetApp NCDA/NCIE, HPE ASE). Understanding of storage security, encryption, and access control. Understanding of storage interaction with Databases (SQL, Oracle). Experience with Brocade or Cisco SAN fabric switches.

Posted 13 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies