Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 10.0 years
20 - 25 Lacs
Pune
Work from Office
Senior DevOps Engineer Experience: 7 to 10 years Job category: IT Job Type: FullTime Job Location: Pune Number of Positions: 5 Job Description: The ideal candidate will have a strong background in cloud infrastructure, automation, andDevOps tools. This role requires expertise in designing, implementing, and managing scalable DevOpssolutions to meet diverse project requirements. Skills: AWS GCP Azure Responsibilities: Manage, monitor, and optimize cloud infrastructure across AWS, Azure, and GCP. Design and implement CI/CD pipelines using tools such as GitHub Actions, Jenkins, and ArgoCD to streamline deployment processes. Develop and manage Infrastructure-as-Code (IaC) using Terraform, ensuring scalable and repeatable infrastructure deployment. Work with containerization technologies like Docker and orchestrate environments using platforms such as Kubernetes or other container orchestration tools. Automate and manage configurations using tools like Ansible, Chef, or Puppet. Support and integrate log monitoring and observability tools such as ELK, Grafana, Splunk,Prometheus, and Elastic Search. Collaborate with cross-functional teams to design and implement infrastructure architectures based on business requirements. Troubleshoot and optimize deployment processes and ensure high availability and scalability of systems.
Posted 1 week ago
1.0 - 5.0 years
18 - 20 Lacs
Chennai
Work from Office
Design, implement, and maintain CI/CD pipelines to facilitate automated build, test, and deployment processes. Collaborate with development teams to understand application requirements and architect infrastructure solutions that support scalability, reliability, and performance. Configure and manage cloud-based infrastructure (AWS, Azure, Google Cloud, etc.) using Infrastructure-as-Code (IaC) tools like Terraform or CloudFormation. Monitor system performance, identify bottlenecks, and take proactive actions to optimize infrastructure and application performance. Implement and manage containerization platforms (Docker, Kubernetes) to facilitate application deployment and scaling. Maintain and improve configuration management systems (Ansible, Puppet, Chef) for efficient management of infrastructure. Implement and manage monitoring and logging solutions to ensure the availability and reliability of systems. Collaborate with security teams to implement best practices for infrastructure and application security. Automate repetitive tasks to increase efficiency and reduce manual intervention. Participate in on-call rotations and incident response to handle system emergencies and ensure high availability. Qualifications and skills: Bachelors Degree Req 1-5 years of related experience Req Collaboration; Continuous Deployment; Continuous Integrations; Script Programming; Teamwork AWS CloudFormation; Build Automation; Infrastructure As Code (IaC); Red Hat Ansible; Terraform 24*7 shift IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide . Learn more at Save this job LEARN ABOUT HOW WE WORK Join our Global Talent Network Let s stay connected. Sign up to receive alerts when new opportunities become available that match your career ambitions.
Posted 1 week ago
10.0 - 11.0 years
15 - 16 Lacs
Pune
Work from Office
METRO Global Solution Center IN is looking for Expert Profit Recovery Audit to join our dynamic team and embark on a rewarding career journey Provide specialized expertise and advice in a particular field or industry. Analyze complex problems and develop effective solutions. Collaborate with stakeholders to implement best practices and strategies. Conduct research and stay updated on industry trends and advancements. Mentor and support team members in their professional development. Develop and present reports, recommendations, and technical documentation. Ensure compliance with relevant regulations and standards.
Posted 1 week ago
1.0 - 4.0 years
5 - 8 Lacs
Ghaziabad
Work from Office
RINA is currently recruiting for a India Certification Site Operations Coordinator to join its office in Pithampur within the International Certification Division. Mission To carry out product audit/verification at customer site as per established QA plan Key Accountabilities - To carry out product audit on identified stage and segregate NC products - To ensure OK products are duly identified by seal/sticker - To ensure adequate housekeeping at work station - To ensure adequate handling of all measuring equipment and Gauges - To prepare timely reports for the work done - To have a clear understanding of all relevant documents Education High School Diploma/GED in General Studies/Other Competencies CLIENT INTIMACY - Embrace internal and external client needs, expectations, and requirements to ensure maximum satisfaction EARN TRUST - Take everyones opinion into account and remain open to diversity PROMOTE SUSTAINABLE DEVELOPMENT - Promote commitment by keeping promises as a Role Model MANAGE EMOTIONS - Recognise ones and others emotions and express and regulate ones reactions PIONEER CHANGE - Actively embrace change and benefit from the new circumstances BUILD NETWORK - Forge trust relationships, across departments, and outside the organization MAKE EFFECTIVE DECISIONS - Structure activities according to priorities, actions, resources and constraint ADDRESS THE WAY - Have a big picture of different situations and reinterpret it in a perspective way THINK FORWARD - Capitalise on experiences and translate them into action plans for the future RINA is a multinational company providing a wide range of services in the energy, marine, certification, infrastructure & mobility, industry, research & development sectors. Our business model covers the full process of project development, from concept to completion.
Posted 1 week ago
9.0 - 15.0 years
12 - 13 Lacs
Bengaluru
Work from Office
Job Title Manager - MEP Job Description Summary We are looking for a skilled and knowledgeable MEP Manager to oversee all Mechanical, Electrical, and Plumbing aspects of our hotel infrastructure projects. The ideal candidate will have a strong educational background in Mechanical or Electrical Engineering and proven expertise in delivering high-quality MEP works in hospitality projects. Location: Chikkamagalur Job Description Plan, execute, and manage all MEP works for hotel infrastructure projects in line with the overall construction program. Review MEP designs, drawings, and technical submittals to ensure compliance with project specifications and industry standards. Coordinate with architects, structural engineers, consultants, and contractors to ensure effective integration of MEP systems into the project. Supervise installation, testing, and commissioning of HVAC, electrical, plumbing, fire-fighting, and low-voltage systems. Monitor MEP work progress, quality, and adherence to the project schedule and budget. Identify technical issues and provide timely, practical solutions in coordination with project stakeholders. Ensure all MEP works are carried out in compliance with local codes, safety regulations, and quality standards. Conduct regular site inspections and track contractor performance against project requirements. Organize coordination meetings with consultants, contractors, and vendors to resolve MEP-related issues. Maintain accurate and up-to-date documentation including drawings, reports, inspection records, and equipment logs. Support procurement and review of MEP materials, equipment, and vendor submittals to ensure technical suitability. Work closely with the client and internal teams to ensure that all MEP systems meet the functionality and service expectations of a hotel project. Promote energy-efficient solutions and sustainable practices in the design and implementation of MEP systems. About You: Location: Chikkamagalur Degree in Mechanical or Electrical Engineering from a recognized university. Proven experience in managing MEP works for hotel or hospitality infrastructure projects. Strong knowledge of HVAC, electrical, plumbing, fire-fighting, and building automation systems. Skilled in reading and interpreting technical drawings and specifications. Familiar with industry standards, codes, and safety practices related to MEP systems. Excellent coordination, team management, and communication skills. Strong organizational and problem-solving abilities. Ability to provide effective, timely, and reliable support to stakeholders. Proficient in MS Office, AutoCAD, and other relevant MEP software tools. Why join Cushman & Wakefield As one of the leading global real estate services firms transforming the way people work, shop and live working at Cushman & Wakefield means you will benefit from; Being part of a growing global company; Career development and a promote from within culture; An organisation committed to Diversity and Inclusion Were committed to providing work-life balance for our people in an inclusive, rewarding environment. . We have a vision of the future, where people simply belong. INCO: Cushman & Wakefield
Posted 1 week ago
5.0 - 9.0 years
7 - 11 Lacs
Pune
Work from Office
Greetings from Synergy Resource Solutions, a leading HR Management Consultancy. Our client i s a global leader in water technology and services for industrial and infrastructure markets. With a focus on solving water scarcity through desalination, water reuse, and zero liquid discharge (ZLD), they have executed over 2,000 projects across 60 countries. Leveraging the experience and know-how gained over 40 years handling some of the most difficult to treat waters, they help some of the world s most recognized companies reduce their water and carbon footprint, ultimately reducing water risk. Position: Assistant Manager Department: Contracts Qualifications: B.E., LLB etc. Location: Pune Experience required: 6 8 years Key job responsibilities: Review of terms and Conditions of different types of domestic and overseas clients contracts, NDAs, MOUs etc. as per company Guidelines Identifying risk areas & Propose alternatives for their mitigation. Discussion with clients for documents finalisation. Review and negotiate vendor deviation. Identify flow down condition from client contracts to sub-orders. Contractual letters for projects. Assist contracts HOD in day-to-day works and other activities as may be assigned. Skill & Knowledge required: Knowledge / experience on EPC contracts, Supply contracts, O&M, BOOT Contracts preferred. Good communication skill If interested, please share your cv with details of total experience, current salary, expected salary and notice period.
Posted 1 week ago
3.0 - 6.0 years
4 - 7 Lacs
Mumbai, Navi Mumbai
Work from Office
Overview: The role is for an Active Directory Engineer in Core Infra IAM operations within Identity & Access Technology Services, responsible for the operation of the Active Directory environment and related technologies. Responsible for the delivery of tiered admin solution and other global solutions for both on premise and cloud identity solutions. The successful candidate must possess relevant experience of operating enterprise scale identity platforms. Role / Principal Accountabilities: The Core Infra IAM function is responsible for operating of all aspects of core infrastructure components relating to Identity and Access Management. The candidate must be highly self-motivated team player with good oral and written skills and confidence to present to management. The candidate should also have a good sense of discipline for change control procedures. A flexible work ethic is required with the ability to form part of a multi-platform global function Skills & Experience Required: Demonstrate an ability to work well as part of a global team, and on their own when required. Strong oral and written communication Enthusiastic, eager and personable. Ability to cope well under pressure. Essential: Subject-matter expert knowledge of Active Directory related technologies (i.e., Group Policy, DHCP, DNS, Active Directory enterprise design principles etc.) Subject-matter expert knowledge of Active Directory OU structures and delegation models and understanding of impact when moving objects within OU structures. Subject-matter expert knowledge of Windows Server technologies Experience in using and developing automation and scripting (PowerShell) to drive efficiencies. Desirable: Enterprise-scale technical experience designing and deploying tiered administrative / Enterprise Access Models. Extra Notes (if applicable): Please contact us if you are visiting our offices and require any form of personal assistance or physical adaptations to be provided for your appointment. A member of staff will be happy to help. TBC
Posted 1 week ago
1.0 - 4.0 years
3 - 6 Lacs
Bengaluru
Work from Office
Job Descriptions : The Executive - Admin is responsible for managing branch renovation projects and ensuring the seamless operation of essential infrastructure systems such as water, sewage, electrical, and civil works. The role includes oversight of licensing and regulatory compliance under various state laws, vendor coordination, preventive maintenance, facility upkeep, and health and safety management across showrooms, service centers, and head office premises. Licensing and Registration Apply for, renew, and amend licenses under various state acts including BBMP, KSPCB, KIADB, Panchayath, and Factory Act. Coordinate inspections and documentation with Factory Inspectors. Handle lease agreement renewals and maintain updated records. Manage e-Manifest tracking for hazardous waste compliance. 2. Regulatory Compliance and Inspections Ensure all locations comply with local traffic, environmental, and safety regulations. Coordinate statutory inspections and ensure rectification of audit observations. Maintain up-to-date knowledge of applicable state and local compliance requirements. 3. Construction, Renovation, and Maintenance Supervise civil work and renovation of showrooms, service centers, and offices. Oversee installation and maintenance of critical systems including plumbing, electrical, and drainage. Ensure work quality, adherence to safety norms, and completion within timelines. 4. Facility Management Oversee AMC and procurement for facility infrastructure (furniture, equipment, etc.). Maintain DG sets, fire extinguishers, lifts, uniforms, security, and h
Posted 1 week ago
11.0 - 13.0 years
10 - 14 Lacs
Hyderabad
Work from Office
We are always looking upward. And that starts with finding the right talent to help us get there. We are seeking a skilled Cloud Engineer to join our dynamic team. The ideal candidate will have a strong background in cloud computing, with experience in designing, implementing, and managing cloud-based solutions. The Cloud Engineer will work closely with our IT team to ensure the reliability, security, and scalability of our cloud infrastructure. Responsibilities Design, develop, and deploy modular cloud-based systems. Develop and maintain cloud solutions in accordance with best practices. Ensure efficient functioning of data storage and process functions in accordance with company security policies and best practices in cloud security. Identify, analyze, and resolve infrastructure vulnerabilities and application deployment issues. Regularly review existing systems and make recommendations for improvements. Interact with clients, provide cloud support, and make recommendations based on client needs. Automate tasks using scripting and configuration management tools (e.g., Terragrunt, Terraform). Monitor and manage cloud infrastructure performance and security. Collaborate with development teams to ensure applications are designed for the cloud. Qualifications Required Skills: - Positive, growth-oriented mindset - Excellent problem-solving skills and attention to detail. - Robust technical acumen on cloud/on-prem DevSecOps software development and technologies - Proven experience as a Cloud Engineer or similar role. - Empathetic technical leadership and talent development - Boundless intellectual curiosity to continually explore how to move better, faster and cheaper - Collaborative voice for solution integrity utilizing best practices and industry standards - Experience with cloud platforms such as AWS or Azure. - Strong understanding of cloud security and compliance requirements. - Proficiency in scripting languages (e.g., Python, Bash). - Experience with CI/CD tools and processes. - Familiarity with containerization technologies (e.g., Docker, Kubernetes). - Strong communication and collaboration skills. - Passion for building modular, configurable Cloud and DevSecOps solutions utilizing the latest technologies - Required Experience & Education: - 11 13 years of over all experience in technology - Proven experience with architecture, design, and development of large-scale enterprise application solutions. - College degree (Bachelor) in related technical/business areas or equivalent work experience. - Hands on experience engineering/developing modern on-prem/public cloud DevSecOps solutions. - Willing to learn, go above and beyond attitudes. - Experience with modern and legacy development languages. Desired Experience: - Exposure to AWS - Experience with serverless architectures and microservices. - Strong Knowledge of database management and data integration.
Posted 1 week ago
8.0 - 10.0 years
6 - 10 Lacs
Bengaluru
Work from Office
Job Title: CloudOps & InfoSecurity Lead Job Description: We are seeking a skilled CloudOps & InfoSecurity Lead to oversee our cloud operations and information security initiatives. The ideal candidate will be responsible for designing, implementing, and managing cloud infrastructures across multiple environments while ensuring compliance with security protocols. You will collaborate closely with developers, operations teams, and other stakeholders to enhance our cloud capabilities and security posture. Key Responsibilities: - Lead the design and implementation of cloud infrastructure solutions using industry best practices. - Develop and enforce cloud security policies and access controls to safeguard sensitive data. - Monitor and manage cloud environments, ensuring availability, performance, and security. - Conduct regular security assessments and audits to identify vulnerabilities and implement remediation plans. - Collaborate with DevOps teams to integrate security into the software development lifecycle. - Provide guidance and support to cross-functional teams on cloud and security best practices. - Stay updated on the latest trends and technologies in cloud computing and information security. Skills: - Extensive experience with cloud platforms such as AWS, Azure, or Google Cloud. - Strong understanding of cloud architecture and service models (IaaS, PaaS, SaaS). - Proficiency in security frameworks and best practices, such as ISO 27001, NIST, or CIS. - Experience with identity and access management (IAM) solutions. - Familiarity with containerization and orchestration tools like Docker and Kubernetes. - Knowledge of scripting languages, such as Python or Bash, to automate tasks. - Excellent problem-solving skills and the ability to work under pressure. Tools: - Cloud management platforms (AWS Management Console, Azure Portal, Google Cloud Console). - Security tools (firewalls, intrusion detection/prevention systems, vulnerability scanners). - Configuration management tools (Ansible, Terraform, Chef). - Monitoring and logging tools (CloudWatch, Datadog, Splunk). - CI/CD tools (Jenkins, GitLab CI, CircleCI). Join us to lead our efforts in cloud operations and information security, ensuring the integrity and security of our systems in a rapidly evolving technological landscape.
Posted 1 week ago
7.0 - 12.0 years
12 - 16 Lacs
Gurugram
Work from Office
About the Role: Grade Level (for internal use): 10 The Team: The TechOps team is responsible for cloud infrastructure provisioning and maintenance in addition to providing high quality Technical Support across a wide suite of products within PVR business segment. The TechOps team works closely with a highly competent Client Services team and the core project teams to resolve client issues and improve the platform. Our work helps ensure that all products are provided a high-quality service and maintaining client satisfaction. The team is responsible for owning and maintaining our cloud hosted apps. The Impact The role is an extremely critical role to help affect positive client experience by virtue of creating and maintaining high availability of business-critical services/applications. Whats in it for you The role provides for successful candidate to haveOpportunity to interact and engage with senior technology and operations users Work on latest in technology like AWS, Terraform, Datadog, Splunk, Grafana etc Work in an environment which allows for complete ownership and scalability What Were Looking For Basic Required Qualifications Total 7+ years of experience required with atleast 4+ years in infrastructure provisioning and maintenance using IaC in AWS.Building (and support) AWS infrastructure as code to support our hosted offering.Continuous improvement of infrastructure components, cloud security, and reliability of services.Operational support for cloud infrastructure including incident response and maintenance.Candidate needs to be an experienced technical resource (Java, Python, Oracle, PL/SQL, Unix) with strong understanding of ITIL standards such as incident and problem management.Ability to understand complex release dependencies and manage them automatically by writing relevant automationsDrive and take responsibilities of support and monitoring toolsShould have exposure to hands-on fault diagnosis, resolution, knowledge sharing and delivery in high pressure client focused environment.Extensive experience of working on mission critical systemsInvolve and drive RCA for repetitive incidents and provide solutions.Driving excellent levels of service to business, effective management & technology strategy development and ownership through defined processGood knowledge of SDLC, agile methodology, CI/CD and deployment tools like Gitlab, GitHub, ADOKnowledge of Networks, Database, Storage, Management Systems, services frameworks, cloud technologies Additional Preferred Qualifications:Keen problem solver with analytical nature and excellent problem-solving skillsetBe able to work flexible hours including some weekends and possibly public holidays to meet service level agreementsExcellent communication skills, both written and verbal with ability to represent complex technical issues/concepts to non-tech stakeholders About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, andmake decisions with conviction.For more information, visit www.spglobal.com/marketintelligence . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- IFTECH103.1 - Middle Management Tier I (EEO Job Group)
Posted 1 week ago
1.0 - 6.0 years
19 - 22 Lacs
Hyderabad
Work from Office
Overview Main Purpose of the Role The Cloud Solution Architect will contribute to the design, implementation, and support of cloud-native applications and applications transitioning from on-premises to PepsiCos public cloud. This role is part of PepsiCos Cloud and SAP Infra Engineering organization. The candidate will assist the Cloud Solution Architects in developing cloud solutions that align with PepsiCos infrastructure and platform guidelines. Working in collaboration with Enterprise Solution Architecture, Infrastructure Engineering, and Capability Teams, the Cloud Solution Engineer will focus on learning and supporting cloud deployments while ensuring compliance with PepsiCos standards for security, networking, and infrastructure best practices. Other Relevant Scope/Measures This role will support a multi-year Cloud First Strategy and Cloud Transformation for PepsiCo, including cloud adoption at scale with Cloud Velocity. Responsibilities Accountabilities Support the development, improvement, and maintenance of cloud-native technology standards under the guidance of senior architects. Assist in delivering cloud solutions by contributing to design reviews and proposals that comply with PepsiCos platform standards and security guidelines. Help implement and validate architecture components across areas such as identity and access management, networking, security, and data integration. Participate in creating and maintaining cloud solution patterns, learning to integrate industry best practices for security and performance. Conduct application portfolio analysis to assist in identifying applications and workloads eligible for cloud migration. Collaborate on application modernization projects under the direction of senior architects. Contribute to cloud migration efforts, including infrastructure, workload, data, and application movement to cloud IaaS and PaaS environments. Gain exposure to automation tools such as Infrastructure as Code (IAC), CI/CD processes, and cloud configuration management. Work closely with the Cloud Engineering team and application/business teams to help ensure cloud environments meet functional requirements. Learn how to prepare architectural documentation and present solutions by supporting senior architects. Develop secure, scalable, and reliable cloud environments across core domains such as networking, data, and applications. Qualifications Experience Bachelor's degree. 1+ years of experience in IT architecture, infrastructure, or cloud strategy. Foundational knowledge of public cloud platforms and interest in learning about migration patterns, tools, and best practices. Mandatory Technical Skills: Basic knowledge of cloud platforms like Azure, AWS, or Google Cloud and their services (e.g., compute, networking, storage). Exposure to cloud application development solutions (e.g., IaaS, serverless, API management), container technologies (e.g., Docker or Kubernetes), or CI/CD pipelines (e.g., Azure DevOps). Familiarity with IT systems and application lifecycle concepts, and a willingness to learn advanced cloud solution designs. Strong technical aptitude and the ability to quickly learn new tools and emerging cloud technologies. Mandatory Non-Technical Skills: Good communication and interpersonal skills to collaborate within a team and across departments. Good collaboration and partnership skills to foster key relationships with other technology, application, and business teams. Troubleshooting skills with an eagerness to learn problem-resolution techniques for technology issues. Ability to adapt to a fast-paced work environment while maintaining attention to detail. Proficiency in documentation and translating technical ideas into easily understandable concepts. Soft skills Eagerness to understand business needs and translate them into cloud-enabled solutions with mentorship from senior team members. Ability to work on assigned tasks with guidance while gradually gaining independence in solving problems and designing solutions. Collaborative mindset to build relationships with team members, technology leaders, and business stakeholders.
Posted 1 week ago
12.0 - 17.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Overview Accountable for the technical management of the PepsiCo Global template solution. Accountable for the deployment of SAP upgrades for the Sector, partnering with global and DevSecOps team. Accountable for E2E delivery for all technical components related to S4 migration project for multiple sectors in parallel and ensured robust architecture and process to avoid any impact to BAU Oversee the basis team for high-quality installation, configuration and maintenance of systems throughout the entire lifecycle from pursuit/estimation to deployment/hypercare. Responsibilities Drive the end to end integration and connectivity of applications across the S/4 platform and external legacy applications. Lead Technical Delivery for S4 migration in all sectors (Tech Design, Test Support, TCO, Systems Management). Coordinate the PGT technical project activities for the entire project lifecycle from pursuit/estimation to deployment/Hypercare and liaise with the other project streams to ensure technical readiness at each stage of the project. Support System Management deployment, responsible for coordinating amongst members of different teams to address and resolve any issues or problems, escalating to vendors, partners and additional support required. Partner with other IT Orgs and cloud partners/vendors to plan and deliver comprehensive technical support for the project, understanding and predicting the needs of project execution. Accountable for the technical management of the PepsiCo Global template solution. Accountable for the deployment of SAP upgrades for the Sector, partnering with global and DevSecOps team. Qualifications Minimum of 12 years of related IT work experience, including significant experience managing teams (5+ years) and large scale technical projects 10+ years of SAP experience with cross-functional knowledge in Application Development, Middleware , System Admin, Infrastructure; S/4 & HANA experience a plus Minimum of 5 plus years of strong development experience in the past. Past experience in working in ECC 6.0 or S/4 versions. Ability to work collaboratively across project teams Experience in CPG industry preferred Proficient in written & spoken English Knowledge of integration points between business processes Solves highly complex problems within their work team Knowledge and experience in following tools, disciplines and processes are preferredSAP Solution Manager, SAP ChaRM, ITIL tools & processes, HP Service Manager, quality management tools, CPG industry experience, project implementation and production support experience
Posted 1 week ago
5.0 - 10.0 years
13 - 18 Lacs
Hyderabad
Work from Office
Overview Enterprise Data Operations Analyst Job OverviewAs an Analyst, Data Modeling , your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics . The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks , Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/ modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives , maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications Qualifications 5+ years of overall technology experience that includes at least 2+ years of data modeling and systems architecture. Around 2+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 2+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ , and Great Expectations. Experience building/ operating highly available , distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake . Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI ).
Posted 1 week ago
8.0 - 13.0 years
8 - 13 Lacs
Hyderabad
Work from Office
Overview As Senior Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data str/cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI).
Posted 1 week ago
14.0 - 19.0 years
16 - 20 Lacs
Hyderabad
Work from Office
Overview Design, development and maintenance of Cloud Marketplace Platform, Cloud Infrastructure Automation of different OS platforms and DB Technologies for PepsiCo Enterprise Azure, AWS and private cloud environments using Ansible Responsibilities Develop Ansible Playbooks and Roles for automating OS offerings for Windows/Linux and DB Offerings including Oracle, SQL Server and NoSQL technologies in Cloud (both IaaS and PaaS) using Ansible and Terraform Develop Services and Tools (using Ansible and Terraform, Azure Workbooks, Microsoft forms) to provide an excellent user experience incorporating self-service, automated reporting, collaboration, and isolation where appropriate and a fully auditable platform Design end to end automation solutions (FullStack) for PepsiCo Cloud Marketplace Essential Services (ESS) integration and offerings, Azure and AWS infrastructure foundation, cloud services including tagging, cloud onboarding and policies Develop Azure CI/CD Pipelines, branching standards and coding standards Develop, deploy, and maintain TerraForm modules, PowerShell scripts, TerraForm schema and ARM templates Operationalize automation solutions Collaborate / partner with enterprise IT services teams, InfoSec, application teams, CAVO teams, digital platform teams, sector application teams as well as managed service providers and cloud providers Modernize and automate PepsiCo infrastructure services provisioning to provide faster services provisioning capabilities and improve supportability Responsible for Ansible Automation Platform (AAP) and AWX environments, including developing Execution Environments (EE) as needed for different applications Qualifications 14+ years of automation and Software Development experience including Ansible and Terraform 8+ years of experience administrating Ansible Infrastructure 5+ years of experience creating TF modules, config files 5+ years of experience of automating IT Infrastructure - compute, network, storage and databases 5+ years of experience in Azure and AWS IaaS / PaaS services 5+ years of experience in Azure Dev Ops (ADO) including Version Control
Posted 1 week ago
5.0 - 10.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Overview The main objective of this role is to ensure stability for the platform enabling PepsiCos digital transformation in workforce automation. This environment is a centralized platform used to create, monitor, and audit any robotic process automation (Bots) being deployed globally within PepsiCo. Technical support and sustain lead for Intelligent Automation (IA) infrastructure and bots running in that infrastructure. This role will provide technical leadership for geographically dispersed resources, working closely with Automation Development, the Citizens Development community, and key business contacts to sustain and expand the operational infrastructure, providing a centrally-supported and standardized environment. Responsibilities Responsible for the upkeep of the RPA (Robotics Process Automation) Infrastructure, and provide support to the hosted applications. Be the subject matter expert for the Intelligent Automation / UI Path infrastructure that includes UI Path, Windows servers, and Azure Cloud Maintain knowledge of advanced diagnostic and troubleshooting techniques, solid analytical skills, experience in RPA administration/configuration, refresh/upgrade projects and enhancements Proactively manage the Automation / UI Path environment and ensure that unplanned outages are reduced or eliminated by helping provide root cause for issues. Lead the on-call management, coverage schedule and executions when needed to assist with complex and critical incidents. Provide technical support collaboration with global resources and maintain operational stability through automation and adherence to standards. Responsible for the capacity management of RPA environments through continuous operational reviews and continuous collaboration with the Infra and development teams. Evaluate regularly the operational procedures to keep the environment in compliance with the PepsiCo control standards. Qualifications 5+ years production support experience 3-5 years Windows experience 1-2 years UI Path development experience (good to have) 2-3 years UI Path infrastructure experience 2+ years Azure cloud experience 2+ years ITSM process (incident, change, problem) experience Bachelors degree in Information Technology, Engineering, Computer Science, related field or equivalent experience Proficiency with Unix scripting Training & Advanced certifications on Automation tools (good to have) Experience in an IT programming development environment, ideally in RPA tools or test automation tools (UI Path/ AA/ Blue Prism/QTP/OpenSpan/WinAuto etc). Hands on development experience in any of the programming languages/platforms .NET / VB / VC++ / C++/J2EE. Experience in Microsoft stack is highly preferred. Experience in VB Script, JavaScript (AngularJS, NodeJS), Python, Perl, Bash & Powershell is highly desired Strong in design principles and modular programming techniques. Strong in requirement gathering and analysis (ability to work with a structured and methodical approach, combined with an inquiring mind) Ability to understand the business process and create process flow diagrams. Knowledge of Microsoft packages such as MS Excel (VBA scripting), Visio, Access, and Word. Problem solving issues that arise in day to day running of IPA processes and providing timely responses and solutions as required Advanced troubleshooting and problem analysis proficiency Exceptional organizational skills with an ability to manage multiple priorities in a fast-paced dynamic environment Leadership skills to direct remote resources Ability to communicate to both technical and management resources Preferred to have .net/Java developer certifications. Creating and maintaining solution documentation.
Posted 1 week ago
4.0 - 9.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Overview As an Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 4+ years of overall technology experience that includes at least 2+ years of data modeling and systems architecture. Around 2+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 2+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI).
Posted 1 week ago
8.0 - 13.0 years
18 - 22 Lacs
Hyderabad
Work from Office
Overview Enterprise Data Operations Sr Analyst L08 Job OverviewAs Senior Analyst, Data Modeling , your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics . The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks , Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/ modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives , maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications Qualifications 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ , and Great Expectations. Experience building/ operating highly available , distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake . Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI ). Does the person hired for this job need to be based in a PepsiCo office, or can they be remote Employee must be based in a Pepsico office Primary Work LocationHyderabad HUB-IND
Posted 1 week ago
10.0 - 15.0 years
35 - 40 Lacs
Hyderabad
Work from Office
Overview The role will lead and support all cloud-first projects leveraging IaaS or PaaS. The role would join \"Cloud Acceleration and Valued Office\" which got started in 2021 and would lead migrating all onpremise datacenter applications and infrastructure to the Public cloud. As these are cloud first projects, they require support to provide a cloud concierge type of service shepherding projects through various review/approval processes, bringing together team members to assess what cloud services will be required and plan for build-out. The role of the cloud concierge is to help sift through options and select cloud services that are the most robust and cost-effective for the project, without overengineering a solution and increasing its cost. This role will lead cloud migrations and help to develop support models / ensure a smooth transition to sustain teams. Will require validation of cost for the proposed implementation, confirmation/testing of high availability/disaster recovery, reviews to ensure data is secured properly. Will drive cloud projects globally and work with multiple time zones. Will have accountability to ensure accurate reporting on project SLAs and KPIs. Accountabilities Work with relevant customer leaders to define and create specific implementation plans Manage aspects of cloud project implementation including initiation, execution, and delivery Engage with multiple vendors/partner teams to ensure minimal risk/business disruption Review / assess implementation approach, potential risks, costs, etc. Responsibilities The Cloud Application Migration Product Owner will focus on migrating on-premise applications to the cloud in a methodical, agile way that enables migration at the scale and speed required to enable our digital transformation. You will define and lead cloud migration projects spanning multiple business units/sectors while ensuring a seamless transition from an on-prem to cloud operating environment. Work with relevant sector leaders to define and create sector specific application migration plans Define and follow the migration playbook with emphasis on capturing the advantages of operating in the cloud for migrated applications Partner with Cloud Engineering, Cloud Ops, and other teams in executing migration projects to ensure minimal risk and business disruption Provide deep cloud migration expertise covering infrastructure, application architectures, cloud capabilities, security, etc. Scripting / automation mindset and skills to automate routine tasks and extracts of data out of Azure for KPI and other metric reporting Qualifications 10+ years Cloud Solution Architecture skills (Azure or AWS) Experience with private, hybrid or public cloud technology Experience migrating large scale business applications from DC to Cloud or Cloud to Cloud Good Understanding of Migration Framework, Process and has extensive experience working on Lift & Shift migrations Scripting experience (must be fluent in a scripting language such asbash, python) Detail oriented self-starter capable of working independently. Minimum of 5 years of experience with data centers, consolidation, relocation, migration, technology refresh/modernization projects Application Knowledge, Compute & Database experience. Experience with Safe Agile Methodologies (Azure DevOps) Minimum of a Bachelor's Degree in computer science or equivalent experience
Posted 1 week ago
8.0 - 13.0 years
11 - 15 Lacs
Hyderabad
Work from Office
Overview PepsiCo operates in an environment undergoing immense and rapid change. Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCos global business scale to enable business insights, advanced analytics, and new product development. PepsiCos Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. Increase awareness about available data and democratize access to it across the company. As a data engineering lead, you will be the key technical expert overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be empowered to create & lead a strong team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Ideally Candidate must be flexible to work an alternative schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon coverage requirements of the job. The candidate can work with immediate supervisor to change the work schedule on rotational basis depending on the product and project requirements. Responsibilities Provide leadership and management to a team of data engineers, managing processes and their flow of work, vetting their designs, and mentoring them to realize their full potential. Act as a subject matter expert across different digital projects. Overseework with internal clients and external partners to structure and store data into unified taxonomies and link them together with standard identifiers. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality and performance. Responsible for implementing best practices around systems integration, security, performance, and data management. Empower the business by creating value through the increased adoption of data, data science and business intelligence landscape. Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects and strategic internal and external partners. Develop and optimize procedures to productionalize data science models. Define and manage SLAs for data products and processes running in production. Support large-scale experimentation done by data scientists. Prototype new approaches and build solutions at scale. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create and audit reusable packages or libraries. Qualifications 8+ years of overall technology experience that includes at least 4+ years of hands-on software development, data engineering, and systems architecture. 4+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience in SQL optimization and performance tuning, and development experience in programming languages like Python, PySpark, Scala etc.). 2+ years in cloud data engineering experience in Azure. Fluent with Azure cloud services. Azure Certification is a plus. Experience in Azure Log Analytics Experience with integration of multi cloud services with on-premises technologies. Experience with data modelling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse or Snowflake. Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus. Understanding of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). BA/BS in Computer Science, Math, Physics, or other technical fields. Candidate must be flexible to work an alternative work schedule either on tradition work week from Monday to Friday; or Tuesday to Saturday or Sunday to Thursday depending upon product and project coverage requirements of the job. Candidates are expected to be in the office at the assigned location at least 3 days a week and the days at work needs to be coordinated with immediate supervisor Skills, Abilities, Knowledge: Excellent communication skills, both verbal and written, along with the ability to influence and demonstrate confidence in communications with senior level management. Proven track record of leading, mentoring data teams. Strong change manager. Comfortable with change, especially that which arises through company growth. Ability to understand and translate business requirements into data and technical requirements. High degree of organization and ability to manage multiple, competing projects and priorities simultaneously. Positive and flexible attitude to enable adjusting to different needs in an ever-changing environment. Strong leadership, organizational and interpersonal skills; comfortable managing trade-offs. Foster a team culture of accountability, communication, and self-management. Proactively drives impact and engagement while bringing others along. Consistently attain/exceed individual and team goals. Ability to lead others without direct authority in a matrixed environment.
Posted 1 week ago
2.0 - 7.0 years
4 - 8 Lacs
Hyderabad
Work from Office
Overview PepsiCo operates in an environment undergoing immense and rapid change.Big-data and digital technologies are driving business transformation that is unlocking new capabilities and business innovations in areas like eCommerce, mobile experiences and IoT. The key to winning in these areas is being able to leverage enterprise data foundations built on PepsiCos global business scale to enable business insights, advanced analytics, and new product development. PepsiCos Data Management and Operations team is tasked with the responsibility of developing quality data collection processes, maintaining the integrity of our data foundations, and enabling business leaders and data scientists across the company to have rapid access to the data they need for decision-making and innovation. Maintain a predictable, transparent, global operating rhythm that ensures always-on access to high-quality data for stakeholders across the company. Responsible for day-to-day data collection, transportation, maintenance/ curation, and access to the PepsiCo corporate data asset Work cross-functionally across the enterprise to centralize data and standardize it for use by business, data science or other stakeholders. Increase awareness about available data and democratize access to it across the company . As a data enginee r , you will be the key technical expert building PepsiCo's data product s to drive a strong vision. You'll be empowered to create data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help developing very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics . You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premises data sources as well as cloud and remote systems. Responsibilities Act as a subject matter expert across different digital projects. Overseework with internal clients and external partners to structure and store data into unified taxonomies and link them together with standard identifiers. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality and performance. Responsible for implementing best practices around systems integration, security, performance, and data management. Empower the business by creating value through the increased adoption of data, data science and business intelligence landscape. Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects and strategic internal and external partners. Develop and optimize procedures to productionalize data science models. Define and manage SLAs for data products and processes running in production. Support large-scale experimentation done by data scientists. Prototype new approaches and build solutions at scale. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create and audit reusable packages or libraries. Qualifications 4 + years of overall technology experience that includes at least 3 + years of hands-on software development, data engineering, and systems architecture. 3 + years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 3 + years of experience in SQL optimization and performance tuning, and development experience in programming languages like Python, PySpark , Scala etc.). 2+ years in cloud data engineering experience in Azure. Fluent with Azure cloud services. Azure Certification is a plus. Experience in Azure Log Analytics
Posted 1 week ago
5.0 - 7.0 years
22 - 27 Lacs
Hyderabad
Work from Office
Overview We are seeking a Cloud Platform Enablement Engineer to join our team, supporting cloud adoption, automation, and governance within PepsiCos cloud ecosystem. This role is customer-facing, working closely to enabling teams to facilitate a smooth cloud journey by providing guidance, best practices, security compliance, and continuous services excellence. The ideal candidate will have strong technical expertise in cloud services (Azure, AWS), automation, and infrastructure provisioning, combined with excellent communication and collaboration skills to support teams in their cloud adoption process. Responsibilities Cloud Enablement & Customer Support Act as a trusted advisor, helping teams navigate their cloud adoption journey. Provide hands-on support for cloud resource provisioning, optimization, and best practices. Assist teams in resolving cloud-related challenges, ensuring a seamless experience. Automation & Infrastructure as Code (IaC) Develop and maintain automation scripts and Infrastructure as Code (Terraform, ARM, CloudFormation, etc.). Implement automated guardrails to ensure compliance with cloud governance policies. Optimize cloud provisioning workflows to improve efficiency and reduce manual effort. Cloud Governance & Compliance Ensure cloud resources align with security and compliance frameworks. Work closely with FinOps, Security, and Governance teams to enforce tagging, cost controls, and security best practices. Assist in NSG reviews, access management, and workload assessments. Cross-Team Collaboration & Enablement Partner with Cloud Architects, DevOps, and FinOps teams to streamline cloud adoption. Provide training, documentation, and knowledge-sharing to enhance cloud capabilities across teams. Drive continuous improvement in cloud onboarding and self-service experiences. Qualifications 5-7 years of hands-on experience in cloud engineering, cloud operations, or cloud enablement roles. Hands-on experience with Azure or AWS Strong understanding of customer-facing technical roles and ability to facilitate cloud adoption. Proficiency in Infrastructure as Code (IaC) tools like Terraform, ARM, or Ansible Strong knowledge of networking (NSG, VPC, Firewalls, DNS, etc.) in cloud environments. Experience with automation and scripting (Python, PowerShell, Bash). Familiarity with cloud security, compliance, and governance frameworks. Excellent communication skills with the ability to translate technical concepts for non-technical stakeholders. Preferred Certifications in Azure (AZ-104, AZ-305), AWS (Solutions Architect) Experience with FinOps and cloud cost management tools. Knowledge of CI/CD pipelines and DevOps methodologies.
Posted 1 week ago
5.0 - 10.0 years
14 - 19 Lacs
Hyderabad
Work from Office
Overview As an Analyst, Data Modeling, your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks, Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications 5+ years of overall technology experience that includes at least 2+ years of data modeling and systems architecture. Around 2+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 2+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI).
Posted 1 week ago
6.0 - 11.0 years
13 - 17 Lacs
Hyderabad
Work from Office
Overview As a member of the data engineering team, you will be the key technical expert developing and overseeing PepsiCo's data product build & operations and drive a strong vision for how data engineering can proactively create a positive impact on the business. You'll be an empowered member of a team of data engineers who build data pipelines into various source systems, rest data on the PepsiCo Data Lake, and enable exploration and access for analytics, visualization, machine learning, and product development efforts across the company. As a member of the data engineering team, you will help lead the development of very large and complex data applications into public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics. You will work closely with process owners, product owners and business users. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems Responsibilities Active contributor to code development in projects and services. Manage and scale data pipelines from internal and external data sources to support new product launches and drive data quality across data products. Build and own the automation and monitoring frameworks that captures metrics and operational KPIs for data pipeline quality and performance. Responsible for implementing best practices around systems integration, security, performance and data management. Empower the business by creating value through the increased adoption of data, data science and business intelligence landscape. Collaborate with internal clients (data science and product teams) to drive solutioning and POC discussions. Evolve the architectural capabilities and maturity of the data platform by engaging with enterprise architects and strategic internal and external partners. Develop and optimize procedures to productionalize data science models. Define and manage SLAs for data products and processes running in production. Support large-scale experimentation done by data scientists. Prototype new approaches and build solutions at scale. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create and audit reusable packages or libraries Qualifications 6+ years of overall technology experience that includes at least 4+ years of hands-on software development, data engineering, and systems architecture. 4+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience in SQL optimization and performance tuning, and development experience in programming languages like Python, PySpark, Scala etc.). 2+ years in cloud data engineering experience in Azure. Fluent with Azure cloud services. Azure Certification is a plus. Experience with integration of multi cloud services with on-premises technologies. Experience with data modeling, data warehousing, and building high-volume ETL/ELT pipelines. Experience with data profiling and data quality tools like Apache Griffin, Deequ, and Great Expectations. Experience building/operating highly available, distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse or SnowFlake. Experience with running and scaling applications on the cloud infrastructure and containerized services like Kubernetes. Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Azure Databricks and Azure Machine learning tools. Experience with Statistical/ML techniques is a plus. Experience with building solutions in the retail or in the supply chain space is a plus Understanding of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI). BA/BS in Computer Science, Math, Physics, or other technical fields.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi