Jobs
Interviews

71 Eventbridge Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 12.0 years

0 Lacs

noida, uttar pradesh

On-site

You are looking for a skilled DotNet with AWS Developer to join your team. The ideal candidate should have a strong background in .NET Core, C#, and AWS services, along with experience in developing and integrating applications using CI/CD pipelines. As a part of this role, you will be involved in the full lifecycle development process, from requirements analysis to deployment and maintenance. Your responsibilities will include developing and integrating requirements using CI/CD code pipeline with GitHub, participating in the full development lifecycle, serving as a technical expert on projects, writing technical specifications, supporting and maintaining software functionality, evaluating new technologies, analyzing and revising code, participating in software design meetings, and consulting with end users. Required skills for this position include proficiency in .NET Core, C#, AWS SDK, experience with NoSQL databases like MongoDB and AWS DynamoDB, working with JIRA, Microsoft .NET Framework and supported programming languages, and a strong understanding of AWS services such as EC2, ECS, Lambda, SNS, SQS, EventBridge, DynamoDB, and CloudWatch. Additionally, you should have experience in backend development using C# and .Net Core, version control using Git with Copilot. Preferred skills include UI development experience with Angular 8+, working experience with Confluence, Lucid portal, and ServiceNow. You will be working with tools and technologies such as GitHub Desktop, Visual Studio Code, Visual Studio IDE (Professional 2022 with GitHub Copilot), Teams, and Outlook for communication. Soft skills required for this role include strong communication skills in English and the ability to complete tasks within the estimated time by the team.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

noida, uttar pradesh

On-site

As a Tech Lead with over 5 years of experience in Typescript and Node, you will be responsible for leading the design and development of highly skilled APIs using technologies such as REST, OpenAPI, as well as API gateways like Kong. Your expertise in AWS services such as serverless, Lambda, Aurora, and API Gateway will be instrumental in creating efficient and scalable solutions. Experience with authentication protocols like OIDC, OAuth2, and JWT is essential, along with proficiency in RDBs like MySQL and PostgreSQL. Programmatic testing using tools like Jest will also be part of your responsibilities. At GlobalLogic, we prioritize a culture of caring where people come first. You will experience an inclusive environment that fosters acceptance, belonging, and meaningful connections with your teammates, managers, and leaders. Continuous learning and development are key aspects of your journey at GlobalLogic, with numerous opportunities to enhance your skills and advance your career. You will have the chance to work on interesting and impactful projects that challenge your problem-solving skills and creativity, enabling you to contribute to cutting-edge solutions that shape the world. We believe in the importance of work-life balance and flexibility, offering various career areas, roles, and work arrangements to help you achieve the perfect balance. As part of a high-trust organization, integrity is fundamental to our values. By joining GlobalLogic, you are becoming part of a trustworthy, reliable, and ethical company that values truthfulness, candor, and integrity in all aspects of its operations. GlobalLogic, a Hitachi Group Company, is a digital engineering partner to leading global companies, driving innovation and transformation through intelligent products, platforms, and services. Since 2000, we have been at the forefront of the digital revolution, collaborating with clients to redefine industries and create innovative digital experiences. Join us and be part of a team that engineers impact and shapes the future of digital technology.,

Posted 2 weeks ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

Noida, India

Work from Office

1. Design and manage cloud-based systems on AWS. 2. Develop and maintain backend services and APIs using Java. 3. Basic knowledge on SQL and able to write SQL queries. 4. Good hands on Docker file and multistage docker 5. Implement containerization using Docker and orchestration with ECS/Kubernetes. 6. Monitor and troubleshoot cloud infrastructure and application performance. 7. Collaborate with cross-functional teams to integrate systems seamlessly. 8. Document system architecture, configurations, and operational procedures. Need Strong Hands-on Knowledge: * ECS, ECR, NLB, ALB, ACM, IAM, S3, Lambda, RDS, KMS, API Gateway, Cognito, CloudFormation. Good to Have: * Experience with AWS CDK for infrastructure as code. * AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified Developer). * Pyhton Mandatory Competencies Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Database - Other Databases - PostgreSQL Beh - Communication DevOps/Configuration Mgmt - Cloud Platforms - AWS

Posted 2 weeks ago

Apply

4.0 - 7.0 years

6 - 9 Lacs

Noida

Work from Office

Key Responsibilities: Develop responsive web applications using Angular. Integrate front-end applications with AWS backend services. Collaborate with UX/UI designers and backend developers in Agile teams. Develop and maintain responsive web applications using Angular framework. Create engaging and interactive web interfaces using HTML, CSS, and JavaScript Optimize web performance and ensure cross-browser compatibility Integrate APIs and backend systems to enable seamless data flow Required Skills: Strong proficiency in Angular and TypeScript. Experience with RESTful APIs and integration with AWS services. Knowledge of HTML, CSS, and JavaScript. Knowledge of version control systems like Git. Background in financial applications is a plus. Mandatory Competencies User Interface - Other User Interfaces - JavaScript DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Git Beh - Communication and collaboration User Interface - Angular - Angular Components and Design Patterns Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate UX - UX - Adobe XD Agile - Agile - SCRUM User Interface - HTML - HTML/CSS User Interface - Other User Interfaces - Typescript

Posted 2 weeks ago

Apply

5.0 - 9.0 years

10 - 15 Lacs

Noida

Work from Office

1. C#, Microsoft SQL Server or Azure SQL, Azure Cosmos DB, Azure Service Bus, Azure Function Apps, Auth0, Web Sockets 2. Strong development experience in C# and .NET core technologies built up across a range of different projects 3.Experience of developing API's which conform as much as possible to REST principles in terms of Resources, Sub Resources, Responses, Error Handling 4.Experience of API design and documentation using Open API 3.x YAML Swagger 5.Some familiarity with AWS, and especially Elastic Search would be beneficial but not mandatory. 6.Azure Certifications an advantage 7. HTML5, Angular 14 or later, NodeJS, CSS Mandatory Competencies Programming Language - .Net Full Stack - Angular Programming Language - .Net - .NET Core Programming Language - .Net Full Stack - HTML CSS Beh - Communication and collaboration Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Cloud - Azure - ServerLess (Function App Logic App) Programming Language - Other Programming Language - C# Middleware - API Middleware - Microservices User Interface - Other User Interfaces - node.JS

Posted 2 weeks ago

Apply

6.0 - 7.0 years

6 - 11 Lacs

Noida

Work from Office

Responsibilities Data Architecture: Develop and maintain the overall data architecture, ensuring scalability, performance, and data quality. AWS Data Services: Expertise in using AWS data services such as AWS Glue, S3, SNS, SES, Dynamo DB, Redshift, Cloud formation, Cloud watch, IAM, DMS, Event bridge scheduler etc. Data Warehousing: Design and implement data warehouses on AWS, leveraging AWS Redshift or other suitable options. Data Lakes: Build and manage data lakes on AWS using AWS S3 and other relevant services. Data Pipelines: Design and develop efficient data pipelines to extract, transform, and load data from various sources. Data Quality: Implement data quality frameworks and best practices to ensure data accuracy, completeness, and consistency. Cloud Optimization: Optimize data engineering solutions for performance, cost-efficiency, and scalability on the AWS cloud. Team Leadership: Mentor and guide data engineers, ensuring they adhere to best practices and meet project deadlines. Qualifications Bachelors degree in computer science, Engineering, or a related field. 6-7 years of experience in data engineering roles, with a focus on AWS cloud platforms. Strong understanding of data warehousing and data lake concepts. Proficiency in SQL and at least one programming language ( Python/Pyspark ). Good to have - Experience with any big data technologies like Hadoop, Spark, and Kafka. Knowledge of data modeling and data quality best practices. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a team. Preferred Qualifications Certifications in AWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect - Data. Mandatory Competencies Big Data - Big Data - Pyspark Data on Cloud - Azure Data Lake (ADL) Beh - Communication and collaboration Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Cloud - AWS - AWS S3, S3 glacier, AWS EBS Cloud - Azure - Azure Data Factory (ADF), Azure Databricks, Azure Data Lake Storage, Event Hubs, HDInsight Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Database - Sql Server - SQL Packages Data Science and Machine Learning - Data Science and Machine Learning - Python

Posted 2 weeks ago

Apply

4.0 - 8.0 years

7 - 11 Lacs

Noida

Work from Office

We are seeking a dedicated and proactive Support Manager to lead our Maintenance and Support Team and ensure timely resolution of Client issues. The ideal candidate will be responsible for managing daily support operations, maintaining service quality, and acting as the primary point of escalation for all production critical issues and defects. Key Responsibilities: Support Manager is responsible for Resource Management - Coverage, availability, capability Oversee support team performance and ticket resolution timelines Manage escalations and ensure customer satisfaction Collaborate with other support/dev teams to resolve recurring issues Monitor KPIs and prepare regular support performance reports Act as the primary escalation point Identify, document, and mitigate Risks, Assumptions, Issue and Dependencies for the project Drive improvements in support processes and tools Requirements: Proven experience in technical application maintenance & support projects and production support leadership role Strong understanding of RAID management and issue escalation handling Strong leadership, problem-solving, and communication skills Familiarity with support tools (e.g., Jira, Service Now) Ability to work effectively under pressure in a fast-paced environment Good to have technical knowledge or hands on experience in Java, Sprint Boot, .Net, Python, Unix/Linux systems, AWS Mandatory Competencies App Support - App Support - L1, L2, L3 Support BA - Project Management Programming Language - Java - Core Java (java 8+) Programming Language - .Net Full Stack - Javascript Beh - Communication and collaboration Operating System - Operating System - Linux Operating System - Operating System - Unix Middleware - API Middleware - Microservices Data Science and Machine Learning - Data Science and Machine Learning - Python Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate

Posted 2 weeks ago

Apply

7.0 - 11.0 years

13 - 18 Lacs

Noida

Work from Office

Must-Have Skills: Expertise in AWS CDK, Services(Lambda, ECS, S3) and PostgreSQL DB management. Strong understanding serverless architecture and event-driven design(SNS, SQS). Nice to have: Knowledge of multi-account AWS Setups and Security best practices (IAM, VPC, etc.), Experience in cost optimization strategies in AWS. Mandatory Competencies Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Database - Other Databases - PostgreSQL

Posted 2 weeks ago

Apply

5.0 - 9.0 years

8 - 12 Lacs

Noida

Work from Office

Must-Have Skills: Expertise in AWS CDK, Services(Lambda, ECS, S3) and PostgreSQL DB management. Strong understanding serverless architecture and event-driven design(SNS, SQS). Nice to have: Knowledge of multi-account AWS Setups and Security best practices (IAM, VPC, etc.), Experience in cost optimization strategies in AWS. Mandatory Competencies Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Database - Other Databases - PostgreSQL Beh - Communication and collaboration Cloud - AWS - AWS S3, S3 glacier, AWS EBS Development Tools and Management - Development Tools and Management - CI/CD Cloud - AWS - ECS

Posted 2 weeks ago

Apply

4.0 - 5.0 years

5 - 9 Lacs

Noida

Work from Office

Responsibilities Data Architecture: Develop and maintain the overall data architecture, ensuring scalability, performance, and data quality. AWS Data Services: Expertise in using AWS data services such as AWS Glue, S3, SNS, SES, Dynamo DB, Redshift, Cloud formation, Cloud watch, IAM, DMS, Event bridge scheduler etc. Data Warehousing: Design and implement data warehouses on AWS, leveraging AWS Redshift or other suitable options. Data Lakes: Build and manage data lakes on AWS using AWS S3 and other relevant services. Data Pipelines: Design and develop efficient data pipelines to extract, transform, and load data from various sources. Data Quality: Implement data quality frameworks and best practices to ensure data accuracy, completeness, and consistency. Cloud Optimization: Optimize data engineering solutions for performance, cost-efficiency, and scalability on the AWS cloud. Qualifications Bachelors degree in computer science, Engineering, or a related field. 4-5 years of experience in data engineering roles, with a focus on AWS cloud platforms. Strong understanding of data warehousing and data lake concepts. Proficiency in SQL and at least one programming language ( Python/Pyspark ). Good to have - Experience with any big data technologies like Hadoop, Spark, and Kafka. Knowledge of data modeling and data quality best practices. Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a team. Preferred Qualifications Certifications in AWS Certified Data Analytics - Specialty or AWS Certified Solutions Architect - Data. Mandatory Competencies Big Data - Big Data - Pyspark Beh - Communication and collaboration Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Database - Sql Server - SQL Packages Data Science and Machine Learning - Data Science and Machine Learning - Python

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Noida

Work from Office

Strong experience in Java 1.8 or above Strong experience in Cloud AWS Experience in developing front end screens with Angular framework Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, JSON, jQuery) Experience with Database Ability to pick up new technologies Willingness to learn and understand the business domain Ability to meet client needs without sacrificing deadlines and quality Ability to work effectively within global team Excellent communication and teamwork skills Mandatory Competencies Fundamental Technical Skills - Spring Framework/Hibernate/Junit etc. Beh - Communication Programming Language - Java - Core Java (java 8+) Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Database - Database Programming - SQL

Posted 2 weeks ago

Apply

5.0 - 10.0 years

6 - 11 Lacs

Noida

Work from Office

5+ years of experience in data engineering with a strong focus on AWS services . Proven expertise in: Amazon S3 for scalable data storage AWS Glue for ETL and serverless data integration using Amazon S3, DataSync, EMR, Redshift for data warehousing and analytics Proficiency in SQL , Python , or PySpark for data processing. Experience with data modeling , partitioning strategies , and performance optimization . Familiarity with orchestration tools like AWS Step Functions , Apache Airflow , or Glue Workflows . Strong understanding of data lake and data warehouse architectures. Excellent problem-solving and communication skills. Mandatory Competencies Beh - Communication ETL - ETL - AWS Glue Big Data - Big Data - Pyspark Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Cloud - AWS - AWS S3, S3 glacier, AWS EBS Cloud - AWS - Tensorflow on AWS, AWS Glue, AWS EMR, Amazon Data Pipeline, AWS Redshift Programming Language - Python - Python Shell Database - Database Programming - SQL

Posted 2 weeks ago

Apply

8.0 - 12.0 years

10 - 15 Lacs

Pune

Work from Office

Key Responsibilities Design and develop scalable applications using Python and AWS services Debug and resolve production issues across complex distributed systems Architect solutions aligned with business strategies and industry standards Lead and mentor a team of India-based developers; guide career development Ensure technical deliverables meet highest standards of quality and performance Research and integrate emerging technologies and processes into development strategy Document solutions in compliance with SDLC standards using defined templates Assemble large, complex datasets based on functional and non-functional requirements Handle operational issues and recommend improvements in technology stack Facilitate end-to-end platform integration across enterprise-level applications Required Skills Technical Skills Cloud & Architecture Tools & Processes Python AWS (EC2, EKS, Glue, Lambda, S3, EMR, RDS, API Gateway) Terraform, CI/CD pipelines Data Engineering Step Functions, CloudFront EventBridge, ARRFlow, Airflow (MWAA), Quicksight Debugging & Troubleshooting System Integration SDLC, Documentation Templates Qualifications 10+ years of software development experience, preferably in financial/trading applications 5+ years of people management and mentoring experience Proven track record in technical leadership and architecture planning Expertise in developing applications using Python and AWS stack Strong grasp of Terraform and automated CI/CD processes Exceptional multitasking and prioritization capabilities

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have at least 10 years of experience. You should be proficient in setting up, configuring, and integrating API gateways in AWS. Your expertise should include API frameworks, XML/JSON, REST, and data protection in software design, build, test, and documentation. Experience with various AWS services such as Lambda, S3, CDN (CloudFront), SQS, SNS, EventBridge, API Gateway, Glue, and RDS is required. You should be able to articulate and implement projects using these AWS services effectively. Your role will involve improving business processes through effective integration solutions. Location: Bangalore, Chennai, Pune, Mumbai, Noida Notice Period: Immediate joiner If you meet the requirements mentioned above, please apply for this position by filling out the form with your Full Name, Email, Phone, Cover Letter, and uploading your CV/Resume (PDF, DOC, DOCX formats accepted). By submitting this form, you agree to the storage and handling of your data by this website.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

We have an exciting and rewarding opportunity for you to take your software engineering career to the next level. As a Software Engineer III at JPMorgan Chase within the Consumer & Community Banking Team, you serve as a seasoned member of an agile team to design and deliver trusted market-leading technology products in a secure, stable, and scalable way. You are responsible for carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. You will execute creative software solutions, design, development, and technical troubleshooting with the ability to think beyond routine or conventional approaches to build solutions or break down technical problems. Your role includes creating secure and high-quality production code and maintaining algorithms that run synchronously with appropriate systems. You will produce architecture and design artifacts for complex applications while being accountable for ensuring design constraints are met by software code development. Additionally, you will gather, analyze, synthesize, and develop visualizations and reporting from large, diverse data sets in service of continuous improvement of software applications and systems. You will also build Microservices that will run on the bank's internal cloud and the public cloud platform (AWS) and collaborate with teams in multiple regions and time zones. Participation in scrum team stand-ups, code reviews, and other ceremonies, contributing to task completion and blocker resolution within your team is expected. Required qualifications, capabilities, and skills include formal training or certification on software engineering concepts and 3+ years of applied experience in Java, AWS, Terraforms. You should have experience with technologies like Java 11/17, Spring/Spring Boot, Kafka, Relational/Non-Relational Databases such as Oracle, Cassandra, Dynamo, Postgres. A minimum of 3 years of hands-on experience on the Public Cloud platform using AWS for building secure Microservices is required. Hands-on experience with AWS services like EKS, Fargate, SQS/SNS/Eventbridge, Lambda, S3, EBS, Dynamo/Arora Postgres DB, and Terraform scripts is essential. Additionally, experience with DevOps concepts for automated build and deployment is crucial. Preferred qualifications, capabilities, and skills include familiarity with modern front-end technologies and exposure to cloud technologies.,

Posted 2 weeks ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Gurugram

Work from Office

for AWS/DevOps Analyst Analyst should be having more than 6 years of IT Experience Experience in setting up and maintaining ECS solutions Experience in designing and building AWS solutions with VPC,EC2, WAF, ECS, ALB, IAM, KMS, ACM, Secret Manager, S3, CloudFront etc Experience with SNS, SQS, EventBridge Experience in setting up and maintaining RDS, Aurora DB, Postgres DB, DynamoDB, Redis. Experience in setting up AWS Glue jobs, AWS Lambda Experience in setting up CI/CD using Azure DevOps Experience in Source code management GitHub Experience in building and maintaining cloud-native applications Experience in container technologies like docker Experience in configuring logging and monitoring solution like CloudWatch, OpenSearch Build, Release and Manage Configuration of system using Terraform and Terragrunt Ensure necessary system security by using best in class security solutions Recommend process and architecture improvements Ability to troubleshoot distributed systems. Interpersonal Skills Required for both: Strong communication and collaboration skills The ability to be a team player Having good analytical and problem-solving skill Understanding of Agile methodologies The ability and skill to train other people in procedural and technical topics Mandatory Skills: Cloud App Dev Consulting. Experience: 5-8 Years.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

18 - 22 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Job Title: AWS Data Engineer Experience Required: 5+ Years Interested? Send your resume to: aditya.rao@estrel.ai Kindly include: Updated Resume Current CTC Expected CTC Notice Period / Availability (Looking only for Immediate Joiner) LinkedIn Profile Job Overview: We are seeking a skilled and experienced Data Engineer with a minimum of 5 years of experience in Python-based data engineering solutions, real-time data processing, and AWS Cloud technologies. The ideal candidate will have hands-on expertise in designing, building, and maintaining scalable data pipelines, implementing best practices, and working within CI/CD environments. Key Responsibilities: Design and implement scalable and robust data pipelines using Python and frameworks like Pytest and PySpark . Work extensively with AWS cloud services such as AWS CDK , S3 , Lambda , DynamoDB , EventBridge , Kinesis , CloudWatch , AWS Glue , and Lake Formation . Implement data governance and data security protocols, including handling of sensitive data and encryption practices . Develop microservices and APIs using FastAPI , GraphQL , and Pydantic . Design and maintain solutions for real-time streaming and event-driven architecture . Follow SDLC best practices , ensuring code quality through TDD (Test-Driven Development) and robust documentation. Use GitLab for version control, and manage deployment pipelines with CI/CD . Collaborate with cross-functional teams to align data architecture and services with business objectives. Required Skills: Proficiency in Python v3.6+ Experience with Python frameworks: Pytest , PySpark Strong knowledge of AWS tools & services Experience with FastAPI , GraphQL , and Pydantic Expertise in real-time data processing , eventing , and microservices Good understanding of Data Governance , Security , and LakeFormation Familiarity with GitLab , CI/CD pipelines , and TDD Strong problem-solving and analytical skills Excellent communication and team collaboration skills Preferred Qualifications: AWS Certification(s) (e.g., AWS Certified Data Analytics Specialty, Solutions Architect) Experience with DataZone , data cataloging , or metadata management tools Experience in high-compliance industries (e.g., finance, healthcare) is a plus

Posted 2 weeks ago

Apply

8.0 - 10.0 years

12 - 18 Lacs

Zirakpur

Work from Office

AWS Services ( Lambda, Glue, S3, Dynamo, EventBridge, Appsync, Open search) Terraform Python React/Vite Unit testing (Jest, Pytest) Software development lifecycle

Posted 2 weeks ago

Apply

12.0 - 17.0 years

14 - 19 Lacs

Bengaluru

Work from Office

YOUR IMPACT: We are seeking a highly skilled and experienced Level 3 Site Reliability Engineer (SRE) to join our Cloud Operations team. This role is critical in driving advanced engineering initiatives to ensure infrastructure reliability, scalability, and automation across multi-cloud environments. As an L3 SRE, you will lead complex cloud support operations, troubleshoot infrastructure as code, implement observability frameworks, and guide junior SREs while helping shape future architectural direction. This role demands hands-on expertise in AWS, Azure, or GCP, advanced scripting, and deep observability integrationcontributing directly to uptime, automation maturity, and strategic improvements to cloud infrastructure. WHAT YOU NEED TO SUCCEED: Cloud Infrastructure & Architecture Architect and maintain scalable, resilient systems across AWS, Azure, and GCP. Lead cloud adoption and migration strategies while ensuring minimal disruption and high reliability. Implement security and governance controls including VPC, Security Groups, Route53, ACM, and Security Hub. Perform deep infrastructure troubleshooting and root cause analysis, especially with IaC-based deployments. Infrastructure as Code (IaC) & Configuration Management Design and manage infrastructure using Terraform, Terragrunt, and CloudFormation. Oversee configuration management using tools like AWS SSM, SaltStack, and Packer. Review and remediate issues within Git-based CI/CD workflows for IaC and service deployment. Observability & Monitoring Build and maintain monitoring/alerting pipelines using CloudWatch, EventBridge, SNS, and Hund.io. Develop custom observability tooling for end-to-end visibility and proactive issue detection. Lead incident response and contribute to post-incident reviews and reliability reports. Automation, Scripting & CI/CD Develop and maintain automation tools using Bash, Python, Ruby, or PHP. Integrate deployment pipelines into secure, scalable CI/CD processes. Automate vulnerability assessments and compliance scans with ISO 27001 standards. Containerization & Microservices Support Lead container platform deployments using EKS, ECS, ECR, and Fargate. Guide engineering teams in Kubernetes resource optimization and troubleshooting. Database & Storage Management Provide advanced operational support for RDS, PostgreSQL, and Elasticsearch. Monitor database performance and ensure availability across distributed systems. Mentorship & Strategy Mentor L1 and L2 SREs on technical tasks and troubleshooting best practices. Contribute to cloud architecture planning, operational readiness, and process improvements. Help define and track Key Performance Indicators (KPIs) related to system uptime, MTTR, and automation coverage. WHAT THE ROLE OFFERS: What You Bring (Qualifications & Skills) 7-12 years of experience in Site Reliability Engineering or DevOps roles. Advanced expertise in multi-cloud environments (AWS, Azure, GCP). Strong Linux and Windows administration background (Fedora, Debian, Microsoft). Proficiency in Terraform, Terragrunt, CloudFormation, and config management tools. Hands-on with monitoring tools like CloudWatch, SNS, EventBridge, and third-party integrations. Advanced scripting skills in Python, Bash, Ruby, or PHP. Knowledge of container platforms including EKS, ECS, and Fargate. Familiarity with Vulnerability Management, ISO 27001, and audit-readiness practices.

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

maharashtra

On-site

If you are a software engineering leader ready to take the reins and drive impact, weve got an opportunity just for you. As a Director of Software Engineering at JPMorgan Chase within the Asset and Wealth Management LOB, you lead a data technology area and drive impact within teams, technologies, and deliveries. Utilize your in-depth knowledge of software, applications, technical processes, and product management to drive multiple complex initiatives, while serving as a primary decision maker for your teams and be a driver of engineering innovation and solution delivery. The current role focuses on delivering data solutions for some of the Wealth Management businesses. Job responsibilities Leads engineering and delivery of a data and analytics solutions Makes decisions that influence teams resources, budget, tactical operations, and the execution and implementation of processes and procedures Carries governance accountability for coding decisions, control obligations, and measures of success such as cost of ownership & maintainability Delivers technical solutions that can be leveraged across multiple businesses and domains Influences and collaborates with peer leaders and senior stakeholders across the business, product, and technology teams Champions the firms culture of diversity, equity, inclusion, and respect Required qualifications, capabilities, and skills Experience managing data solutions across a large, global consumer community in the Financial Services domain Experience hiring, developing and leading cross-functional teams of technologists Experience handling multiple, global stakeholders across business, technology and product Appreciation of the data product; modeling, sourcing, quality, lineage, discoverability, access management, visibility, purging, etc. Experience researching and upgrading to latest technologies in the continuously evolving data ecosystem Practical hybrid cloud native experience, preferably AWS Experience using current technologies, such as GraphQL, Glue, Spark, SnowFlake, SNS, SQS, Kinesis, Lambda, ECS, EventBridge, QlikSense, etc. Experience with Java and/or Python programming languages Expertise in Computer Science, Computer Engineering, Mathematics, or a related technical field Preferred qualifications, capabilities, and skills Comfortable being hands-on as required to drive solutions and solve challenges for the team Exposure and appreciation of the continuously evolving data science space Exposure to the Wealth Management business,

Posted 2 weeks ago

Apply

2.0 - 7.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Your Role Experience in Enterprise Data Management Consolidation (EDMCS) Enterprise Profitability & Cost Management Cloud Services (EPCM) Oracle Integration cloud (OIC). full life cycle Oracle EPM Cloud Implementation. Experience in creating forms, OIC Integrations, and complex Business Rules. Understand dependencies and interrelationships between various components of Oracle EPM Cloud. Keep abreast of Oracle EPM roadmap and key functionality to identify opportunities where it will enhance the current process within the entire Financials ecosystem. Proven ability to collaborate with internal clients in an agile manner, leveraging design thinking approaches. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Your Profile Proven ability to collaborate with internal clients in an agile manner, leveraging design thinking approaches. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Experience of Python, AWS Cloud (Lambda, Step functions, EventBridge etc.) is preferred. What you'll love about capgemini You can shape yourcareer with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work. You will have theopportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. Were committed to ensure that people of all backgrounds feel encouraged and have a sense of belonging at Capgemini. You are valued for who you are, and you can bring your original self to work . Every Monday, kick off the week with a musical performance by our in-house band - The Rubber Band. Also get to participate in internal sports events , yoga challenges, or marathons. At Capgemini, you can work on cutting-edge projects in tech and engineering with industry leaders or create solutions to overcome societal and environmental challenges. About Capgemini

Posted 2 weeks ago

Apply

5.0 - 10.0 years

20 - 27 Lacs

Pune

Remote

Work Hours: Partial overlap with US PST Key Responsibilities Rapidly prototype MVPs and innovative ideas within tight timelines. Own end-to-end development and deployment of applications in non-production AWS environments. Collaborate with cross-functional teams to deliver scalable web solutions. Technical Expertise 1. Front-End Development Proficiency in React, AWS S3, and AWS CloudFront. Experience building medium to large websites (1520+ pages). 2. Back-End & Serverless Architecture Strong understanding of microservices architecture. Experience with AWS serverless stack: Lambda (Node.js), Cognito, API Gateway, EventBridge, Step Functions. Familiarity with AWS Aurora MySQL and DynamoDB (preferred but not mandatory). 3. DevOps & CI/CD Proficiency in AWS SAM, CloudFormation or AWS CDK. Experience with AWS CodePipeline or equivalent tools (e.g., GitHub Actions). Requirements Experience & Qualifications 5–10 years of software development experience. Minimum 3 years of hands-on experience with React and AWS technologies. Fast learner with the ability to adapt in a dynamic environment.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

12 - 16 Lacs

Hyderabad, Pune, Chennai

Work from Office

Your Role 5+ years of experience implementing and supporting following Enterprise Planning & Budgeting Cloud Services (EPBCS) modules - Financials, Workforce, Capital, and Projects Experience in Enterprise Data Management Consolidation (EDMCS) Enterprise Profitability & Cost Management Cloud Services (EPCM) Oracle Integration cloud (OIC). 1+ full life cycle Oracle EPM Cloud Implementation. Experience in creating forms, OIC Integrations, and complex Business Rules. Understand dependencies and interrelationships between various components of Oracle EPM Cloud. Keep abreast of Oracle EPM roadmap and key functionality to identify opportunities where it will enhance the current process within the entire Financials ecosystem. Proven ability to collaborate with internal clients in an agile manner, leveraging design thinking approaches. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Your Profile Proven ability to collaborate with internal clients in an agile manner, leveraging design thinking approaches. Collaborate with the FP&A to facilitate the Planning, Forecasting and Reporting process for the organization. Create and maintain system documentation, both functional and technical Experience of Python, AWS Cloud (Lambda, Step functions, EventBridge etc.) is preferred. What you love about capgemini You can shape yourcareer with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage or new parent support via flexible work.You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. Were committed to ensure that people of all backgrounds feel encouraged and have a sense of belonging at Capgemini. You are valued for who you are, and you can bring your original self to work . Every Monday, kick off the week with a musical performance by our in-house band - The Rubber Band. Also get to participate in internal sports events , yoga challenges, or marathons. At Capgemini, you can work on cutting-edge projects in tech and engineering with industry leaders or create solutions to overcome societal and environmental challenges. About Capgemini Location - Hyderabad,Pune,Chennai,Bengaluru,Mumbai

Posted 3 weeks ago

Apply

6.0 - 10.0 years

6 - 7 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Experience: 6 + years Expected Notice Period: 15 Days Shift: (GMT+05:30) Asia/Kolkata (IST) Opportunity Type: Remote,New Delhi,Bengaluru,Mumbai Technical Lead What youll own Leading the re-architecture of Zooms database foundation with a focus on scalability, query performance, and vector-based search support Replacing or refactoring our current in-house object store and metadata database to a modern, high-performance elastic solution Collaborating closely with our core platform engineers and AI/search teams to ensure seamless integration and zero disruption to existing media workflows Designing an extensible system that supports object-style relationships across millions of assets, including LLM-generated digital asset summaries, time-coded video metadata, AI generated tags, and semantic vectors Driving end-to-end implementation: schema design, migration tooling, performance benchmarking, and production rolloutall with aggressive timelines Skills & Experience We Expect Were looking for candidates with 7-10 years of hands-on engineering experience, including 3+ years in a technical leadership role. Your experience should span the following core areas: System Design & Architecture (3-4 yrs) Strong hands-on experience with the Java/JVM stack (GC tuning), Python in production environments Led system-level design for scalable, modular AWS microservices architectures Designed high-throughput, low-latency media pipelines capable of scaling to billions of media records Familiar with multitenant SaaS patterns, service decomposition, and elastic scale-out/in models Deep understanding of infrastructure observability, failure handling, and graceful degradation Database & Metadata Layer Design (3-5 yrs) Experience redesigning or implementing object-style metadata stores used in MAM/DAM systems Strong grasp of schema-less models for asset relationships, time-coded metadata, and versioned updates Practical experience with DynamoDB, Aurora, PostgreSQL, or similar high-scale databases Comfortable evaluating trade-offs between memory, query latency, and write throughput Semantic Search & Vectors (1-3 yrs) Implemented vector search using systems like Weaviate, Pinecone, Qdrant, or Faiss Able to design hybrid (structured + semantic) search pipelines for similarity and natural language use cases Experience tuning vector indexers for performance, memory footprint, and recall Familiar with the basics of embedding generation pipelines and how they are used for semantic search and similarity-based retrieval Worked with MLOps teams to deploy ML inference services (e.g., FastAPI/Docker + GPU-based EC2 or SageMaker endpoints) Understands the limitations of recognition models (e.g., OCR, face/object detection, logo recognition), even if not directly building them Media Asset Workflow (2-4 yrs) Deep familiarity with broadcast and OTT formats: MXF, IMF, DNxHD, ProRes, H.264, HEVC Understanding of proxy workflows in video post-production Experience with digital asset lifecycle: ingest, AI metadata enrichment, media transformation, S3 cloud archiving Hands-on experience working with time-coded metadata (e.g., subtitles, AI tags, shot changes) management in media archives Cloud-Native Architecture (AWS) (3-5 yrs) Strong hands-on experience with ECS, Fargate, Lambda, S3, DynamoDB, Aurora, SQS, EventBridge Experience building serverless or service-based compute models for elastic scaling Familiarity with managing multi-region deployments, failover, and IAM configuration Built cloud-native CI/CD deployment pipelines with event-driven microservices and queue-based workflows Frontend Collaboration & React App Integration (2-3 yrs) Worked closely with React-based frontend teams, especially on desktop-style web applications Familiar with component-based design systems, REST/GraphQL API integration, and optimizing media-heavy UI workflows Able to guide frontend teams on data modeling, caching, and efficient rendering of large asset libraries Experience with Electron for desktop apps skills -MAM, App integration

Posted 3 weeks ago

Apply

7.0 - 12.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Your Job As a Data Engineer you will be a part of an team that designs, develops, and delivers Data Pipelines and Data Analytics Solutions for Koch Industries. Koch Industries is a privately held global organization with over 120,000 employees around the world, with subsidiaries involved in manufacturing, trading, and investments. Koch Global Solution India (KGSI) is being developed in India to extend its IT operations, as well as act as a hub for innovation in the IT function. As KSGI rapidly scales up its operations in India, its employees will get opportunities to carve out a career path for themselves within the organization. This role will have the opportunity to join on the ground floor and will play a critical part in helping build out the Koch Global Solution (KGS) over the next several years. Working closely with global colleagues would provide significant international exposure to the employees. Our Team The Enterprise data and analytics team at Georgia Pacific is focused on creating an enterprise capability around Data Engineering Solutions for operational and commercial data as well as helping businesses develop, deploy, manage monitor Data Pipelines and Analytics solutions of manufacturing, operations, supply chain and other key areas. What You Will Do ETL SolutionsDesign, implement, and manage large-scale ETL solutions using the AWS technology stack, including Event Bridge, Lambda, Glue, Step Functions, Redshift, and CloudWatch. Data Pipeline ManagementDesign, develop, enhance, and debug existing data pipelines to ensure seamless operations. Data ModellingProven Experience in Designing, Developing Data Modeling. Best Practices ImplementationDevelop and implement best practices to ensure high data availability, computational efficiency, cost-effectiveness, and data quality within Snowflake and AWS environments. EnhancementBuild and enhance Data Products, processes, functionalities, and tools to streamline all stages of data lake implementation and analytics solution development, including proof of concepts, prototypes, and production systems. Production SupportProvide ongoing support for production data pipelines, ensuring high availability and performance. Issue ResolutionMonitor, troubleshoot, and resolve issues within data pipelines and ETL processes promptly. AutomationDevelop and implement scripts and tools to automate routine tasks and enhance system efficiency Who You Are (Basic Qualifications) Bachelor's degree in Computer Science, Engineering, or a related IT field, with at least 7+ years of experience in software development. 5+ Years of hands-on experience of Designing, implementing, and managing large-scale ETL solutions using the AWS technology stack including Event Bridge, Lambda, Glue, Step Functions, Redshift, and CloudWatch. Primary skill setSQL, S3, AWS Glue, Pyspark, Python, Lambda, Columnar DB (Redshift), AWS IAM, Step Functions, Git, Terraform, CI/CD. Good to haveExperience with the MSBI stack, including SSIS, SSAS, and SSRS. What Will Put You Ahead In-depth knowledge of entire suite of services in AWS Data Service Platform. Strong coding experience using Python, Pyspark. Experience of designing and implementing Data Modeling. Cloud Data Analytics/Engineering certification. Who We Are At Koch, employees are empowered to do what they do best to make life better. Learn how ourhelps employees unleash their potential while creating value for themselves and the company.Additionally, everyone has individual work and personal needs. We seek to enable the best work environment that helps you and the business work together to produce superior results.

Posted 4 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies