Jobs
Interviews

13218 Kafka Jobs - Page 11

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 years

0 Lacs

Andhra Pradesh

On-site

Job Title: Java Full Stack Developer With Angular Location: Any Location Experience Required: 8+ years Job Summary: We are looking for a highly skilled and experienced Full Stack Java Developer exposure to Angular with 8+ years of hands-on experience in developing scalable and high-performance enterprise applications. The ideal candidate should have strong proficiency in Java, Spring Boot, Microservices Architecture,RESTful APIs, JUnits, and exposure to Angular(Front End Framework). Roles and Responsibilities: Design, develop, and maintain scalable Java-based spring Boot with microservices and enterprise applications. Write efficient, reusable, and testable code following best practices. Perform unit and integration testing using JUnit, or similar frameworks. Develop RESTful APIs and integrate with front-end systems and third-party services. Good Strong Exposure with Angular Front-End Frameworks. Collaborate with cross-functional teams including frontend developers, QA, DevOps, and product managers. Implement CI/CD pipelines for continuous integration and deployment. Build and manage containerized applications using Docker and Kubernetes. Participate in code reviews, architecture discussions, and technical documentation. Monitor application performance and troubleshoot production issues. Stay updated with emerging technologies and suggest improvements to the existing tech stack. Follow Agile/Scrum processes for sprint planning, daily standups, and retrospectives. Primary Skills: Strong expertise in Java 17+, Spring Boot, Microservices, Exposure to Angular Framework. Good hands-on coding experience, Strong problem-solving and debugging skills. Hands-on experience in building and consuming RESTful APIs, Swagger. Proficiency in SQL (PostgreSQL, Oracle) Solid experience in unit testing frameworks: JUnit. Version control with Git. CI/CD tools: Git Hub action. Experience with cloud platforms: GCP. Knowledge of Docker and Kubernetes for containerized deployments. Good Attitude and Communication skills, Team collaboration. Use case understanding. Experience with Jira. Secondary Skills: Familiarity with JavaScript frameworks: Angular, typescript. Experience with messaging systems: Apache Kafka. Exposure to Agile/Scrum methodologies. Basic understanding of security practices (OAuth2, JWT, etc.) About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 day ago

Apply

7.0 years

0 Lacs

Andhra Pradesh

On-site

Strong background in Design and develop custom applications and solutions on the ServiceNow platform using both low code Flow Designer, ServiceNow Studio and procode e.g., JavaScript, AngularJS approaches. Collaborate with business analysts and stakeholders to gather requirements and translate them into technical specifications. Configure and customize ServiceNow modules, including ITSM, ITOM, HRSD, and others, to meet business needs. Develop and maintain workflows, UI policies, business rules, client scripts, and other platform components. Design and implement integrations with third party systems using REST SOAP APIs, Kafka, MID Server, and other integration tools. Automate business processes and tasks using ServiceNow's Flow Designer and other automation capabilities. A suitable phrase to describe experience in accessing data between scoped applications and global database tables in ServiceNow could be: Proficient in managing data access and integration between scoped applications and global database tables within the ServiceNow platform. good implementation knowledge and experience in the ServiceNow Module. Should have implemented ITSM with any modules such as ITOM or CSM or ,HRSD,or any of the modules of ServiceNow etc. xperience 7 Years Preferred location Hyderabad , India. Preferred Certifications ServiceNow Certified Application Developer Golden standard ServiceNow Certified Implementation Specialist ServiceNow Citizen developer training Must to have About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 day ago

Apply

0 years

0 Lacs

Andhra Pradesh

On-site

Strong background in Design and develop custom applications and solutions on the ServiceNow platform using both low-code (e.g., Flow Designer, ServiceNow Studio) and pro-code (e.g., JavaScript, AngularJS) approaches. Collaborate with business analysts and stakeholders to gather requirements and translate them into technical specifications. Configure and customize ServiceNow modules, including ITSM, ITOM, HRSD, and others, to meet business needs. Develop and maintain workflows, UI policies, business rules, client scripts, and other platform components. Design and implement integrations with third-party systems using REST SOAP APIs, Kafka, MID Server, and other integration tools. Automate business processes and tasks using ServiceNow's Flow Designer and other automation capabilities. A suitable phrase to describe experience in accessing data between scoped applications and global database tables in ServiceNow could be: Proficient in managing data access and integration between scoped applications and global database tables within the ServiceNow platform. Good implementation knowledge and experience in the ServiceNow Module. Should have implemented ITSM with any modules such as ITOM or CSM or ,HRSD,or any of the modules of ServiceNow etc. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 day ago

Apply

0 years

0 Lacs

Andhra Pradesh

On-site

Primary Skills; Java, Spring Boot,RESTFUL API,Microservices, Swagger, Good hands-on coding experience, Strong problem-solving and debugging skills, PostgreSQL, Oracle, JUnit,Git, Git Hub action,GCP,Dockers,Kubernate, Good Attitude and Communication skills, Team collaboration, Use case understanding, Experience with Jira. Secondary Skills; Angular, TypeScript, Kafka, Agile Methodologies. About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 day ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Reference # 320068BR Job Type Full Time Your role Do you want to join a team that blazes a trail for technology modernisation in banking? Do you aspire to work with colleagues who productionise microservice architecture concepts into enterprise scale solutions? Can you drive modernization and reshape traditional platforms and systems? Our team is undertaking an ambitious programme of wholesale platform renewal & product evolution underpinned by genuine digitalisation. We are looking for a talented engineer to act as a technical expert for our fledgling team in Pune, India. If you are motivated to build our next generation Post-Trade & Custody ecosystem based on services and integration in modern, programming paradigms we look forward to receiving your application. Our Stack CI/CD: Gitlab Cloud: Azure Platform: Linux, Windows, Mainframe Configuration management: Ansible Scripts: Bash, PowerShell, Azure CLI Programming languages: Java, C#, Kotlin Container based architecture and deployments (Docker/Azure Kubernetes) Your team You will work in a highly collaborative environment with colleagues from globally diverse backgrounds and skillsets coming together to solve challenging problems as a team. The position is within our Trade and Transfer Assets stream closely collaborating with the part of Agile delivery units distributed across the globe. Our teams design, deliver and operate state-of-the-art financial systems that offer best-in-class services to the bank’s clients. Our development pods work across a multitude of development languages working together in a model of coexistence whilst we transform, modernize and evolve our post-trade service platform. We are genuine believers that diversity brings more varied experience, expertise, and working methods to improve the way we engineer and deliver solutions. Your expertise We are looking for a DevOps with Azure engineer who is passionate and ready to develop the state of art technology solutions for their digital platforms. This job will have a variety of challenges on a daily basis, where you will need to understand the business needs and based on your creative thinking-ways, you will develop and design solutions in Azure with Kafka and DevOps capabilities. You will implement them according to the DevOps practices. This job is for someone who is excited and happy to work with cutting edge technologies and really motivated to work with large amounts of complex cloud-based data. To apply for this job as a Azure Cloud Developer with Kafka you should bring: 7+yrs of experience in designing and developing Azure (with Kubernetes) solutions strong knowledge and experience of working with Kakfa comfortable working with large amounts of data Knowledge of technologies such as Docker and Kubernetes devops skills are also essential good Postgres DB knowledge Microsoft Azure expertise and certification is a plus Your profile may also show the above and beyond elements such as: positive attitude & willingness to learn & desire to improve the environment around you knowledge of virtualization and containerization track record as engineer working in a globally distributed team on-the-job examples of working in a fast-paced Agile environment About Us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors. We have a presence in all major financial centers in more than 50 countries. How We Hire We may request you to complete one or more assessments during the application process. Learn more Join us At UBS, we know that it's our people, with their diverse skills, experiences and backgrounds, who drive our ongoing success. We’re dedicated to our craft and passionate about putting our people first, with new challenges, a supportive team, opportunities to grow and flexible working options when possible. Our inclusive culture brings out the best in our employees, wherever they are on their career journey. We also recognize that great work is never done alone. That’s why collaboration is at the heart of everything we do. Because together, we’re more than ourselves. We’re committed to disability inclusion and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. Disclaimer / Policy Statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.

Posted 1 day ago

Apply

9.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Reference # 320587BR Job Type Full Time Your role Are you looking for a challenge? Are you passionate about technology? As a Senior Java full stack developer, you will work in one of the largest and most complex CRM applications buildings. You will work closely with Product Management, Software Development, Data Engineering and other teams to develop scalable and innovative CRM solutions. Your role will be accountable for design /implementation of technical solutions within WMA and timely delivery of projects following agile / scrum SDLC. analyze and understand business and technical stories, writing code, implementing automated tests, contributing to release and iteration planning and developing the working practices of the team capturing detailed technical requirements based on business requirements support the business in resolving high-priority defects and deploying fixes to production systems contribute widely to establishing and promoting best SDLC practices and pro-actively investigating and proposing new technologies for use within the department act as a mentor for more junior staff in the team Your team Are you an enthusiastic technology professional? Are you excited about seeking an enriching career, working for one of finest financial institutions in world? If so, you are the right person for this role. We are seeking technology and domain experts to join our CRM development team. We are responsible for WMA (Wealth Management Americas) clients facing technology applications. You’ll be working in the WMA CRM crew focusing on building applications which are used by financial advisors. Our team is dedicated to creating innovative solutions that drive our organization's success. We foster a collaborative and supportive environment, where you can grow and excel in your role. Your expertise 9+ years’ experience in technology and developing enterprise-class applications using Java, Unix/Linux, Spring, Hibernate, Maven, JMS, Oracle, SQL, RDBMS database experience with budding Kafka platform and hands on knowledge of Kafka infrastructure and scale hands of expertise of developing UI using Java, React JS, HTML and CSS deep understanding of Java and JEE internals (Class loading, Memory Management, Transaction management etc) excellent knowledge of Relational Databases, SQL and ORM technologies (Webflux) strong middleware skills on building RESTful/SOA services in Java using Open API/Swagger, Apigee or API Connect along with RDBMS and NoSQL skills for persistence should be proficient with build and CI tooling like maven/gradle/git should have some experience with Unix OS and Shell Scripting should have some understanding of Azure Cloud based solutions experience in unit/integration testing About Us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors. We have a presence in all major financial centers in more than 50 countries. How We Hire We may request you to complete one or more assessments during the application process. Learn more Join us At UBS, we know that it's our people, with their diverse skills, experiences and backgrounds, who drive our ongoing success. We’re dedicated to our craft and passionate about putting our people first, with new challenges, a supportive team, opportunities to grow and flexible working options when possible. Our inclusive culture brings out the best in our employees, wherever they are on their career journey. We also recognize that great work is never done alone. That’s why collaboration is at the heart of everything we do. Because together, we’re more than ourselves. We’re committed to disability inclusion and if you need reasonable accommodation/adjustments throughout our recruitment process, you can always contact us. Disclaimer / Policy Statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.

Posted 1 day ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: Full Stack Developer Job Location: Hyderabad Notice Period: 15 Days Role Overview: · Play a crucial role in driving Company mission to simplify and innovate construction management. · Collaborate with diverse clients worldwide, helping them transform complex workflows. · Thrive in a fast-paced, tech-driven environment that encourages continuous learning and growth. · Advance your career by delivering real impact on large scale infrastructure and construction projects. Key Responsibilities: · We are looking for a tech enthusiast with a knack for full stack developer. Eager to dive into code and bring ideas to life. · Own features from brainstorming to deployment—handling everything from database architecture to front-end performance. · Optimize and Scale: Ensure that our platform is high-performing, scalable, and future-proof. You will be part of laying the groundwork for big, exciting growth. · Collaborate & Conquer: Work closely with our design, product, and AI teams to integrate machine learning and automation features into our platform, pushing the boundaries of what tech can do in construction. · Write clean, efficient, and maintainable code track record that talks. Required Qualifications: · Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. · Equivalent practical experience may be acceptable with a strong portfolio and leadership track record. · 3+ years of experience with either MEAN (MongoDB, Express, Angular, Nodejs) or MERN (MongoDB, Express, React, Nodejs) stack. · Hands-on experience in designing and building scalable, secure full-stack applications in a microservices or monolithic architecture. · Strong proficiency in Angular 15+, RxJS, NgRx (or other state management libraries). · Solid understanding of TypeScript, JavaScript, HTML5, and CSS3. · Experience building responsive and cross-browser applications. · Familiar with Angular CLI, lazy loading, routing, and component-based architecture. · Proficiency in MongoDB and its query syntax, aggregation framework. · Knowledge of Mongoose (ORM). Understanding of schema design, indexing, and performance tuning. Nice-to-have: · Experience with GraphQL, Socket.IO, or WebRTC. · Understanding of Server-Side Rendering (SSR) using Next.js (for MERN) or Angular Universal (for MEAN). · Knowledge of Redis, Kafka, or other message queues · Familiarity with multi-tenant architecture or SaaS product engineering. What We Offer: · Grow with purpose: Accelerate your career with hands-on learning and expert mentorship. · Culture that empowers: Join a team where your ideas matter and diversity is celebrated. · Perks that matter: Enjoy flexible work options and benefits designed to support your work-life balance. · Make a real impact: Work on advanced solutions that simplify construction and help build smarter cities and communities worldwide.

Posted 1 day ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

8+ years of experience in Development with Ecom/Retail domain. Experience, User Management, Catalog / Browse / Search, Promotions, Offers & Pricing, Cart & Checkout, Payments, Order Capture and Post processing. Deep understanding of frameworks like REST, GraphQL, JSON, XML etc. Good knowledge of message brokers like Kafka, Kinesis, SQS, Pub / Sub etc and ways in which the message brokering can be optimized with parallel processing. Sound knowledge of CDN and its capabilities, Caching (client and server side), Database (SQL / No SQL). Experience in web technologies like HTML / CSS, React / Typescript / Javascript and GraphQL. Advanced knowledge of JavaScript (ES6+), and modern frontend tooling (Webpack, Babel, etc.) will have added advantage. Solid track record and experience in working hands on with eCommerce platforms like Shopify (recommended), HCL Commerce, ATG or SFCC. Exposure to package business components like Builder for CMS, Bloomreach for Search, Payment Gateways, International or Cross Border Shipping extra will have added advantage. Exposure to event driven composable Commerce architecture with an inherent knowledge of cloud system AWS . Should be aware of NodeJS based service implementations and capabilities in the CSP of knowledge. Good grasp of CI / CD, release mechanisms, working with high velocity scrum based teams, branching strategies, GIt / Bucket .

Posted 1 day ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

8+ years of experience in Development with Ecom/Retail domain. Experience in User Management, Catalog / Browse / Search, Promotions, Offers & Pricing, Cart & Checkout, Payments, Order Capture and Post processing. Deep understanding of frameworks like REST, GraphQL, JSON, XML etc. Good knowledge of message brokers like Kafka, Kinesis, SQS, Pub / Sub etc and ways in which the message brokering can be optimized with parallel processing. Good knowledge of CDN and its capabilities, Caching (client and server side), Database (SQL / No SQL). Experience in web technologies like HTML / CSS, React / Typescript / Javascript and GraphQL. Advanced knowledge of JavaScript (ES6+), and modern frontend tooling (Webpack, Babel, etc.) will have added advantage. Solid track record and experience working hands on with eCommerce platforms like Shopify (recommended), HCL Commerce, ATG or SFCC. Exposure to package business components like Builder for CMS, Bloomreach for Search, Payment Gateways, International or Cross Border Shipping extra will have added advantage. Exposure to event driven composable Commerce architecture with an inherent knowledge of cloud system AWS . Should be aware of NodeJS based service implementations and capabilities in the CSP of knowledge. Good grasp of CI / CD, release mechanisms, working on high velocity scrum based teams, branching strategies, GIt / Bucket.

Posted 1 day ago

Apply

0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Senior Data Streaming Engineer Build, and maintain a real-time, file-based streaming data platform leveraging open-source technologies. The ideal candidate will have experience with Kubernetes (K8s), Apache Kafka, and Java multithreading, and will be responsible for: Developing a highly performant, scalable streaming architecture optimized for high throughput and low memory overhead Implementing auto-scaling solutions to support variable data loads efficiently Integrating reference data enrichment workflows using Snowflake Ensuring system reliability and real-time processing across distributed environments Collaborating with cross-functional teams to deliver robust, cloud-native data solutions Build scalable and optimized ETL/ELT workflows leveraging Azure Data Factory (ADF) and Apache Spark within Databricks. Skills Azure,KAFKA,JAVA,KUBERENETES

Posted 1 day ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

8+ years of experience in Development with Ecom/Retail domain. Experience in User Management, Catalog / Browse / Search, Promotions, Offers & Pricing, Cart & Checkout, Payments, Order Capture and Post processing. Deep understanding of frameworks like REST, GraphQL, JSON, XML etc. Good knowledge of message brokers like Kafka, Kinesis, SQS, Pub / Sub etc and ways in which the message brokering can be optimized with parallel processing. Good knowledge of CDN and its capabilities, Caching (client and server side), Database (SQL / No SQL). Experience in web technologies like HTML / CSS, React / Typescript / Javascript and GraphQL. Advanced knowledge of JavaScript (ES6+), and modern frontend tooling (Webpack, Babel, etc.) will have added advantage. Solid track record and experience working hands on with eCommerce platforms like Shopify (recommended), HCL Commerce, ATG or SFCC. Exposure to package business components like Builder for CMS, Bloomreach for Search, Payment Gateways, International or Cross Border Shipping extra will have added advantage. Exposure to event driven composable Commerce architecture with an inherent knowledge of cloud system AWS . Should be aware of NodeJS based service implementations and capabilities in the CSP of knowledge. Good grasp of CI / CD, release mechanisms, working on high velocity scrum based teams, branching strategies, GIt / Bucket.

Posted 1 day ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

8+ years of experience in Development with Ecom/Retail domain. Experience, User Management, Catalog / Browse / Search, Promotions, Offers & Pricing, Cart & Checkout, Payments, Order Capture and Post processing. Deep understanding of frameworks like REST, GraphQL, JSON, XML etc. Good knowledge of message brokers like Kafka, Kinesis, SQS, Pub / Sub etc and ways in which the message brokering can be optimized with parallel processing. Sound knowledge of CDN and its capabilities, Caching (client and server side), Database (SQL / No SQL). Experience in web technologies like HTML / CSS, React / Typescript / Javascript and GraphQL. Advanced knowledge of JavaScript (ES6+), and modern frontend tooling (Webpack, Babel, etc.) will have added advantage. Solid track record and experience in working hands on with eCommerce platforms like Shopify (recommended), HCL Commerce, ATG or SFCC. Exposure to package business components like Builder for CMS, Bloomreach for Search, Payment Gateways, International or Cross Border Shipping extra will have added advantage. Exposure to event driven composable Commerce architecture with an inherent knowledge of cloud system AWS . Should be aware of NodeJS based service implementations and capabilities in the CSP of knowledge. Good grasp of CI / CD, release mechanisms, working with high velocity scrum based teams, branching strategies, GIt / Bucket .

Posted 1 day ago

Apply

0 years

0 Lacs

Andhra Pradesh, India

On-site

Collaborate with Product Owners and stakeholders to understand the business requirements. Good Experience in Apache Kafka, Python, Tableau and MSBI-SSIS ,SSRS. Kafka integration with Python and data loading process. Analyse data from key source systems and design suitable solutions that transform the data from source to target. Provide support to the stakeholders and scrum team throughout the development lifecycle and respond to any design queries Support testing and implementation and review solutions to ensure functional and data assurance requirements are met Passionate about data and delivering high quality data-led solutions Able to influence stakeholders, build strong business relationships and communicate in a clear, concise manner Experience working with SQL or any big data technologies is a plus (Hadoop, Hive, Hbase, Scala, Spark etc) Good in Control-M, Git, CI & CD pipeline Good team player with a strong team ethos. Skills Required MSBI SSIS, MS SQL Server, Kafka, Airflow, ANSI SQL, ShellScript, Python, Scala, HDFS.

Posted 1 day ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Hyderabad

Work from Office

The Team: We are looking for a highly motivated, enthusiastic, and skilled engineering lead for Commodity Insights. We strive to deliver solutions that are sector-specific, data-rich, and hyper-targeted for evolving business needs. Our Software development Leaders are involved in the full product life cycle, from design through release. The resource would be joining a strong innovative team working on the content management platforms which support a large revenue stream for S&P Commodity Insights. Working very closely with the Product owner and Development Manager, teams are responsible for the development of user enhancements and maintaining good technical hygiene. The successful candidate will assist in the design, development, release and support of content platforms. Skills required include ReactJS, Spring Boot, RESTful microservices, AWS services (S3, ECS, Fargate, Lambda, etc.), CSS / HTML, AJAX / JSON, XML and SQL (PostgreSQL/Oracle), . The candidate should be aware of GEN AI or LLM models like Open AI and Claude etc. The candidate should be enthusiast in working on prompt building related to GenAI and business-related prompts. The candidate should be able to develop and optimize prompts for AI models to improve accuracy and relevance. The candidate must be able to work well with a distributed team, demonstrate an ability to articulate technical solutions for business requirements, have experience with content management/packaging solutions, and embrace a collaborative approach for the implementation of solutions. Responsibilities: Lead and mentor a team through all phases of the software development lifecycle, adhering to agile methodologies (Analyze, design, develop, test, debug, and deploy). Ensure high-quality deliverables and foster a collaborative environment. Be proficient with the use of developer tools supporting the CI/CD process including configuring and executing automated pipelines to build and deploy software components Actively contribute to team planning and ceremonies and commit to team agreement and goals Ensure code quality and security by understanding vulnerability patterns, running code scans, and be able to remediate issues. Mentoring the junior developers. Make sure that code review tasks on all user stories are added and timely completed. Perform reviews and integration testing to assure quality of project development eorts Design database schemas, conceptual data models, UI workows and application architectures that t into the enterprise architecture Support the user base, assisting with tracking down issues and analyzing feedback to identify product improvements Understand and commit to the culture of S&P Global: the vision, purpose and values of the organization Basic Qualifications: 10+ years experience in an agile team development role, delivering software solutions using Scrum Java, J2EE, Javascript, CSS/HTML, AJAX ReactJS, Spring Boot, Microservices, RESTful services, OAuth XML, JSON, data transformation SQL and NoSQL Databases (Oracle, PostgreSQL) Working knowledge of Amazon Web Services (Lambda, Fargate, ECS, S3, etc.) Experience on GEN AI or LLM models like Open AI and Claude is preferred. Experience with agile workflow tools (e.g. VSTS, JIRA) Experience with source code management tools (e.g. git), build management tools (e.g. Maven) and continuous integration/delivery processes and tools (e.g. Jenkins, Ansible) Self-starter able to work to achieve objectives with minimum direction Comfortable working independently as well as in a team Excellent verbal and written communication skills Preferred Qualifications: Analysis of business information patterns, data analysis and data modeling Working with user experience designers to deliver end-user focused benefits realization Familiar with containerization (Docker, Kubernetes) Messaging/queuing solutions (Kafka, etc.) Familiar with application security development/operations best practices (including static/dynamic code analysis tools)

Posted 1 day ago

Apply

4.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Profile Description We’re seeking someone to design, develop, test, and deploy the code following Agile principles. The candidate is expected to be able to communicate well with global teams, adapt rapidly and learn fast. CDRR_Technology The Cybersecurity organization's mission is to create an agile, adaptable organization with the skills and expertise needed to defend against increasingly sophisticated adversaries. This will be achieved by maintaining sound capabilities to identify and protect our assets, proactively assessing threats and vulnerabilities and detecting events, ensuring resiliency through our ability to respond to and recover from incidents and building awareness and increase vigilance while continually developing our cyber workforce. Non-Financial Risk Technology Non-Financial Risk Technology provides operational controls, support for sustainable investing, business continuity planning, and surveillance capabilities to enhance the firm’s resilience to threats and fraudulent behavior. Software Engineering This is Director position that develops and maintains software solutions that support business needs. Morgan Stanley is an industry leader in financial services, known for mobilizing capital to help governments, corporations, institutions, and individuals around the world achieve their financial goals. At Morgan Stanley India, we support the Firm’s global businesses, with critical presence across Institutional Securities, Wealth Management, and Investment management, as well as in the Firm’s infrastructure functions of Technology, Operations, Finance, Risk Management, Legal and Corporate & Enterprise Services. Morgan Stanley has been rooted in India since 1993, with campuses in both Mumbai and Bengaluru. We empower our multi-faceted and talented teams to advance their careers and make a global impact on the business. For those who show passion and grit in their work, there’s ample opportunity to move across the businesses for those who show passion and grit in their work. Interested in joining a team that’s eager to create, innovate and make an impact on the world? Read on… Responsibilities What you’ll do in the role: Candidate will be working on existing and new initiatives within the ECRR suite of applications. Initiatives include customizations for evolving business needs as well as architectural and infrastructure improvements. This role will be responsible with activities including design, develop, test, and deploy the code following Agile principles. The candidate is expected to be able to communicate well with global teams, adapt rapidly and learn fast. Sound judgment will be required to understand complex business use cases, design appropriate solutions, develop the underlying implementation and leverage team’s strength for application success. Financial domain knowledge, understanding of AML and Customer Risk ranking quantitative methodologies is an advantage. Our current technology stack includes Java, Spring Boot, REST, Angular JS, Kafka , MQ and DB2. The team makes extensive use of open source state-of-the-art Java technologies. The team incorporates agile methodologies including Scrum, Test Driven Development, Continuous Integration and Continuous Delivery in its development processes. We are looking for a strong hands-on technologist who is passionate about technology, has strong experience developing Java-based systems, is proactive and is an excellent team player. The work involves handling large volumes (~ 36 millions a day) of Customer and Transactions data into the daily/weekly batch process, multi-processing/multi-threading enhancements to meet daily business SLA timelines, Realtime REST services providing responses in sub-second. Skills Required What you’ll bring to the role: 4+ years of strong hands-on experience working on Core Java, Concurrency and databases using Spring Boot REST Services with JSON, Apache Kafka, IBM MQ , SOAP , XML Strong experience developing distributed n-tier applications with distributed messaging layers, caching mechanisms, data flows Strong Database design skills including SQL, procedures, and query tuning Service Design Concepts, Object Oriented and Functional Development concepts Agile Development Methodologies DevOps tools & methodologies Excellent interpersonal skills and professional approach Strong oral and written communication skills, ability to interface with business stakeholders Ability to lead multiple initiatives/projects in parallel with right sense of prioritization Ability to lead & mentor junior team-members Skills desired Microservices concepts and High-quality software architecture and design methodologies Exposure to Cloud technologies Ability to modernize the ecosystem in iterative approach Create Architecture Design Review documents Linux and Shell Scripts Angular/REACT UI is a plus What You Can Expect From Morgan Stanley We are committed to maintaining the first-class service and high standard of excellence that have defined Morgan Stanley for over 89 years. Our values - putting clients first, doing the right thing, leading with exceptional ideas, committing to diversity and inclusion, and giving back - aren’t just beliefs, they guide the decisions we make every day to do what's best for our clients, communities and more than 80,000 employees in 1,200 offices across 42 countries. At Morgan Stanley, you’ll find an opportunity to work alongside the best and the brightest, in an environment where you are supported and empowered. Our teams are relentless collaborators and creative thinkers, fueled by their diverse backgrounds and experiences. We are proud to support our employees and their families at every point along their work-life journey, offering some of the most attractive and comprehensive employee benefits and perks in the industry. There’s also ample opportunity to move about the business for those who show passion and grit in their work. To learn more about our offices across the globe, please copy and paste https://www.morganstanley.com/about-us/global-offices into your browser. Morgan Stanley is an equal opportunities employer. We work to provide a supportive and inclusive environment where all individuals can maximize their full potential. Our skilled and creative workforce is comprised of individuals drawn from a broad cross section of the global communities in which we operate and who reflect a variety of backgrounds, talents, perspectives, and experiences. Our strong commitment to a culture of inclusion is evident through our constant focus on recruiting, developing, and advancing individuals based on their skills and talents.

Posted 1 day ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

ServiceNow_ITSM_TSOM _FSM_CMDB_ITOM_TSM_OMT_TNI Knowledge/Experience of working on different Servicenow modules like ITSM (IT service management), TSOM (Telecommunications Service Operations Management), FSM (Field Service Management), CMDB (Configuration Management Database), Inventory, ITOM (Enterprise IT Operations Management), TSM (Telecommunications Service Management), OMT (Order Management), TNI (Telecommunications Network Inventory) etc. 2. Good knowledge/experience of Products & Service Catalog, Client Scripts, Service Side scripting, Business Rules and other ServiceNow scripting capabilities. 3. Configuration/Customization/Implementation of the ServiceNow system followed by efficient integration using workflows. 4. Experience integrating external systems with the ServiceNow using integration solutions, including ODBC, REST, SOAP, LDAP, SSL, KAFKA etc. 5. Deep understanding of ServiceNow data model and structures, proposing and enforcing coding standards. 6. Good experience in Agile way of working (Scrum, User Story, Sprint model), JIRA & Confluence tools. Location - Across all Infosys DCs

Posted 1 day ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Pune, Gurugram, Pune/Gurgaon

Work from Office

Must have: Java 8, Spring boot Docker, Kubernetes Microservices Experience with Azure or AWS Hands on in Oracle PL/ SQL and Unix CI/CD pipelines, Unit testing, Integration and components testing Nice to have: Telecom Knowledge - OSS SOM product Knowledge Knowledge on Kafka Knowledge on TMF

Posted 1 day ago

Apply

5.0 years

0 Lacs

Delhi, India

On-site

Opportunity for Sr. Java Dev with Node.js (Hybrid) Locations: Gurugram,Pune, Bangalore, Chennai 5-8 yrs 5+ years of experience in software development Strong proficiency in Java and Microservices architecture Experience migrating applications from Node.js to Java Microservices (ability to read and understand Node.js code) Hands-on experience with Spring Boot and RESTful APIs Familiarity with database technologies (SQL/NoSQL) Experience with containerization (Docker) Knowledge of cloud platforms (AWS) Bonus Skills: Exposure to AWS Lambda (not mandatory but a bonus) Exposure to AWS Lambda (not mandatory but a bonus) Experience with Liquibase for database management Familiarity with Spring Feign Client for service-to-service communication Good-to-Have Skills: Experience with DevOps tools (CI/CD, Jenkins, Terraform) Familiarity with event-driven architecture (Kafka, RabbitMQ) Exposure to performance tuning and optimization techniques

Posted 1 day ago

Apply

6.0 years

0 Lacs

Mohali district, India

On-site

Job Title: DevOps/MLOps Expert Location: Mohali (On-Site) Employment Type: Full-Time Experience: 6 + years Qualification: B.Tech CSE About the Role We are seeking a highly skilled DevOps/MLOps Expert to join our rapidly growing AI-based startup building and deploying cutting-edge enterprise AI/ML solutions. This is a critical role that will shape our infrastructure, deployment pipelines, and scale our ML operations to serve large-scale enterprise clients. As our DevOps/MLOps Expert , you will be responsible for bridging the gap between our AI/ML development teams and production systems, ensuring seamless deployment, monitoring, and scaling of our ML-powered enterprise applications. You’ll work at the intersection of DevOps, Machine Learning, and Data Engineering in a fast-paced startup environment with enterprise-grade requirements. Key Responsibilities MLOps & Model Deployment • Design, implement, and maintain end-to-end ML pipelines from model development to production deployment • Build automated CI/CD pipelines specifically for ML models using tools like MLflow, Kubeflow, and custom solutions • Implement model versioning, experiment tracking, and model registry systems • Monitor model performance, detect drift, and implement automated retraining pipelines • Manage feature stores and data pipelines for real-time and batch inference • Build scalable ML infrastructure for high-volume data processing and analytics Enterprise Cloud Infrastructure & DevOps • Architect and manage cloud-native infrastructure with focus on scalability, security, and compliance • Implement Infrastructure as Code (IaC) using Terraform , CloudFormation , or Pulumi • Design and maintain Kubernetes clusters for containerized ML workloads • Build and optimize Docker containers for ML applications and microservices • Implement comprehensive monitoring, logging, and alerting systems • Manage secrets, security, and enterprise compliance requirements Data Engineering & Real-time Processing • Build and maintain large-scale data pipelines using Apache Airflow , Prefect , or similar tools • Implement real-time data processing and streaming architectures • Design data storage solutions for structured and unstructured data at scale • Implement data validation, quality checks, and lineage tracking • Manage data security, privacy, and enterprise compliance requirements • Optimize data processing for performance and cost efficiency Enterprise Platform Operations • Ensure high availability (99.9%+) and performance of enterprise-grade platforms • Implement auto-scaling solutions for variable ML workloads • Manage multi-tenant architecture and data isolation • Optimize resource utilization and cost management across environments • Implement disaster recovery and backup strategies • Build 24x7 monitoring and alerting systems for mission-critical applications Required Qualifications Experience & Education • 4-8 years of experience in DevOps/MLOps with at least 2+ years focused on enterprise ML systems • Bachelor’s/Master’s degree in Computer Science, Engineering, or related technical field • Proven experience with enterprise-grade platforms or large-scale SaaS applications • Experience with high-compliance environments and enterprise security requirements • Strong background in data-intensive applications and real-time processing systems Technical Skills Core MLOps Technologies • ML Frameworks : TensorFlow, PyTorch, Scikit-learn, Keras, XGBoost • MLOps Tools : MLflow, Kubeflow, Metaflow, DVC, Weights & Biases • Model Serving : TensorFlow Serving, PyTorch TorchServe, Seldon Core, KFServing • Experiment Tracking : MLflow, Neptune.ai, Weights & Biases, Comet DevOps & Cloud Technologies • Cloud Platforms : AWS, Azure, or GCP with relevant certifications • Containerization : Docker, Kubernetes (CKA/CKAD preferred) • CI/CD : Jenkins, GitLab CI, GitHub Actions, CircleCI • IaC : Terraform, CloudFormation, Pulumi, Ansible • Monitoring : Prometheus, Grafana, ELK Stack, Datadog, New Relic Programming & Scripting • Python (advanced) - primary language for ML operations and automation • Bash/Shell scripting for automation and system administration • YAML/JSON for configuration management and APIs • SQL for data operations and analytics • Basic understanding of Go or Java (advantage) Data Technologies • Data Pipeline Tools : Apache Airflow, Prefect, Dagster, Apache NiFi • Streaming & Real-time : Apache Kafka, Apache Spark, Apache Flink, Redis • Databases : PostgreSQL, MongoDB, Elasticsearch, ClickHouse • Data Warehousing : Snowflake, BigQuery, Redshift, Databricks • Data Versioning : DVC, LakeFS, Pachyderm Preferred Qualifications Advanced Technical Skills • Enterprise Security : Experience with enterprise security frameworks, compliance (SOC2, ISO27001) • High-scale Processing : Experience with petabyte-scale data processing and real-time analytics • Performance Optimization : Advanced system optimization, distributed computing, caching strategies • API Development : REST/GraphQL APIs, microservices architecture, API gateways Enterprise & Domain Experience • Previous experience with enterprise clients or B2B SaaS platforms • Experience with compliance-heavy industries (finance, healthcare, government) • Understanding of data privacy regulations (GDPR, SOX, HIPAA) • Experience with multi-tenant enterprise architectures Leadership & Collaboration • Experience mentoring junior engineers and technical team leadership • Strong collaboration with data science teams , product managers , and enterprise clients • Experience with agile methodologies and enterprise project management • Understanding of business metrics , SLAs , and enterprise ROI Growth Opportunities • Career Path : Clear progression to Lead DevOps Engineer or Head of Infrastructure • Technical Growth : Work with cutting-edge enterprise AI/ML technologies • Leadership : Opportunity to build and lead the DevOps/Infrastructure team • Industry Exposure : Work with Government & MNCs enterprise clients and cutting-edge technology stacks Success Metrics & KPIs Technical KPIs • System Uptime : Maintain 99.9%+ availability for enterprise clients • Deployment Frequency : Enable daily deployments with zero downtime • Performance : Ensure optimal response times and system performance • Cost Optimization : Achieve 20-30% annual infrastructure cost reduction • Security : Zero security incidents and full compliance adherence Business Impact • Time to Market : Reduce deployment cycles and improve development velocity • Client Satisfaction : Maintain 95%+ enterprise client satisfaction scores • Team Productivity : Improve engineering team efficiency by 40%+ • Scalability : Support rapid client base growth without infrastructure constraints Why Join Us Be part of a forward-thinking, innovation-driven company with a strong engineering culture. Influence high-impact architectural decisions that shape mission-critical systems. Work with cutting-edge technologies and a passionate team of professionals. Competitive compensation, flexible working environment, and continuous learning opportunities. How to Apply Please submit your resume and a cover letter outlining your relevant experience and how you can contribute to Aaizel Tech Labs’ success. Send your application to hr@aaizeltech.com , bhavik@aaizeltech.com or anju@aaizeltech.com.

Posted 1 day ago

Apply

2.0 - 6.0 years

3 - 7 Lacs

Gurugram

Work from Office

We are looking for a Pyspark Developer that loves solving complex problems across a full spectrum of technologies. You will help ensure our technological infrastructure operates seamlessly in support of our business objectives. Responsibilities Develop and maintain data pipelines implementing ETL processes. Take responsibility for Hadoop development and implementation. Work closely with a data science team implementing data analytic pipelines. Help define data governance policies and support data versioning processes. Maintain security and data privacy working closely with Data Protection Officer internally. Analyse a vast number of data stores and uncover insights. Skillset Required Ability to design, build and unit test the applications in Pyspark. Experience with Python development and Python data transformations. Experience with SQL scripting on one or more platforms Hive, Oracle, PostgreSQL, MySQL etc. In-depth knowledge of Hadoop, Spark, and similar frameworks. Strong knowledge of Data Management principles. Experience with normalizing/de-normalizing data structures, and developing tabular, dimensional and other data models. Have knowledge about YARN, cluster, executor, cluster configuration. Hands on working in different file formats like Json, parquet, csv etc. Experience with CLI on Linux-based platforms. Experience analysing current ETL/ELT processes, define and design new processes. Experience analysing business requirements in BI/Analytics context and designing data models to transform raw data into meaningful insights. Good to have knowledge on Data Visualization. Experience in processing large amounts of structured and unstructured data, including integrating data from multiple sources.

Posted 1 day ago

Apply

2.0 - 6.0 years

6 - 10 Lacs

Gurugram

Work from Office

Were looking for a Big Data Engineer who can find creative solutions to tough problems. As a Big Data Engineer, youll create and manage our data infrastructure and tools, including collecting, storing, processing and analyzing our data and data systems. You know how to work quickly and accurately, using the best solutions to analyze mass data sets, and you know how to get results. Youll also make this data easily accessible across the company and usable in multiple departments. Skillset Required Bachelors Degree or more in Computer Science or a related field. A solid track record of data management showing your flawless execution and attention to detail. Strong knowledge of and experience with statistics. Programming experience, ideally in Python, Spark, Kafka or Java, and a willingness to learn new programming languages to meet goals and objectives. Experience in C, Perl, Javascript or other programming languages is a plus. Knowledge of data cleaning, wrangling, visualization and reporting, with an understanding of the best, most efficient use of associated tools and applications to complete these tasks. Experience in MapReduce is a plus. Deep knowledge of data mining, machine learning, natural language processing, or information retrieval. Experience processing large amounts of structured and unstructured data, including integrating data from multiple sources. Experience with machine learning toolkits including, H2O, SparkML or Mahout A willingness to explore new alternatives or options to solve data mining issues, and utilize a combination of industry best practices, data innovations and your experience to get the job done. Experience in production support and troubleshooting. You find satisfaction in a job well done and thrive on solving head-scratching problems.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job purpose Be a part of development team in India. Work on the different technologies used globally. Transfer an existing MES solution to a modern hybrid-cloud microservice environment. Develop and support the production site the same way. Duties and responsibilities Develop MES software for Lens Manufacturing Implement systems functions, controls and algorithms Implement Unit-Test to achieve reasonable test coverage Work within an agile project with Azure DevOps Managing the deployment of new or changed versions of the software-solution into edge-environments. Working directly with various technologies (MSSQL/MongoDB/Kafka/Docker) to diagnose issues across your running services. Creating and maintaining training material and documentation for the services which you are deploying and are the owner. Reporting new bugs/issues and solace it on a code basis Qualifications A successfully completed degree in computer science/business informatics or equivalent work experience. Minimum 5 years experience working with in software development. Skill Set Must Have: Using Modern IDEs (IntelliJ IDEA, Eclipse, Visual Studio) Must Have: Advanced programming experience in Java Must Have: Knowledge of Linux environment Must Have: Very good written and spoken English. Must Have: Very good teamworking skills as well as a customer and service-oriented way of thinking and working. Must Have: First experiences in working with international teams Must Have : Experience in writing UI in React Should Have: Basic Knowledge in HTML, JS, CSS Should Have: Basic knowledge of using GIT for source code versioning Should Have: Experience in software development of enterprise software Should Have: Experience in Quarkus or Java Spring Preferred: Usage of Azure DevOps for CI/CD Preferred: Experience in developing distributed systems or using Microservice Architectures Preferred: Experience in MongoDB Preferred: Experience in Kafka Other Attributes Good communication skills in English, command of the German language is a plus Willingness to learn and teach.

Posted 1 day ago

Apply

0 years

40 - 45 Lacs

Bengaluru, Karnataka, India

On-site

About Us We are on a mission to create India's largest fully automated financial inclusion organization, offering a range of financial services including micro-loans to serve the vast underserved middle/lower-income segment. Recognized as one of the Top 10 Google Launchpad-backed AI/ML Tech startups, you will experience firsthand challenges and opportunities to contribute towards building and scaling our business. Collaborate with brilliant minds driven by the goal of solving macro issues related to financial inclusion. Our services span over 17,000 pin codes in India, having positively impacted over 5.5 million users. Our user profile ranges from micro-entrepreneurs and small retailers to blue-grey-collar workers and salaried employees across various sectors. As part of our team, you'll manage Petabytes of data and contribute to organizational growth by deriving and applying data-driven insights, alongside opportunities to innovate and patent AI/ML technologies. What Can You Expect? Ownership of the company's success through ESOPs for high performers. Market-leading competitive salaries (in the 90th percentile). An open culture that encourages expressing opinions freely. Opportunities to learn from industry experts. A chance to positively impact billions of lives by enhancing financial inclusion. Be part of our journey to re-imagine solutions, delivering world-class, best-of-breed services to delight our customers and make a significant impact on the FinTech industry. Roles & Responsibilities Develop and extend our backend platform, processing terabytes of data to deliver unique, personalized financial experiences. Collaborate directly with tech-focused founding team members and IIT graduates with expertise in designing scalable and robust system architectures. Design systems from scratch with scalability and security front of mind. Demonstrate deep knowledge of design patterns in Java, DS, and algorithms. Monitor and optimize MySQL database queries for peak performance. Experience with tools like Scala, Kafka, Bigtable, and BigQuery is beneficial but not mandatory. Mentor junior team members by providing regular feedback and conducting code reviews. Skills: data structures,mysql,bigquery,bigtable,code,hld,ds,design patterns,kafka,scala,fintech,algorithms,java,architecture

Posted 1 day ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Zscaler Serving thousands of enterprise customers around the world including 45% of Fortune 500 companies, Zscaler (NASDAQ: ZS) was founded in 2007 with a mission to make the cloud a safe place to do business and a more enjoyable experience for enterprise users. As the operator of the world’s largest security cloud, Zscaler accelerates digital transformation so enterprises can be more agile, efficient, resilient, and secure. The pioneering, AI-powered Zscaler Zero Trust Exchange™ platform, which is found in our SASE and SSE offerings, protects thousands of enterprise customers from cyberattacks and data loss by securely connecting users, devices, and applications in any location. Named a Best Workplace in Technology by Fortune and others, Zscaler fosters an inclusive and supportive culture that is home to some of the brightest minds in the industry. If you thrive in an environment that is fast-paced and collaborative, and you are passionate about building and innovating for the greater good, come make your next move with Zscaler. Our Engineering team built the world’s largest cloud security platform from the ground up, and we keep building. With more than 100 patents and big plans for enhancing services and increasing our global footprint, the team has made us and our multitenant architecture today's cloud security leader, with more than 15 million users in 185 countries. Bring your vision and passion to our team of cloud architects, software engineers, security experts, and more who are enabling organizations worldwide to harness speed and agility with a cloud-first strategy. We are seeking a visionary and dynamic Staff Engineer to join our ZPA ControlPath team. You will be reporting to the Senior Manager, you'll be responsible for: Working closely with Principal Engineers, collaborating on architectural changes, product designs thereby making impactful changes Building and operating high scale systems Overseeing the software development lifecycle to deliver quality products and aligning technical solutions with product and user needs What We're Looking For (Minimum Qualifications) 6+ years of experience in Java coding in a highly distributed and enterprise-scale environment Experience being oncall and dealing with cloud incidents and writing RCAs Working knowledge of cloud infrastructure services on AWS/Azure Great mentor and coach Bachelor Degree/or Masters Degree in computer science or equivalent experience What Will Make You Stand Out (Preferred Qualifications) Experience building full CI/CD systems leveraging Kubernetes for Microservices Experiences building reliable and extensible data tiers for large scale web services (Postgres and Redis) Experience building web service frameworks with Logging and Analytics experience with Druid, Kafka, Opensearch At Zscaler, we are committed to building a team that reflects the communities we serve and the customers we work with. We foster an inclusive environment that values all backgrounds and perspectives, emphasizing collaboration and belonging. Join us in our mission to make doing business seamless and secure. Benefits Our Benefits program is one of the most important ways we support our employees. Zscaler proudly offers comprehensive and inclusive benefits to meet the diverse needs of our employees and their families throughout their life stages, including: Various health plans Time off plans for vacation and sick time Parental leave options Retirement options Education reimbursement In-office perks, and more! By applying for this role, you adhere to applicable laws, regulations, and Zscaler policies, including those related to security and privacy standards and guidelines. Zscaler is committed to providing equal employment opportunities to all individuals. We strive to create a workplace where employees are treated with respect and have the chance to succeed. All qualified applicants will be considered for employment without regard to race, color, religion, sex (including pregnancy or related medical conditions), age, national origin, sexual orientation, gender identity or expression, genetic information, disability status, protected veteran status, or any other characteristic protected by federal, state, or local laws. See more information by clicking on the Know Your Rights: Workplace Discrimination is Illegal link. Pay Transparency Zscaler complies with all applicable federal, state, and local pay transparency rules. Zscaler is committed to providing reasonable support (called accommodations or adjustments) in our recruiting processes for candidates who are differently abled, have long term conditions, mental health conditions or sincerely held religious beliefs, or who are neurodivergent or require pregnancy-related support.

Posted 1 day ago

Apply

15.0 years

0 Lacs

India

Remote

About Us MyRemoteTeam, Inc is a fast-growing distributed workforce enabler, helping companies scale with top global talent. We empower businesses by providing world-class software engineers, operations support, and infrastructure to help them grow faster and better. Job Title: AWS Cloud Architecture Experience: 15+ Years Mandatory Skills ✔ 15+ years in Java Full Stack (Spring Boot, Microservices, ReactJS) ✔ Cloud Architecture: AWS EKS, Kubernetes, API Gateway (APIGEE/Tyk) ✔ Event Streaming: Kafka, RabbitMQ ✔ Database Mastery: PostgreSQL (performance tuning, scaling) ✔ DevOps: GitLab CI/CD, Terraform, Grafana/Prometheus ✔ Leadership: Technical mentoring, decision-making About the Role We are seeking a highly experienced AWS Cloud Architect with 15+ years of expertise in full-stack Java development , cloud-native architecture, and large-scale distributed systems. The ideal candidate will be a technical leader capable of designing, implementing, and optimizing high-performance cloud applications across on-premise and multi-cloud environments (AWS). This role requires deep hands-on skills in Java, Microservices, Kubernetes, Kafka, and observability tools, along with a strong architectural mindset to drive innovation and mentor engineering teams. Key Responsibilities ✅ Cloud-Native Architecture & Leadership: Lead the design, development, and deployment of scalable, fault-tolerant cloud applications (AWS EKS, Kubernetes, Serverless). Define best practices for microservices, event-driven architecture (Kafka), and API management (APIGEE/Tyk). Architect hybrid cloud solutions (on-premise + AWS/GCP) with security, cost optimization, and high availability. ✅ Full-Stack Development: Develop backend services using Java, Spring Boot, and PostgreSQL (performance tuning, indexing, replication). Build modern frontends with ReactJS (state management, performance optimization). Design REST/gRPC APIs and event-driven systems (Kafka, SQS). ✅ DevOps & Observability: Manage Kubernetes (EKS) clusters, Helm charts, and GitLab CI/CD pipelines. Implement Infrastructure as Code (IaC) using Terraform/CloudFormation. Set up monitoring (Grafana, Prometheus), logging (ELK), and alerting for production systems. ✅ Database & Performance Engineering: Optimize PostgreSQL for high throughput, replication, and low-latency queries. Troubleshoot database bottlenecks, caching (Redis), and connection pooling. Design data migration strategies (on-premise → cloud). ✅ Mentorship & Innovation: Mentor junior engineers and conduct architecture reviews. Drive POCs on emerging tech (Service Mesh, Serverless, AI/ML integrations). Collaborate with CTO/Architects on long-term technical roadmaps.

Posted 1 day ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies