Jobs
Interviews

49 Kafka Messaging Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

8 - 12 Lacs

Hyderabad

Work from Office

S&P Dow Jones Indices is seeking a Python/Bigdata developer to be a key player in the implementation and support of data Platforms for S&P Dow Jones Indices. This role requires a seasoned technologist who contributes to application development and maintenance. The candidate should actively evaluate new products and technologies to build solutions that streamline business operations. The candidate must be delivery-focused with solid financial applications experience. The candidate will assist in day-to-day support and operations functions, design, development, and unit testing. Responsibilities and Impact: Lead the design and implementation of EMR Spark workloads using Python, including data access from relational databases and cloud storage technologies. Implement new powerful functionalities using Python, Pyspark, AWS and Delta Lake. Independently come up with optimal designs for the business use cases and implement the same using big data technologies. Enhance existing functionalities in Oracle/Postgres procedures, functions. Performance tuning of existing Spark jobs. Respond to technical queries from operations and product management team. Implement new functionalities in Python, Spark, Hive. Enhance existing functionalities in Postgres procedures, functions. Collaborate with cross-functional teams to support data-driven initiatives. Mentor junior team members and promote best practices. Respond to technical queries from the operations and product management team. What Were Looking For: Basic Required Qualifications: Bachelors degree in computer science, Information Systems, or Engineering, or equivalent work experience. 5 - 8 years of IT experience in application support or development. Hands on development experience on writing effective and scalable Python programs. Deep understanding of OOP concepts and development models in Python. Knowledge of popular Python libraries/ORM libraries and frameworks. Exposure to unit testing frameworks like Pytest. Good understanding of spark architecture as the system involves data intensive operations. Good amount of work experience in spark performance tuning. Experience/exposure in Kafka messaging platform. Experience in Build technology like Maven, Pybuilder. Exposure with AWS offerings such as EC2, RDS, EMR, lambda, S3,Redis. Hands on experience in at least one relational database (Oracle, Sybase, SQL Server, PostgreSQL). Hands on experience in SQL queries and writing stored procedures, functions. A strong willingness to learn new technologies. Excellent communication skills, with strong verbal and writing proficiencies. Additional Preferred Qualifications: Proficiency in building data analytics solutions on AWS Cloud. Experience with microservice and serverless architecture implementation.

Posted 2 months ago

Apply

3.0 - 6.0 years

10 - 17 Lacs

Pune

Remote

Kafka/MSK Linux In-depth understanding of Kafka broker configurations, zookeepers, and connectors Understand Kafka topic design and creation. Good knowledge in replication and high availability for Kafka system ElasticSearch/OpenSearch

Posted 2 months ago

Apply

6.0 - 11.0 years

12 - 30 Lacs

Hyderabad

Work from Office

Proficient in Java 8 , Kafka Must have Experience with Junit Test Case Good on Spring boot, Microservices, SQL , ActiveMQ & Restful API

Posted 2 months ago

Apply

10.0 - 15.0 years

12 - 16 Lacs

Pune, Bengaluru

Work from Office

We are seeking a talented and experienced Kafka Architect with migration experience to Google Cloud Platform (GCP) to join our team. As a Kafka Architect, you will be responsible for designing, implementing, and managing our Kafka infrastructure to support our data processing and messaging needs, while also leading the migration of our Kafka ecosystem to GCP. You will work closely with our engineering and data teams to ensure seamless integration and optimal performance of Kafka on GCP. Responsibilities: Discovery, analysis, planning, design, and implementation of Kafka deployments on GKE, with a specific focus on migrating Kafka from AWS to GCP. Design, architect and implement scalable, high-performance Kafka architectures and clusters to meet our data processing and messaging requirements. Lead the migration of our Kafka infrastructure from on-premises or other cloud platforms to Google Cloud Platform (GCP). Conduct thorough discovery and analysis of existing Kafka deployments on AWS. Develop and implement best practices for Kafka deployment, configuration, and monitoring on GCP. Develop a comprehensive migration strategy for moving Kafka from AWS to GCP. Collaborate with engineering and data teams to integrate Kafka into our existing systems and applications on GCP. Optimize Kafka performance and scalability on GCP to handle large volumes of data and high throughput. Plan and execute the migration, ensuring minimal downtime and data integrity. Test and validate the migrated Kafka environment to ensure it meets performance and reliability standards. Ensure Kafka security on GCP by implementing authentication, authorization, and encryption mechanisms. Troubleshoot and resolve issues related to Kafka infrastructure and applications on GCP. Ensure seamless data flow between Kafka and other data sources/sinks. Implement monitoring and alerting mechanisms to ensure the health and performance of Kafka clusters. Stay up to date with Kafka developments and GCP services to recommend and implement new features and improvements. Requirements: Bachelors degree in computer science, Engineering, or related field (Masters degree preferred). Proven experience as a Kafka Architect or similar role, with a minimum of [5] years of experience. Deep knowledge of Kafka internals and ecosystem, including Kafka Connect, Kafka Streams, and KSQL. In-depth knowledge of Apache Kafka architecture, internals, and ecosystem components. Proficiency in scripting and automation for Kafka management and migration. Hands-on experience with Kafka administration, including cluster setup, configuration, and tuning. Proficiency in Kafka APIs, including Producer, Consumer, Streams, and Connect. Strong programming skills in Java, Scala, or Python. Experience with Kafka monitoring and management tools such as Confluent Control Center, Kafka Manager, or similar. Solid understanding of distributed systems, data pipelines, and stream processing. Experience leading migration projects to Google Cloud Platform (GCP), including migrating Kafka workloads. Familiarity with GCP services such as Google Kubernetes Engine (GKE), Google Cloud Storage, Google Cloud Pub/Sub, and Big Query. Excellent communication and collaboration skills. Ability to work independently and manage multiple tasks in a fast-paced environment.

Posted 2 months ago

Apply

3.0 - 10.0 years

22 - 26 Lacs

Hyderabad

Work from Office

Skillsoft is the global leader in eLearning. Trusted by the world's leading organizations, including 65% of the Fortune 500. Our 100,000+ courses, videos and books are accessed over 100 million times every month, across more than 100 countries. At Skillsoft, we believe knowledge is the fuel for innovation and innovation is the fuel for business growth. Join us in our quest to democratize learning and help individuals unleash their edge. Are you ready to shape the future of learning through cutting-edge AI? As a Principal AI/Machine Learning Engineer at Skillsoft, you’ll dive into the heart of innovation, crafting intelligent systems that empower millions worldwide. From designing generative AI solutions to pioneering agentic workflows, you’ll collaborate with multiple teams to transform knowledge into a catalyst for growth—unleashing your edge while helping others do the same. Join us in redefining eLearning for the world’s leading organizations! Responsibilities: Hands-on AI/ML software engineer Prompt engineering, agentic workflow development and testing Work with product owners to understand requirements and guide new features Collaborate to identify new feature impacts Evaluate new AI/ML technology advancements and socialize finding Research, prototype, and select appropriate COTS and develop in-house AI/ML technology Consult with external partners to review and guide development and integration of AI technology Collaborate with teams to design, and guide AI development, and enhancements Document designs and implementation to ensure consistency and alignment with standards Create documentation including system and sequence diagrams Create appropriate data pipelines for AI/ML training and inference Analyze, curate, cleanse, and preprocess data Utilize and apply generative AI to increase productivity for yourself and the organization Periodically explore new technologies and design patterns with proof-of-concept Participate in developing best practices and improving operational processes Present research and work to socialize and share knowledge across the organization Contribute to patentable AI innovations Environment, Tools & Technologies: Agile/Scrum Operating Systems – Mac, Linux JavaScript, Node.js, Python PyTorch, Tensorflow, Keras, OpenAI, Anthropic, and friends Langchain, Langgraph, etc. APIs GraphQL, REST Docker, Kubernetes Amazon Web Services (AWS), MS Azure SQL: Postgres RDS NoSQL: Cassandra, Elasticsearch (VectorDb) Messaging – Kafka, RabbitMQ, SQS Monitoring – Prometheus, ELK GitHub, IDE (your choice) Skills & Qualifications: (8+ years experience) Experience with LLMs and fine-tuning models Development experience including unit testing Design and documentation experience of new APIs, data models, service interactions Familiarity with and ability to explain: o system and API security techniques o data privacy concerns o microservices architecture o vertical vs horizontal scaling o Generative AI, NLP, DNN, auto-encoders, etc. Attributes for Success: Proactive, Independent, Adaptable Collaborative team player Customer service minded with an ownership mindset Excellent analytic and communication skills Ability and desire to coach and mentor other developers Passionate, curious, open to new ideas, and ability to research and learn new technologies

Posted 2 months ago

Apply

5.0 - 10.0 years

10 - 18 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Role & responsibilities Administer and maintain Apache Kafka clusters , including installation, upgrades, configuration, and performance tuning. Design and implement Kafka topics, partitions, replication , and consumer groups. Ensure high availability and scalability of Kafka infrastructure in production environments. Monitor Kafka health and performance using tools like Prometheus, Grafana, Confluent Control Center , etc. Implement and manage security configurations such as SSL/TLS, authentication (Kerberos/SASL), and access control. Collaborate with development teams to design and configure Kafka-based integrations and data pipelines . Perform root cause analysis of production issues and ensure timely resolution. Create and maintain documentation for Kafka infrastructure and configurations. Required Skills: Strong expertise in Kafka administration , including hands-on experience with open-source and/or Confluent Kafka . Experience with Kafka ecosystem tools (Kafka Connect, Kafka Streams, Schema Registry). Proficiency in Linux-based environments and scripting (Bash, Python). Experience with monitoring/logging tools and Kafka performance optimization. Ability to work independently and proactively manage Kafka environments. Familiarity with DevOps tools and CI/CD pipelines (e.g., Jenkins, Git, Ansible). Preferred Skills: Experience with cloud platforms (AWS, GCP, or Azure) Kafka services. Knowledge of messaging alternatives like RabbitMQ, Pulsar, or ActiveMQ . Working knowledge of Docker and Kubernetes for Kafka deployment.

Posted 2 months ago

Apply

3.0 - 5.0 years

7 - 12 Lacs

Bengaluru

Work from Office

As a Software Engineer, you will be responsible for designing, developing, and maintaining cloud-native applications and platforms. You will work on cloud security, microservices, container orchestration, and API-driven architectures while implementing best practices in software design and system scalability. You have: Bachelor's or Masters degree in Electronics, Computer Science, Electrical Engineering, or a related field with 8+ years of work experience. Experience in container orchestration using Kubernetes, Helm, and OpenShift. Experience with API Gateway, Kafka Messaging, and Component Life Cycle Management. Expertise in Linux platform, including Linux Containers, Namespaces, and CGroups. Experience in scripting language Perl/ Python and CI/CD tools Jenkins, Git, Helm, and Ansible. It would be nice if you also had: Familiarity with open-source PaaS environments like OpenShift. Experience with evolutionary architecture and microservices development. You will design and develop software components based on cloud-native principles and leading PaaS platforms. You will Implement scalable, secure, and resilient microservices and cloud-based applications. You will develop APIs and integrate with API gateways, message brokers (Kafka), and containerized environments. You will apply design patterns, domain-driven design (DDD), component-based architecture, and evolutionary architecture principles. You will lead the end-to-end development of features and EPICs, ensuring high performance and scalability. You will define and implement container management strategies, leveraging Kubernetes, OpenShift, and Helm

Posted 2 months ago

Apply

6.0 - 10.0 years

15 - 30 Lacs

Pune

Work from Office

It's a pleasure speaking with you. Further to our discussion, please find the role details for your reference: Please let us know your interest along with your updated CV and below details to proceed further: Note : Work from Office Java Developer , Spring Boot,Microservices,Kafka,AWS Skill Java Backend Developer (Core Java + Microservices + Spring boot+ AWS +Java/MS/SB+AWS+Kafka should be strong What you will do: • Design components by translating product requirements, break down project into tasks and provide accurate estimates. • Responsible for the analysis, design, development, and delivery of software solutions. • Plan, design, and develop technical solutions and alternatives to meet business requirements in adherence with MasterCard standards, processes, and best practices. • Independently come up with different solutions, extensible low-level design. Write modular, extensible, readable, and performant code. • Choose the right Data Structures, tools and tech stacks and be able to do High Level Designing with guidance. • Research new frameworks and technologies, assist with prototyping and proof-of-concepts, participate in code reviews. • Collaborate with teams by contributing to the shared vision and working closely with cross-functional stakeholders. • You will be responsible for the APIs you (and or your team) build and will support them till the time they are live in production. • Accountable for full systems development life cycle including creating high quality requirements documents, use-cases, design, and other technical artifacts including but not limited to detailed test strategy/test design, performance benchmarking, release rollout and deployment plans, contingency/back-out plans, feasibility study, cost and time analysis and detailed estimates. • Research and evaluate current and upcoming technologies and frameworks. Participate in PoCs (Proof of Concept) and help the Department with selection of Vendor Solutions, Technologies, Methodologies and Frameworks. All About You: • Excellent communication skills with the ability to communicate with all levels of management. • Ability to build rapport and relationships. • A record of successful delivery of software applications as an individual • Problem solver and solution-seeking approach • Knowledgeable possessing the technical knowledge, market knowledge, and other specialized knowledge of the teams problem domain. • Hands on experience in building complex and highly scalable and performing systems. • Solid understanding and hands-on experience in Java, J2EE, Spring, Spring Security, Spring Boot, RESTful web services • Solid understanding and experience integrating web services. • Exposure to building cloud ready applications (microservices) • Exposure to Test Driven Development, Continuous Delivery, and Integration • Must be high-energy, detail-oriented, proactive and can function under pressure in an independent environment. • Must provide the necessary skills to have a high degree of initiative and self-motivation to drive results. • Possesses strong communication skills -- both verbal and written and strong relationship, collaborative skills, and organizational skills. • Willingness and ability to learn and take on challenging opportunities and to work as a member of matrix based diverse and geographically distributed project team. • Bachelor’s in computer degree (BE/BTech) • 6-10 years of work experience Thanks & Regards, Sateesh Botcha Talent Champion IFIN Global Group m: +91 (0) 7032009838 email: sateesh.b@ifinglobalgroup.com https://www.linkedin.com/in/satish-botcha-99a466225

Posted 3 months ago

Apply

8 - 13 years

20 - 30 Lacs

Gurugram

Remote

Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale Java applications using Spring Boot. Collaborate with cross-functional teams to identify requirements and implement solutions. Ensure high availability, scalability, performance, security, and reliability of the application. Participate in code reviews and ensure adherence to coding standards. Troubleshoot complex issues related to application deployment on AWS cloud platform. Desired Candidate Profile 8-13 years of experience in software development with expertise in Java programming language. Bachelor's degree (B.Tech/B.E.) from a reputed institution in Any Specialization. Strong understanding of architecture design patterns such as Microservices Architecture. Proficiency in technologies like AngularJS, Kafka Messaging System, Elastic Search Engine.

Posted 3 months ago

Apply

8 - 13 years

20 - 30 Lacs

Pune, Gurugram

Hybrid

Roles and Responsibilities Design, develop, test, deploy, and maintain large-scale Java applications using Spring Boot. Collaborate with cross-functional teams to identify requirements and implement solutions. Ensure high availability, scalability, performance, security, and reliability of the application. Participate in code reviews and ensure adherence to coding standards. Troubleshoot complex issues related to application deployment on AWS cloud platform. Desired Candidate Profile 8-13 years of experience in software development with expertise in Java programming language. Bachelor's degree (B.Tech/B.E.) from a reputed institution in Any Specialization. Strong understanding of architecture design patterns such as Microservices Architecture. Proficiency in technologies like AngularJS, Kafka Messaging System, Elastic Search Engine.

Posted 3 months ago

Apply

3 - 8 years

8 - 18 Lacs

Gurugram

Remote

Kafka/MSK Linux In-depth understanding of Kafka broker configurations, zookeepers, and connectors Understand Kafka topic design and creation. Good knowledge in replication and high availability for Kafka system ElasticSearch/OpenSearch

Posted 3 months ago

Apply

5 - 7 years

9 - 18 Lacs

Gurugram

Work from Office

Job Responsibilities Develop, test, and maintain robust and scalable backend applications using Go (Golang). Design and implement RESTful APIs and gRPC services. Work with microservices architecture and containerization using Docker. Optimize application performance and ensure high availability and scalability. Collaborate with frontend developers, DevOps engineers, and other stakeholders. Implement and maintain CI/CD pipelines for seamless deployments. Write clean, maintainable, and well-documented code. Troubleshoot, debug, and upgrade existing applications. Ensure security best practices and code quality standards. Good understanding of concurrency / multi threading patterns memory efficient implementations Go profiling and debugging methods Desired Skills and Experience: Strong experience in Go (Golang) programming. Proficiency in building microservices and APIs. Knowledge of database systems such as PostgreSQL, MySQL, MongoDB, or Redis. Experience with Docker, Kubernetes, and cloud platforms (AWS). Understanding of event-driven architecture and message brokers (Kafka, RabbitMQ, etc.). Familiarity with version control tools like Git. Experience with testing frameworks and best practices in Go. Knowledge of authentication and authorization mechanisms (OAuth, JWT, etc.). Ability to work in an Agile/Scrum environment. Experience in Agile development environment. Web Services Testing/Backend testing: SOAP UI, POSTMAN Build Deployment on Test Environment. Working experience on Unix is a plus

Posted 3 months ago

Apply

7.0 - 11.0 years

20 - 25 Lacs

bengaluru

Work from Office

We are seeking for apassionate and skilled Python Backend Data Engineerto design, develop and maintain hybrid data Platform across multi-cloud and on-premises.You will collaborate with AI engineers, product managers, and software engineers to bring data driven products and features to life. Job description: Build high-performing, scalable, enterprise-grade applications. Manage Big Data application development across the full software development lifecycle. Cloud platforms experience (AWS, GCP, Azure). Experience with CI/CD pipelines for data engineering workflows. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must havePython3, Fast API, Spark, Iceberg. RESTful web services, Kafka messaging service. Experience with Gen AI ecosystem/tools is a plus. Experience in data wrangling using Pandas/Polaris, NumPy, SQL, etc. No SQL & SQL databases (Postgres/MySQL, Mongo with GridFs) Modern authorization mechanisms (JSON Web Token, OAuth2) Preferred technical and professional experience Object-Oriented analysis and design (DDD, Microservices) Understanding of cloud platforms (AWS, GCP, Azure) is a plus. Any exposure to any BI tools is a plus. Problem-SolvingExcellent analytical and problem-solving skills, with the ability to think critically and creatively. CommunicationStrong interpersonal and communication skills in English, with the ability to work effectively in a collaborative team environment

Posted Date not available

Apply

2.0 - 6.0 years

5 - 10 Lacs

bengaluru

Work from Office

We are seeking for apassionate and skilled Python Backend Data Engineerto design, develop and maintain hybrid data Platform across multi-cloud and on-premises.You will collaborate with AI engineers, product managers, and software engineers to bring data driven products and features to life. Job description: Build high-performing, scalable, enterprise-grade applications. Manage Big Data application development across the full software development lifecycle. Cloud platforms experience (AWS, GCP, Azure). Experience with CI/CD pipelines for data engineering workflows. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must havePython3, Fast API, Spark, Iceberg. RESTful web services, Kafka messaging service. Experience with Gen AI ecosystem/tools is a plus. Experience in data wrangling using Pandas/Polaris, NumPy, SQL, etc. No SQL & SQL databases (Postgres/MySQL, Mongo with GridFs) Modern authorization mechanisms (JSON Web Token, OAuth2) Preferred technical and professional experience Object-Oriented analysis and design (DDD, Microservices) Understanding of cloud platforms (AWS, GCP, Azure) is a plus. Any exposure to any BI tools is a plus. Problem-SolvingExcellent analytical and problem-solving skills, with the ability to think critically and creatively. CommunicationStrong interpersonal and communication skills in English, with the ability to work effectively in a collaborative team environment

Posted Date not available

Apply

1.0 - 3.0 years

4 - 8 Lacs

bengaluru

Work from Office

We are seeking for apassionate and skilled Python Backend Data Engineerto design, develop and maintain hybrid data Platform across multi-cloud and on-premises.You will collaborate with AI engineer, product managers, and software engineers to bring data driven products and features to life. Job description: Build high-performing, scalable, enterprise-grade applications. Manage Big Data application development across the full software development lifecycle. Cloud platforms experience (AWS, GCP, Azure). Experience with CI/CD pipelines for data engineering workflows. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must havePython3, Fast API, Spark, Iceberg. RESTful web services, Kafka messaging service. Experience with Gen AI ecosystem/tools is a plus. Experience in data wrangling using Pandas/Polaris, NumPy, SQL, etc. No SQL & SQL databases (Postgres/MySQL, Mongo with GridFs) Modern authorization mechanisms (JSON Web Token, OAuth2) Preferred technical and professional experience Object-Oriented analysis and design (DDD, Microservices) Understanding of cloud platforms (AWS, GCP, Azure) is a plus. Any exposure to any BI tools is a plus. Problem-SolvingExcellent analytical and problem-solving skills, with the ability to think critically and creatively. CommunicationStrong interpersonal and communication skills in English, with the ability to work effectively in a collaborative team environment

Posted Date not available

Apply

8.0 - 10.0 years

19 - 20 Lacs

mumbai

Work from Office

9-10+ years . Strong JAVA skills is MUST, hands on exp in Cache Frameworks like Radis/Elastic, messaging frameworks like Kafka, MQ, Observability & Telemetry is must. Exposure to Data & Analytics patterns, Data Lake, Distributed Ent Data will be adv

Posted Date not available

Apply

6.0 - 10.0 years

15 - 22 Lacs

pune, gurugram, bengaluru

Hybrid

Seeking Java Microservices Developer (6–9 yrs) to modernize legacy apps using Java, Spring Boot, REST APIs. Work on microservices, DB optimization, Docker, CI/CD, API docs, testing, and Agile delivery. Groovy/Grails knowledge a plus.

Posted Date not available

Apply

2.0 - 4.0 years

6 - 12 Lacs

mumbai

Work from Office

About the Role We are seeking a passionate Spring Boot Java Developer with 24 years of professional experience to join our engineering team. You will be responsible for developing and maintaining high-performance, scalable applications using modern Java frameworks. The ideal candidate should be comfortable working with microservices architecture and building REST APIs for high-volume and high-availability systems. Key Responsibilities Design, develop, and maintain Spring Boot microservices. Implement business logic in Java using Lombok , JPA , and Hibernate . Integrate and work with MySQL and PostgreSQL databases. Apply caching strategies for performance optimization. Work with Kafka and RabbitMQ for messaging and event-driven architectures. Use Swagger/OpenAPI for API documentation and develop Rest APIs . Collaborate with cross-functional teams to understand requirements and deliver robust solutions. Ensure high code quality using version control ( Git ) and code review best practices. Troubleshoot production issues and perform root cause analysis. Write unit and integration tests to maintain software reliability. Work in an Agile/Scrum environment for iterative development. Required Skills & Experience 24 years of hands-on Java development experience. Strong understanding of Spring Boot , JPA/Hibernate , and microservices architecture. Proficiency in Lombok to simplify Java code. Solid experience with MySQL and PostgreSQL . Experience with Kafka and RabbitMQ for asynchronous processing. Knowledge of caching frameworks (e.g., Redis, Ehcache). Experience with Git for version control. Exposure to API documentation tools like Swagger/OpenAPI . Good problem-solving skills and ability to write clean, maintainable code. Experience with product development for large data volumes and high-availability systems is a strong plus.

Posted Date not available

Apply

7.0 - 12.0 years

8 - 18 Lacs

bengaluru, delhi / ncr, mumbai (all areas)

Work from Office

7+ years’ experience (3+ in Kafka – Apache, Confluent, MSK – & RabbitMQ) with strong skills in monitoring, optimization, and incident resolution. Proficient in brokers, connectors, Zookeeper/KRaft, schema registry, and middleware performance metrics.

Posted Date not available

Apply

8.0 - 13.0 years

9 - 14 Lacs

bengaluru

Work from Office

You have: Bachelor's or Masters degree in Electronics, Computer Science, Electrical Engineering, or a related field with 8+ years of work experience. Experience in container orchestration using Kubernetes, Helm, and OpenShift. Experience with API Gateway, Kafka Messaging, and Component Life Cycle Management. Expertise in Linux platform, including Linux Containers, Namespaces, and CGroups. Experience in scripting language Perl/ Python and CI/CD tools Jenkins, Git, Helm, and Ansible. It would be nice if you also had: Familiarity with open-source PaaS environments like OpenShift. Experience with evolutionary architecture and microservices development. Autonomously performs tasks with a moderate level of guidance and within guidelines and policies. You will design and develop software components based on cloud-native principles and leading PaaS platforms. You will Implement scalable, secure, and resilient microservices and cloud-based applications. You will Develop APIs and integrate with API gateways, message brokers (Kafka), and containerized environments. You will lead the end-to-end development of features and EPICs, ensuring high performance and scalability. You will define and implement container management strategies, leveraging Kubernetes, OpenShift, and Helm

Posted Date not available

Apply

5.0 - 10.0 years

5 - 15 Lacs

bengaluru

Work from Office

J2EE ,SpringBoot, RESTful API,RDBMS,CI/CD,SQL,TDD,MQ,KAFKA

Posted Date not available

Apply

5.0 - 10.0 years

5 - 15 Lacs

mumbai

Work from Office

J2EE ,SpringBoot, RESTful API,RDBMS,CI/CD,SQL,TDD,MQ,KAFKA

Posted Date not available

Apply

5.0 - 10.0 years

5 - 15 Lacs

pune

Work from Office

J2EE ,SpringBoot, RESTful API,RDBMS,CI/CD,SQL,TDD,MQ,KAFKA

Posted Date not available

Apply

5.0 - 10.0 years

5 - 15 Lacs

chennai

Work from Office

J2EE ,SpringBoot, RESTful API,RDBMS,CI/CD,SQL,TDD,MQ,KAFKA

Posted Date not available

Apply
Page 2 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies