Jobs
Interviews

42 Kafka Cluster Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

15 - 20 Lacs

Noida, Gurugram

Work from Office

Manage end-to-end activities including installation, configuring & maintenance of Kafka clusters, managing topics configured in Kafka, ensuring maximum uptime of Kafka. Monitor performance of producer & consumer threads interacting with Kafka. Required Candidate profile Kafka Certification Must have hands-on experience in managing large Kafka clusters and installations Ability to monitor performance of producer consumer threads interacting with Kafka.

Posted 2 months ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role & responsibilities Looking exp in 5+ Yrs exp in Confluent Kafka Administrator-Technology Lead Kafka Administrator Required Skills & Experience: Hands-on experience in Kafka Cluster Management Proficiency with Kafka Connect Knowledge of Cluster Linking and MirrorMaker Experience setting up Kafka clusters from scratch Experience on Terraform/Ansible script Ability to install and configure the Confluent Platform Understanding of rebalancing, Schema Registry, and REST Proxies Familiarity with RBAC (Role-Based Access Control) and ACLs (Access Control Lists) Interested candidate share me your updated resume in recruiter.wtr26@walkingtree.in

Posted 2 months ago

Apply

6.0 - 11.0 years

12 - 30 Lacs

Hyderabad

Work from Office

Proficient in Java 8 , Kafka Must have Experience with Junit Test Case Good on Spring boot, Microservices, SQL , ActiveMQ & Restful API

Posted 2 months ago

Apply

4.0 - 8.0 years

27 - 42 Lacs

Hyderabad

Work from Office

Job Summary We are looking for an experienced Infra Dev Specialist with 4 to 8 years of experience to join our team. The ideal candidate will have expertise in KSQL Kafka Schema Registry Kafka Connect and Kafka. This role involves working in a hybrid model with day shifts and does not require travel. The candidate will play a crucial role in developing and maintaining our infrastructure to ensure seamless data flow and integration. Responsibilities Develop and maintain infrastructure solutions using KSQL Kafka Schema Registry Kafka Connect and Kafka. Oversee the implementation of data streaming and integration solutions to ensure high availability and performance. Provide technical support and troubleshooting for Kafka-related issues to minimize downtime and ensure data integrity. Collaborate with cross-functional teams to design and implement scalable and reliable data pipelines. Monitor and optimize the performance of Kafka clusters to meet the demands of the business. Ensure compliance with security and data governance policies while managing Kafka infrastructure. Implement best practices for data streaming and integration to enhance system efficiency. Conduct regular reviews and updates of the infrastructure to align with evolving business needs. Provide training and support to team members on Kafka-related technologies and best practices. Develop and maintain documentation for infrastructure processes and configurations. Participate in code reviews and contribute to the continuous improvement of the development process. Stay updated with the latest trends and advancements in Kafka and related technologies. Contribute to the overall success of the team by delivering high-quality infrastructure solutions. Qualifications Possess strong experience in KSQL Kafka Schema Registry Kafka Connect and Kafka. Demonstrate a solid understanding of data streaming and integration concepts. Have a proven track record of troubleshooting and resolving Kafka-related issues. Show expertise in designing and implementing scalable data pipelines. Exhibit knowledge of security and data governance practices in managing Kafka infrastructure. Display proficiency in monitoring and optimizing Kafka cluster performance. Have experience in providing technical support and training to team members. Be skilled in developing and maintaining infrastructure documentation. Stay informed about the latest trends in Kafka and related technologies. Possess excellent communication and collaboration skills. Have a proactive approach to problem-solving and continuous improvement. Demonstrate the ability to work effectively in a hybrid work model. Show commitment to delivering high-quality infrastructure solutions. Certifications Required Certified Apache Kafka Developer

Posted 2 months ago

Apply

4.0 - 7.0 years

8 - 13 Lacs

Pune

Work from Office

Responsibilities: * Monitor Kafka clusters for performance & availability * Manage Kafka broker instances & replication strategies * Collaborate with dev teams on data pipeline design & implementation Food allowance Health insurance Provident fund Annual bonus

Posted 2 months ago

Apply

5.0 - 10.0 years

6 - 11 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Role & responsibilities Kafka Administrator Required Skills & Experience: Looking exp around 5+ Years. Hands-on experience in Kafka Cluster Management Proficiency with Kafka Connect Knowledge of Cluster Linking and MirrorMaker Experience setting up Kafka clusters from scratch Experience on Terraform/Ansible script Ability to install and configure the Confluent Platform Understanding of rebalancing, Schema Registry, and REST Proxies Familiarity with RBAC (Role-Based Access Control) and ACLs (Access Control Lists) Share me your updated resume recruiter.wtr26@walkingtree.in

Posted 2 months ago

Apply

5.0 - 10.0 years

10 - 18 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Role & responsibilities Administer and maintain Apache Kafka clusters , including installation, upgrades, configuration, and performance tuning. Design and implement Kafka topics, partitions, replication , and consumer groups. Ensure high availability and scalability of Kafka infrastructure in production environments. Monitor Kafka health and performance using tools like Prometheus, Grafana, Confluent Control Center , etc. Implement and manage security configurations such as SSL/TLS, authentication (Kerberos/SASL), and access control. Collaborate with development teams to design and configure Kafka-based integrations and data pipelines . Perform root cause analysis of production issues and ensure timely resolution. Create and maintain documentation for Kafka infrastructure and configurations. Required Skills: Strong expertise in Kafka administration , including hands-on experience with open-source and/or Confluent Kafka . Experience with Kafka ecosystem tools (Kafka Connect, Kafka Streams, Schema Registry). Proficiency in Linux-based environments and scripting (Bash, Python). Experience with monitoring/logging tools and Kafka performance optimization. Ability to work independently and proactively manage Kafka environments. Familiarity with DevOps tools and CI/CD pipelines (e.g., Jenkins, Git, Ansible). Preferred Skills: Experience with cloud platforms (AWS, GCP, or Azure) Kafka services. Knowledge of messaging alternatives like RabbitMQ, Pulsar, or ActiveMQ . Working knowledge of Docker and Kubernetes for Kafka deployment.

Posted 2 months ago

Apply

5.0 - 10.0 years

16 - 19 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Detailed job description - Skill Set: Proven experience as a Kafka Developer Knowledge of Kafka Schemas and use of the Schema Registry Strong knowledge of Kafka and other big data technologies Best practices to optimize the Kafka ecosystem based on use-case and workload Knowledge of Kafka clustering, and its fault-tolerance model supporting High Availability Strong fundamentals in Kafka client configuration and troubleshooting Designing and implementing data pipelines using Apache Kafka Develop and maintain Kafka-based data pipelines Monitor and optimize Kafka clusters Troubleshoot and resolve issues related to Kafka and data processing Ensure data security and compliance with industry standards Create and maintain documentation for Kafka configurations and processes Implement best practices for Kafka architecture and operations Mandatory Skills(ONLY 2 or 3) Kafka Developer

Posted 2 months ago

Apply

3 - 8 years

8 - 18 Lacs

Gurugram

Remote

Kafka/MSK Linux In-depth understanding of Kafka broker configurations, zookeepers, and connectors Understand Kafka topic design and creation. Good knowledge in replication and high availability for Kafka system ElasticSearch/OpenSearch

Posted 3 months ago

Apply

7.0 - 12.0 years

8 - 18 Lacs

bengaluru, delhi / ncr, mumbai (all areas)

Work from Office

7+ years’ experience (3+ in Kafka – Apache, Confluent, MSK – & RabbitMQ) with strong skills in monitoring, optimization, and incident resolution. Proficient in brokers, connectors, Zookeeper/KRaft, schema registry, and middleware performance metrics.

Posted Date not available

Apply

6.0 - 11.0 years

13 - 17 Lacs

pune

Work from Office

We are looking for a skilled professional with experience in leading the deployment and operational management of high-performance Apache Kafka clusters. The ideal candidate will have a strong background in IT Services & Consulting, with expertise in managing large-scale data processing systems. Roles and Responsibility Design, deploy, and manage high-performance Apache Kafka clusters to meet business requirements. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and implement automated testing frameworks to ensure system reliability and scalability. Troubleshoot complex isLeading the Deployment and Operational Management of High-Performance Apache Kafka Clusterssues related to Kafka cluster performance and availability. Provide technical guidance and mentorship to junior team members on Kafka technologies. Ensure compliance with industry standards and best practices for data security and governance. Job Requirements Strong understanding of Apache Kafka architecture and ecosystem. Experience with distributed messaging queues and real-time data processing. Proficiency in programming languages such as Java or Python. Excellent problem-solving skills and ability to work under pressure. Strong communication and collaboration skills. Ability to design and implement scalable solutions for large-scale data processing systems.

Posted Date not available

Apply

8.0 - 13.0 years

30 - 45 Lacs

pune

Remote

Mandate skillsets- Debezium, Oracle connector (LogMiner)., Kafka. Review the current Debezium deployment architecture, including Oracle connector configuration, Kafka integration, and downstream consumers. Analyze Oracle database setup for CDC compatibility (e.g., redo log configuration, supplemental logging, privileges). Evaluate connector performance, lag, and error handling mechanisms. Identify bottlenecks, misconfigurations, or anti-patterns in the current implementation. Provide a detailed report with findings, best practices, and actionable recommendations. Optionally, support implementation of recommended changes and performance tuning. Experience: Strong hands-on experience with Debezium, especially the Oracle connector (LogMiner). Deep understanding of Oracle internals relevant to CDC: redo logs, SCNs, archive log mode, supplemental logging. Proficiency with Apache Kafka and Kafka ecosystem tools. Experience with monitoring and debugging Debezium connectors in production environments. Ability to analyze logs, metrics, and connector configurations to identify root causes of issues. Strong documentation and communication skills for delivering technical assessments

Posted Date not available

Apply

7.0 - 11.0 years

15 - 20 Lacs

bengaluru

Work from Office

Hiring for Middleware Admin in Bangalore with 7+ years of experience in below skills: Must Have: RabbitMQ/Kafka clusters Monitor health, troubleshot latency/downtime & optimized performance Automated ops with Shell/Python, Ansible, Terraform, CI/CD Required Candidate profile - Implemented security (SASL/Kerberos, SSL/TLS, RBAC) and compliance controls. - Immediate joiner - Strong in Communication - Ready to work from the client office 5 days every week

Posted Date not available

Apply

5.0 - 8.0 years

5 - 9 Lacs

bengaluru

Work from Office

JD : KAfka: Design and develop Kafka/Kafka connectors data pipelines and integration solutions. Implement and manage Kafka producers, consumers, topics, and partitions. Integrate Kafka with external systems (e.g., databases, APIs, cloud services). Optimize Kafka performance for throughput, latency, and reliability. Collaborate with DevOps, data engineering, and application teams. Monitor and troubleshoot Kafka clusters and streaming applications. Ensure data security, compliance, and governance in Kafka implementations. Maintain documentation for Kafka configurations, schemas, and processes. Mandatory Skills: Kafka Integration. Experience: 5-8 Years.

Posted Date not available

Apply

10.0 - 15.0 years

35 - 40 Lacs

pune

Work from Office

Experience Required : 10+ years overall, with 5+ years in Kafka infrastructure management and operations. Must have successfully deployed and maintained Kafka clusters in production environments, with proven experience in securing, monitoring, and scaling Kafka for enterprise-grade data streaming. Overview : We are seeking an experienced Kafka Administrator to lead the deployment, configuration, and operational management of Apache Kafka clusters supporting real-time data ingestion pipelines. The role involves ensuring secure, scalable, and highly available Kafka infrastructure for streaming flow records into centralized data platforms. Role & responsibilities Architect and deploy Apache Kafka clusters with high availability. Implement Kafka MirrorMaker for cross-site replication and disaster recovery readiness. Integrate Kafka with upstream flow record sources using IPFIX-compatible plugins. Configure Kafka topics, partitions, replication, and retention policies based on data flow requirements. Set up TLS/SSL encryption, Kerberos authentication, and access control using Apache Ranger. Monitor Kafka performance using Prometheus, Grafana, or Cloudera Manager and ensure proactive alerting. Perform capacity planning, cluster upgrades, patching, and performance tuning. Ensure audit logging, compliance with enterprise security standards, and integration with SIEM tools. Collaborate with solution architects and Kafka developers to align infrastructure with data pipeline needs. Maintain operational documentation, SOPs, and support SIT/UAT and production rollout activities. Preferred candidate profile Proven experience in Apache Kafka, Kafka Connect, Kafka Streams, and Schema Registry. Strong understanding of IPFIX, nProbe Cento, and network flow data ingestion. Hands-on experience with Apache Spark (Structured Streaming) and modern data lake or DWH platforms. Familiarity with Cloudera Data Platform, HDFS, YARN, Ranger, and Knox. Deep knowledge of data security protocols, encryption, and governance frameworks. Excellent communication, documentation, and stakeholder management skills.

Posted Date not available

Apply

10.0 - 13.0 years

30 - 40 Lacs

pune

Work from Office

Experience Required : 10+ years overall, with 5+ years in Kafka infrastructure management and operations. Must have successfully deployed and maintained Kafka clusters in production environments, with proven experience in securing, monitoring, and scaling Kafka for enterprise-grade data streaming. Overview : We are seeking an experienced Kafka Administrator to lead the deployment, configuration, and operational management of Apache Kafka clusters supporting real-time data ingestion pipelines. The role involves ensuring secure, scalable, and highly available Kafka infrastructure for streaming flow records into centralized data platforms. Role & responsibilities Architect and deploy Apache Kafka clusters with high availability. Implement Kafka MirrorMaker for cross-site replication and disaster recovery readiness. Integrate Kafka with upstream flow record sources using IPFIX-compatible plugins. Configure Kafka topics, partitions, replication, and retention policies based on data flow requirements. Set up TLS/SSL encryption, Kerberos authentication, and access control using Apache Ranger. Monitor Kafka performance using Prometheus, Grafana, or Cloudera Manager and ensure proactive alerting. Perform capacity planning, cluster upgrades, patching, and performance tuning. Ensure audit logging, compliance with enterprise security standards, and integration with SIEM tools. Collaborate with solution architects and Kafka developers to align infrastructure with data pipeline needs. Maintain operational documentation, SOPs, and support SIT/UAT and production rollout activities. Preferred candidate profile Proven experience in Apache Kafka, Kafka Connect, Kafka Streams, and Schema Registry. Strong understanding of IPFIX, nProbe Cento, and network flow data ingestion. Hands-on experience with Apache Spark (Structured Streaming) and modern data lake or DWH platforms. Familiarity with Cloudera Data Platform, HDFS, YARN, Ranger, and Knox. Deep knowledge of data security protocols, encryption, and governance frameworks. Excellent communication, documentation, and stakeholder management skills.

Posted Date not available

Apply

10.0 - 13.0 years

30 - 40 Lacs

pune

Work from Office

Experience Required : 10+ years overall, with 5+ years in Kafka-based data streaming development. Must have delivered production-grade Kafka pipelines integrated with real-time data sources and downstream analytics platforms. Overview : We are looking for a Kafka Developer to design and implement real-time data ingestion pipelines using Apache Kafka. The role involves integrating with upstream flow record sources, transforming and validating data, and streaming it into a centralized data lake for analytics and operational intelligence. Role & responsibilities Develop Kafka producers to ingest flow records from upstream systems such as flow record exporters (e.g., IPFIX-compatible probes). Build Kafka consumers to stream data into Spark Structured Streaming jobs and downstream data lakes. Define and manage Kafka topic schemas using Avro and Schema Registry for schema evolution. Implement message serialization, transformation, enrichment, and validation logic within the streaming pipeline. Ensure exactly once processing, checkpointing, and fault tolerance in streaming jobs. Integrate with downstream systems such as HDFS or Parquet-based data lakes, ensuring compatibility with ingestion standards. Collaborate with Kafka administrators to align topic configurations, retention policies, and security protocols. Participate in code reviews, unit testing, and performance tuning to ensure high-quality deliverables. Document pipeline architecture, data flow logic, and operational procedures for handover and support. Preferred candidate profile Proven experience in developing Kafka producers and consumers for real-time data ingestion pipelines. Strong hands-on expertise in Apache Kafka, Kafka Connect, Kafka Streams, and Schema Registry. Proficiency in Apache Spark (Structured Streaming) for real-time data transformation and enrichment. Solid understanding of IPFIX, NetFlow, and network flow data formats; experience integrating with nProbe Cento is a plus. Experience with Avro, JSON, or Protobuf for message serialization and schema evolution. Familiarity with Cloudera Data Platform components such as HDFS, Hive, YARN, and Knox. Experience integrating Kafka pipelines with data lakes or warehouses using Parquet or Delta formats. Strong programming skills in Scala, Java, or Python for stream processing and data engineering tasks. Knowledge of Kafka security protocols including TLS/SSL, Kerberos, and access control via Apache Ranger. Experience with monitoring and logging tools such as Prometheus, Grafana, and Splunk. Understanding of CI/CD pipelines, Git-based workflows, and containerization (Docker/Kubernetes)

Posted Date not available

Apply
Page 2 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies