Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
9.0 - 14.0 years
15 - 18 Lacs
navi mumbai
Work from Office
We are hiring an experienced Messaging Systems Specialist (Kafka & RabbitMQ) with 8+ years of expertise in administration, architecture, and performance tuning. Role involves deploying, managing, and securing messaging systems on Kubernetes and VMs. Required Candidate profile Seeking 8+ years experienced professional in Kafka & RabbitMQ administration with strong Kubernetes/Docker skills, certified in Kafka (Confluent), and proven ability in performance tuning and scaling.
Posted 3 days ago
2.0 - 6.0 years
0 Lacs
maharashtra
On-site
As a dedicated L1 Kafka Administrator at Ashnik, you will be an integral part of our 24/7 managed services team. Your primary responsibility will be to provide first-level support for a large Kafka cluster, ensuring its smooth operation and promptly addressing any issues that may arise. We are looking for a proactive individual with strong problem-solving skills and a keen interest in Kafka administration. Preferred Qualifications: - 4 to 5 years of industry experience - Bachelor's degree in Computer Science, Information Technology, or a related field - 2 to 3 years of relevant experience Key Responsibilities: - Monitor Kafka clusters for performance, availability, and security. - Perform routine operational checks to ensure the health of Kafka brokers, topics, and partitions. - Respond to alerts and incidents, troubleshoot issues, and escalate to L2 support as needed. - Assist in the deployment and configuration of Kafka brokers and other components. - Maintain and update documentation related to Kafka operations and incident management. - Work closely with the L2 support team to understand ongoing issues and learn from resolutions. - Ensure compliance with service level agreements (SLAs) and operational processes. - Participate in shift rotations to provide 24/7 support coverage. - Provide initial triage and troubleshooting for Kafka-related issues and incidents. Requirements: - Basic understanding of Confluent Kafka architecture and its components. - Familiarity with Linux/Unix operating systems and basic system administration tasks. - Experience with monitoring and logging tools (e.g., Prometheus, Grafana, Kibana). - Strong problem-solving skills and the ability to work under pressure. - Excellent communication and teamwork skills. - Willingness to work in a 24/7 shift environment, including nights, weekends, and holidays. - Experience with any messaging systems (e.g., RabbitMQ, ActiveMQ). - Basic knowledge of scripting languages (e.g., Bash, Python) for automation tasks. - Understanding of networking concepts and protocols. - Prior experience in a similar support role, preferably in a managed services environment. Join Ashnik and embark on a journey where you can establish yourself as a thought leader in the open-source space. Engage in transformative projects that shape the future of enterprise technology and broaden your technical and strategic skill set through diverse, hands-on engagements. Collaborate with industry pioneers and contribute directly to high-impact digital transformation projects for leading enterprises. Take advantage of benefits that promote well-being, flexibility, and growth while accelerating your career in a dynamic and supportive environment. About Ashnik: Founded in 2009, Ashnik is a leading provider of enterprise open-source solutions across Southeast Asia and India. We empower organizations to modernize their infrastructure, accelerate innovation, and drive digital transformation with comprehensive offerings in Solutions, Services, Support, and Subscriptions. Partnering with global open-source leaders, we are committed to enabling success through open, scalable, and future-ready technology solutions.,
Posted 1 week ago
2.0 - 6.0 years
0 Lacs
karnataka
On-site
As a Kafka Developer & Administrator at Translab.io, you will be an integral part of our team, utilizing your 2+ years of experience to develop, manage, and administer Kafka clusters efficiently. Your expertise in Change Data Capture (CDC) using Debezium, deployment, performance tuning, and integration will be crucial in ensuring seamless data streaming and high availability of services. Your responsibilities will include implementing and managing CDC processes, handling Kafka deployment, optimizing cluster performance, collaborating with cross-functional teams, and contributing to the design of streaming architectures. Troubleshooting and resolving issues related to Kafka clusters, pipelines, and integrations will also be part of your role. To excel in this position, you must possess hands-on experience in Kafka development and administration, along with a strong understanding of CDC concepts, Debezium implementation, cluster deployment, and monitoring. Product development experience, as well as proficiency in PySpark and Python, will be advantageous. A good grasp of distributed systems, data streaming architectures, and the ability to work collaboratively in a fast-paced environment are essential. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Engineering, or a related field. Strong problem-solving skills, excellent communication, and teamwork abilities will further complement your profile as a successful Kafka Developer & Administrator at Translab.io.,
Posted 2 weeks ago
5.0 - 10.0 years
15 - 25 Lacs
Navi Mumbai
Work from Office
We are hiring a RabbitMQ Admin with strong expertise in Kafka, messaging systems, and performance monitoring. This role involves managing and optimizing enterprise messaging infrastructure in a banking environment. Required Candidate profile Experienced Messaging Admin with hands-on Kafka & RabbitMQ skills, certified in Confluent Kafka,adept at ensuring high-performance message delivery,troubleshooting issues,securing middleware systems.
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
As a Kafka Administrator at our Merchant Ecommerce platform in Noida Sector 62, you will be responsible for managing, maintaining, and optimizing our distributed, multi-cluster Kafka infrastructure in an on-premise environment. You should have a deep understanding of Kafka internals, Zookeeper administration, and performance tuning to ensure operational excellence in high-throughput, low-latency production systems. Experience with API gateway operations (specifically Kong) and observability tooling is considered a plus. Your key responsibilities will include: - Managing multiple Kafka clusters with high-availability Zookeeper setups - Providing end-to-end operational support including deployment, configuration, and health monitoring of Kafka brokers and Zookeeper nodes - Conducting capacity planning, partition strategy optimization, and topic lifecycle management - Implementing backup and disaster recovery processes with defined RPO/RTO targets - Enforcing security configurations such as TLS encryption, authentication (SASL, mTLS), and ACL management - Optimizing Kafka producer and consumer performance to meet low-latency, high-throughput requirements - Planning and executing Kafka and Zookeeper upgrades and patching with minimal/zero downtime - Integrating Kafka with monitoring platforms like Prometheus, Grafana, or similar tools - Defining and enforcing log retention and archival policies in line with compliance requirements Additionally, you will be responsible for integrating Kafka metrics and logs with centralized observability and logging tools, creating dashboards and alerts to monitor Kafka consumer lag, partition health, and broker performance, and collaborating with DevOps/SRE teams to ensure visibility into Kafka services. You will also be involved in applying CIS benchmarks, performing automated security scans across Kafka nodes, managing secret and certificate rotation, supporting regular vulnerability assessments, and ensuring timely remediation. To be successful in this role, you should have: - 3+ years of hands-on Kafka administration experience in production environments - Strong understanding of Kafka internals and Zookeeper management - Experience in Kafka performance tuning, troubleshooting, and security mechanisms - Proficiency in monitoring and logging tools and scripting skills for operational automation Preferred qualifications include experience with API gateways, Kubernetes-based environments, compliance standards, security hardening practices, and Infrastructure as Code (IaC) tools. In return, we offer you a mission-critical role in managing large-scale real-time data infrastructure, a flexible work environment, opportunities for growth, and access to modern observability and automation tools.,
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
haryana
On-site
As a Kafka Administrator at our Merchant Ecommerce platform, located in Noida Sector 62, you will be responsible for managing, maintaining, and optimizing our distributed, multi-cluster Kafka infrastructure in an on-premise environment. Your role will require a deep understanding of Kafka internals, Zookeeper administration, performance tuning, and operational excellence in high-throughput, low-latency production systems. Additionally, experience with API gateway operations (specifically Kong) and observability tooling would be advantageous. Your key responsibilities will include managing multiple Kafka clusters with high-availability Zookeeper setups, conducting end-to-end operational support, capacity planning, implementation of backup and disaster recovery processes, enforcing security configurations, optimizing Kafka producer and consumer performance, planning and executing upgrades and patching, integrating with monitoring platforms, defining log retention and archival policies, monitoring Kafka metrics and logs, collaborating on security and compliance measures, and supporting regular vulnerability assessments. You will be expected to have at least 3+ years of hands-on Kafka administration experience in production environments, a strong understanding of Kafka internals and Zookeeper management, experience with performance tuning and troubleshooting, familiarity with security mechanisms like TLS/mTLS, ACLs, and SASL, proficiency with monitoring and logging tools, and scripting skills for operational automation. Experience with API gateways, Kubernetes-based environments, compliance standards, security hardening practices, and IaC tools would be a plus. In return, we offer you a mission-critical role in managing large-scale real-time data infrastructure, a flexible work environment, opportunities for growth, a supportive team, and access to modern observability and automation tools.,
Posted 1 month ago
5.0 - 8.0 years
3 - 6 Lacs
Navi Mumbai
Work from Office
Required Details. 1.Total IT Exp: 2.Exp in Kafka: 3.Exp in Kafka Connect, Schema Registry, Kafka Streams 4.Exp in Kafka cluster: 5.Current CTC: 6.Exp CTC: 7.Notice Period/LWD: 8.Current Location: 9.Willing to relocate to Navi Mumbai: 10.Willing to work on Alternate Saturdays: Job Title: Kafka Administrator (5+ Years Experience) Location : CBD Belapur Navi Mumbai Job Type : [Full-time] Experience Required : 5+ Years Educational Qualification: B.E B.Tech BCA B.Sc-IT MCA M.Sc-IT M.Tech Job Summary: We are looking for a skilled and experienced Kafka Administrator with a minimum of 5 years of experience in managing Apache Kafka environments. The ideal candidate will be responsible for the deployment, configuration, monitoring, and maintenance of Kafka clusters to ensure system scalability, reliability, and performance. Key Responsibilities: Install, configure, and maintain Apache Kafka clusters in production and development environments. Monitor Kafka systems using appropriate tools and proactively respond to issues. Set up Kafka topics, manage partitions, and define data retention policies. Perform upgrades and patch management for Kafka and its components. Collaborate with application teams to ensure seamless Kafka integration. Troubleshoot and resolve Kafka-related production issues. Develop and maintain scripts for automation of routine tasks. Ensure security, compliance, and data governance for Kafka infrastructure. Maintain documentation and operational runbooks. Required Skills: Strong experience with Apache Kafka and its ecosystem (Kafka Connect, Schema Registry, Kafka Streams). Proficient in Kafka cluster monitoring and performance tuning. Experience with tools such as Prometheus, Grafana, ELK stack. Solid knowledge of Linux/Unix system administration. Hands-on experience with scripting languages like Bash, Python. Familiarity with DevOps tools (Ansible, Jenkins, Git). Experience with cloud-based Kafka deployments (e.g., Confluent Cloud, AWS MSK) is a plus. Qualification Criteria: Candidates must hold at least one of the following degrees: - B.E (Bachelor of Engineering) - B.Tech (Bachelor of Technology) - BCA (Bachelor of Computer Applications) - B.Sc-IT (Bachelor of Science in Information Technology) - MCA (Master of Computer Applications) - M.Sc-IT (Master of Science in Information Technology) - M.Tech (Master of Technology) Preferred Certifications (Not Mandatory): Confluent Certified Administrator for Apache Kafka (CCAAK) Linux and Cloud Administration Certifications (RHCSA, AWS, Azure)
Posted 1 month ago
7.0 - 9.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a APIGEE Administrator to join our team in Bangalore, Karn?taka (IN-KA), India (IN). Role: APIGEE Administrator Responsibilities - 1. Designing and developing API proxies, implementing security policies (e.g., OAuth, JWT), and creating API product bundles. 2. Support users and administer Apigee OPDK. Integrating APIs with various systems and backend services. 3. Participate and contribute to the migration to Apigee X. Planning and executing API migrations between different Apigee environments 4. Automation of platform processes 5. Implementing security measures like authentication, authorization, mitigation, as well as managing traffic and performance optimization. 6. On-call support - Identifying and resolving API-related issues, providing support to developers and consumers, and ensuring high availability. 7. Implement architecture, including tests/CICD/monitoring/alerting/resilience/SLAs/Documentation 8. Collaborating with development teams, product owners, and other stakeholders to ensure seamless API integration and adoption Requirement - 1. Bachelor's degree (Computer Science/Information Technology/Electronics & Communication/ Information Science/Telecommunications) 2. 7+ years of work experience in IT Industry and strong knowledge in implementing/designing solutions using s/w application technologies 3. Good knowledge and experience of the Apigee OPDK platform and troubleshooting 4. Experience in AWS administration (EC2, Route53, Cloudtrail AWS WAF, Cloudwatch, EKS, AWS System Manager) 5. Good hands on experience in Redhat Linux administration and Shell scripting programming 6. Strong understanding of API design principles and best practices. 7. Kubernetes Admin, Github Cassandra Admin, Google Cloud 8. Familiar in managing Dynatrace Desirable . Jenkins . Proxy API Development . Kafka administration based on SASS (Confluent) . Knowledge of Azure . ELK About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at NTT DATA endeavors to make accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click . If you'd like more information on your EEO rights under the law, please click . For Pay Transparency information, please click.
Posted 1 month ago
8.0 - 10.0 years
25 - 27 Lacs
Bengaluru
Work from Office
Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Office (Bengaluru) Placement Type : Full time Permanent Position (*Note: This is a requirement for one of Uplers' client - SoHo Dragon) What do you need for this opportunity? Must have skills required: Kafka, Kafka Administration, Kafka Confluent, Linux SoHo Dragon is Looking for: Job Description We are seeking an experienced professional with 8-15 years of IT experience to join our team. The ideal candidate should possess expertise in Kafka architecture and operations, along with a strong understanding of Confluent-specific tools, Linux/Unix systems, networking, and security practices. Key Responsibilities & Requirements: Kafka Architecture & Operations: Deep understanding of core Kafka components including brokers, topics, partitions, producers, and consumers. Ability to create, configure, and manage Kafka topics and partitions. Confluent Ecosystem: Proficiency in Confluent-specific tools such as Control Centre, Schema Registry, ksqlDB, and Kafka Connect. Linux/Unix Expertise: Strong command over Linux/Unix systems, including shell scripting and system monitoring. Networking Knowledge: Understanding of network configurations, protocols, and security best practices to ensure efficient and secure Kafka operations. Programming Skills: Knowledge of Java programming and JVM tuning, as Kafka is built on Java. Automation & Scripting: Proficiency in scripting languages like Python or Bash for automation and management tasks. Monitoring & Logging: Experience with monitoring tools such as Prometheus, Grafana, and Confluent Control Centre to track Kafka performance. Familiarity with logging frameworks like Log4j for troubleshooting and maintaining Kafka logs. Security Practices: Implementation of security measures, including SSL/TLS encryption, Kerberos authentication, and access control lists (ACLs) to safeguard Kafka data. Integration Expertise: Experience in integrating Kafka with various systems and data sources, including databases, data lakes, and cloud services. Capacity Planning: Ability to plan and scale Kafka clusters to handle dynamic workloads while ensuring high availability. Backup & Recovery: Knowledge of backup and recovery strategies to protect data and ensure business continuity in case of failures (a frequent task in T&S). Preferred Qualification: Confluent Certification: Preference for candidates holding the Confluent Certified Administrator for Apache Kafka (CCAAK) certification. Note: This role is a 6-month contractual position with the possibility of extension based on performance. The work location is Bangalore-Eco World Bellandur, and it requires on-site presence at the office.
Posted 2 months ago
5.0 - 10.0 years
15 - 20 Lacs
Noida, Gurugram
Work from Office
Manage end-to-end activities including installation, configuring & maintenance of Kafka clusters, managing topics configured in Kafka, ensuring maximum uptime of Kafka. Monitor performance of producer & consumer threads interacting with Kafka. Required Candidate profile Kafka Certification Must have hands-on experience in managing large Kafka clusters and installations Ability to monitor performance of producer consumer threads interacting with Kafka.
Posted 2 months ago
8.0 - 13.0 years
20 - 27 Lacs
Bengaluru
Work from Office
Senior Developer Kafka Experience: 8 - 20 Years Exp Salary : INR 25-28 Lacs per annum Preferred Notice Period : Within 30 Days Shift : 10:00AM to 7:00PM IST Opportunity Type: Onsite (Bengaluru) Placement Type: Permanent (*Note: This is a requirement for one of Uplers' Clients) Must have skills required : Kafka, Kafka Administration, Kafka Confluent, Linux SoHo Dragon (One of Uplers' Clients) is Looking for: Senior Developer Kafka who is passionate about their work, eager to learn and grow, and who is committed to delivering exceptional results. If you are a team player, with a positive attitude and a desire to make a difference, then we want to hear from you. Role Overview Description Job Description We are seeking an experienced professional with 8-15 years of IT experience to join our team. The ideal candidate should possess expertise in Kafka architecture and operations, along with a strong understanding of Confluent-specific tools, Linux/Unix systems, networking, and security practices. Key Responsibilities & Requirements: Kafka Architecture & Operations: Deep understanding of core Kafka components including brokers, topics, partitions, producers, and consumers. Ability to create, configure, and manage Kafka topics and partitions. Confluent Ecosystem: Proficiency in Confluent-specific tools such as Control Centre, Schema Registry, ksqlDB, and Kafka Connect. Linux/Unix Expertise: Strong command over Linux/Unix systems, including shell scripting and system monitoring. Networking Knowledge: Understanding of network configurations, protocols, and security best practices to ensure efficient and secure Kafka operations. Programming Skills: Knowledge of Java programming and JVM tuning, as Kafka is built on Java. Automation & Scripting: Proficiency in scripting languages like Python or Bash for automation and management tasks. Monitoring & Logging: Experience with monitoring tools such as Prometheus, Grafana, and Confluent Control Centre to track Kafka performance. Familiarity with logging frameworks like Log4j for troubleshooting and maintaining Kafka logs. Security Practices: Implementation of security measures, including SSL/TLS encryption, Kerberos authentication, and access control lists (ACLs) to safeguard Kafka data. Integration Expertise: Experience in integrating Kafka with various systems and data sources, including databases, data lakes, and cloud services. Capacity Planning: Ability to plan and scale Kafka clusters to handle dynamic workloads while ensuring high availability. Backup & Recovery: Knowledge of backup and recovery strategies to protect data and ensure business continuity in case of failures (a frequent task in T&S). Preferred Qualification: Confluent Certification: Preference for candidates holding the Confluent Certified Administrator for Apache Kafka (CCAAK) certification. Note: This role is a 6-month contractual position with the possibility of extension based on performance. The work location is Bangalore-Eco World Bellandur, and it requires on-site presence at the office. How to apply for this opportunity: Easy 3-Step Process: 1. Click On Apply! And Register or log in on our portal 2. Upload updated Resume & Complete the Screening Form 3. Increase your chances to get shortlisted & meet the client for the Interview! About Our Client: We are a full-service Software Application Development company that focuses on portals, document management, collaboration, business intelligence, CRM tools, cloud technology, and data. Much of the work done for our clients are based in the Microsoft Application stack of business tools. About Uplers: Our goal is to make hiring and getting hired reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant product and engineering job opportunities and progress in their career. (Note: There are many more opportunities apart from this on the portal.) So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City