Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
15.0 - 20.0 years
10 - 14 Lacs
pune
Work from Office
We help our customers free up time and space to become an Autonomous Digital Enterprise that conquers the opportunities ahead - and are relentless in the pursuit of innovation! BMC is looking for an experienced DevOps Engineer to join us and design, develop, and implement complex applications, using the latest technologies. Here is how, through this exciting role, YOU will contribute to BMC's and your own success: Design and implement critical product deployment features with high quality, scalability and variability in mind. Guide and mentor the team for technical excellence and help resolve technical impediments. Collaborate with cross-functional teams to deliver robust, production-grade software applications. Drive DevOps practices, CI/CD pipelines, and deployment automation across Kubernetes/OpenShift environments. Troubleshoot and resolve technical issues efficiently across multiple environments and releases. Lead technical execution within an Agile development environment and manage scrum team activities. Coordinate with multiple product teams for effective problem isolation and resolution. Continuously explore and learn new technologies and integrations relevant to the product. To ensure youre set up for success, you will bring the following skillset & experience: 12-15+ years of experience in commercial-grade software development with strong object-oriented programming skills. Strong hands-on experience with Kubernetes, OpenShift, Helm, Docker, and Shell scripting. Good knowledge of Linux operating systems and relational databases such as Postgres and Oracle. Experience with CI/CD tools and build systems like Jenkins, Maven, and Makefiles. Proficient in at least one programming language such as C, Java, Python, or Perl. Strong communication skills and experience working in Agile environments. Whilst these are nice to have, our team can help you develop in the following skills: Experience with microservice architecture and development. Familiarity with technologies like Kafka, Elasticsearch, Zookeeper, Redis, and VictoriaMetrics. Exposure to cloud platforms such as AWS or Oracle Cloud Infrastructure (OCI). Experience with source control (GIT) and project tracking tools like JIRA and TestTrack.
Posted 1 day ago
3.0 - 7.0 years
14 - 19 Lacs
vadodara
Work from Office
Title and Summary Senior Software Engineer (SDET)Job Description Summary Overview The MDES team is looking for a Senior Software Development Engineer who can develop microservices based Enterprise applications using Java J2EE stack. Also, development of Portals which would be either used by customer care, end user, customer representatives etc.. The ideal candidate is the one who is passionate about designing & developing high quality code which is highly scalable, operable & highly available Role Develop (code) Enterprise Application with quality, within schedule and within estimated efforts. Assist Lead Engineer in low level design Provide estimate for the assigned task Write and execute Unit, Integration test cases Provide accurate status of the tasks Perform peer review and mentor junior team members Comply with organizations processes. Policies and protects organizations Intellectual property. Also, participate in organization level process improvement and knowledge sharing All About You Essential knowledge, skills & attributes Hands on experience with core Java, Spring Boot, Spring (MVC, IOC, AOP, Security), SQL, RDBMS (Oracle and PostGRES), Web-services (JSON and SOAP), Kafka, Zookeeper Hands on experience of developing microservice application & deploying them on any one of the public cloud like Google, AWS, Azure Hands on experience of using Intellij/Eclipse/My Eclipse IDE Hands on experience of writing Junit test cases, working with Maven/Ant/Gradle, GIT Knowledge of Design Patterns Experience of working with Agile methodologies. Personal attributes are strong logical and Analytical Skills, design skills, should be able to articulate and present his/her thoughts very clearly and precisely in English (written and verbal) Knowledge of Security concepts (E.g. authentication, authorization, confidentiality etc.) and protocols, their usage in enterprise application Additional/Desirable capabilities Experience of working in Payments application Domain Hands on experience of working with tools like Mockito, JBehave, Jenkins, Bamboo, Confluence, Rally
Posted 1 day ago
4.0 - 7.0 years
10 - 15 Lacs
vadodara
Work from Office
Title and Summary Senior Software EngineerOverview The MDES team is looking for a Senior Software Development Engineer who can develop microservices based Enterprise applications using Java J2EE stack. Also, development of Portals which would be either used by customer care, end user, customer representatives etc.. The ideal candidate is the one who is passionate about designing & developing high quality code which is highly scalable, operable & highly available Role Develop (code) Enterprise Application with quality, within schedule and within estimated efforts. Assist Lead Engineer in low level design Provide estimate for the assigned task Write and execute Unit, Integration test cases Provide accurate status of the tasks Perform peer review and mentor junior team members Comply with organizations processes. Policies and protects organizations Intellectual property. Also, participate in organization level process improvement and knowledge sharing All About You Essential knowledge, skills & attributes Hands on experience with core Java, Spring Boot, Spring (MVC, IOC, AOP, Security), SQL, RDBMS (Oracle and PostGRES), Web-services (JSON and SOAP), Kafka, Zookeeper Hands on experience of developing microservice application & deploying them on any one of the public cloud like Google, AWS, Azure Hands on experience of using Intellij/Eclipse/My Eclipse IDE Hands on experience of writing Junit test cases, working with Maven/Ant/Gradle, GIT Knowledge of Design Patterns Experience of working with Agile methodologies. Personal attributes are strong logical and Analytical Skills, design skills, should be able to articulate and present his/her thoughts very clearly and precisely in English (written and verbal) Knowledge of Security concepts (E.g. authentication, authorization, confidentiality etc.) and protocols, their usage in enterprise application Additional/Desirable capabilities Experience of working in Payments application Domain Hands on experience of working with tools like Mockito, JBehave, Jenkins, Bamboo, Confluence, Rally
Posted 1 day ago
5.0 - 7.0 years
3 - 5 Lacs
hyderabad, india
Hybrid
Job Purpose Designs, develops, and implements Java applications to support business requirements. Follows approved life cycle methodologies, creates design documents, writes code and performs unit and functional testing of software. Contributes to the overall architecture and standards of the group, acts as an SME and plays a software governance role. Key Activities / Outputs • Work closely with business analysts to analyse and understand the business requirements and business case, in order to produce simple, cost effective and innovative solution designs • Implement the designed solutions in the required development language (typically Java) in accordance with the Vitality Group standards, processes, tools and frameworks • Testing the quality of produced software thoroughly through participation in code reviews, the use of static code analysis tools, creation and execution of unit tests, functional regression tests, load tests and stress tests and evaluating the results of performance metrics collected on the software. • Participate in feasibility studies, proof of concepts, JAD sessions, estimation and costing sessions, evaluate and review programming methods, tools and standards, etc. • Maintain the system in production and provide support in the form of query resolution and defect fixes • Prepare the necessary technical documentation including payload definitions, class diagrams, activity diagrams, ERDs, operational and support documentation, etc • Driving the skills development of team members, coaching of team members for performance and coaching on career development, recruitment, staff training, performance management, etc Technical Skills or Knowledge Java, Object Orientation, Spring, Hibernate, Junit, SOA, SOAP, REST, Microservices, Docker, Data Modelling, UML, SQL, Architectural Styles, Liferay 7 (web), Kotlin (Android), Swift (iOS) Preferred Technical Skills (Would be advantageous) Kafka, Zookeeper, Zuul, Eureka, Obsidian, Elasticsearch, Kibana, Fluentd This position is a hybrid role based in Hyderabad which requires you to be in the office on a Tuesday, Wednesday and Thursday.
Posted 2 days ago
8.0 - 13.0 years
20 - 35 Lacs
gurugram
Remote
Kafka Developer (7+ Years Experience) Position Overview We are seeking a highly skilled Kafka Developer with 7+ years of experience in designing, developing, and deploying real-time data streaming solutions. The ideal candidate will have strong expertise in Apache Kafka, distributed systems, and event-driven architecture, along with proficiency in Java/Scala/Python. Key Responsibilities Design, develop, and optimize Kafka-based real-time data pipelines and event-driven solutions. Implement and maintain Kafka producers, consumers, and stream processing applications. Build and configure Kafka Connectors, Schema Registry, and KSQL/Kafka Streams for data integration. Manage Kafka clusters (on-premises and cloud AWS MSK, Confluent Cloud, Azure Event Hubs). Ensure high availability, scalability, and reliability of Kafka infrastructure. Troubleshoot and resolve issues related to Kafka performance, lag, replication, offsets, and throughput. Implement security best practices (SSL/TLS, SASL, Kerberos, RBAC). Collaborate with cross-functional teams (data engineers, architects, DevOps, business stakeholders). Write unit/integration tests and ensure code quality and performance tuning. Document solutions, best practices, and knowledge sharing within the team. Required Skills & Experience 7+ years of software development experience, with at least 4+ years in Kafka development. Strong hands-on experience with Apache Kafka APIs (Producer, Consumer, Streams, Connect). Proficiency in Java or Scala (Python/Go is a plus). Solid understanding of event-driven and microservices architecture. Experience with serialization formats (Avro, Protobuf, JSON). Strong knowledge of distributed systems concepts (partitioning, replication, consensus). Experience with Confluent Platform and ecosystem tools (Schema Registry, REST Proxy, Control Center). Exposure to cloud-based Kafka (AWS MSK, Confluent Cloud, Azure Event Hubs, GCP Pub/Sub). Hands-on with CI/CD, Docker, Kubernetes, and monitoring tools (Prometheus, Grafana, Splunk). Strong problem-solving skills with the ability to troubleshoot latency, lag, and cluster performance issues.
Posted 3 days ago
5.0 - 8.0 years
7 - 11 Lacs
pune
Work from Office
Position Overview We are seeking a skilled and experienced Senior PySpark Developer with expertise in Apache spark, Spark Batch, and Spark Streaming to join our dynamic team. The ideal candidate will design, develop, and maintain high-performance, scalable applications for processing large-scale data in batch and real-time environments. Required Skills and Qualifications Experience : 7+ years of professional experience in PySpark development. Technical Skills : Strong proficiency in PySpark Deep understanding of Apache Spark architecture, distributed computing concepts and parallel processing. Proven experience in building and optimizing complex ETL pipeline using PySpark Experience with various spark components Expertise in Spark Batch for large-scale data processing and analytics. Experience with Spark Streaming for real-time data processing and streaming pipelines. Familiarity with distributed computing concepts and big data frameworks. Mandatory Skills: PySpark.Experience: 5-8 Years.
Posted 3 days ago
3.0 - 8.0 years
10 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Job Description: Standing up and administer on premise Kafka cluster. Ability to architect and create reference architecture for kafka Implementation standards Provide expertise in Kafka brokers, zookeepers, Kafka connect, schema registry, KSQL, Rest proxy and Kafka Control center. Ensure optimum performance, high availability and stability of solutions. Create topics, setup redundancy cluster, deploy monitoring tools, alerts and has good knowledge of best practices. Create stubs for producers, consumers and consumer groups for helping onboard applications from different languages/platforms. Provide administration and operations of the Kafka platform like provisioning, access lists Kerberos and SSL configurations. Use automation tools like provisioning using Docker, Jenkins and GitLab. Ability to perform data related benchmarking, performance analysis and tuning. Strong skills in In-memory applications, Database Design, Data Integration. Involve in design and capacity review meetings to provide suggestion in Kafka usage. Solid knowledge of monitoring tools and fine tuning alerts on Splunk, Prometheus, Grafana ,Splunk. Setting up security on Kafka. Providing naming conventions, Backup & Recovery and problem determination strategies for the projects. Monitor, prevent and troubleshoot security related issues. Provide strategic vision in engineering solutions that touch the messaging queue aspect of the infrastructure QUALIFICATIONS Demonstrated proficiency and experience in design, implementation, monitoring, and troubleshooting Kafka messaging infrastructure. Hands on experience on recovery in Kafka. 2 or more years of experience in developing/customizing messaging related monitoring tools/utilities. Good Scripting knowledge/experience with one or more (ex. Chef, Ansible, Terraform). Good programming knowledge/experience with one or more languages (ex. Java, node.js, python) Considerable experience in implementing Kerberos Security. Support 24*7 Model and be available to support rotational on-call work Competent working in one or more environments highly integrated with an operating system. Experience implementing and administering/managing technical solutions in major, large-scale system implementations. High critical thinking skills to evaluate alternatives and present solutions that are consistent with business objectives and strategy. Ability to manage tasks independently and take ownership of responsibilities Ability to learn from mistakes and apply constructive feedback to improve performance Ability to adapt to a rapidly changing environment. Proven leadership abilities including effective knowledge sharing, conflict resolution, facilitation of open discussions, fairness and displaying appropriate levels of assertiveness. Ability to communicate highly complex technical information clearly and articulately for all levels and audiences. Willingness to learn new technologies/tool and train your peers. Proven track record to automate.
Posted 3 weeks ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
As a Database Cloud Engineer at Salesforce, you will play a crucial role in ensuring the reliability, scalability, and performance of our vast cloud database infrastructure. Your responsibilities will involve architecting and operating resilient, secure, and performant database environments across public cloud platforms such as AWS and GCP. Collaborating across various teams, you will deliver cloud-native reliability solutions at a massive scale, contributing to one of the largest SaaS platforms globally. The CRM Database Sustaining Engineering team is a fast-paced and dynamic global team responsible for delivering and supporting databases and their cloud infrastructure to meet the substantial growth needs of the business. In this role, you will work closely with other engineering teams to deliver innovative solutions in an agile, dynamic environment. Collaboration with Application, Systems, Network, Database, and Storage teams is key to your success. As part of the Global Team, you will be engaged in 24*7 support responsibilities within Europe, requiring occasional flexibility in working hours to align globally. You will be immersed in managing Salesforce cloud databases running on cutting-edge cloud technology and ensuring their reliability. Job Requirements: - Bachelor's degree in Computer Science or Engineering, or equivalent experience. - Minimum of 8+ years of experience as a Database Engineer or similar role. - Expertise in Database and SQL performance tuning for relational databases. - Knowledge and hands-on experience with Postgres database is a plus. - Deep knowledge of at least two relational databases, including Oracle, PostgreSQL, and MySQL. - Working knowledge of cloud platforms like AWS or GCP is highly desirable. - Experience with cloud technologies such as Docker, Spinnaker, Terraform, Helm, Jenkins, and GIT. Exposure to Zookeeper fundamentals and Kubernetes is highly desirable. - Proficiency in SQL and at least one procedural language like Python, Go, or Java. Basic understanding of C is preferred. - Strong problem-solving skills and experience with Production Incident Management and Root Cause analysis. - Experience with mission-critical distributed systems service and supporting Database Production Infrastructure with 24x7x365 responsibilities. - Exposure to a fast-paced environment with a large-scale cloud infrastructure setup. - Excellent communication skills and attention to detail, with a proactive and self-starting approach. Preferred Qualifications: - Hands-on DevOps experience, including CI/CD pipelines and container orchestration like Kubernetes, EKS, or GKE. - Cloud-native DevOps experience with CI/CD, EKS/GKE, and cloud deployments. - Familiarity with distributed coordination systems such as Apache Zookeeper. - Deep understanding of distributed systems, availability design patterns, and database internals. - Expertise in monitoring and alerting using tools like Grafana, Argus, or similar. - Automation experience with tools like Spinnaker, Helm, and Infrastructure as Code frameworks. - Ability to drive technical projects from ideation to execution with minimal supervision.,
Posted 4 weeks ago
8.0 - 11.0 years
35 - 37 Lacs
Kolkata, Ahmedabad, Bengaluru
Work from Office
Dear Candidate, We are looking for a Big Data Developer to build and maintain scalable data processing systems. The ideal candidate will have experience handling large datasets and working with distributed computing frameworks. Key Responsibilities: Design and develop data pipelines using Hadoop, Spark, or Flink. Optimize big data applications for performance and reliability. Integrate various structured and unstructured data sources. Work with data scientists and analysts to prepare datasets. Ensure data quality, security, and lineage across platforms. Required Skills & Qualifications: Experience with Hadoop ecosystem (HDFS, Hive, Pig) and Apache Spark. Proficiency in Java, Scala, or Python. Familiarity with data ingestion tools (Kafka, Sqoop, NiFi). Strong understanding of distributed computing principles. Knowledge of cloud-based big data services (e.g., EMR, Dataproc, HDInsight). Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 1 month ago
2.0 - 4.0 years
7 - 12 Lacs
Bengaluru
Work from Office
YOUR IMPACT: OpenText is the market leader in Enterprise Information Management platforms and applications. As a Software Engineer you will utilize your knowledge and experience to perform systems analysis, research, maintenance, troubleshooting and other programming activities. You become a member of one of our Agile based project teams, focusing on product development. It is an exciting opportunity to design and implement solutions for enterprise level systems. OpenText Business Network is a modern cloud platform that helps manage the full data lifecycle, from information capture and exchange to integration and governance. Business Network solutions establish the necessary digital backbone for streamlined connectivity, secure collaboration, and real-time business intelligence across an expanding network of internal systems, cloud applications, trading partner systems and connected devices. WHAT THE ROLE OFFERS : Meeting with the software development team to discuss project definitions and goals. Analyze user and system requirements for the software product Preparing high-level & low-level design documents and involve in development of software Provide hands-on technical leadership and guidance to junior engineers. Participates in and drives design sessions in the team. Writing excellent code following industry best practices. Write efficient code based on feature specifications Designing software database architecture Testing and debugging software applications Validating the functionality and security of the application. Respond promptly and professionally to bug reports and customer escalations. Perform the job with minimal assistance. Development, deployment, and support in production. Works with Quality Assurance to transfer knowledge and develop the test strategy and validate the test plan Technical risk assessment, problem solving, early risk notification, report progress and status to manager and/or Scrum master. Troubleshooting Production issues within the defined SLAs WHAT YOU NEED TO SUCCEED : Bachelors degree (Computer Science preferred) with 2+ years of experience in software development. Ability to take direction and work with minimal supervision. Ability to work in a deadline driven environment and respond creatively to pressure. Ability to work on multiple projects simultaneously. Excellent analytical skills Hands-on expertise in OOPs, Java/J2EE Experience working on at least one Application server (Tomcat, BEA Weblogic, IBM Websphere, JBoss) Experience in Database design and strong knowledge of SQL/PLSQL Experience in designing/development (design patterns) and testing of enterprise class systems. Experience in UI design and development using technologies like Angular/React Experience in Spring, Spring boot, Hibernate, RESTFul Services, microservices. Strong communication skills - verbal, written and listening Excellent trouble-shooting skills. Desired Skills: Experience in Hibernate, JSP/Servlets, JMS, XSLT, XML basics, XQuery, Kotlin Experience in Solr/Zookeeper is an added advantage Experience in Rest Security like OAuth, JWT will be an added advantage
Posted 1 month ago
5.0 - 10.0 years
9 - 13 Lacs
Kannur
Work from Office
Role Purpose Required Skills: 5+Years of experience in system administration, application development, infrastructure development or related areas 5+ years of experience with programming in languages like Javascript, Python, PHP, Go, Java or Ruby 3+ years of in reading, understanding and writing code in the same 3+years Mastery of infrastructure automation technologies (like Terraform, Code Deploy, Puppet, Ansible, Chef) 3+years expertise in container/container-fleet-orchestration technologies (like Kubernetes, Openshift, AKS, EKS, Docker, Vagrant, etcd, zookeeper) 5+ years Cloud and container native Linux administration /build/ management skills Key Responsibilities: Hands-on design, analysis, development and troubleshooting of highly-distributed large-scale production systems and event-driven, cloud-based services Primarily Linux Administration, managing a fleet of Linux and Windows VMs as part of the application solutions Involved in Pull Requests for site reliability goals Advocate IaC (Infrastructure as Code) and CaC (Configuration as Code) practices within Honeywell HCE Ownership of reliability, up time, system security, cost, operations, capacity and performance-analysis Monitor and report on service level objectives for a given applications services. Work with the business, Technology teams and product owners to establish key service level indicators. Ensuring the repeatability, traceability, and transparency of our infrastructure automation Support on-call rotations for operational duties that have not been addressed with automation Support healthy software development practices, including complying with the chosen software development methodology (Agile, or alternatives), building standards for code reviews, work packaging, etc. Create and maintain monitoring technologies and processes that improve the visibility to our applications' performance and business metrics and keep operational workload in-check. Partnering with security engineers and developing plans and automation to aggressively and safely respond to new risks and vulnerabilities. Develop, communicate, collaborate, and monitor standard processes to promote the long-term health and sustainability of operational development tasks.
Posted 1 month ago
5.0 - 10.0 years
7 - 17 Lacs
Gurugram
Work from Office
Job Title : Kafka Integration Specialist Job Description : We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.
Posted 1 month ago
8.0 - 12.0 years
0 Lacs
hyderabad, telangana
On-site
Salesforce is the global leader in customer relationship management (CRM) software, pioneering the shift to cloud computing. Today, Salesforce delivers the next generation of social, mobile, and cloud technologies to help companies revolutionize the way they sell, service, market, and innovate, enabling them to become customer-centric organizations. As the fastest-growing enterprise software company in the top 10, Salesforce has been recognized as the World's Most Innovative Company by Forbes and as one of Fortune's 100 Best Companies to Work For. The CRM Database Sustaining Engineering Team at Salesforce is responsible for deploying and managing some of the largest and most trusted databases globally. Customers rely on this team to ensure the safety and high availability of their data. As a Database Cloud Engineer at Salesforce, you will have a mission-critical role in ensuring the reliability, scalability, and performance of Salesforce's extensive cloud database infrastructure. You will contribute to powering one of the largest Software as a Service (SaaS) platforms globally. We are seeking engineers with a DevOps mindset and deep expertise in databases to architect and operate secure, resilient, and high-performance database environments across public cloud platforms such as AWS and GCP. Collaboration across various domains including systems, storage, networking, and applications is essential to deliver cloud-native reliability solutions at a massive scale. The CRM Database Sustaining Engineering team is a dynamic and fast-paced global team that delivers and supports databases and cloud infrastructure to meet the evolving needs of the business. In this role, you will collaborate with other engineering teams to deliver innovative solutions in an agile and dynamic environment. As part of the Global Team, you will engage in 24/7 support responsibilities within Europe, requiring occasional flexibility in working hours to align globally. You will be responsible for the reliability of Salesforce's cloud database, running on cutting-edge cloud technology. **Job Requirements:** - Bachelor's in Computer Science or Engineering, or equivalent experience. - Minimum of 8+ years of experience as a Database Engineer or in a similar role. - Expertise in Database and SQL performance tuning in at least one relational database. - Knowledge and hands-on experience with Postgres database is advantageous. - Broad and deep knowledge of at least two relational databases, including Oracle, PostgreSQL, and MySQL. - Working knowledge of cloud platforms such as AWS or GCP is highly desirable. - Experience with cloud technologies like Docker, Spinnaker, Terraform, Helm, Jenkins, GIT, etc. Exposure to Zookeeper fundamentals and Kubernetes is highly desirable. - Proficiency in SQL and at least one procedural language such as Python, Go, or Java, with a basic understanding of C. - Excellent problem-solving skills and experience with Production Incident Management and Root Cause analysis. - Experience with mission-critical distributed systems service, including supporting Database Production Infrastructure with 24x7x365 support responsibilities. - Exposure to a fast-paced environment with a large-scale cloud infrastructure setup. - Strong speaking, listening, and writing skills, attention to detail, and a proactive self-starter. **Preferred Qualifications:** - Hands-on DevOps experience including CI/CD pipelines and container orchestration (Kubernetes, EKS/GKE). - Cloud-native DevOps experience (CI/CD, EKS/GKE, cloud deployments). - Familiarity with distributed coordination systems like Apache Zookeeper. - Deep understanding of distributed systems, availability design patterns, and database internals. - Monitoring and alerting expertise using tools like Grafana, Argus, or similar. - Automation experience with tools like Spinnaker, Helm, and Infrastructure as Code frameworks. - Ability to drive technical projects from idea to execution with minimal supervision.,
Posted 1 month ago
4.0 - 8.0 years
15 - 25 Lacs
Bengaluru
Work from Office
Job Summary: We are looking for a skilled Apache Solr Engineer to design, implement, and maintain scalable and high-performance search solutions. The ideal candidate will have hands-on experience with Solr/SolrCloud, strong analytical skills, and the ability to work in cross-functional teams to deliver efficient search functionalities across enterprise or customer-facing applications. Experience: 4–8 years Roles and Responsibilities Key Responsibilities: Design, develop, and maintain enterprise-grade search solutions using Apache Solr and SolrCloud . Develop and optimize search indexes and schema based on use cases like product search, document search, or order/invoice search. Integrate Solr with backend systems, databases and APIs. Implement full-text search , faceted search , auto-suggestions , ranking , and relevancy tuning . Optimize search performance, indexing throughput, and query response time. Ensure data consistency and high availability using SolrCloud and Zookeeper (cluster coordination & configuration management). Monitor search system health and troubleshoot issues in production. Collaborate with product teams, data engineers, and DevOps teams for smooth delivery. Stay up to date with new features of Apache Lucene/Solr and recommend improvements. Required Skills & Qualifications: Strong experience in Apache Solr & SolrCloud Good understanding of Lucene , inverted index , analyzers , tokenizers , and search relevance tuning . Proficient in Java or Python for backend integration and development. Experience with RESTful APIs , data pipelines, and real-time indexing. Familiarity with Zookeeper , Docker , Kubernetes (for SolrCloud deployments). Knowledge of JSON , XML , and schema design in Solr. Experience with log analysis , performance tuning , and monitoring tools like Prometheus/Grafana is a plus. Exposure to e-commerce or document management search use cases is an advantage. Preferred Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. Experience with Elasticsearch or other search technologies is a plus. Working knowledge of CI/CD pipelines and cloud platforms ( Azure).
Posted 1 month ago
5.0 - 10.0 years
3 - 6 Lacs
Noida
Work from Office
We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.
Posted 1 month ago
5.0 - 10.0 years
3 - 6 Lacs
Pune
Work from Office
We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.
Posted 1 month ago
3.0 - 8.0 years
18 - 22 Lacs
Navi Mumbai, Mumbai (All Areas)
Work from Office
1 Education : B.E./B.Tech/MCA in Computer Science 2 Experience : Must have 7+ years relevant experience in the field of database Administration. 3 Mandatory Skills/Knowledge Candidate should be technically sound in multiple distribution like Cloudera, Confluent, open source Kafka. Candidate should be technically sound in Kafka, Zookeeper Candidate should well versed in capacity planning and performance tuning. Candidate should be expertise in implementation of security in ecosystem Hadoop Security ranger , Kerberos ,SSL Candidate should be expertise in dev ops tool like ansible, Nagios, shell scripting python , Jenkins, Ansible, Git, Maven to implement automation . Candidate should able to Monitor, Debug & RCA for any service failure. Knowledge of network infrastructure for eg. TCP/IP, DNS, Firewall, router, load balancer. Creative analytical and problem-solving skills Provide RCAs for critical & recurring incidents. Provide on-call service coverage within a larger group Good aptitude in multi-threading and concurrency concepts. 4 Preferred Skills/Knowledge Expert Knowledge of database administration and architecture Hands on Operating System Commands Kindly Share CVs on snehal.sankade@outworx.com
Posted 1 month ago
5.0 - 10.0 years
3 - 6 Lacs
Ahmedabad
Work from Office
Job Title : Kafka Integration Specialist Job Description : We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.
Posted 1 month ago
5.0 - 10.0 years
10 - 13 Lacs
Noida, Gurugram, Bengaluru
Hybrid
This role is for a client of ours. Prior experience of Contract work is preferred. Role Type: Contract Contract Duration: 6 Months (Extendable) Location: Gurgaon/ Noida/ Bangalore Work Mode: Hybrid Max Budget: 1.1 Lac/Month (depending on candidate) Required Candidate profile Experience of 5+ yrs in Java, esp. in Order and Execution Management, Trading systems SDLC, MySQL, Spring.
Posted 1 month ago
5.0 - 10.0 years
3 - 6 Lacs
Chennai
Work from Office
Job Title : Kafka Integration Specialist Job Description : We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.
Posted 1 month ago
4.0 - 9.0 years
12 - 16 Lacs
Bengaluru
Work from Office
As a Fortune 50 company with more than 400,000 team members worldwide, Target is an iconic brand and one of America's leading retailers. Joining Target means promoting a culture of mutual care and respect and striving to make the most meaningful and positive impact. Becoming a Target team member means joining a community that values different voices and lifts each other up. Here, we believe your unique perspective is important, and you'll build relationships by being authentic and respectful. Overview about TII At Target, we have a timeless purpose and a proven strategy. And that hasnt happened by accident. Some of the best minds from different backgrounds come together at Target to redefine retail in an inclusive learning environment that values people and delivers world-class outcomes. That winning formula is especially apparent in Bengaluru, where Target in India operates as a fully integrated part of Targets global team and has more than 4,000 team members supporting the companys global strategy and operations. SENIOR ENGINEER KAFKA STREAMING PLATFORM - Heres a smattering of approaches important to us and the technologies we use Everything we do is as-code in version control. We dont like clicking buttons or doing things manually. All development or infra config changes go through a pull-request process, so youll always have a say to thumbs up or down things you catch. Everything should have test cases and they go through a continuous integration process. We understand the importance of logs and metrics, so having visibility to things you need to see to do your job isnt an issue. And if you need to add more metrics or see more logs, its within our control to improve that. We try to own as much of the platform as we reasonably can. You dont need to rely on other teams outside our own to improve the stack or change the way we do things. Kafka/Streaming Stack CodeSpring Boot (Java/Kotlin), Restful API, Golang PlatformApache Kafka 2.x, TAP, GCP, Ansible, Terraform, Docker, Vela Alerting/MonitoringGrafana, Kibana, ELK stack As a Senior Engineer on Targets Streaming Platform Team, you'll . . Help build out the Kafka/Streaming capability in India Write and deploy code that enhances the Kafka platform Designs infrastructure solutions that support automation, self- provisioning, product health, security/compliance, resiliency, zero- call aspiration, and are Guest/Team Member experience focused Troubleshoot and resolve platform operational issues Requirements 4+ years of experience developing in JVM-based languages (e.g. Java/Kotlin) Ability to apply skills to solve problems, aptitude to learn additional technologies or go deeper in an area. Has good basic programming/infrastructure skills and is able to quickly gather the skills necessary to accomplish the task at hand. Intermediate knowledge and skills associated with infrastructure- based technologies Works across the team to recommend solutions that are in accordance with accepted testing frameworks. Experience with modern platforms and CI/CD stacks (e.g. GitHub, Vela, Docker) Highly productive, self-starter and self-motivated Passionate about staying current with new and evolving technologies Desired 4+ years of experience developing high quality applications and/or supporting critical enterprise platforms Experience with Kafka, Containers(k8s), Zookeeper, worked with any one of the major public cloud providers ( GCP/AWS/AZURE) Familiarity with Golang and microservices architecture is a big plus Participate in day-to-day support requests by performing the admin tasks. Install and maintain standard Kafka componentsControl Center, ZooKeeper, and Brokers Strong understanding of infrastructure/software and how these systems are secured, analyzed, and investigated. Is a contact point for their team and is able to help answer questions for other groups and/or management Partner with teams to prioritize and improve services throughout the software development lifecycle Personal or professional experience contributing to open-source projects Innovative mindset willingness to push new ideas into the company Useful Links- Life at Target- https://india.target.com/ Benefits- https://india.target.com/life-at-target/workplace/benefits Culture https://india.target.com/life-at-target/belonging
Posted 1 month ago
5.0 - 9.0 years
3 - 6 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.
Posted 1 month ago
5.0 - 10.0 years
7 - 17 Lacs
Bengaluru
Work from Office
Job Title : Kafka Integration Specialist We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.
Posted 1 month ago
5.0 - 10.0 years
3 - 6 Lacs
Mumbai
Work from Office
We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.
Posted 1 month ago
5.0 - 10.0 years
3 - 6 Lacs
Kolkata
Work from Office
We are seeking a highly skilled Kafka Integration Specialist to join our team. The ideal candidate will have extensive experience in designing, developing, and integrating Apache Kafka solutions to support real-time data streaming and distributed systems. Key Responsibilities : - Design, implement, and maintain Kafka-based data pipelines. - Develop integration solutions using Kafka Connect, Kafka Streams, and other related technologies. - Manage Kafka clusters, ensuring high availability, scalability, and performance. - Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. - Implement best practices for data streaming, including message serialization, partitioning, and replication. - Monitor and troubleshoot Kafka performance, latency, and security issues. - Ensure data integrity and implement failover strategies for critical data pipelines. Required Skills : - Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). - Proficiency in programming languages like Java, Python, or Scala. - Experience with distributed systems and data streaming concepts. - Familiarity with Zookeeper, Confluent Kafka, and Kafka Broker configurations. - Expertise in creating and managing topics, partitions, and consumer groups. - Hands-on experience with integration tools such as REST APIs, MQ, or ESB. - Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Nice to Have : - Experience with monitoring tools like Prometheus, Grafana, or Datadog. - Exposure to DevOps practices, CI/CD pipelines, and infrastructure automation. - Knowledge of data serialization formats like Avro, Protobuf, or JSON. Qualifications : - Bachelor's degree in Computer Science, Information Technology, or related field. - 4+ years of hands-on experience in Kafka integration projects.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City