Home
Jobs

37 Apache Kafka Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

The Platform Data Engineer will be responsible for designing and implementing robust data platform architectures, integrating diverse data technologies, and ensuring scalability, reliability, performance, and security across the platform. The role involves setting up and managing infrastructure for data pipelines, storage, and processing, developing internal tools to enhance platform usability, implementing monitoring and observability, collaborating with software engineering teams for seamless integration, and driving capacity planning and cost optimization initiatives.

Posted 1 day ago

Apply

6.0 - 7.0 years

11 - 14 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Location: Remote / Pan India,Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Notice Period: Immediate iSource Services is hiring for one of their client for the position of Java kafka developer. We are seeking a highly skilled and motivated Confluent Certified Developer for Apache Kafka to join our growing team. The ideal candidate will possess a deep understanding of Kafka architecture, development best practices, and the Confluent platform. You will be responsible for designing, developing, and maintaining scalable and reliable Kafka-based data pipelines and applications. Your expertise will be crucial in ensuring the efficient and robust flow of data across our organization. Develop Kafka producers, consumers, and stream processing applications. Implement Kafka Connect connectors and configure Kafka clusters. Optimize Kafka performance and troubleshoot related issues. Utilize Confluent tools like Schema Registry, Control Center, and ksqlDB. Collaborate with cross-functional teams and ensure compliance with data policies. Qualifications: Bachelors degree in Computer Science or related field. Confluent Certified Developer for Apache Kafka certification. Strong programming skills in Java/Python. In-depth Kafka architecture and Confluent platform experience. Experience with cloud platforms and containerization (Docker, Kubernetes) is a plus. Experience with data warehousing and data lake technologies. Experience with CI/CD pipelines and DevOps practices. Experience with Infrastructure as Code tools such as Terraform, or CloudFormation.

Posted 4 days ago

Apply

7.0 - 9.0 years

11 - 12 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Location: Remote / Pan India, hyderabad,ahmedabad,pune,chennai,kolkata. Notice Period: Immediate iSource Services is hiring for one of their client for the position of RoR Engineer About the Role - An RoR Engineer is responsible for maintaining all the applications i.e. the primary back-end application API, the order admin tool, the eCommerce application based on Solidus, and various supporting services which are used by our fulfilment partners, web and mobile customer facing applications. Roles & Responsibilities: Primary technology: Ruby on Rails Monitoring #escalated-support and #consumer-eng slack channels and addressing any issues that require technical assistance. Monitoring logs (via rollbar / datadog) and resolving any errors. Monitoring Sidekiqs job morgue and addressing any dead jobs. Maintaining libraries in all applications with security updates. Security requirements and scope understanding. Must have knowledge of database like MySQL, PostgreSQL, SQLite. Good knowledge of deployment of application on server. 7 years in ROR and 3 Years in Angular JS .

Posted 4 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Mumbai, Pune, Chennai

Work from Office

Naukri logo

We are seeking proficient and result-oriented Java/Spring Boot Developer with hands-on experience in building scalable and high-performance systems Job Description Expert on Core Java (Java 11 or 17), J2EE, Spring Boot, JUnit Should have knowledge on NodeJs, Maven, GitHub/BitBucket Experience with RESTful services, Rabbit MQ, Active MQ, JSON, Graphql, Apache Kafka & postGres is a plus Excellent problem solving/troubleshooting skills on Java/J2EE technologies Roles & Responsibilities Participate in system design discussions, planning and performance tuning Write clean, testable and scalable code following industry best practices Resolve technical issues through debugging, research, and investigation Work in Agile/Scrum development lifecycle and participate in daily stand-ups and sprint planning Complete the task assigned within the given timelines Continuously learn new technologies

Posted 4 days ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Thane

Work from Office

Naukri logo

Position Purpose In the context of development of applications for the Compliance domain of BNPP, the developer will be part of a team of developers, align with the local team lead, take ownership, and deliver quality for all the user stories worked upon. We are looking for a highly skilled backend developer with strong experience in Java 8+, Spring Boot and Microservices. Candidate should be comfortable designing and developing scalable backend solutions with NoSQL databases like MongoDB. Responsibilities Direct Responsibilities Design and develop backend services using Java 8+, Spring boot & JUnit. Build and maintain robust RESTful APIs. Integrate with MongoDB and ensure performance and security. Ensure coding standards are followed Ensure collaboration, good rapport & teamwork with ISPL and Paris team members Contributing Responsibilities Take ownership and commit towards quality deliverables within estimated timelines, avoiding global schedule shift Participate in code reviews and documentation process. Contribute to continuous improvement in development practices processes and code quality. Participation in projects meetings: fine-tuning, daily, retrospective. Collaboration with the team members: the ability to collect, analyze, synthesize and present information in a clear, concise and precise way Technical & Behavioral Competencies - Expert in Java 8+ and Spring Boot - RESTful API and Microservices architecture. - Hands-on experience with MongoDB - Apache Kafka for messaging - Junit and Spring boot testing frameworks and code quality tools like Sonar - API Gateways like APIGEE and authentication strategies - Clean coding practices. - Maven and swagger tools. - Good to have Familiar with payment systems or related compliance driven systems Knowledge of Docker and Kubernetes and CI/CD pipelines using GitLab Angular2+, Typescript Including knowledge on PrimeNG and/or Material UI Experience in Integrated AI tool and knowledge on efficient prompting Knowledge of Web security principles (OWASP, Auth double factor, encryption, etc.) Knowledge of hexagonal architecture, event-oriented architecture and DDD Specific Qualifications (if required) Experience in Linux, DevOps, IntelliJ, Gitlab (Pipeline CI/CD), Cloud Object Storage, Kafka Skills Referential Behavioural Skills : (Please select up to 4 skills) Ability to collaborate / Teamwork Attention to detail / rigor Communication skills - oral & written Ability to deliver / Results driven Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to develop and adapt a process Choose an item. Choose an item. Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 3 years

Posted 4 days ago

Apply

10.0 - 20.0 years

25 - 40 Lacs

Gurugram, Bengaluru

Hybrid

Naukri logo

Greetings from BCforward INDIA TECHNOLOGIES PRIVATE LIMITED. Contract To Hire(C2H) Role Location: Gurgaon/ Bengaluru Payroll: BCforward Work Mode: Hybrid JD Skills: Java; Apache Kafka; AWS; Spring, microservices, Event Driven Architecture, deeper knowledge of Java, Spring, Kafka and with good hands-on coding and analytical skills. Please share your Updated Resume, PAN card soft copy, Passport size Photo & UAN History. Interested applicants can share updated resume to g.sreekanth@bcforward.com Note: Looking for Immediate to 30-Days joiners at most. All the best

Posted 4 days ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

Expertise in the following areas: Java, Spring MVC, Spring Boot, Docker, MariaDB, MongoDB, NoSql, Maven, JUnit, Mockito, SAML,XML, Object Oriented Design and Development, Apache ANT, Relational databases (MySQL), Hibernate,HTML,J2EE 1.6+,PostgreSQL

Posted 5 days ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Monday to Friday (WFO) Timings : 9 am to 6 pm Desired Skills Expertise: Strong experience and mathematical understanding in one or more of Natural Language Understanding, Computer Vision, Machine Learning, and Optimization Proven track record in effectively building and deploying ML systems using frameworks such as PyTorch, TensorFlow, Keras, scikit-learn, etc. Expertise in modular, typed, and object-oriented Python programming Proficiency with core data science languages (such as Python, R, Scala), and familiarity & flexibility with data systems (e.g., SQL, NoSQL, knowledge graphs) Experience with financial data analysis, time series forecasting, and risk modeling Knowledge of financial regulations and compliance requirements in the fintech industry Familiarity with cloud platforms (AWS, GCP, or Azure) and containerization technologies (Docker, Kubernetes) Understanding of blockchain technology and its applications in fintech Experience with real-time data processing and streaming analytics (e.g., Apache Kafka, Apache Flink) Excellent communication skills with a desire to work in multidisciplinary teams Ability to explain complex technical concepts to non-technical stakeholders

Posted 6 days ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Design,develop,maintain scalable microservices using Java,Kotlin,Spring Boot. Build/ optimize data models and queries in MongoDB. Integrate and manage Apache Kafka for real-time data streaming and messaging. Implement CI/CD pipelines using Jenkins Required Candidate profile 6+ yrs of exp in backend development with Java/Spring Boot. Exp with Jenkins for CI/CD automation. Familiarity with AWS services (EC2, S3, Lambda, RDS, etc.) or OpenShift for container orchestration

Posted 1 week ago

Apply

6.0 - 9.0 years

16 - 22 Lacs

Hyderabad, Pune, Chennai

Work from Office

Naukri logo

Full stack developer - Java/Angular/Springboot / Kotlin / Kafka We are seeking a talented Full Stack Developer experienced in Java, Kotlin, Spring Boot, Angular, and Apache Kafka to join our dynamic engineering team. The ideal candidate will design, develop, and maintain end-to-end web applications and real-time data processing solutions, leveraging modern frameworks and event-driven architectures. Location : offshore Timings : Until US EST Noon hours Experience : 4-6 Years Key Responsibilities Design, develop, and maintain scalable web applications using Java, Kotlin, Spring Boot, and Angular. Build and integrate RESTful APIs and microservices to connect frontend and backend components. Develop and maintain real-time data pipelines and event-driven features using Apache Kafka. Collaborate with cross-functional teams (UI/UX, QA, DevOps, Product) to define, design, and deliver new features. Write clean, efficient, and well-documented code following industry best practices and coding standards. Participate in code reviews, provide constructive feedback, and ensure code quality and consistency. Troubleshoot and resolve application issues, bugs, and performance bottlenecks in a timely manner. Optimize applications for maximum speed, scalability, and security. Stay updated with the latest industry trends, tools, and technologies, and proactively suggest improvements. Participate in Agile/Scrum ceremonies and contribute to continuous integration and delivery pipelines. Required Qualifications Experience with cloud-based technologies and deployment (Azure, GCP). Familiarity with containerization (Docker, Kubernetes) and microservices architecture. Proven experience as a Full Stack Developer with hands-on expertise in Java, Kotlin, Spring Boot, and Angular (Angular 2+). Strong understanding of object-oriented and functional programming principles. Experience designing and implementing RESTful APIs and integrating them with frontend applications. Proficiency in building event-driven and streaming applications using Apache Kafka. Experience with database systems (SQL/NoSQL), ORM frameworks (e.g., Hibernate, JPA), and SQL. Familiarity with version control systems (Git) and CI/CD pipelines. Good understanding of HTML5, CSS3, JavaScript, and TypeScript. Experience with Agile development methodologies and working collaboratively in a team environment. Excellent problem-solving, analytical, and communication skills.

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Pune, Hinjewadi

Work from Office

Naukri logo

Job Summary Synechron is seeking an experienced and technically proficient Senior PySpark Data Engineer to join our data engineering team. In this role, you will be responsible for developing, optimizing, and maintaining large-scale data processing solutions using PySpark. Your expertise will support our organizations efforts to leverage big data for actionable insights, enabling data-driven decision-making and strategic initiatives. Software Requirements Required Skills: Proficiency in PySpark Familiarity with Hadoop ecosystem components (e.g., HDFS, Hive, Spark SQL) Experience with Linux/Unix operating systems Data processing tools like Apache Kafka or similar streaming platforms Preferred Skills: Experience with cloud-based big data platforms (e.g., AWS EMR, Azure HDInsight) Knowledge of Python (beyond PySpark), Java or Scala relevant to big data applications Familiarity with data orchestration tools (e.g., Apache Airflow, Luigi) Overall Responsibilities Design, develop, and optimize scalable data processing pipelines using PySpark. Collaborate with data engineers, data scientists, and business analysts to understand data requirements and deliver solutions. Implement data transformations, aggregations, and extraction processes to support analytics and reporting. Manage large datasets in distributed storage systems, ensuring data integrity, security, and performance. Troubleshoot and resolve performance issues within big data workflows. Document data processes, architectures, and best practices to promote consistency and knowledge sharing. Support data migration and integration efforts across varied platforms. Strategic Objectives: Enable efficient and reliable data processing to meet organizational analytics and reporting needs. Maintain high standards of data security, compliance, and operational durability. Drive continuous improvement in data workflows and infrastructure. Performance Outcomes & Expectations: Efficient processing of large-scale data workloads with minimum downtime. Clear, maintainable, and well-documented code. Active participation in team reviews, knowledge transfer, and innovation initiatives. Technical Skills (By Category) Programming Languages: Required: PySpark (essential); Python (needed for scripting and automation) Preferred: Java, Scala Databases/Data Management: Required: Experience with distributed data storage (HDFS, S3, or similar) and data warehousing solutions (Hive, Snowflake) Preferred: Experience with NoSQL databases (Cassandra, HBase) Cloud Technologies: Required: Familiarity with deploying and managing big data solutions on cloud platforms such as AWS (EMR), Azure, or GCP Preferred: Cloud certifications Frameworks and Libraries: Required: Spark SQL, Spark MLlib (basic familiarity) Preferred: Integration with streaming platforms (e.g., Kafka), data validation tools Development Tools and Methodologies: Required: Version control systems (e.g., Git), Agile/Scrum methodologies Preferred: CI/CD pipelines, containerization (Docker, Kubernetes) Security Protocols: Optional: Basic understanding of data security practices and compliance standards relevant to big data management Experience Requirements Minimum of 7+ years of experience in big data environments with hands-on PySpark development. Proven ability to design and implement large-scale data pipelines. Experience working with cloud and on-premises big data architectures. Preference for candidates with domain-specific experience in finance, banking, or related sectors. Candidates with substantial related experience and strong technical skills in big data, even from different domains, are encouraged to apply. Day-to-Day Activities Develop, test, and deploy PySpark data processing jobs to meet project specifications. Collaborate in multi-disciplinary teams during sprint planning, stand-ups, and code reviews. Optimize existing data pipelines for performance and scalability. Monitor data workflows, troubleshoot issues, and implement fixes. Engage with stakeholders to gather new data requirements, ensuring solutions are aligned with business needs. Contribute to documentation, standards, and best practices for data engineering processes. Support the onboarding of new data sources, including integration and validation. Decision-Making Authority & Responsibilities: Identify performance bottlenecks and propose effective solutions. Decide on appropriate data processing approaches based on project requirements. Escalate issues that impact project timelines or data integrity. Qualifications Bachelors degree in Computer Science, Information Technology, or related field. Equivalent experience considered. Relevant certifications are preferred: Cloudera, Databricks, AWS Certified Data Analytics, or similar. Commitment to ongoing professional development in data engineering and big data technologies. Demonstrated ability to adapt to evolving data tools and frameworks. Professional Competencies Strong analytical and problem-solving skills, with the ability to model complex data workflows. Excellent communication skills to articulate technical solutions to non-technical stakeholders. Effective teamwork and collaboration in a multidisciplinary environment. Adaptability to new technologies and emerging trends in big data. Ability to prioritize tasks effectively and manage time in fast-paced projects. Innovation mindset, actively seeking ways to improve data infrastructure and processes.

Posted 1 week ago

Apply

3.0 - 6.0 years

20 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Skills Required : Problem Solving, Python, Shell Scripting Location : Bangalore,Karnataka Desirable Skills : Apache Kafka, Apache Pulsar, Ansible, Github

Posted 1 week ago

Apply

7.0 - 12.0 years

20 - 32 Lacs

Noida, Hyderabad, Delhi / NCR

Hybrid

Naukri logo

Role: Java Backend Developer Location: Greater Noida Experience: 7+ years Notice Period: Immediate to 30 days Must Have: Java, Microservices, Springboot, AWS, Kafka, DevOps.

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

About this role: Wells Fargo is seeking a highly motivated, dynamic individual for a Lead Software Engineer role within Core Banking Deposits in Enterprise Data & Platforms/Consumer Technology. This position will play a key role in modernizing the Transaction Processing platform that will provide invariant capabilities for the enterprise. Experience in Cloud, DevSecOps, Domain Driven Design and Architecture are foundational and key to the role. This role is a key liaison with various internal teams like engineering & delivery, platform operations, product, risk and compliance. The position will be part of the engineering/delivery team and drive technology transformation and resiliency efforts for transaction processing capabilities. This individual will be a technical expert in the design and development of very complex applications/capabilities within transaction processing ecosystem analyze complex business requirements, design and/or redesign existing applications, provide direction to the application development/engineering team, create/maintain any/all system interface artifacts, and any other compliance policy and procedure documents. This position will act as the lead in ensuring that all non-functional requirements (as required by the enterprise) have been documented, tested and implemented successfully in production environment, ensuring production availability and stability as the number one priority. In this role, you will: Lead complex technology initiatives including those that are companywide with broad impact Act as a key participant in developing standards and companywide best practices for engineering complex and large scale technology solutions for technology engineering disciplines Design, code, test, debug, and document for projects and programs Review and analyze complex, large-scale technology solutions for tactical and strategic business objectives, enterprise technological environment, and technical challenges that require in-depth evaluation of multiple factors, including intangibles or unprecedented technical factors Make decisions in developing standard and companywide best practices for engineering and technology solutions requiring understanding of industry best practices and new technologies, influencing and leading technology team to meet deliverables and drive new initiatives Collaborate and consult with key technical experts, senior technology team, and external industry groups to resolve complex technical issues and achieve goals Lead projects, teams, or serve as a peer mentor Required Qualifications: 5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Bachelors degree or higher in computer science or engineering Experience with Agile Scrum (Daily Standup, Sprint Planning and Sprint Retrospective meetings) and/or Kanban Familiarity with Event driven architecture 3+ years working with configuration and monitoring technologies such as Ansible, Grafana, Elastic, Splunk, Prometheus, Github, Maven 5+ years of Java / J2EE / Spring/Springboot experience 5+ years of experience developing enterprise applications using open source technologies such as APIs, Microservices, REST, SOAP, IBM MQ, Apache Kafka, Swagger etc. 5+ years of experience with any Relational and/or NoSql/Document databases like Cassandra, MongoDB, Oracle or Postgre etc. 5+ Experience with Test Driven Development (TDD), Unit Testing, integration testing, API testing, Performance Testing, and Functional testing 3 + years of experience supporting enterprise level complex applications and platforms in Production 1+ years of experience with any of the Cloud technologies such as AWS, Azure, Open Shift, Pivotal Cloud Foundry, Kubernetes, Docker, Terraform. Job Expectations: Lead , design, developer complex core banking systems Deliver Gen AI solutions to improve efficiency Provide Domain & Technical expertise to team Support production issues/research.

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Grade Level (for internal use): 10 Market Intelligence The Role: Senior Full Stack Developer Grade level :10 The Team: You will work with a team of intelligent, ambitious, and hard-working software professionals. The team is responsible for the architecture, design, development, quality, and maintenance of the next-generation financial data web platform. Other responsibilities include transforming product requirements into technical design and implementation. You will be expected to participate in the design review process, write high-quality code, and work with a dedicated team of QA Analysts, and Infrastructure Teams The Impact: Market Intelligence is seeking a Software Developer to create software design, development, and maintenance for data processing applications. This person would be part of a development team that manages and supports the internal & external applications that is supporting the business portfolio. This role expects a candidate to handle any data processing, big data application development. We have teams made up of people that learn how to work effectively together while working with the larger group of developers on our platform. Whats in it for you: Opportunity to contribute to the development of a world-class Platform Engineering team . Engage in a highly technical, hands-on role designed to elevate team capabilities and foster continuous skill enhancement. Be part of a fast-paced, agile environment that processes massive volumes of dataideal for advancing your software development and data engineering expertise while working with a modern tech stack. Contribute to the development and support of Tier-1, business-critical applications that are central to operations. Gain exposure to and work with cutting-edge technologies including AWS Cloud , EMR and Apache NiFi . Grow your career within a globally distributed team , with clear opportunities for advancement and skill development. Responsibilities: Design and develop applications, components, and common services based on development models, languages and tools, including unit testing, performance testing and monitoring and implementation Support business and technology teams as necessary during design, development and delivery to ensure scalable and robust solutions Build data-intensive applications and services to support and enhance fundamental financials in appropriate technologies.( C#, .Net Core, Databricsk, Spark ,Python, Scala, NIFI , SQL) Build data modeling, achieve performance tuning and apply data architecture concepts Develop applications adhering to secure coding practices and industry-standard coding guidelines, ensuring compliance with security best practices (e.g., OWASP) and internal governance policies. Implement and maintain CI/CD pipelines to streamline build, test, and deployment processes; develop comprehensive unit test cases and ensure code quality Provide operations support to resolve issues proactively and with utmost urgency Effectively manage time and multiple tasks Communicate effectively, especially written with the business and other technical groups What Were Looking For: Basic Qualifications: BachelorsMasters Degree in Computer Science, Information Systems or equivalent. Minimum 5 to 8 years of strong hand-development experience in C#, .Net Core, Cloud Native, MS SQL Server backend development. Proficiency with Object Oriented Programming. Advance SQL programming skills Preferred experience or familiarity with tools and technologies such as Odata, Grafana, Kibana, Big Data platforms, Apache Kafka, GitHub, AWS EMR, Terraform, and emerging areas like AI/ML and GitHub Copilot. Highly recommended skillset in Databricks, SPARK, Scalatechnologies. Understanding of database performance tuning in large datasets Ability to manage multiple priorities efficiently and effectively within specific timeframes Excellent logical, analytical and communication skills are essential, with strong verbal and writing proficiencies Knowledge of Fundamentals, or financial industry highly preferred. Experience in conducting application design and code reviews Proficiency with following technologies: Object-oriented programming Programing Languages (C#, .Net Core) Cloud Computing Database systems (SQL, MS SQL) Nice to have: No-SQL (Databricks, Spark, Scala, python), Scripting (Bash, Scala, Perl, Powershell) Preferred Qualifications: Hands-on experience with cloud computing platforms including AWS , Azure , or Google Cloud Platform (GCP) . Proficient in working with Snowflake and Databricks for cloud-based data analytics and processing. Benefits: Health & Wellness: Health care coverage designed for the mind and body. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: Its not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awardssmall perks can make a big difference.

Posted 1 week ago

Apply

10.0 - 15.0 years

12 - 18 Lacs

Maharashtra

Work from Office

Naukri logo

Staff Software Engineers are the technology leaders of our highest impact projects. Your high energy is contagious, you actively collaborate with others across the engineering organization, and you seek to learn as much as you like to teach. You personify the notion of constant improvement as you work with your team and the larger engineering group to build software that delivers on our mission. You use your extraordinary technical competence to ensure a high bar for excellence while you mentor other engineers on their own path towards craftsmanship. You are most likely T-shaped, with broad knowledge across many technologies plus strong skills in a specific area. Staff Software Engineers embrace the opportunity to represent HMH in industry groups and open-source communities. Area of Responsibility: You will be working on the HMH Assessment Platform that is part of the HMH Educational Online/Digital Learning Platform. The Assessment team builds highly scalable and available platform. The platform is built using Microservices Architecture, Java microservices backend, REACT JavaScript UI Frontend, REST APIs, Postgres Database, AWS Cloud technologies, AWS Kafka, Kubernetes or Mesos orchestration, DataDog for logging/monitoring/alerting, Concourse CI or Jenkins, Maven etc. Responsibilities: Be the technical lead for feature development in a team of 5-10 engineers and influencing the technical direction of the overall engineering organization. Decompose business objectives into valuable, incrementally releasable user features accurately estimating the effort to complete each. Contribute code to feature development efforts demonstrating to others efficient design, delivery and testing patterns and techniques. Strive for high quality outcomes, continuously look for ways to improve team productivity and product reliability, performance, and security. Develop the talents and abilities of peers and colleagues. Create a memorable legacy as you progress toward your personal and professional objectives. Foster your personal and professional development continually seeking assignments that challenge you. Skills & Experience: Successful Candidates must demonstrate an appropriate combination of: 10+ years of experience as a software engineer. 3+ years of experience as a Staff or lead software engineer. Bachelor's degree in computer science or a STEM field. A portfolio of thought leadership and individual technical accomplishments. Full understanding of Agile software development methodologies and practices. Strong communication skills both verbal and written. Extensive experience working with technologies and concepts such: Behavior-driven or test-driven development JVM-based languages such as Java and Scala Development frameworks such as Spring Boot Asynchronous programming concepts, including Event processing Database technologies such as SQL, Postgres/MySQL, AWS Aurora DBs, Redshift, Liquibase or Flyway No-SQL technologies such as Redis, MongoDB and Cassandra Streaming technologies such as Apache Kafka, Apache Spark or Amazon Kinesis Unit-testing frameworks such as jUnit Performance testing frameworks such as Gatling Architectural concepts such as micro-services and separation of concerns Expert knowledge of class-based, object-oriented programming and design patterns Development tools such as GitHub, Jira, Jenkins, Concourse, and Maven Cloud technologies such as AWS and Azure Data Center Operating Technologies such as Kubernetes, Apache Mesos Apache Aurora, and TerraForm and container services such as Docker and Kubernetes Monitoring and operational data analysis practices and tools such as DataDog, Splunk and ELK.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Noida

Work from Office

Naukri logo

Java Technical Lead- Enterprise Solutions & AI Integration Specialist About the Role We are seeking an experienced Senior Java Developer to join our team in building modern enterprise applications with AI capabilities. You will work on mission-critical systems involving real-time data processing, automated workflows, and intelligent business solutions. This role offers the opportunity to work with cutting-edge technologies while developing scalable, cloud-native applications. Required Experience 5+ years of experience in Java development 5+ years of experience with Spring Boot / Spring Cloud and Micro services Strong experience with: Java 11/17 Spring Boot / Spring Cloud Apache Kafka containerization (Docker, Kubernetes) AI/ML integration Key Responsibilities Design and implement scalable microservices using Spring Boot Build robust error processing pipelines using Apache Kafka Integrate with Zendesk API for automated ticket management Implement AI-powered error classification and resolution Create and maintain CI/CD pipelines Write clean, maintainable, and well-tested code Mentor junior developers and conduct code reviews Technical Skills Required Java 11/17 Spring Boot 3.x Apache Kafka Docker & Kubernetes Maven/Gradle Junit / Mockito Git, Jenkins AI/ML Experience with AI/ML frameworks Integration with AI services ML model deployment Natural Language Processing Required Qualifications Bachelor's/Master's degree in Computer Science or related field Strong understanding of distributed systems Experience with high-throughput message processing Solid understanding of RESTful architecture Experience with agile development methodologies Preferred Qualifications Experience with Zendesk API integration Knowledge of error management systems Experience with AI/ML model integration Understanding of ITIL practices Experience with cloud platforms (AWS/Azure/GCP)

Posted 2 weeks ago

Apply

8.0 - 10.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Naukri logo

Job Responsibilities: Collaborates with Product and Engineering stakeholders to design and build platform services that meet key product and infrastructure requirements Produces both detailed designs for platform-level services Must be able to evaluate software and products against business requirements and turn business requirements into robust technical solutions fitting into corporate standards and strategy. Designs and implements microservices with thoughtfully defined APIs Should be conversant with frameworks & Architectures - Spring Boot, Spring Cloud, Spring Batch, Messaging Frameworks (like Kafka), Micro service Architecture Work with other areas of technology team to realize end to end solution and estimation for delivery proposals. Sound understanding of Java concepts, understanding of the technologies in the various architecture tiers - presentation, middleware, data access and integration to propose solution using Java /open-source technologies Design modules that are scalable, reusable, modular, secure. Clearly communicates design decisions, roadblocks and timelines to key stakeholders Adheres to all industry best practices and standards for Agile/Scrum Frameworks adopted by the Organization including but not limited to daily stand-ups, grooming, planning, retrospectives, sprint reviews, demos, and analytics via systems (JIRA) administration to directly support initiatives set by Product Management and the Organization at large Actively participate in Production stabilization and lead system software improvements along with team members. Technical Skills: Candidate Should have at least total 8+ years of experience in IT software development/design architecture. 3+ experience as an Architect in building distributed, highly available and scalable, microservice-based Cloud Native architecture Experience in one or more open-source Java frameworks such as Spring Boot, Spring Batch, Quartz, Spring Cloud, Spring Security, BPM, etc. Experience in single page web application framework like Angular. Experience with at least one type messaging system (Apache Kafka (Required), RabbitMQ) Experience with at least one RDBMS (MySQL, PostgreSQL, Oracle) Experience with at least one document-oriented DB (MongoDB, Preferably Couchbase DB) Experience with NoSQL DB like Elasticsearch Proficient in creating design documents - LLD documents with UML Good Exposure on Design Patterns, Microservices Architecture Design patterns and 12 factor application Experience working with observability/monitoring framework (Prometheus/Grafana, ELK) along with any APM tool Ability to conceptualize end-to-end system components across a wide range of technologies and translate into architectural design patterns for implementation Knowledge of security systems like Oauth 2, Keyclaok and SAML Familiarity with source code version control systems like Git/SVN Experience using, designing, and building REST/GRPC/ GraphQL/Web Service APIs Production experience with container orchestration (Docker, Kubernetes/CI/CD) and maintaining production environments Good understanding of public clouds GCP, AWS Etc. Good Exposure on API Gateways, Config servers Familiar with OWASP Experience in Telecom BSS (Business Support System) for CRM components added advantage. Immediate Joiner/30 days

Posted 2 weeks ago

Apply

5.0 - 8.0 years

20 - 25 Lacs

Udaipur

Work from Office

Naukri logo

Required Skills : Expert in Python, with knowledge of at least one Python web framework {{such as Django, Flask, etc depending on your technology stack}} Familiarity with some ORM (Object Relational Mapper) libraries Able to integrate multiple data sources and databases into one system Understanding of the threading limitations of Python, and multi-process architecture Good understanding of server-side templating languages {{such as Jinja 2, Mako, etc depending on your technology stack}} Basic understanding of front-end technologies, such as JavaScript, HTML5, and CSS3 Understanding of accessibility and security compliance Knowledge of user authentication and authorization between multiple systems, servers, and environments Understanding of fundamental design principles behind a scalable application Familiarity with event-driven programming in Python Understanding of the differences between multiple delivery platforms, such as mobile vs desktop, and optimizing output to match the specific platform Able to create database schemas that represent and support business processes Design, develop, and maintain Microservices using Python to ensure high performance and scalability. Collaborate with cross-functional teams to define and implement Microservices architecture best practices. Design, implement, and maintain systems that utilize queueing services for asynchronous communication. Integrate and configure queueing services like RabbitMQ or Apache Kafka within the application architecture. Strong unit test and debugging skills Proficient understanding of code versioning tools Work collaboratively with the design team to understand end-user requirements to provide technical solutions and for the implementation of new software features Knowledge of application deployment process and server set up Responsibilities- Develop reusable, testable, and efficient code. Implement moderately complex applications and features following the underlying architectural decisions. Collaborate with team members to follow established development guidelines. Integrate and manage data storage solutions with a focus on execution. Design and implementation of low-latency, high-availability, and performant applications

Posted 3 weeks ago

Apply

4.0 - 5.0 years

8 - 16 Lacs

Hyderabad

Hybrid

Naukri logo

Role & responsibilities We are seeking a Senior Java Developer with strong experience in Spring Boot , AWS , Apache Kafka , and React JS to join our fast-growing development team. The ideal candidate will have a solid background in designing scalable microservices, hands-on cloud deployment, and BPM integration using Groovy scripts and interceptors. Key Responsibilities Design, develop, and maintain scalable Java microservices using Spring Boot Work with AWS services (EC2, Lambda, S3, Glue, EKS) to deploy and manage applications in a cloud environment Develop and manage Kafka producers and consumers, handle topic/partition configurations Design and optimize PostgreSQL schemas and complex queries Collaborate with frontend developers to integrate APIs with React JS UI Write JUnit test cases and ensure code coverage with tools like JaCoCo and SonarQube Implement JWT -based API security standards Build and maintain CI/CD pipelines , participate in DevOps processes Integrate business workflows using BPMN , Groovy scripting , and event listeners Monitor and troubleshoot using Prometheus , Grafana , and centralized logging tools Mentor junior developers and collaborate in Agile/Scrum ceremonies Required Skills Java (13+), Spring Boot, REST APIs Apache Kafka (Topics, Partitions, Producer/Consumer APIs) AWS (EC2, Lambda, S3, Glue, EKS) Docker, Kubernetes PostgreSQL (Schema design, indexing, optimization) JUnit, JaCoCo, SonarQube JWT, API Security React JS Git, Agile/Scrum BPM tools, Groovy scripting, Event Listeners, Interceptors Monitoring tools Prometheus, Grafana Interested candidates candidates can share your resume to sarvani.j@ifinglobalgroup.com

Posted 3 weeks ago

Apply

8.0 - 10.0 years

40 - 45 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Naukri logo

Roles & Responsibilities: Data Engineering Leadership & Strategy: Lead and mentor a team of data engineers, fostering a culture of technical excellence and collaboration. Define and implement data engineering best practices, standards, and processes. Data Pipeline Architecture & Development: Design, build, and maintain scalable, robust, and efficient data pipelines for ingestion, transformation, and loading of data from various sources. Optimize data pipelines for performance, reliability, and cost-effectiveness. Implement data quality checks and monitoring systems to ensure data integrity. Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver high-quality data solutions. Cloud-Based Data Infrastructure: Design, implement, and manage cloud-based data infrastructure using platforms like AWS, Azure, or GCP. Leverage cloud services (e.g., data lakes, data warehouses, serverless computing) to build scalable and cost-effective data solutions. Leverage opensource airbyte , mage ai and similar Ensure data security, governance, and compliance within the cloud environment. Data Modeling & Warehousing: Design and implement data models to support business intelligence, reporting, and analytics. Optimize data warehouse performance for efficient querying and reporting. Collaboration & Communication: Collaborate effectively with cross-functional teams including product managers, software engineers, and business stakeholders. Requirements: Bachelor's or master's degree in computer science, Engineering, or a related field. 8+ years of proven experience in data engineering, with at least 3+ years in a lead role. Expertise in building and maintaining data pipelines using tools such as Apache Spark, Apache Kafka, Apache Beam, or similar. Proficiency in SQL and one or more programming languages like Python, Java, or Scala. Hands-on experience with cloud-based data platforms (AWS, Azure, GCP) and services. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote Work Timings: 2.30 pm - 11.30 pm IST

Posted 3 weeks ago

Apply

8.0 - 12.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Roles and Responsibilities Implementing the design and architecture of complex web applications using Angular framework, ensuring adherence to best practices and architectural principles. Collaborate closely with product managers, UX/UI designers, and development teams to translate business requirements into technical specifications and architectural designs. Define and implement scalable and maintainable front-end architecture, including component-based architecture, state management, and data flow patterns. Provide technical guidance and mentorship to development teams, promoting code quality, performance optimization, and maintainability. Conduct code reviews and architectural reviews to ensure compliance with established standards and design guidelines. Evaluate and recommend tools, libraries, and frameworks to enhance productivity and efficiency in Angular development. Stay current with industry trends and emerging technologies related to front-end development, and incorporate them into our architectural roadmap. Drive continuous improvement initiatives to streamline development processes, increase development velocity, and elevate overall product quality. Preferred Skills Knowledge of continuous integration Excellent teamwork and communication abilities Excellent organizational and time management abilities Effective scrum master experience Good to have knowledge of API designing using Swagger Hub Good to have knowledge of SignalR API for web functionality implementation and data broadcasting Requirements Skill Requirements Bachelor/Master of Engineering or equivalent in Computers/Electronics and Communication with 8+ years experience. Proven Experience as Software Architect or Solution Architect or Senior Full Stack Developer or in web application development. Hands-on experience in C#, ASP.NET development. Expert-level proficiency in Angular framework and its ecosystem (Angular CLI, RxJS, Angular Material and related technologies). Expert-level proficiency in designing and implementing microservices-based applications, with a strong understanding of micro services design principles, patterns, and best practices. Architect level Cloud Certification is recommended. Deep knowledge of front-end development technologies such as HTML5, CSS3, JavaScript/TypeScript, and RESTful APIs. Experience with state management libraries (e.g., NgRx, Redux) and reactive programming concepts. Strong understanding of software design principles, design patterns, and architectural styles, with a focus on building scalable and maintainable front-end architectures. Excellent communication and collaboration skills, with the ability to effectively convey technical concepts to non-technical stakeholders. Experience working in Agile/Scrum development environments and familiarity with DevOps practices is a plus. Experience to work in multiple cloud environments - Azure, AWS web services and GCP. Experience in developing and consuming web services GRPC Strong knowledge of RESTful APIs, HTTP protocols, JSON, XML and micro services using serverless cloud technologies. Design, Implementation and Integration of data storage solutions like databases, key-value stores, blob stores User authentication and authorization between multiple systems, servers, and environments Management of hosting environment, deployment of update packages Excellent analytical and problem-solving abilities Strong understanding of object-oriented programming Strong unit test and debugging skills Proficient understanding of code versioning tools such as Git, SVN Hands-on experience with PostgreSQL Database. Knowledge on Azure IOT, MQTT, Apache Kafka, Kubernetes, Docker, is a plus. Experience with version control systems such as Git & SVN. Good understanding of Agile based software development & Software delivery process. Experience in Requirements Managements tools like Polarion [preferable] or any other requirement management system Excellent communication and collaboration abilities, with the capacity to work effectively in cross- functional teams.

Posted 3 weeks ago

Apply

4.0 - 8.0 years

5 - 8 Lacs

Chennai

Work from Office

Naukri logo

Responsibilities What you'll do Engineer, test, document and manage GCP Dataproc, DataFlow and VertexAI services used in high-performance data processing pipelines and Machine Learning. Help developers optimize data processing jobs using Spark, Python, and Java. Collaborate with development teams to integrate data processing pipelines with other cloud services and applications. Utilize Terraform and Tekton for infrastructure as code (IaC) and CI/CD pipelines, ensuring efficient deployment and management. Good to have Experience with Spark for large-scale data processing. Solid understanding and experience with GitHub for version control and collaboration. Experience with Terraform for infrastructure management and Tekton for continuous integration and deployment. Experience with Apache NiFi for data flow automation. Knowledge of Apache Kafka for real-time data streaming. Familiarity with Google Cloud Pub/Sub for event-driven systems and messaging. Familiarity with Google BigQuery Mandatory Key Skills Python,Java,Google Cloud Pub/Sub,Apache Kafka,Big Query,CI/CD*,Machine Learning*,Spark*

Posted 3 weeks ago

Apply

8.0 - 12.0 years

10 - 15 Lacs

Chennai

Work from Office

Naukri logo

Roles and Responsibilities Implementing the design and architecture of complex web applications using Angular framework, ensuring adherence to best practices and architectural principles. Collaborate closely with product managers, UX/UI designers, and development teams to translate business requirements into technical specifications and architectural designs. Define and implement scalable and maintainable front-end architecture, including component-based architecture, state management, and data flow patterns. Provide technical guidance and mentorship to development teams, promoting code quality, performance optimization, and maintainability. Conduct code reviews and architectural reviews to ensure compliance with established standards and design guidelines. Evaluate and recommend tools, libraries, and frameworks to enhance productivity and efficiency in Angular development. Stay current with industry trends and emerging technologies related to front-end development, and incorporate them into our architectural roadmap. Drive continuous improvement initiatives to streamline development processes, increase development velocity, and elevate overall product quality. Preferred Skills Knowledge of continuous integration Excellent teamwork and communication abilities Excellent organizational and time management abilities Effective scrum master experience Good to have knowledge of API designing using Swagger Hub Good to have knowledge of SignalR API for web functionality implementation and data broadcasting. Requirements Skill Requirements Bachelor Master of Engineering or equivalent in Computers/Electronics and Communication with 8+ years experience. Proven Experience as Software Architect or Solution Architect or Senior Full Stack Developer or in web application development. Hands-on experience in C#, ASP.NET development. Expert-level proficiency in Angular framework and its ecosystem (Angular CLI, RxJS, Angular Material and related technologies). Expert-level proficiency in designing and implementing microservices-based applications, with a strong understanding of micro services design principles, patterns, and best practices. Architect level Cloud Certification is recommended. Deep knowledge of front-end development technologies such as HTML5, CSS3, JavaScript TypeScript, and RESTful APIs. Experience with state management libraries (e.g., NgRx, Redux) and reactive programming concepts. Strong understanding of software design principles, design patterns, and architectural styles, with a focus on building scalable and maintainable front-end architectures. Excellent communication and collaboration skills, with the ability to effectively convey technical concepts to non-technical stakeholders. Experience working in Agile/Scrum development environments and familiarity with DevOps practices is a plus. Experience to work in multiple cloud environments - Azure, AWS web services and GCP. Experience in developing and consuming web services GRPC Strong knowledge of RESTful APIs, HTTP protocols, JSON, XML and micro services using serverless cloud technologies. Design, Implementation and Integration of data storage solutions like databases, key-value stores, blob stores User authentication and authorization between multiple systems, servers, and environments Management of hosting environment, deployment of update packages Excellent analytical and problem-solving abilities Strong understanding of object-oriented programming Strong unit test and debugging skills Proficient understanding of code versioning tools such as Git, SVN Hands-on experience with PostgreSQL Database. Knowledge on Azure IOT, MQTT, Apache Kafka, Kubernetes, Docker, is a plus. Experience with version control systems such as Git & SVN. Good understanding of Agile based software development & Software delivery process. Experience in Requirements Managements tools like Polarion [preferable] or any other requirement management system Excellent communication and collaboration abilities, with the capacity to work effectively in cross- functional teams. Keywords RxJS,Angular Material,microservices,DevOps,Git,SVN,PostgreSQL,Azure IOT,MQTT,Apache Kafka,Kubernetes,Docker,CI/CD,Angular CLI*,Swagger Hub*,SignalR API*,C#*,ASP.NET development*

Posted 3 weeks ago

Apply

5.0 - 7.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Roles and Responsibilities Responsible for programming and testing of cloud applications Integration of user-facing elements developed by a front-end developers with server side logic Optimization of the application for maximum speed and scalability Design and implementation of data storage solutions Writing reusable, testable, and efficient code Design, Code, test, debug, and document software according to the functional requirements Participate as a team member in fully agile Scrum deliveries Provide Low Level Design Document for the components Work collaboratively in Agile/Scrum team environment Test driven development based on unit tests Preferred Skills Good to have knowledge of API designing using Swagger Hub Good to have knowledge of SignalR API for web functionality implementation and data broadcasting Good to have knowledge on cloud and CI/CD. Knowledge of continuous integration Excellent teamwork and communication abilities Excellent organizational and time management abilities Effective scrum master experience Requirements Skill Requirements: Bachelor/Master of Engineering or equivalent in Computers/Electronics and Communication with 5-7 yrs experience. Hands-on Experience in web application development using Angular. Hands-on experience in C#, ASP.NET development. Dev level Cloud application Certification is recommended. Proficiency in designing and implementing microservices-based applications, with a strong understanding of micro-services design principles, patterns, and best practices. Experience to work in multiple cloud environments - Azure, AWS web-services and GCP. Experience in developing and consuming web services GRPC Strong knowledge of RESTful APIs, HTTP protocols, JSON, XML and micro services using serverless cloud technologies. Integration of data storage solutions like databases, key-value stores, blob stores User authentication and authorization between multiple systems, servers, and environments Management of hosting environment, deployment of update packages Excellent analytical and problem-solving abilities Strong understanding of object-oriented programming Basic understanding of front-end technologies, such as JavaScript, TypeScript, HTML5, and CSS Strong unit test and debugging skills Proficient understanding of code versioning tools such as Git, SVN Hands-on experience with PostgreSQL Database. Knowledge on Azure IOT, MQTT, Apache Kafka, Kubernetes, Docker, is a plus. Experience with version control systems such as Git & SVN. Good understanding of Agile based software development & Software delivery process. Experience in Requirements Managements tools like Polarion [preferable] or any other requirement management system Excellent communication and collaboration abilities, with the capacity to work effectively in cross- functional teams. Keywords RESTful APIs,HTTP protocols,JSON,XML,blob stores,JavaScript,TypeScript,HTML5,CSS,Git,PostgreSQL Database,Azure IOT,MQTT,Apache Kafka,Kubernetes,Docker,C#*,ASP.NET development*,Azure*,AWS web-services*,GCP*

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies