Home
Jobs

9913 Kafka Jobs - Page 36

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

14 - 20 Lacs

Hyderabad

Work from Office

Hand-On Programming Core Java (good syntax)-4/5 Problem Solving Logical Ability / Other way of writing optimised program-4/5 Communication Candidate should be a fluent and clear-4/5 Core Java-4/5 (Preferable Java11,17) -3/5 Required Candidate profile Minimum 5 years experience in Java,Hands on experience in springboot,Microservices,Restful APIS. Any Cloud platform-3/5(Preferably Azure) AKS and Docker,Kafka

Posted 5 days ago

Apply

10.0 years

0 Lacs

Andhra Pradesh, India

On-site

10 years of hands-on experience in data spark developer Java, Apache Spark, Kafka, Hive, Hadoop, HDFS, Scala Architect big data processing pipelines using Spark Design real-time and batch data integrations Optimize Spark jobs for performance and reliability Guide developers on functional programming and fault tolerance Lead data engineering best practices and governance Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge, Strong communication and collaboration skills. Banking Domain knowledge is must

Posted 5 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of Software Engineer. In this role, you will: Software design, Scala & Spark development, automated testing of new and existing components in an Agile, DevOps and dynamic environment. Promoting development standards, code reviews, mentoring, knowledge sharing. Production support & troubleshooting. Implement the tools and processes, handling performance, scale, availability, accuracy and monitoring. Liaison with BAs to ensure that requirements are correctly interpreted and implemented. Participation in regular planning and status meetings. Input to the development process – through the involvement in Sprint reviews and retrospectives. Input into system architecture and design. Requirements To be successful in this role, you should meet the following requirements: Scala development and design using Scala 2.10+ or Java development and design using Java 1.8+. Experience with most of the following technologies (Apache Hadoop, Scala, Apache Spark, Spark streaming, YARN, Kafka, Hive, Python, ETL frameworks, Map Reduce, SQL, RESTful services). Sound knowledge on working Unix/Linux Platform. Hands-on experience building data pipelines using Hadoop components - Hive, Spark, Spark SQL. Experience with industry standard version control tools (Git, GitHub), automated deployment tools (Ansible & Jenkins) and requirement management in JIRA. Understanding of big data modelling techniques using relational and non-relational techniques. Experience on Debugging the Code issues and then publishing the highlighted differences to the development team/Architects. Experience with time-series/analytics dB’s such as Elastic search. Experience with scheduling tools such as Airflow, Control-M. Understanding or experience of Cloud design patterns. Exposure to DevOps & Agile Project methodology such as Scrum and Kanban. Experience with developing Hive QL, UDF’s for analysing semi structured/structured datasets You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India

Posted 5 days ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Description Position at Wind River Location : Chennai Why Choose Wind River? In a world increasingly driven by software innovation, Wind River is pioneering the technologies to accelerate the digital transformation of our customers with a new generation of Mission Critical AI Systems in an AI-first world with the most exacting standards for safety, security, performance, and reliability. Success will be determined by our ability to innovate with velocity and sell at the solutions level. Wind River’s impact spans critical infrastructure domains such as telecommunications, including 5G; industrial (automation, sustainable energy, robotics, mining), connected healthcare and medical devices, automotive (connected and self-driving vehicles), and aerospace & defense. We were recognized by VDC Research in July 2020 as #1 in Edge Compute OS Platforms, overtaking Microsoft as the overall commercial leader. Wind River regularly wins industry recognitions for excellence in IoT security, cloud and edge computing, as well as 8 consecutive years as a “Top Work Place”. If you’re passionate about amplifying your impact on the world, in a caring, respectful culture with a growth mindset, come join us and help lead the way into the future of the intelligent edge! Skills Experience in designing, developing, and delivering cloud native software Advanced knowledge of microservices, data access, event sourcing and stream processing. Experience working with large scale microservice architecture backends using message brokers, data pipelines, several data sources Experience developing software using Typescript, NodeJs, Docker, Kafka, Redis, Auth, various UT framework Experience with frontend web design, CSS, Angular, HTML, JavaScript Experience with DBMS - Postgres, MongoDB, SQL Experience with Docker, Containers, Helm charts, Kubernetes, Vault Experience with OpenAPI/Swagger. Must have experience with AWS, good to have - GCP, Azure, vSphere, Openstack Experience with GIT, Jira, code review tools. Experience with Python, GoLang Nice to have experience with Jenkins, Terraform, Cloudify, groovy script Thorough experience of working in software development including the application of best practices and SOLID design principles Excellent communication skills, both written and verbal Experience supporting Junior Engineers to develop best practices in software development Experience with various Agile SDLC – Scrum, Kanban Benefits Workplace Flexibility: Hybrid Work. Medical insurance: Group Medical Insurance coverage. Additional shared cost medical benefit in the form of reimbursements. Employee Assistance Program. Vacation and Time off: Employees are eligible for various types of paid time off. Additional Time off’s – Birthday, Volunteer Time off, Wedding. Wellness Benefits through Unmind Carrot (Family -forming support)

Posted 5 days ago

Apply

7.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Staff software engineer - Backend developer Key Responsibilities: Design and develop solutions to address complex business workflows and user requirements. Deliver seamless and intuitive user experiences by understanding user personas. Take end-to-end ownership of subsystems, including design, coding, testing, integration, deployment, and maintenance. Write clean, high-quality code and take accountability for assigned tasks. Identify and resolve performance bottlenecks for improved system efficiency. Mentor and guide junior engineers to ensure adherence to coding standards and best practices. Collaborate with management, product teams, QA, and UI/UX designers for feature development and delivery. Maintain a focus on quality and timely delivery in a fast-paced start-up environment. Skills & Qualifications: Bachelor’s or Master’s degree in Computer Science or a related field. 7+ years of experience in backend development with Java. Strong proficiency in Java and object-oriented programming ( 5+ years ). Hands-on experience with Java frameworks such as Spring Boot, Hibernate ( 4+ years ). Experience with building and consuming RESTful APIs ( 3+ years ). Proficiency in RDBMS and NoSQL databases such as MySQL, PostgreSQL, MongoDB ( 3+ years ). Experience with cloud platforms like AWS, Azure, or Google Cloud ( 3+ years ). Experience with messaging systems like Kafka or RabbitMQ ( 3+ years ). Proficiency in DevOps tools like Docker, Kubernetes, Jenkins, CI/CD pipelines ( 3+ years ). Strong problem-solving, debugging, and performance optimization skills. Ability to work in cross-functional teams and communicate effectively.

Posted 5 days ago

Apply

5.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Key Responsibilities: Design and develop solutions to address complex business workflows and user requirements. Deliver seamless and intuitive user experiences by understanding user personas. Take end-to-end ownership of subsystems, including design, coding, testing, integration, deployment, and maintenance. Write clean, high-quality code and take accountability for assigned tasks. Identify and resolve performance bottlenecks for improved system efficiency. Mentor and guide junior engineers to ensure adherence to coding standards and best practices. Collaborate with management, product teams, QA, and UI/UX designers for feature development and delivery. Maintain a focus on quality and timely delivery in a fast-paced startup environment. Skills & Qualifications: Bachelor’s or Master’s degree in Computer Science or a related field. 5+ years of experience in backend development with Java. Strong proficiency in Java and object-oriented programming ( 5+ years ). Hands-on experience with Java frameworks such as Spring Boot, Hibernate ( 4+ years ). Experience with building and consuming RESTful APIs ( 3+ years ). Proficiency in RDBMS and NoSQL databases such as MySQL, PostgreSQL, MongoDB ( 3+ years ). Experience with cloud platforms like AWS, Azure, or Google Cloud ( 3+ years ). Experience with messaging systems like Kafka or RabbitMQ ( 3+ years ). Proficiency in DevOps tools like Docker, Kubernetes, Jenkins, CI/CD pipelines ( 3+ years ). Strong problem-solving, debugging, and performance optimization skills. Ability to work in cross-functional teams and communicate effectively.

Posted 5 days ago

Apply

5.0 years

0 Lacs

Udaipur, Rajasthan, India

On-site

For quick Response, please fill out the form Job Application Form 34043 - Data Scientist - Senior I - Udaipur https://docs.google.com/forms/d/e/1FAIpQLSeBy7r7b48Yrqz4Ap6-2g_O7BuhIjPhcj-5_3ClsRAkYrQtiA/viewform 3–5 years of experience in Data Engineering or similar roles Strong foundation in cloud-native data infrastructure and scalable architecture design Build and maintain reliable, scalable ETL/ELT pipelines using modern cloud-based tools Design and optimize Data Lakes and Data Warehouses for real-time and batch processing Ingest, transform, and organize large volumes of structured and unstructured data Collaborate with analysts, data scientists, and backend engineers to define data needs Monitor, troubleshoot, and improve pipeline performance, cost-efficiency, and reliability Implement data validation, consistency checks, and quality frameworks Apply data governance best practices and ensure compliance with privacy and security standards Use CI/CD tools to deploy workflows and automate pipeline deployments Automate repetitive tasks using scripting, workflow tools, and scheduling systems Translate business logic into data logic while working cross-functionally Strong in Python and familiar with libraries like pandas and PySpark Hands-on experience with at least one major cloud provider (AWS, Azure, GCP) Experience with ETL tools like AWS Glue, Azure Data Factory, GCP Dataflow, or Apache NiFi Proficient with storage systems like S3, Azure Blob Storage, GCP Cloud Storage, or HDFS Familiar with data warehouses like Redshift, BigQuery, Snowflake, or Synapse Experience with serverless computing like AWS Lambda, Azure Functions, or GCP Cloud Functions Familiar with data streaming tools like Kafka, Kinesis, Pub/Sub, or Event Hubs Proficient in SQL, and knowledge of relational (PostgreSQL, MySQL) and NoSQL (MongoDB, DynamoDB) databases Familiar with big data frameworks like Hadoop or Apache Spark Experience with orchestration tools like Apache Airflow, Prefect, GCP Workflows, or ADF Pipelines Familiarity with CI/CD tools like GitLab CI, Jenkins, Azure DevOps Proficient with Git, GitHub, or GitLab workflows Strong communication, collaboration, and problem-solving mindset Experience with data observability or monitoring tools (bonus points) Contributions to internal data platform development (bonus points) Comfort working in data mesh or distributed data ownership environments (bonus points) Experience building data validation pipelines with Great Expectations or similar tools (bonus points)

Posted 5 days ago

Apply

5.0 - 9.0 years

11 - 12 Lacs

Hyderabad

Work from Office

We are seeking a highly skilled Java Full Stack Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Java and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in Java programming, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 5+ years of experience in full-stack development, with a strong focus on Java. Java Full Stack Developer Roles & Responsibilities: Develop scalable web applications using Java (Spring Boot) for backend and React/Angular for frontend. Implement RESTful APIs to facilitate communication between frontend and backend. Design and manage databases using MySQL, PostgreSQL, Oracle , or MongoDB . Write complex SQL queries, procedures, and perform database optimization. Build responsive, user-friendly interfaces using HTML, CSS, JavaScript , and frameworks like Bootstrap, React, Angular , NodeJS, Phyton integration Integrate APIs with frontend components. Participate in designing microservices and modular architecture. Apply design patterns and object-oriented programming (OOP) concepts. Write unit and integration tests using JUnit , Mockito , Selenium , or Cypress . Debug and fix bugs across full stack components. Use Git , Jenkins , Docker , Kubernetes for version control, continuous integration , and deployment. Participate in code reviews, automation, and monitoring. Deploy applications on AWS , Azure , or Google Cloud platforms. Use Elastic Beanstalk , EC2 , S3 , or Cloud Run for backend hosting. Work in Agile/Scrum teams, attend daily stand-ups, sprints, retrospectives, and deliver iterative enhancements. Document code, APIs, and configurations. Collaborate with QA, DevOps, Product Owners, and other stakeholders. Must-Have Skills : Java Programming: Deep knowledge of Java language, its ecosystem, and best practices. Frontend Technologies: Proficiency in HTML , CSS , JavaScript , and modern frontend frameworks like React or Angular etc... Backend Development: Expertise in developing and maintaining backend services using Java, Spring, and related technologies. Full Stack Development: Experience in both frontend and backend development, with the ability to work across the entire application stack. Soft Skills: Problem-Solving: Ability to analyze complex problems and develop effective solutions. Communication Skills: Strong verbal and written communication skills to effectively collaborate with cross-functional teams. Analytical Thinking: Ability to think critically and analytically to solve technical challenges. Time Management: Capable of managing multiple tasks and deadlines in a fast-paced environment. Adaptability: Ability to quickly learn and adapt to new technologies and methodologies. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm

Posted 5 days ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Go Developer to build high-performance microservices and backend systems. The role requires proficiency in Golang and a deep understanding of concurrent programming and cloud-native architecture. Key Responsibilities: Develop scalable services and APIs using Go Optimize systems for performance and concurrency Build event-driven systems using Kafka, NATS, or gRPC Work with CI/CD pipelines and containerized environments Participate in system design and architecture discussions Required Skills & Qualifications: Proficient in Golang , standard library, and concurrency patterns Experience with Docker, Kubernetes, and REST/gRPC services Knowledge of cloud infrastructure (AWS/GCP) Familiar with Prometheus, Grafana, and distributed tracing tools Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Reddy Delivery Manager Integra Technologies

Posted 5 days ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Rust Developer to build secure, high-performance systems. This role is ideal for developers who enjoy working close to the system level, with a focus on performance and safety. Key Responsibilities: Write scalable and efficient code using Rust . Develop backend services, blockchain integrations, or embedded systems. Optimize for memory safety and performance without garbage collection. Build and maintain CI pipelines and testing frameworks. Collaborate with DevOps to ensure smooth deployments. Required Skills & Qualifications: Proficient in Rust and its ecosystem (Cargo, Crates.io) Familiar with Actix, Rocket, Tokio, or async runtimes Understanding of ownership model, lifetimes, and traits Experience with low-latency systems or blockchain tech is a plus Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 5 days ago

Apply

8.0 - 11.0 years

45 - 50 Lacs

Noida, Kolkata, Chennai

Work from Office

Dear Candidate, We are hiring a Scala Developer to work on scalable data pipelines, distributed systems, and backend services. This role is perfect for candidates passionate about functional programming and big data. Key Responsibilities: Develop data-intensive applications using Scala . Work with frameworks like Akka, Play, or Spark . Design and maintain scalable microservices and ETL jobs. Collaborate with data engineers and platform teams. Write clean, testable, and well-documented code. Required Skills & Qualifications: Strong in Scala, Functional Programming, and JVM internals Experience with Apache Spark, Kafka, or Cassandra Familiar with SBT, Cats, or Scalaz Knowledge of CI/CD, Docker, and cloud deployment tools Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 5 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Who We Are Zinnia is the leading technology platform for accelerating life and annuities growth. With innovative enterprise solutions and data insights, Zinnia simplifies the experience of buying, selling, and administering insurance products. All of which enables more people to protect their financial futures. Our success is driven by a commitment to three core values: be bold, team up, deliver value – and that we do. Zinnia has over $180 billion in assets under administration, serves 100+ carrier clients, 2500 distributors and partners, and over 2 million policyholders. Who You Are A senior leader who brings deep expertise in building Financial applications using Peoplesoft or similar technologies and thrives in a collaborative, cross-functional environment. You are passionate about enterprise platforms, team leadership, and delivering high-impact solutions that power core financial and operational systems. You lead with a combination of strategic vision and hands-on knowledge of financial applications such as PeopleSoft FSCM, Oracle cloud ERP, SAP S/4HANA Finance). You are energized by complex problem-solving and creating scalable solutions in partnership with internal stakeholders and executive leadership. You thrive in transforming systems, leading teams, and pushing continuous improvements in enterprise application landscapes. What You’ll Do Lead the PeopleSoft applications strategy across the organization, focusing on finance and operations modules including GL, AP, AR, Cash Management, Purchasing, and Asset Management. Own the roadmap, planning, and execution of initiatives related to upgrades, patching, performance tuning, integrations, and business-driven enhancements in PeopleSoft. Act as a strategic advisor to executive leadership (including the CTO and CIO) on system architecture, platform capabilities, modernization efforts, and investment opportunities. Oversee a team of developers, analysts, and administrators, ensuring high levels of technical execution, service availability, and functional alignment with business goals. Partner with finance, operations, and IT leaders to translate business needs into scalable PeopleSoft solutions, balancing innovation, stability, and compliance. Serve as the primary escalation point for PeopleSoft-related issues and lead efforts to resolve complex functional and technical challenges. Drive governance, prioritization, and delivery of enhancements and initiatives through strong program management and agile practices. Monitor, evaluate, and optimize system performance including SQL query tuning, App Engine processes, reports, and integrations. Ensure compliance, audit readiness, and robust documentation across environments. Champion best practices in security configuration, data management, and testing processes across the PeopleSoft ecosystem. Develop and maintain executive-level reporting to track project progress, system health, user satisfaction, and roadmap alignment. Stay current on industry trends and technology advancements in enterprise platforms to drive continuous improvement and innovation. Represent the team and platform strategy in executive and governance forums, articulating value, progress, and future vision. What You’ll Need 10+ years of progressive experience in enterprise application platforms, with deep hands-on and leadership experience in PeopleSoft FSCM modules or similar competitor solutions Proven success in leading financial applications development, support, and upgrade initiatives in a complex, regulated enterprise environment. Strong leadership experience, including managing geographically distributed teams and cross-functional collaboration with senior stakeholders. Strong understanding of REST based APIs, Kafka, databases to manage upstream and downstream processes. Solid understanding of systems integration, application performance tuning, and functional support across finance modules. Experience leading Agile/Scrum development cycles, including sprint planning, retrospectives, and iterative delivery. Strong decision-making, analytical thinking, and problem-solving capabilities with a focus on quality and results. Excellent communication skills, with a proven ability to present complex technical ideas to non-technical stakeholders. Bachelor's degree in Computer Science, Information Systems, or a related field; advanced degree preferred. Preferred: Experience with scheduling tools such as UC4/Atomic, and domain knowledge of the insurance or financial services industry. WHAT’S IN IT FOR YOU? We’re looking for the best and brightest innovators in the industry to join our team. At Zinnia, you collaborate with smart, creative professionals who are dedicated to delivering cutting-edge technologies, deeper data insights, and enhanced services to transform how insurance is done. Visit our website at www.zinnia.com for more information. Apply by completing the online application on the careers section of our website. We are an Equal Opportunity employer committed to a diverse workforce. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability

Posted 5 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

JOB DESCRIPTION: Position: - Go Developer Experience: 3–5+ years Location: Pune Role Summary: As a Go Developer, you will be responsible for building scalable, secure backend services and middleware components to support the payment gateway infrastructure. You'll work closely with Rust developers, system architects, and DevOps to build event-driven, high-availability services. Required Skills: Application Security, Infrastructure Security, PCI-DSS, ISO 27001, RBI Cybersecurity Guidelines, SIEM (E.G., Splunk, Wazuh), WAF, Firewalls, Intrusion Detection Systems, TLS Certificates, HSMs, Secrets Management, Secure APIs, Python Scripting, Bash Scripting. Key Responsibilities: • Develop and maintain robust APIs, authentication layers, and service integrations. • Build and scale middleware for transactions, settlements, and data reconciliation. • Implement caching, rate limiting, and logging mechanisms. • Integrate with card networks, UPI, IMPS, NEFT, and bank APIs. • Write unit, integration, and load tests for reliability and performance validation. • Collaborate with frontend and mobile teams to define backend contracts. Required Skills: • 3 to 5+ years of experience developing Go-based applications. • Solid understanding of concurrency, goroutines, and channels. • Experience with REST/gRPC APIs, microservices, and event-driven architectures. • Familiarity with Docker, Kubernetes, PostgreSQL, Redis, and Kafka/RabbitMQ. • Experience with monitoring and observability tools (e.g., Prometheus, Grafana). • Previous experience in fintech, banking, or payment platforms is a plus.

Posted 5 days ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Clearwater Analytics’ mission is to become the world’s most trusted and comprehensive technology platform for investment reporting, accounting, and analytics. With our team, you will partner with the most sophisticated and innovative institutional investors around the world. If you are infectiously passionate about what you do, intensely committed to clients, and driven by continuous innovation and improvement... We want you to apply! A career in Software Development , will provide you with the opportunity to participate in all phases of the software development lifecycle, including design, implementation, testing and deployment of quality software. With the use of advanced technology, you and your team will work in an agile environment producing designs and code that our customers will use every day . Responsibilities: Developing quality software that is used by some of the world's largest technology firms, fixed income asset managers, and custodian banks Participate in Agile meetings to contribute with development strategies and product roadmap Owning critical processes that are highly available and scalable Producing tremendous feature enhancements and reacting quickly to emerging technologies Encouraging collaboration and stimulating creativity Helping mentor entry-level developers Contributing to design and architectural decisions Providing leadership and expertise to our ever-growing workforce Testing and validating in development and production code that they own, deploy, and monitor Understanding, responding to, and addressing customer issues with empathy and in a timely manner Independently can move a major feature or service through an entire lifecycle of design, development, deployment, and maintenance Deep knowledge in multiple teams' domains; broad understanding of CW systems. Creates documentation of system requirements and behavior across domains Willingly takes on unowned and undesirable work that helps team velocity and quality Is in touch with client needs and understands their usage Consulted on quality, scaling and performance requirements before development on new features begins. Understands, finds, and proposes solutions for systemic problems Leads in the technical breakdown of deliverables and capabilities into features and stories. Expert in unit testing techniques and design for testability, contributes to automated system testing requirements and design Improves code quality and architecture to ensure testability and maintainability Understands, designs, and tests for impact/performance on dependencies and adjacent components and services. Builds and maintains code in the context and awareness of the larger system Helps less experienced engineers troubleshoot and solve problems Active in mentoring and training of others inside and outside their division Requirements: Strong problem-solving skills Experience with an object-oriented, or functional language Bachelor’s degree in Computer Science or related field Strong problem-solving skills 7+ years professional experience in industry-leading programming languages (Java/Python). Background in SDLC & Agile practices. Experience in monitoring production systems. Experience with Machine Learning Experience working with Cloud Platforms (AWS/Azure/GCP). Experience working with messaging systems such as Cloud Pub/Sub, Kafka, or SQS/SNS. Must be able to communicate (speak, read, comprehend, write in English). Desired Experience or Skills: Ability to build scalable backend services (Microservices, polyglot storage, messaging systems, data processing pipelines). Possess strong analytical skills, with excellent problem-solving abilities in the face of ambiguity. Excellent written and verbal skills. Ability to contribute to software design documentation, presentation, sequence diagrams and present complex technical designs in a concise manner. Professional experience in building distributed software systems, specializing in big data and NoSQL database technologies (Hadoop, Spark, DynamoDB, HBase, Hive, Cassandra, Vertica). Ability to work with relational and NoSQL databases Strong problem-solving skills. Strong organizational, interpersonal, and communication skills. Detail oriented. Motivated, team player.

Posted 5 days ago

Apply

6.0 - 11.0 years

30 - 37 Lacs

Bengaluru

Work from Office

Job Title: Senior Java Developer Location: Bangalore (Candidates must be based in Bangalore) Open Positions: 3 Positions: 5 to 8 years of experience 1 Position: 8+ years of experience Key Requirements: Strong experience in Core Java , Spring Boot , Spring Batch Expertise in REST APIs , Kafka , Junit , Maven/Gradle Hands-on experience with Microservices architecture Good knowledge of PostgreSQL Working experience with AWS , Jenkins , and GIT Good to Have: Domain experience in Payment Systems Knowledge of payment processes: Authorization , Settlement/Reconciliation , Credit/Debit Cards , Gift Cards Experience in legacy modernization projects Familiarity with Agile methodology Note: Candidates from Bangalore preferred. Immediate joiners or candidates with short notice periods are preferred.

Posted 5 days ago

Apply

3.0 - 6.0 years

2 - 7 Lacs

Salem

Work from Office

Job Title: Senior Microservices Engineer / Microservices Developer Job Overview: We are looking for a seasoned Microservices Engineer who will design, build, and maintain scalable, secure, and highperformance microservice architectures. You'll work closely with product owners, solution architects, DevOps, and QA teams to deliver cloud-native and containerized applications. Responsibilities: Design, develop, and deploy microservices-based applications using best practices and patterns Write clean, testable, and efficient API services (REST) Collaborate across teams to define requirements, participate in architecture discussions, and conduct code reviews Implement CI/CD pipelines and DevOps best practices (e.g. Jenkins, Terraform, Docker, Kubernetes) . Monitor, troubleshoot, and optimize microservices performance in production environments Document service designs, API contracts, and deployment processes Mentor and guide junior developers in microservices design, testing, and deployment Required Qualifications & Skills Bachelors degree in CS or related field (or equivalent experience). 3+ years in software development focused on microservices architecture Strong programming proficiency in Python, Django,AWS API Gateway, Lambda. Solid experience with RESTful API design, event-driven patterns, and message brokers (e.g., Kafka, RabbitMQ) . Proficient in Docker and orchestration tools (Kubernetes, ECS/EKS) Familiarity with AWS cloud platform. Understanding of SQL and NoSQL databases, caching layers, and data management in distributed systems. Solid grasp of distributed systems concepts: service discovery, resilience patterns, circuit breakers, eventual consistency Excellent problem-solving skills and communication abilities; experience with agile methodologies. Company: Mukesh Buildtech is an innovative stealth startup focused on a new marketplace. We are building a cutting-edge platform that leverages advanced technologies to provide unparalleled user experiences. Join our dynamic team and be a part of our exciting journey from the ground up. Mukesh Buildtech Private Limited is backed by the strategic guidance of Mukesh & Associates (www.mukeshassociates.com) If you're interested, please share your CV with us at sumathi@mukeshassociates.com

Posted 5 days ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

We are looking for a Software Engineer with development experience in Java, Spring Boot framework, development of REST APIs, PostgreSQL (RDBMS) and strong Unit testing skills. In This Role, You Will Develop and maintain Java-based microservices and applications. Developing REST APIs using the Springboot Framework Work with Object Relational Mapping (ORM) frameworks like Hibernate/JPA. Demonstrate proficiency with RDBMS systems (PostgreSQL), creating data models, indexing, and writing optimal queries. Demonstrate proficiency in event oriented architecture, working knowledge of Kafka or similar technologies. Proficient in version control systems, particularly GitHub. Troubleshoot, debug, and optimize cloud-based applications. To Be Successful, You Will Have Bachelor's Degree in Computer Science or Computer Engineering 3-5 years of related expereince Excellent communication skills (both written and verbal) Proven skills in implementing unit testing (JUnit) Working with IoT datasets using frameworks like Kafka Understanding of addressing cybersecurity OWASP best practices in software applications. Experience working with data transport formats like Google Protobuf, Avro, Docker, Kubernetes and familiarity with DevOps methodologies. AMETEK, Inc. is a leading global provider of industrial technology solutions serving a diverse set of attractive niche markets with annual sales over $7.0 billion. AMETEK is committed to making a safer, sustainable, and more productive world a reality. We use differentiated technology solutions to solve our customers’ most complex challenges. We employ 21,000 colleagues, in 35 countries, that are grounded by our core values: Ethics and Integrity, Respect for the Individual, Inclusion, Teamwork, and Social Responsibility. AMETEK (NYSE:AME) is a component of the S&P 500. Visit www.ametek.com for more information.

Posted 5 days ago

Apply

7.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

FanCode is India’s premier sports destination committed to giving fans a highly personalised experience across content and merchandise for a wide variety of sports. Founded by sports industry veterans Yannick Colaco and Prasana Krishnan in March 2019, FanCode has over 100 million users. It has partnered with domestic, international sports leagues and associations across multiple sports. In content, FanCode offers interactive live streaming for sports with industry-first subscription formats like Match, Bundle and Tour Passes, along with monthly and annual subscriptions at affordable prices. Through FanCode Shop, it also offers fans a wide range of sports merchandise for sporting teams, brands and leagues across the world. Technology @ FanCode We have one mission: Create a platform for all sports fans. Built by sports fans for sports fans, we cover Sports Live Video Streaming, Live Scores & Commentary, Video On Demand, Player Analytics, Fantasy Research, News, and e-Commerce. We’re at the beginning of our story and growing at an incredible pace. Our tech stack is hosted on AWS and GCP, leveraging Amazon EC2, CloudFront, Lambda, API Gateway, Google Compute Engine, Cloud Functions, and Google Cloud Storage. We use a microservices-based architecture built on Java, Node.js , Python, PHP, Redis, MySQL, Cassandra, and Elasticsearch to serve product features. Our data-driven team utilizes Python and other big data technologies for Machine Learning and Predictive Analytics, along with Kafka, Spark, Redshift, and BigQuery to keep improving FanCode's performance. Your Role As the Director of DevOps at FanCode, you will lead and shape the vision, strategy, and execution of our Core Infra team. This team is responsible for maintaining a stable, secure, and scalable environment that empowers our talented developers to innovate and deliver exceptional experiences to sports fans globally. Key Responsibilities Strategic Leadership: Develop and execute the DevOps strategy, ensuring alignment with FanCode's overall business objectives. Shape and communicate the vision for cloud-native infrastructure, driving scalability, reliability, and performance. Mentor and lead the DevOps team, fostering a culture of innovation, collaboration, and continuous learning. Infrastructure and Automation: Oversee the deployment of resilient Cloud Architectures using Infrastructure as Code (IaC) tools like Terraform and Ansible. Design and implement tools for service-oriented architecture, including service discovery, config management, and container orchestration. Lead the development of a Compute Orchestration Platform using EC2 and GCE, driving automation and self-service infrastructure. Be hands-on with Kubernetes (GKE/EKS), container orchestration, and scalable infrastructure solutions. CI/CD, Performance, and Security: Strategize and implement CI/CD pipelines using tools like Jenkins, ArgoCD, and Github Actions to optimize deployment workflows. Champion best practices for networking and security at scale, ensuring compliance and data integrity. Implement monitoring solutions using DataDog, NewRelic, CloudWatch, ELK Stack, and Prometheus/Grafana for proactive performance management. Collaboration and Cross-Functional Alignment: Collaborate with Engineering, QA, Product, and Data Science teams to streamline product development and release cycles. Promote knowledge sharing and infrastructure best practices across all technical teams, ensuring consistent standards. Innovation and Continuous Improvement: Stay abreast of the latest DevOps trends and technologies, driving continuous improvement initiatives. Evaluate and recommend cutting-edge tools and practices to enhance FanCode's infrastructure and processes. Must Haves: 7+ years of relevant experience in DevOps, with at least 3 years in a leadership role. Strong experience with AWS and GCP cloud platforms, with hands-on expertise in Infrastructure as Code (IaC). Proficiency in scripting languages like Python or Bash. Deep hands-on expertise with Kubernetes (GKE/EKS) and container orchestration. Proven ability to lead by example, getting hands-on when needed while driving a strong DevOps culture. Strong background in CI/CD pipeline automation, performance monitoring, and cloud security best practices. Excellent troubleshooting skills, with the ability to dive deep into system-level issues. Excellent communication and leadership skills, with the ability to influence stakeholders at all levels. Good to Haves: Experience with CI/CD tools like Jenkins, ArgoCD, and Github Actions. Knowledge of monitoring solutions such as DataDog, NewRelic, CloudWatch, ELK Stack, Prometheus/Grafana. Hands-on experience with DevOps automation tools like Terraform and Ansible. AWS, GCP, or Azure certification(s). Previous experience in fast-paced startup environments. Passion for sports and a desire to impact millions of sports fans globally. Dream Sports is India’s leading sports technology company with brands such as Dream11 , the world’s largest fantasy sports platform, FanCode , a premier digital sports platform that personalizes content and commerce for all sports fans, DreamSetGo , a sports experiences platform, and DreamPay , a payment solutions provider. It has founded the Dream Sports Foundation to help and champion sportspeople and is an active member of the Federation of Indian Fantasy Sports , the nodal body for the Fantasy Sports industry in India. Founded in 2008 by Harsh Jain and Bhavit Sheth, Dream Sports is always working on its mission to ‘Make Sports Better’ and is located in Mumbai. Dream Sports has been featured as a ‘Great Places to Work’ by the Great Place to Work Institute for four consecutive years. It is also the only sports tech company among India’s best companies to work for in 2021. For more information: https://dreamsports.group/ About FanCode: FanCode is India’s premier digital sports destination committed to giving all fans a highly personalized experience across Content, Community, and Commerce. Founded by sports industry veterans Yannick Colaco and Prasana Krishnan in March 2019, FanCode offers interactive live streaming, sports fan merchandise (FanCode Shop), fast interactive live match scores, in-depth live commentary, fantasy sports data and statistics (Fantasy Research Hub), expert fantasy tips, sports news and much more. FanCode has partnered with both domestic and international sports leagues and associations across multiple sports such as three of the top American Leagues - MLB, NFL, and NBA, FIVB, West Indies Cricket Board, Bangladesh Premier League, Caribbean Premier League, Bundesliga, and I-League. Dream Sports India’s leading Sports Technology company is the parent company of FanCode with brands such as Dream11 also in its portfolio. FanCode has already amassed over 2 crore+ app installs and won the “Best Sports Startup” award at FICCI India Sports Awards 2019. Get the FanCode App: iOS | Android Website: www.fancode.com FanCode Shop : www.shop.fancode.com Dream Sports is India’s leading sports technology company with 250 million users, housing brands such as Dream11 , the world’s largest fantasy sports platform, FanCode , India’s digital sports destination, and DreamSetGo , a sports experiences platform. Dream Sports is based in Mumbai and has a workforce of close to 1,000 ‘Sportans’. Founded in 2008 by Harsh Jain and Bhavit Sheth, Dream Sports’ vision is to ‘Make Sports Better’ for fans through the confluence of sports and technology. For more information: https://dreamsports.group/

Posted 5 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Java Standard Edition, Spring Boot, Apache Kafka, Cucumber (Software) Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. You will work on developing innovative solutions to enhance business operations and user experience. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Develop and implement software solutions using Java Standard Edition. - Collaborate with team members to design and build applications. - Conduct code reviews and provide feedback to improve code quality. - Troubleshoot and debug applications to ensure optimal performance. - Stay updated on industry trends and technologies to enhance development processes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Java Standard Edition, Apache Kafka, Spring Boot, Cucumber (Software). - Need a hands-on senior developer with - Sound knowledge of OOPS concepts - Expertise in Core Java, Spring, JUnit, Mockito, Cucumber. Kafka knowledge - Good understanding of build tools like Maven, Ant etc - Working experience in CI / CD pipeline - Knowledge of code version tools like GIT - Strong understanding of software development principles and best practices. - Experience in developing and deploying applications using Spring Boot framework. - Knowledge of testing frameworks and tools like JUnit for ensuring code quality. - Familiarity with Agile methodologies and continuous integration/continuous deployment (CI/CD) pipelines. Additional Information: - The candidate should have a minimum of 3 years of experience in Java Standard Edition. - This position is based at our Pune office. - A 15 years full time education is required.

Posted 5 days ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Java Standard Edition Good to have skills : Spring Boot, Apache Kafka, Cucumber (Software) Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with team members to develop innovative solutions and ensure seamless application functionality. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Develop and implement high-quality software solutions. - Collaborate with cross-functional teams to analyze user needs and design efficient applications. - Conduct code reviews and provide feedback to enhance code quality. - Troubleshoot and debug applications to optimize performance. - Stay updated on industry trends and technologies to drive continuous improvement. Professional & Technical Skills: - Must To Have Skills: Proficiency in Java Standard Edition, Spring Boot, Cucumber (Software), Apache Kafka. - Core Java, Spring, Mockito, Cucumber, Kafka, Maven, React.js - Expertise on Java, Spring, PL/SQL, SQL, Multithreading, Unit Testing Exposure working on TeamCity, JIRA & GIT, should have a basic understanding of Equities Derivatives - Strong understanding of object-oriented programming principles. - Experience in developing and maintaining Java applications. - Knowledge of software development lifecycle and agile methodologies. - Familiarity with RESTful web services and microservices architecture. Additional Information: - The candidate should have a minimum of 3 years of experience in Java Standard Edition. - This position is based at our Pune office. - A 15 years full time education is required.

Posted 5 days ago

Apply

4.0 - 9.0 years

14 - 22 Lacs

Pune

Work from Office

Responsibilities: * Design, develop, test and maintain scalable Python applications using Scrapy, Selenium and Requests. * Implement anti-bot systems and data pipeline solutions with Airflow and Kafka. Share CV on recruitment@fortitudecareer.com Flexi working Work from home

Posted 5 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: Big Data Engineer (Apache Spark) Experience: 5-8 Years Location: Bengaluru/Hyderabad We are seeking a skilled and detail-oriented Big Data Engineer with strong expertise in Apache Spark to join our dynamic data team. The ideal candidate will have proven experience in designing and implementing large-scale data processing solutions and a strong understanding of distributed computing concepts. Key Responsibilities Design and develop scalable data processing solutions using Apache Spark (Core, SQL, Streaming, MLlib). Build, optimize, and maintain robust data pipelines and ETL processes for structured and unstructured data. Collaborate with data scientists, analysts, and software engineers to integrate Spark-based data products into production workflows. Ensure data quality, integrity, and security throughout the data lifecycle. Monitor, troubleshoot, and improve the performance of Spark jobs in production environments. Integrate Spark with cloud platforms such as AWS, Azure, or GCP for scalable deployment. Required Skills and Qualifications: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. 3+ years of hands-on experience with Apache Spark in a large-scale data environment. Strong proficiency in Scala, Python, or Java (preference for Scala). Experience with data storage technologies such as HDFS, Hive, HBase, S3, etc. Familiarity with SQL, Kafka, Airflow, and NoSQL databases. Understanding of distributed systems and parallel computing concepts. Experience with containerization and orchestration tools (e.g., Docker, Kubernetes) is a plus. Preferred Qualifications Certification in Big Data or Spark (e.g., Databricks Certified Developer). Experience working in a DevOps/CI-CD environment. Knowledge of data warehousing concepts and tools like Snowflake or Redshift.

Posted 5 days ago

Apply

0 years

0 Lacs

Anupgarh, Rajasthan, India

On-site

34638BR Bangalore - Campus Job Description Role: AWS Data Specialist 8+ yes of exp with Managed Apache Kafka and Databricks in AWS, Dynamo DB, AWS Glue, AWS pipeline development. FHIR skill – All are must have skills Qualifications BE Range of Year Experience-Min Year 5 Range of Year Experience-Max Year 8

Posted 5 days ago

Apply

0 years

0 Lacs

Anupgarh, Rajasthan, India

On-site

34639BR Bangalore - Campus Job Description Role: AWS Data Specialist 8+ yes of exp with Managed Apache Kafka and Databricks in AWS, Dynamo DB, AWS Glue, AWS pipeline development. FHIR skill – All are must have skills Qualifications BE Range of Year Experience-Min Year 5 Range of Year Experience-Max Year 8

Posted 5 days ago

Apply

6.0 - 10.0 years

30 - 35 Lacs

Bengaluru

Work from Office

We are seeking an experienced PySpark Developer / Data Engineer to design, develop, and optimize big data processing pipelines using Apache Spark and Python (PySpark). The ideal candidate should have expertise in distributed computing, ETL workflows, data lake architectures, and cloud-based big data solutions. Key Responsibilities: Develop and optimize ETL/ELT data pipelines using PySpark on distributed computing platforms (Hadoop, Databricks, EMR, HDInsight). Work with structured and unstructured data to perform data transformation, cleansing, and aggregation. Implement data lake and data warehouse solutions on AWS (S3, Glue, Redshift), Azure (ADLS, Synapse), or GCP (BigQuery, Dataflow). Optimize PySpark jobs for performance tuning, partitioning, and caching strategies. Design and implement real-time and batch data processing solutions. Integrate data pipelines with Kafka, Delta Lake, Iceberg, or Hudi for streaming and incremental updates. Ensure data security, governance, and compliance with industry best practices. Work with data scientists and analysts to prepare and process large-scale datasets for machine learning models. Collaborate with DevOps teams to deploy, monitor, and scale PySpark jobs using CI/CD pipelines, Kubernetes, and containerization. Perform unit testing and validation to ensure data integrity and reliability. Required Skills & Qualifications: 6+ years of experience in big data processing, ETL, and data engineering. Strong hands-on experience with PySpark (Apache Spark with Python). Expertise in SQL, DataFrame API, and RDD transformations. Experience with big data platforms (Hadoop, Hive, HDFS, Spark SQL). Knowledge of cloud data processing services (AWS Glue, EMR, Databricks, Azure Synapse, GCP Dataflow). Proficiency in writing optimized queries, partitioning, and indexing for performance tuning. Experience with workflow orchestration tools like Airflow, Oozie, or Prefect. Familiarity with containerization and deployment using Docker, Kubernetes, and CI/CD pipelines. Strong understanding of data governance, security, and compliance (GDPR, HIPAA, CCPA, etc.). Excellent problem-solving, debugging, and performance optimization skills.

Posted 5 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies