Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 7.0 years
5 - 15 Lacs
Pune
Work from Office
Role & responsibilities We have an urgent requirement of Java + Camunda Role Skillset : Java + Spring boot + Camunda + Kafka Grade : P2 Experience level : 2 to 4 Years P3 Experience Level : 4 to 7 Years Thanks & Regards Sushma Patil HR Coordinator 92700 05035
Posted 6 hours ago
3.0 - 8.0 years
8 - 15 Lacs
Pune
Hybrid
Role : Developer Location: Pune Hybrid Excellent Communication skills NP: Immediate Joiners to 1 month 9 (Only serving NP candidates apply) Exp: 3 to 9 yrs All Mandatory Skills : ( Must be in the roles and responsibilities) Data Platform Java Python Spark Kafka Cloud technologies (Azure / AWS) Databricks Interested Candidate Share Resume at dipti.bhaisare@in.experis.com
Posted 6 hours ago
2.0 - 7.0 years
4 - 8 Lacs
Ahmedabad
Work from Office
Travel Designer Group Founded in 1999, Travel Designer Group has consistently achieved remarkable milestones in a relatively short span of time. While we embody the agility, growth mindset, and entrepreneurial energy typical of start-ups, we bring with us over 24 years of deep-rooted expertise in the travel trade industry. As a leading global travel wholesaler, we serve as a vital bridge connecting hotels, travel service providers, and an expansive network of travel agents worldwide. Our core strength lies in sourcing, curating, and distributing high-quality travel inventory through our award-winning B2B reservation platform, RezLive.com. This enables travel trade professionals to access real-time availability and competitive pricing to meet the diverse needs of travelers globally. Our expanding portfolio includes innovative products such as: * Rez.Tez * Affiliate.Travel * Designer Voyages * Designer Indya * RezRewards * RezVault With a presence in over 32+ countries and a growing team of 300+ professionals, we continue to redefine travel distribution through technology, innovation, and a partner-first approach. Website : https://www.traveldesignergroup.com/ Profile :- ETL Developer ETL Tools - any 1 -Talend / Apache NiFi /Pentaho/ AWS Glue / Azure Data Factory /Google Dataflow Workflow & Orchestration : any 1 good to have- not mandatory Apache Airflow/dbt (Data Build Tool)/Luigi/Dagster / Prefect / Control-M Programming & Scripting : SQL (Advanced) Python ( mandatory ) Bash/Shell (mandatory) Java or Scala (optional for Spark) -optional Databases & Data Warehousing MySQL / PostgreSQL / SQL Server / Oracle mandatory Snowflake - good to have Amazon Redshift - good to have Google BigQuery - good to have Azure Synapse Analytics - good to have MongoDB / Cassandra - good to have Cloud & Data Storage : any 1 -2 AWS S3 / Azure Blob Storage / Google Cloud Storage - mandatory Kafka / Kinesis / Pub/Sub Interested candidate also share your resume in shivani.p@rezlive.com
Posted 6 hours ago
7.0 - 12.0 years
18 - 32 Lacs
Ahmedabad
Remote
Participate in Agile standups, design discussions, and pair programming sessions. Write clean, testable, and maintainable code in Node.js and TypeScript. Review peer code and collaborate Take full ownership of solutions from design through deployment Required Candidate profile 7+ yrs of experience in software dev with expertise in Node.js, TypeScript, ReactJS. Strong in backend systems. Familiar with Java, Spring, REST/SOAP, TDD. Follows Agile & test-driven development.
Posted 7 hours ago
3.0 - 8.0 years
5 - 10 Lacs
Pune
Work from Office
Shift: (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Python, Node.js Position Overview: We are looking for a highly skilled Senior Backend Developer to join our team. The ideal candidate will bring extensive expertise in backend systems, cloud-native applications, and microservices, along with a strong track record of building scalable systems. If you are passionate about developing robust architectures and driving technical innovation, wed love to hear from you. Responsibilities: Design, develop, and maintain backend systems and cloud-native applications. Architect and implement scalable microservices using Go, Node.js, or Spring Boot. Leverage AWS cloud services to build, deploy, and monitor applications. Optimise systems for high availability, scalability, and performance. Work with Kafka, Redis, and Spark to manage real-time data pipelines and caching mechanisms. Design database solutions using MySQL and NoSQL technologies for efficient data storage and retrieval. Collaborate with cross-functional teams to integrate payment gateways and ensure seamless transaction processing (experience desirable). Contribute to the architectural design of systems to meet eCommerce and high-scale system demands. Write and maintain clean, reusable code with Python (desirable but not mandatory). Drive best practices for CI/CD pipelines and automated deployments. Mentor junior engineers and actively contribute to the teams technical growth. Required Qualifications: 3-6 years of experience in software engineering, with a focus on backend development and microservices architecture. Proficiency in one or more of the following: Go, Node.js, Python or Spring Boot. Deep understanding of AWS services (e.g., S3, RDS, Lambda, EC2). Proven experience in designing systems for scale and high performance. Hands-on experience with Kafka, Redis, Spark, and other distributed technologies. Strong knowledge of MySQL and NoSQL databases. Experience with system architecture design and implementation. Familiarity with e-commerce platforms is highly desirable. Experience with payment gateway integration is a plus. Strong problem-solving skills and the ability to work in fast-paced environments.
Posted 7 hours ago
7.0 - 12.0 years
40 - 45 Lacs
Noida
Hybrid
Expected Notice Period: 30 Days Shift: (GMT+05:30) Asia/Kolkata (IST) Opportunity Type: Hybrid (Noida) What do you need for this opportunity? Must have skills required: GCP, AWS, Docker, Jenkins, Apache, ELK, Jira, PHP, Java, Kafka, Micro services, MySQL Looking for: Responsibilities : Be able to conceptualize and develop prototype quickly Research, design and build highly reliable, available and scalable platforms. Build reusable components as libraries, utilities and services and promote reuse. Work closely with our engineering managers, product managers, strategists and team members to develop Agri-Tech products. Complete ownership of Service/Services that your team is responsible for Designing, developing, and maintaining new and existing code coding standards, best practices and frameworks. Lead by example, mentor andguide team members on everything from structured problem solving to development of best practices Implement continuous deployment to ship code every day, once a day. Attend daily stand-ups and any other meetings schedules Contribute to or lead group discussions and coach junior team members Own large technical deliverables and execute in an exemplary way. Manage tasks using JIRA and communicate status to tech leads and managers. Create and groom Tech specific backlog. Drive technical roadmap of the team in collaboration with Engineering and Product Support production releases and investigate issues, if needed Evangelize emerging technologies/applications or and find the opportunities to integrate them into operations. Coach others on the new technologies Requirements: Substantial experience in building complex and scalable solutions. Experience leading multi-engineer projects and mentoring junior engineers. 7+ years of programming experience with Java including object-oriented design. Strong object oriented design skills, ability to apply design patterns, and an uncanny ability to design intuitive module and class-level interfaces Comprehensive operational experience including, optimisations, deployments and tuning servers like apache/mysql/tomcat/solr Strong in coding, data structures, algorithms and problem solving. Experience designing for performance, scalability, availability and security. Strong desire to build, sense of ownership, urgency, and drive. Expertise in delivering high-quality and innovative applications. Experience in communicating with users, other technical teams, and senior management to collect requirements, describe software product features, product strategy and influence outcomes in technical decision-making. Excellent written communication and verbal agility are strong assets. Quickly adapt to new development environments and changing business requirements. Demonstrated ability to mentor other software developers in all aspects of their engineering skill sets. Track record of building and delivering mission critical, 24x7 production software systems. Performance optimisation knowledge must to have Should have the ability to do the code review of the team. Strong and deep professional experience designing and implementing web applications, especially developing and consuming microservices Experience in using git to manage code bases, branching, merging, etc. Experience in microservices architecture Experience in performance tuning on MySQL, PostgreSQL and MongoDB Skills/Knowledge: Strong collaboration skills Deep expertise with any or a combination of programming languages: Java & PHP, or any object-oriented high-level open source language with strong programming constructs. Outstanding attention to detail and adherence to deadlines; Ability to work effectively, both independently and as a member of a team; Distributed Systems Architecture, components modeling, data flow, Scaling and managing large pieces of data. Articulating system requirements, problem comprehension and identifying high level building blocks Ability to handle multiple tasks in a fast-paced environment; Ability to "think outside the box" while identifying problems and developing creative solutions Should have worked in microservices architecture Experience with release building and deployment software, such as Jenkins, preferred but not required Experience with Docker and Cloud Infra like GCP, AWS etc. Expertise with log analyzing tools like splunk or ELK stack etc... Should have knowledge of Queueing Implementation like Kafka, RabbitMQ or SQS Should have experience in one of cloud environment like AWS or GCP Should be able to write modular and functionally complete object oriented code, NFR implementation, abstractions, separation of concerns, concurrency & thread safety, extensibility, hooks etc.
Posted 7 hours ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad, Bengaluru
Work from Office
What youll achieve As a Senior Software Engineer - IT you will be designing and implementing microservices in Python, integrating with the database, writing optimized SQL queries, implementing test framework, and automating workflows.The Messaging Platform automation team offers PaaS capability to the customer, providing RabbitMQ, Kafka services as cloud service offerings. This team is responsible for providing automation features, automating various platform administrative functions to improve operational efficiency while enabling quick turnaround time. You will: Design, develop, and maintain python-based applications, RESTful web services Automate workflows and backend processes using python Apply object-oriented programming (OOP) principles to create reusable, modular code Write and maintain unit and integration tests using pytest, unittest, etc Working with relational database, integrating with microservices Take the first step towards your dream career Every Dell Technologies team member brings something unique to the table. Heres what we are looking for with this role: Essential Requirements 5-8 years of Strong experience with Python,Django/Flask/FAST API framework Experience building and consuming REST APIs Strong understanding and application of object-oriented programming (OOP) and test-driven development (TDD) Familiar with CI/CD tools and practices Proficiency in writing optimized SQL queries and database design.Excellent debugging and troubleshooting skills in complex environments Desirable Requirements Knowledge of containerization tools (e.g., Docker).Familiarity with cloud platforms like AWS, Azure, or GCP Experience with message queues (RabbitMQ, Kafka, etc.).Exposure to frontend technologies (HTML/CSS/JS)
Posted 7 hours ago
4.0 - 9.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Expected Notice Period: 15 Days Shift: (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity? Must have skills required: Configuring hardware Firewalls, Oop, Prometheus, VPN, DNS, Git CLI, Terraform, Azure, Docker, Kafka, Python, Shell Scripting, Ubuntu About Team In this role you'll work closely with a team of 4-6 other people who are responsible for the design, development, deployment, and triage of the systems and tools required to support the operation and rollout of our computer vision software at thousands of remote locations. The team wears multiple shared hats which can relate to network engineering and security, release management, tooling development, as well as cloud, on-prem and colocation infrastructure administration and design. You will help contribute to an existing set of DevOps practices, the software implementation, troubleshooting, enhancement, and integration with other teams and potentially clients. This is a small but nimble team who must adjust quickly to the changing landscape and is a critical component of many new projects. In this role You will - Play a role as a Senior engineer who can take end to end ownership and accountability of a task and guide it through the finish line with little or no hand holding - Work with two mindsets, balancing work between an operational task list as well as a development focused Kanban board - Provide insight and advice to other team members with regard to software development and design - Develop, review, test, and troubleshoot code as well as debug networking and software errors in a linux environment - Working the US daytime hours and collaborate with team members and coworkers through video meetings - Receive support escalations from other teams related to automation or system administration - Managing infra and patching for our Data Science teams which run software such as Dkube, CVAT, etc Technical Expertise - 4+ years of experience with Python and Shell scripting - 3+years of experience with Git CLI, Git branching strategies, release management, merge conflict resolution - 3+ years of Professional experience with container orchestration and administration (K3S, AKS, Rancher, Docker etc) - Strong with Ubuntu/Debian based Linux distros, comfortable resolving network and filesystem issues - 3+ years Professional Experience with CI/CD and writing build/test/deployment automation preferably with Github Actions or Gitlab CI - 2+ years of Professional Experience with Infrastructure as Code (prefer Azure with either Terraform/Pulumi/ARM/Bicep) - Working knowledge of DNS and tools used to troubleshoot related issues Nice to have: - Strong OOP and software engineering experience with Python (Golang and Rust are a bonus) - Knowledge of Kafka is a plus - Experience deploying and maintaining observability stacks such as ELK, Prometheus, Jaeger, etc - Understand how to query relational and No-SQL databases - Configuring hardware Firewalls (Sophos, Cisco, Fortinet, Juniper, etc) - Experience troubleshooting VPN, working with SSH tunnels, Cloudflare Warp *THIS IS NOT REMOTE ROLE
Posted 8 hours ago
5.0 - 10.0 years
16 - 19 Lacs
Mumbai
Work from Office
Greeting from R2R Consults ! Location :- Mumbai (PAREL) Working Mode -5 days working (WFO) Reporting To: COE Application Development Role Objective We are looking for an enthusiastic application developer to join our technology team. Your primary focus will be to learn the codebase, gather user data, and respond to requests from the lead. To ensure success as an Application Developer, you should have a good working knowledge of basic programming languages, the ability to learn new technology quickly, and the ability to work in a cross-functional team environment. Job Responsibilities Requirement gathering from the stakeholders and analysis Build and solution as per the requirement Develop and manage in-house applications independently Develop features across multiple subsystems within our applications, including collaboration in requirements definition, prototyping, design, coding, testing, and deployment Understand how our applications operate, are structured, and how customers use them Participate in interactions with stakeholders as a technical expert for product subsystems Investigate, analyze and make recommendations to management regarding technology improvements, upgrades, and modifications Rewrite the legacy application using a modern framework Measure of Success (KRAs) Requirement analysis Develop code/solution into multiple platforms for the assigned tasks Quality code delivery following coding conventions Contribute to solution design Proven ability to take initiative and dive into new areas of technology Excellent communication and teamwork skills Full stack experience is a plus Job Specifications Minimum Education: MCA, BE, B.Tech., BCA, B.Sc. (IT), M.Sc. (IT) Minimum/Relevant Experience: 5-7 Years of experience Technical Skills 5+ years of experience as a Software Engineer with proven track record in system analysis, design, and implementation Web-based application development experience using N-tier and microservice architectures Strong OOP skills and Java server-side development for large-scale systems Proficiency in Spring, Spring Boot, Hibernate, and SOA principles Strong understanding of concurrency (multithreading, multiprocessing, parallelism, memory management) Practical knowledge in distributed systems like Kafka, Cassandra, PostgreSQL Familiarity with MVP, MVVM, MVC design patterns Unit testing using NUnit, MSTest, or equivalents Distributed system design experience Hands-on experience with Docker, Kubernetes, Helm charts Full SDLC experience for Java applications Agile methodologies and best practices familiarity Independent working capability with high-quality deliverables Software tools like JIRA, version control (Git), and CI/CD pipelines Bonus: Angular, NATS, MQTT, IoT technologies Key Interactions Internal Stakeholders: Technology and Business Teams External Stakeholders: Tech Teams and Vendors If interested please share updated resume @ gunjan@r2rconsults.com OR what's app @ 7439380585 Regards, Gunjan Upadhyay
Posted 8 hours ago
6.0 - 11.0 years
18 - 22 Lacs
Pune, Chennai, Bengaluru
Work from Office
Role & responsibilities 5 - 8 years of exp on primarily skill required are Java Springboot, API development , microservices. Programming experience in Java Spring Boot, API-Driven Development - Solid experience in RESTful and Microservices development Strong understanding in Data Structures and Algorithms Preferred experience in any UI Technologies like Angular or React Experience working in ORM frameworks like Entity, Hibernate, Dapper Strong relational database experience in either Oracle, MS SQL, or Postgres Good experience in queuing or streaming engines like Kafka Unit Testing / TDD - Experience with Continuous Integration Delivery Automated Testing and tools such as NUnit, Junit Experience with Docker, GIT, SonarQube, Checkmarx, OpenShift, and other deployment tools for CI/CD Experience in using tools like Jira, GitLab, Swagger, Postman, SOAP UI, Service Now Basic Understanding of JavaScript, HTML, CSS Good to have in AWS or Azure cloud
Posted 8 hours ago
10.0 - 14.0 years
37 - 50 Lacs
Noida, Gurugram
Work from Office
Primary Responsibilities: Understanding the Business Requirements Translating them to the technical functional deliverables Work with Business and other system to design the solution Be a Subject Matter expert for the functionality Solve the problem with the right technology Work with Sr Solution Architect on Roadmap and Agile delivery Creative and innovative thinker with attention to detail Design principles including SOLID principles and clean architecture Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Business Acumen: Understanding Business Requirements: Ability to translate business needs into technical solutions. Cost Management: Balancing technical requirements with budget constraints. Stakeholder Management: Engaging with stakeholders to align technical solutions with business goals. Strategic Thinking: Aligning architectural decisions with long-term business strategies. Required Qualifications: B.Tech / MCA / Msc / MTech 8+ years of experience System Design and Architecture: Expertise in designing scalable, reliable, and secure systems. Knowledge of architectural patterns (e.g., microservices, event-driven architecture, SOA). Cloud Computing: Proficiency in cloud platforms like AWS, Azure, or Google Cloud. Experience with cloud-native services, deployment, and cost optimization. Programming and Development: Strong understanding of programming languages (e.g., Java, Python, etc.). Familiarity with frameworks, APIs, and development tools. Integration and Middleware: Experience with system integration, middleware, and APIs. Knowledge of tools like REST, SOAP, GraphQL, and message brokers (e.g., Kafka, RabbitMQ). Database Management: Proficiency in relational (SQL) and non-relational (NoSQL) databases. Understanding of data modeling and database optimization. DevOps and CI/CD: Familiarity with DevOps practices, tools (e.g., Jenkins, Git, Docker, Kubernetes), and CI/CD pipelines. Security: Knowledge of security best practices, encryption, and compliance standards (e.g., HIPAA). Enterprise Tools and Platforms: Experience with BPM platforms (e.g., Camunda). Preferred Qualification: Certifications (Optional but Valuable): Cloud Certifications: AWS Certified Solutions Architect, Microsoft Azure Solutions Architect, or Google Cloud Professional Architect. Enterprise Architecture: TOGAF (The Open Group Architecture Framework). Security Certifications: CISSP, CISM, or similar. Soft Skills: Problem-Solving: Ability to analyze complex problems and design effective solutions. Communication: Strong verbal and written communication skills to interact with stakeholders, developers, and business teams. Leadership: Ability to lead technical teams and guide them in implementing architectural solutions. Collaboration: Working effectively with cross-functional teams, including business analysts, developers, and project managers. Adaptability: Staying updated with emerging technologies and adapting to new tools and methodologies.
Posted 8 hours ago
6.0 - 11.0 years
11 - 21 Lacs
Udaipur, Jaipur, Bengaluru
Work from Office
Data Architect Kadel Labs is a leading IT services company delivering top-quality technology solutions since 2017, focused on enhancing business operations and productivity through tailored, scalable, and future-ready solutions. With deep domain expertise and a commitment to innovation, we help businesses stay ahead of technological trends. As a CMMI Level 3 and ISO 27001:2022 certified company, we ensure best-in-class process maturity and information security, enabling organizations to achieve their digital transformation goals with confidence and efficiency. Role: Data Architect Experience: 6-10 Yrs Location: Udaipur , Jaipur, Bangalore Domain: Telecom Job Description: We are seeking an experienced Telecom Data Architect to join our team. In this role, you will be responsible for designing comprehensive data architecture and technical solutions specifically for telecommunications industry challenges, leveraging TM forum frameworks and modern data platforms. You will work closely with customers, and technology partners to deliver data solutions that address complex telecommunications business requirements including customer experience management, network optimization, revenue assurance, and digital transformation initiatives. Key Responsibilities: Design and articulate enterprise-scale telecom data architectures incorporating TMforum standards and frameworks, including SID (Shared Information/Data Model), TAM (Telecom Application Map), and eTOM (enhanced Telecom Operations Map) Develop comprehensive data models aligned with TMforum guidelines for telecommunications domains such as Customer, Product, Service, Resource, and Partner management Create data architectures that support telecom-specific use cases including customer journey analytics, network performance optimization, fraud detection, and revenue assurance Design solutions leveraging Microsoft Azure and Databricks for telecom data processing and analytics Conduct technical discovery sessions with telecom clients to understand their OSS/BSS architecture, network analytics needs, customer experience requirements, and digital transformation objectives Design and deliver proof of concepts (POCs) and technical demonstrations showcasing modern data platforms solving real-world telecommunications challenges Create comprehensive architectural diagrams and implementation roadmaps for telecom data ecosystems spanning cloud, on-premises, and hybrid environments Evaluate and recommend appropriate big data technologies, cloud platforms, and processing frameworks based on telecom-specific requirements and regulatory compliance needs. Design data governance frameworks compliant with telecom industry standards and regulatory requirements (GDPR, data localization, etc.) Stay current with the latest advancements in data technologies including cloud services, data processing frameworks, and AI/ML capabilities Contribute to the development of best practices, reference architectures, and reusable solution components for accelerating proposal development Required Skills: 6-10 years of experience in data architecture, data engineering, or solution architecture roles with at least 5 years in telecommunications industry Deep knowledge of TMforum frameworks including SID (Shared Information/Data Model), eTOM, TAM, and their practical implementation in telecom data architectures Demonstrated ability to estimate project efforts, resource requirements, and implementation timelines for complex telecom data initiatives Hands-on experience building data models and platforms aligned with TMforum standards and telecommunications business processes Strong understanding of telecom OSS/BSS systems, network management, customer experience management, and revenue management domains Hands-on experience with data platforms including Databricks, and Microsoft Azure in telecommunications contexts Experience with modern data processing frameworks such as Apache Kafka, Spark and Airflow for real-time telecom data streaming Proficiency in Azure cloud platform and its respective data services with an understanding of telecom-specific deployment requirements Knowledge of system monitoring and observability tools for telecommunications data infrastructure Experience implementing automated testing frameworks for telecom data platforms and pipelines Familiarity with telecom data integration patterns, ETL/ELT processes, and data governance practices specific to telecommunications Experience designing and implementing data lakes, data warehouses, and machine learning pipelines for telecom use cases Proficiency in programming languages commonly used in data processing (Python, Scala, SQL) with telecom domain applications Understanding of telecommunications regulatory requirements and data privacy compliance (GDPR, local data protection laws) Excellent communication and presentation skills with ability to explain complex technical concepts to telecom stakeholders Strong problem-solving skills and ability to think creatively to address telecommunications industry challenges Good to have TMforum certifications or telecommunications industry certifications Relevant data platform certifications such as Databricks, Azure Data Engineer are a plus Willingness to travel as required Educational Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Visit us: https://kadellabs.com/ https://in.linkedin.com/company/kadel-labs https://www.glassdoor.co.in/Overview/Working-at-Kadel-Labs-EI_IE4991279.11,21.htm
Posted 8 hours ago
5.0 - 9.0 years
12 - 14 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
We are seeking a skilled ETL Data Tester to join our dynamic team on a 6-month contract. The ideal candidate will focus on implementing ETL processes, creating comprehensive test suites using Python, and validating data quality through advanced SQL queries. The role involves collaborating with Data Scientists, Engineers, and Software teams to develop and monitor data tools, frameworks, and infrastructure changes. Proficiency in Hive QL, Spark QL, and Big Data concepts is essential. The candidate should also have experience in data testing tools like DBT, iCEDQ, and QuerySurge, along with expertise in Linux/Unix and messaging systems such as Kafka or RabbitMQ. Strong analytical and debugging skills are required, with a focus on continuous automation and integration of data from multiple sources. Location- Remote,Delhi NCR,Bengaluru, Chennai,Pune, Kolkata, Ahmedabad, Mumbai, Hyderabad
Posted 8 hours ago
5.0 - 9.0 years
12 - 16 Lacs
Mumbai, New Delhi, Bengaluru
Work from Office
The role involves hands-on experience with data testing, data integration, and supporting data quality in big data environments. Key responsibilities include selecting and integrating data tools and frameworks, providing technical guidance for software engineers, and collaborating with data scientists, data engineers, and other stakeholders. This role requires implementing ETL processes, monitoring performance, advising on infrastructure, and defining data retention policies. Candidates should be proficient in Python, advanced SQL, Hive QL, and Spark QL, with hands-on experience in data testing tools like DBT, iCEDQ, QuerySurge, Denodo, or Informatica. Strong experience with NoSQL, Linux/Unix, and messaging systems (Kafka or RabbitMQ) is also required. Additional responsibilities include troubleshooting, debugging, UAT with business users in Agile environments, and automating tests to increase coverage and efficiency. Location: Chennai, Hyderabad, Pune, Kolkata, Ahmedabad, RemotE
Posted 8 hours ago
3.0 - 6.0 years
15 - 25 Lacs
Bengaluru
Work from Office
Job summary Join our dynamic team as a Team Member where you will leverage your expertise in Rest API Rabbit MQ Kafka PostgreSQL Quarkus Java 8 and Advanced Java. This hybrid role offers the opportunity to work on innovative projects that drive our companys success. With a focus on collaboration and technical excellence you will contribute to impactful solutions that enhance our services. Responsibilities Develop and maintain robust Rest API solutions to support seamless integration across platforms. Implement messaging solutions using Rabbit MQ and Kafka to ensure reliable and efficient data processing. Design and optimize PostgreSQL databases to enhance data storage and retrieval performance. Utilize Quarkus to build high-performance scalable applications that meet business requirements. Write clean efficient and maintainable code in Java 8 and Advanced Java to deliver high-quality software solutions. Collaborate with cross-functional teams to gather and analyze requirements ensuring alignment with project goals. Participate in code reviews and provide constructive feedback to enhance code quality and team performance. Troubleshoot and resolve technical issues ensuring minimal disruption to business operations. Contribute to the continuous improvement of development processes and practices. Stay updated with the latest industry trends and technologies to drive innovation within the team. Ensure adherence to best practices in software development including security and performance optimization. Document technical specifications and project progress to facilitate knowledge sharing and collaboration. Engage in regular team meetings and discussions to foster a collaborative and supportive work environment. Qualifications Possess a strong understanding of Rest API development and integration. Demonstrate proficiency in Rabbit MQ and Kafka for messaging solutions. Have experience in designing and managing PostgreSQL databases. Show expertise in using Quarkus for application development. Be skilled in Java 8 and Advanced Java programming. Exhibit excellent problem-solving and analytical skills. Display effective communication and teamwork abilities.
Posted 8 hours ago
8.0 - 13.0 years
20 - 35 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Kafka Admin - Architecture
Posted 9 hours ago
5.0 - 7.0 years
15 - 18 Lacs
Bengaluru
Work from Office
We are seeking a highly skilled GCP Data Engineer with experience in designing and developing data ingestion frameworks, real-time processing solutions, and data transformation frameworks using open-source tools. The role involves operationalizing open-source data-analytic tools for enterprise use, ensuring adherence to data governance policies, and performing root-cause analysis on data-related issues. The ideal candidate should have a strong understanding of cloud platforms, especially GCP, with hands-on expertise in tools such as Kafka, Apache Spark, Python, Hadoop, and Hive. Experience with data governance and DevOps practices, along with GCP certifications, is preferred.
Posted 9 hours ago
8.0 - 13.0 years
85 - 90 Lacs
Noida
Work from Office
About the Role We are looking for a Staff EngineerReal-time Data Processing to design and develop highly scalable, low-latency data streaming platforms and processing engines. This role is ideal for engineers who enjoy building core systems and infrastructure that enable mission-critical analytics at scale. Youll work on solving some of the toughest data engineering challenges in healthcare. A Day in the Life Architect, build, and maintain a large-scale real-time data processing platform. Collaborate with data scientists, product managers, and engineering teams to define system architecture and design. Optimize systems for scalability, reliability, and low-latency performance. Implement robust monitoring, alerting, and failover mechanisms to ensure high availability. Evaluate and integrate open-source and third-party streaming frameworks. Contribute to the overall engineering strategy and promote best practices for stream and event processing. Mentor junior engineers and lead technical initiatives. What You Need 8+ years of experience in backend or data engineering roles, with a strong focus on building real-time systems or platforms. Hands-on experience with stream processing frameworks like Apache Flink, Apache Kafka Streams, or Apache Spark Streaming. Proficiency in Java, Scala, or Python or Go for building high-performance services. Strong understanding of distributed systems, event-driven architecture, and microservices. Experience with Kafka, Pulsar, or other distributed messaging systems. Working knowledge of containerization tools like Docker and orchestration tools like Kubernetes. Proficiency in observability tools such as Prometheus, Grafana, OpenTelemetry. Experience with cloud-native architectures and services (AWS, GCP, or Azure). Bachelor's or Masters degree in Computer Science, Engineering, or a related field.
Posted 9 hours ago
8.0 - 13.0 years
85 - 90 Lacs
Noida
Work from Office
About the Role We are looking for a Staff Engineer specialized in Master Data Management to design and develop our next-generation MDM platform. This role is ideal for engineers who have created or contributed significantly to MDM solutions. Youll lead the architecture and development of our core MDM engine, focusing on data modeling, matching algorithms, and governance workflows that enable our customers to achieve a trusted, 360-degree view of their critical business data. A Day in the Life Collaborate with data scientists, product managers, and engineering teams to define system architecture and design. Architect and develop scalable, fault-tolerant MDM platform components that handle various data domains. Design and implement sophisticated entity matching and merging algorithms to create golden records across disparate data sources. Develop or Integrate flexible data modeling frameworks that can adapt to different industries and use cases. Create robust data governance workflows, including approval processes, audit trails, and role-based access controls. Build data quality monitoring and remediation capabilities into the MDM platform. Collaborate with product managers, solution architects, and customers to understand industry-specific MDM requirements. Develop REST APIs and integration patterns for connecting the MDM platform with various enterprise systems. Mentor junior engineers and promote best practices in MDM solution development. Lead technical design reviews and contribute to the product roadmap What You Need 8+ years of software engineering experience, with at least 5 years focused on developing master data management solutions or components. Proven experience creating or significantly contributing to commercial MDM platforms, data integration tools, or similar enterprise data management solutions. Deep understanding of MDM concepts including data modeling, matching/merging algorithms, data governance, and data quality management. Strong expertise in at least one major programming language such as Java, Scala, Python, or Go. Experience with database technologies including relational (Snowflake, Databricks, PostgreSQL) and NoSQL systems (MongoDB, Elasticsearch). Knowledge of data integration patterns and ETL/ELT processes. Experience designing and implementing RESTful APIs and service-oriented architectures. Understanding of cloud-native development and deployment on AWS, or Azure. Familiarity with containerization (Docker) and orchestration tools (Kubernetes). Experience with event-driven architectures and messaging systems (Kafka, RabbitMQ). Strong understanding of data security and privacy considerations, especially for sensitive master data. Bachelors or Masters degree in Computer Science, Information Systems, or related field.
Posted 9 hours ago
8.0 - 13.0 years
85 - 90 Lacs
Noida
Work from Office
About the Role We are looking for a Staff Engineer to lead the design and development of a scalable, secure, and robust data platform. You will play a key role in building data platform capabilities for data quality, metadata management, lineage tracking, and compliance across all data layers. If youre passionate about building foundational data infrastructure that accelerates innovation in healthcare, wed love to talk. A Day in the Life Architect, design, and build scalable data governance tools and frameworks. Collaborate with cross-functional teams to ensure data compliance, security, and usability. Lead initiatives around metadata management, data lineage, and data cataloging. Define and evangelize standards and best practices across data engineering teams. Own the end-to-end lifecycle of governance tooling from prototyping to production deployment. Mentor and guide junior engineers and contribute to technical leadership across the organization. Drive innovation in privacy-by-design, regulatory compliance (e.g., HIPAA), and data observability solutions. What You Need 8+ years of experience in software engineering. Strong experience building distributed systems for metadata management, data lineage, and quality tracking. Proficient in backend development (Python, Java, or Scala or Go) and familiar with RESTful API design. Expertise in modern data stacks: Kafka, Spark, Airflow, Snowflake etc. Experience with open-source data governance frameworks like Apache Atlas, Amundsen, or DataHub is a big plus. Familiarity with cloud platforms (AWS, Azure, GCP) and their native data governance offerings. Prior experience in building metadata management frameworks for scale. Bachelor's or Masters degree in Computer Science, Engineering, or a related field.
Posted 9 hours ago
2.0 - 6.0 years
6 - 18 Lacs
Hyderabad
Work from Office
Familiarity in programming patterns in Python. Exposure to Kafka, RabbitMQ, or AWS EventBridg, Data science exposure. Built or contributed to agentic systems, ML/AI pipelines, or intelligent automation tools. Understanding of MLOps Food allowance Health insurance Provident fund
Posted 9 hours ago
5.0 - 9.0 years
0 - 0 Lacs
Mumbai, Pune, Bengaluru
Hybrid
Data Engineer Experience 5 to 10 years Location Pune Yeravda hybrid Primary Skill: Scala coding Spark SQL **Key Responsibilities:** - Design and implement high-performance data pipelines using Apache Spark and Scala. - Optimize Spark jobs for efficiency and scalability. - Collaborate with diverse data sources and teams to deliver valuable insights. - Monitor and troubleshoot production pipelines to ensure smooth operations. - Maintain thorough documentation for all systems and code. **Required Skills & Qualifications:** - Minimum of 3 years hands-on experience with Apache Spark and Scala. - Strong grasp of distributed computing principles and Spark internals. - Proficiency in working with big data technologies like HDFS, Hive, Kafka, and HBase. - Ability to write optimized Spark jobs using Scala effectively.
Posted 10 hours ago
3.0 - 8.0 years
25 - 30 Lacs
Udaipur
Work from Office
Hiring Sr. Node.js Engineer with expertise in Node.js, Express, CI/CD, RabbitMQ/Kafka, Event Loop, performance tuning, and debugging. Lead dev projects, quality, design scalable apps, and collaborate across teams. Strong comm & leadership skills.
Posted 10 hours ago
7.0 - 12.0 years
22 - 37 Lacs
Noida, Gurugram, Bengaluru
Hybrid
We're Hiring: Senior Java Backend Developers Locations: Gurgaon / Noida / Bangalore (Only candidates available for F2F Interview in these locations) Experience: 7-14 Years ONLY Joiners: End of July or 30 Days Notice Key Skills: Core Java Spring Boot Microservices Multithreading Cloud (any) Minimum Education: 15 years ( B.Sc , BCA, B.Tech , MCA, M.Sc , M.Tech ) Hiring Process: 1 MCQ Test 2 Face-to-Face Interview (Weekday/Weekend based on panel availability) NOTE: Do NOT apply if your experience is less/more than 710 years Only apply if you are ready for the test and F2F interview Irrelevant profiles will not be considered Send relevant profiles to: harpreet.r@anlage.co.in
Posted 10 hours ago
20.0 years
20 - 40 Lacs
Thiruvananthapuram
Remote
About the Job As a specialist in electronics and software for the past 20 years, in-tech is a dynamic, fast growing company headquartered in Munich, Germany employing 1350 employees globally across 20 project locations in 8 countries. A strategic partner of Infosys Ltd since 2024 after becoming a 100% subsidiary. Our India location Our India based locations bring together the best of Indian and European work cultures, creating a unique intech environment that promotes strong team spirit and a positive, collaborative workplace. Now part of the Infosys, we’re expanding our capabilities to meet a growing range of digital engineering requirements. This includes close collaboration with Infosys teams on cutting-edge engineering projects. We’re currently inviting applications for the role of Lead Java Developer for an exciting Infosys project based in Trivandrum/Bangalore/Chennai with hybrid work options. Responsibilities The primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey. Requirements Must have 5-10 years of relevant experience in software development with Java, including expertise in Java 8 and above, microservices architecture, and multi-threaded programming. Proficiency in frameworks such as Spring Boot, Hibernate, and Spring Security. Experience with distributed computing, messaging systems (Kafka, RabbitMQ), and caching solutions (Redis, Elastic Cache). Strong understanding of RESTful API design, Web Services, SOA, and microservices development. Hands-on experience with database technologies including SQL (Oracle, PostgreSQL, MySQL) and NoSQL (MongoDB, Cassandra). Familiarity with DevOps practices, CI/CD pipelines, Docker, Kubernetes, and cloud platforms (AWS/GCP/Azure). Proficient in Linux/Unix environments with shell scripting capabilities. Experience in performance tuning, profiling, and optimizing Java applications. Strong understanding of Agile methodologies, Scrum, and test-driven development (TDD) using JUnit and Mockito. Apply with us If you have the experience, team spirit and are looking for a great place to work, then start your job with us. As part of our dedication to the diversity of our workforce, in-tech is committed to equal employment opportunity without regard for age, race, colour, national origin, gender, ethnicity, protected veteran status, disability, sexual orientation, gender identity, or religion. Java Lead in-tech.com Job Types: Full-time, Permanent Pay: ₹2,000,000.00 - ₹4,000,000.00 per year Benefits: Health insurance Provident Fund Work from home Schedule: Day shift Monday to Friday Experience: Java: 5 years (Required) Location: Trivandrum, Kerala (Required) Work Location: In person
Posted 11 hours ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Kafka, a popular distributed streaming platform, has gained significant traction in the tech industry in recent years. Job opportunities for Kafka professionals in India have been on the rise, with many companies looking to leverage Kafka for real-time data processing and analytics. If you are a job seeker interested in Kafka roles, here is a comprehensive guide to help you navigate the job market in India.
These cities are known for their thriving tech industries and have a high demand for Kafka professionals.
The average salary range for Kafka professionals in India varies based on experience levels. Entry-level positions may start at around INR 6-8 lakhs per annum, while experienced professionals can earn between INR 12-20 lakhs per annum.
Career progression in Kafka typically follows a path from Junior Developer to Senior Developer, and then to a Tech Lead role. As you gain more experience and expertise in Kafka, you may also explore roles such as Kafka Architect or Kafka Consultant.
In addition to Kafka expertise, employers often look for professionals with skills in: - Apache Spark - Apache Flink - Hadoop - Java/Scala programming - Data engineering and data architecture
As you explore Kafka job opportunities in India, remember to showcase your expertise in Kafka and related skills during interviews. Prepare thoroughly, demonstrate your knowledge confidently, and stay updated with the latest trends in Kafka to excel in your career as a Kafka professional. Good luck with your job search!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17069 Jobs | Dublin
Wipro
9221 Jobs | Bengaluru
EY
7581 Jobs | London
Amazon
5941 Jobs | Seattle,WA
Uplers
5895 Jobs | Ahmedabad
Accenture in India
5813 Jobs | Dublin 2
Oracle
5703 Jobs | Redwood City
IBM
5669 Jobs | Armonk
Capgemini
3478 Jobs | Paris,France
Tata Consultancy Services
3259 Jobs | Thane