Jobs
Interviews

13024 Kafka Jobs - Page 17

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

14 - 17 Lacs

Mumbai

Work from Office

As a Data Engineer at IBM, you'll play a vital role in the development, design of application, provide regular support/guidance to project teams on complex coding, issue resolution and execution. Your primary responsibilities include: Lead the design and construction of new solutions using the latest technologies, always looking to add business value and meet user requirements. Strive for continuous improvements by testing the build solution and working under an agile framework. Discover and implement the latest technologies trends to maximize and build creative solutions Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience with Apache Spark (PySpark)In-depth knowledge of Spark’s architecture, core APIs, and PySpark for distributed data processing. Big Data TechnologiesFamiliarity with Hadoop, HDFS, Kafka, and other big data tools. Data Engineering Skills: Strong understanding of ETL pipelines, data modeling, and data warehousing concepts. Strong proficiency in PythonExpertise in Python programming with a focus on data processing and manipulation. Data Processing FrameworksKnowledge of data processing libraries such as Pandas, NumPy. SQL ProficiencyExperience writing optimized SQL queries for large-scale data analysis and transformation. Cloud PlatformsExperience working with cloud platforms like AWS, Azure, or GCP, including using cloud storage systems Preferred technical and professional experience Define, drive, and implement an architecture strategy and standards for end-to-end monitoring. Partner with the rest of the technology teams including application development, enterprise architecture, testing services, network engineering, Good to have detection and prevention tools for Company products and Platform and customer-facing

Posted 1 day ago

Apply

2.0 - 5.0 years

13 - 17 Lacs

Chennai

Work from Office

Developer leads the cloud application development/deployment for client based on AWS development methodology, tools and best practices. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience of technologies like Spring boot, JAVA Demonstrated technical leadership experience on impact customer facing projects. Experience in building web Applications in Java/J2EE stack. Experience in UI framework such as REACT JS Working knowledge of any messaging system (KAFKA preferred) Preferred technical and professional experience Strong experience in Concurrent design and multi-threading - General Experience, Object Oriented Programming System (OOPS), SQL Server/ Oracle/ MySQL Working knowledge of Azure or AWS cloud. Preferred Experience in building applications in a container based environment (Docker/Kubernetes) on AWS Cloud

Posted 1 day ago

Apply

4.0 - 9.0 years

12 - 16 Lacs

Kochi

Work from Office

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala; Minimum 3 years of experience on Cloud Data Platforms on Azure; Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers

Posted 1 day ago

Apply

5.0 - 7.0 years

14 - 18 Lacs

Mumbai

Work from Office

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and Azure Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark and Hive, Hbase or other NoSQL databases on Azure Cloud Data Platform or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / Azure eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on Azure Experience in DataBricks / Azure HDInsight / Azure Data Factory, Synapse, SQL Server DB Exposure to streaming solutions and message brokers like Kafka technologies Experience Unix / Linux Commands and basic work experience in Shell Scripting Preferred technical and professional experience Certification in Azure and Data Bricks or Cloudera Spark Certified developers

Posted 1 day ago

Apply

2.0 - 6.0 years

30 - 35 Lacs

Bengaluru

Work from Office

FunctionSoftware Engineering, Backend Development Responsibilities: You will work on building the biggest neo-banking app of India You will own the design process, implementation of standard software engineering methodologies while improving performance, scalability and maintainability You will be translating functional and technical requirements into detailed design and architecture You will be collaborating with UX designers and product owners for detailed product requirements You will be part of a fast growing engineering group You will be responsible for mentoring other engineers, defining our tech culture and helping build a fast growing team Requirements: 2-6 years of experience in product development, design and architecture Hands on expertise in at least one of the following programming languages Java, Python NodeJS and Go Hands on expertise in SQL and NoSQL databases Expertise in problem solving, data structure and algorithms Deep understanding and experience in object oriented design Ability in designing and architecting horizontally scalable software systems Drive to constantly learn and improve yourself and processes surrounding you Mentoring, collaborating and knowledge sharing with other engineers in the team Self-starter Strive to write the optimal code possible day in day out What you will get:

Posted 1 day ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

Pune

Work from Office

Developer leads the cloud application development/deployment. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong proficiency in Java, Spring Framework, Spring boot, RESTful APIs, excellent understanding of OOP, Design Patterns. Strong knowledge of ORM tools like Hibernate or JPA, Java based Micro-services framework, Hands on experience on Spring boot Microservices, Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices- Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc)- Spark Good to have Python. Strong knowledge of micro-service logging, monitoring, debugging and testing, In-depth knowledge of relational databases (e.g., MySQL) Experience in container platforms such as Docker and Kubernetes, experience in messaging platforms such as Kafka or IBM MQ, good understanding of Test-Driven-Development Familiar with Ant, Maven or other build automation framework, good knowledge of base UNIX commands, Experience in Concurrent design and multi-threading. Preferred technical and professional experience Experience in Concurrent design and multi-threading Primary Skills: - Core Java, Spring Boot, Java2/EE, Microservices - Hadoop Ecosystem (HBase, Hive, MapReduce, HDFS, Pig, Sqoop etc) - Spark Good to have Python.

Posted 1 day ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Kochi

Work from Office

Role Overview As a Java Developer, you will be responsible for designing, developing, and maintaining our high-quality Java-based backend applications. You will work closely with our crossfunctional development team to ensure the efficient and reliable delivery of our products and services. Responsibilities Design, develop, and maintain robust and scalable Java backend applications. Collaborate with the development team to analyze requirements and translate them into technical specifications. Write clean, efficient, and well-documented code. Conduct unit testing and integration testing to ensure code quality. Optimize application performance and scalability. Troubleshoot and resolve technical issues. Stay up-to-date with the latest Java technologies and industry trends. Required education Bachelor's Degree Required technical and professional expertise Qualifications At least 3 years of hands-on experience with Java backend development. Strong understanding of object-oriented programming principles and design patterns. Experience working with Cloud Native environments and platforms (e.g., AWS, GCP, Azure). Familiarity with containerization technologies (e.g., Docker, Kubernetes). Experience with data processing frameworks (e.g., Kafka, Clickhouse) is a plus. Strong problem-solving and analytical skills. Excellent communication and teamwork skills. Preferred technical and professional experience Preferred Skills Experience with high-volume data processing and distributed systems. Knowledge of microservices architecture. Familiarity with DevOps practices and tools (e.g., CI/CD pipelines, version control). Hands on experience with distributed tracing and application performance monitoring.

Posted 1 day ago

Apply

3.0 - 6.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Summary Position Summary Artificial Intelligence & Engineering Join our AI & Engineering team in transforming technology platforms, driving innovation, and helping make a significant impact on our clients' success. You’ll work alongside talented professionals reimagining and re-engineering operations and processes that are critical to businesses. Your contributions can help clients improve financial performance, accelerate new digital ventures, and fuel growth through innovation. AI & Engineering leverages cutting-edge engineering capabilities to build, deploy, and operate integrated/verticalized sector solutions in software, data, AI, network, and hybrid cloud infrastructure. These solutions are powered by engineering for business advantage, transforming mission-critical operations. We enable clients to stay ahead with the latest advancements by transforming engineering teams and modernizing technology & data platforms. Our delivery models are tailored to meet each client's unique requirements. ROLE Level: Consultant As a Consultant at Deloitte Consulting, you will be responsible for individually delivering high quality work product within due timelines in agile framework. On requirement basis consultants will be mentoring and/or directing junior team members/liaising with onsite/offshore teams to understand the functional requirements. The work you will do includes: Understand business requirements and processes Develop software solutions using industry standard delivery methodologies like Agile, Waterfall across different architectural patterns Write clean, efficient, and well-documented code maintaining industry and client standards ensuring code quality and code coverage adherence as well as debugging and resolving any issues/defects Participate in delivery process like Agile development and actively contributing to sprint planning, daily stand-ups, and retrospectives Resolve issues or incidents reported by end users and escalate any quality issues or risks with team leads/scrum masters/project leaders Develop expertise in end-to-end construction cycle starting from Design (low level and high level), coding, unit testing, deployment and defect fixing along with coordinating with multiple stakeholders Create and maintain technical documentation, including design specifications, API documentation and usage guidelines Demonstrate problem-solving mindset and ability to analyze business requirements Qualifications Skills / Project Experience: Must Have: Excellent written and verbal communication skills 3 to 6 years of experience working on Microservices Architecture, Web services, API development, Enterprise integration layer Implement Microservices architecture, visualization, and development processes Strong technical skills in Java and Spring Boot framework Experience in Restful and SOAP Webservices Experience implementing services layer using more than one integration technologies Knowledge on API management, Service discovery, service orchestration, security as a service Implementation experience in XML, Version Control Systems like GIT hub & SVN and build tools Maven/Gradle/ANT Builds Experience in best practices such as OOPs Principles, Exception handling and usage of Generics and well-defined reusable easy to maintain code and tools like JUnit, Mockito, SOAP UI, Postman, Check style, SonarQube etc. Experience in SQL like MYSQL/PostgreSQL/Oracle and frameworks such as JPA/Hibernate Experience using logging and monitoring tools like Splunk, Dynatrace or similar Good to Have: Experience in working with Docker and Kubernetes is preferred Experience in NoSQL like MongoDB, DynamoDB etc. Experience in at least one cloud platform – AWS/Azure/GCP Experience of Build and Test Automation and Continuous Integration (CI) using Jenkins/Hudson tools Knowledge of Agile and Scrum Software Development Methodologies Experience with NoSQL and DevOps Knowledge on design patterns like circuit breaker pattern, proxy pattern, etc. Experience in using messaging broker tools like Apache Kafka, ActiveMQ, etc. Experience in deploying Microservices on cloud platforms Education: B.E./B. Tech/M.C.A./M.Sc (CS) degree or equivalent from accredited university Prior Experience: 3 – 6 years of experience working with hands-on Microservices, Spring boot on cloud technologies Location: Bengaluru/Hyderabad/Pune/Mumbai The Team Deloitte Consulting LLP’s Technology Consulting practice is dedicated to helping our clients build tomorrow by solving today’s complex business problems involving strategy, procurement, design, delivery, and assurance of technology solutions. Our service areas include analytics and information management, delivery, cyber risk services, and technical strategy and architecture, as well as the spectrum of digital strategy, design, and development services Core Business Operations Practice optimizes clients’ business operations and helps them take advantage of new technologies. Drives product and service innovation, improves financial performance, accelerates speed to market, and operates client platforms to innovate continuously. Learn more about our Technology Consulting practice on www.deloitte.com Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300099

Posted 1 day ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

Pune

Work from Office

Developer leads the cloud application development/deployment for client based on AWS development methodology, tools and best practices. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience of technologies like Spring boot, JAVA Demonstrated technical leadership experience on impact customer facing projects. Experience in building web Applications in Java/J2EE stack. Experience in UI framework such as REACT JS Working knowledge of any messaging system (KAFKA preferred) Experience designing and integrating REST APIs using Spring Boot Preferred technical and professional experience Strong experience in Concurrent design and multi-threading - General Experience, Object Oriented Programming System (OOPS), SQL Server/ Oracle/ MySQL Working knowledge of Azure or AWS cloud. Preferred Experience in building applications in a container based environment (Docker/Kubernetes) on AWS Cloud. Basic knowledge of SQL or NoSQL databases (Postgres, MongoDB, DynamoDB preferred) design and queries

Posted 1 day ago

Apply

3.0 - 6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Summary Position Summary Artificial Intelligence & Engineering Join our AI & Engineering team in transforming technology platforms, driving innovation, and helping make a significant impact on our clients' success. You’ll work alongside talented professionals reimagining and re-engineering operations and processes that are critical to businesses. Your contributions can help clients improve financial performance, accelerate new digital ventures, and fuel growth through innovation. AI & Engineering leverages cutting-edge engineering capabilities to build, deploy, and operate integrated/verticalized sector solutions in software, data, AI, network, and hybrid cloud infrastructure. These solutions are powered by engineering for business advantage, transforming mission-critical operations. We enable clients to stay ahead with the latest advancements by transforming engineering teams and modernizing technology & data platforms. Our delivery models are tailored to meet each client's unique requirements. ROLE Level: Consultant As a Consultant at Deloitte Consulting, you will be responsible for individually delivering high quality work product within due timelines in agile framework. On requirement basis consultants will be mentoring and/or directing junior team members/liaising with onsite/offshore teams to understand the functional requirements. The work you will do includes: Understand business requirements and processes Develop software solutions using industry standard delivery methodologies like Agile, Waterfall across different architectural patterns Write clean, efficient, and well-documented code maintaining industry and client standards ensuring code quality and code coverage adherence as well as debugging and resolving any issues/defects Participate in delivery process like Agile development and actively contributing to sprint planning, daily stand-ups, and retrospectives Resolve issues or incidents reported by end users and escalate any quality issues or risks with team leads/scrum masters/project leaders Develop expertise in end-to-end construction cycle starting from Design (low level and high level), coding, unit testing, deployment and defect fixing along with coordinating with multiple stakeholders Create and maintain technical documentation, including design specifications, API documentation and usage guidelines Demonstrate problem-solving mindset and ability to analyze business requirements Qualifications Skills / Project Experience: Must Have: Excellent written and verbal communication skills 3 to 6 years of experience working on Microservices Architecture, Web services, API development, Enterprise integration layer Implement Microservices architecture, visualization, and development processes Strong technical skills in Java and Spring Boot framework Experience in Restful and SOAP Webservices Experience implementing services layer using more than one integration technologies Knowledge on API management, Service discovery, service orchestration, security as a service Implementation experience in XML, Version Control Systems like GIT hub & SVN and build tools Maven/Gradle/ANT Builds Experience in best practices such as OOPs Principles, Exception handling and usage of Generics and well-defined reusable easy to maintain code and tools like JUnit, Mockito, SOAP UI, Postman, Check style, SonarQube etc. Experience in SQL like MYSQL/PostgreSQL/Oracle and frameworks such as JPA/Hibernate Experience using logging and monitoring tools like Splunk, Dynatrace or similar Good to Have: Experience in working with Docker and Kubernetes is preferred Experience in NoSQL like MongoDB, DynamoDB etc. Experience in at least one cloud platform – AWS/Azure/GCP Experience of Build and Test Automation and Continuous Integration (CI) using Jenkins/Hudson tools Knowledge of Agile and Scrum Software Development Methodologies Experience with NoSQL and DevOps Knowledge on design patterns like circuit breaker pattern, proxy pattern, etc. Experience in using messaging broker tools like Apache Kafka, ActiveMQ, etc. Experience in deploying Microservices on cloud platforms Education: B.E./B. Tech/M.C.A./M.Sc (CS) degree or equivalent from accredited university Prior Experience: 3 – 6 years of experience working with hands-on Microservices, Spring boot on cloud technologies Location: Bengaluru/Hyderabad/Pune/Mumbai The Team Deloitte Consulting LLP’s Technology Consulting practice is dedicated to helping our clients build tomorrow by solving today’s complex business problems involving strategy, procurement, design, delivery, and assurance of technology solutions. Our service areas include analytics and information management, delivery, cyber risk services, and technical strategy and architecture, as well as the spectrum of digital strategy, design, and development services Core Business Operations Practice optimizes clients’ business operations and helps them take advantage of new technologies. Drives product and service innovation, improves financial performance, accelerates speed to market, and operates client platforms to innovate continuously. Learn more about our Technology Consulting practice on www.deloitte.com Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 300099

Posted 1 day ago

Apply

0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

POSITION OVERVIEW : Senior Developer Mandatory Skills: GCP BigQuery, SQL, CloudRun Job Description: Senior Application Developer with Google Cloud Platform. Experience in BigQuery, SQL, CloudRun. Need a Senior Application Developer with GCP Skillset for a project involving re-design and re-platform of legacy Revenue Allocation systemMandatory Skills: GCP BigQuery, SQL, CloudRun. Desired Skills: Linux Shell Scripting is a huge plus; Nice to have - Kafka, MQ Series, Oracle PL/SQL POSITION GENERAL DUTIES AND TASKS : Senior Developer Mandatory Skills: GCP BigQuery, SQL, CloudRun Job Description: Senior Application Developer with Google Cloud Platform experience in BigQuery, SQL, CloudRun. Need a Senior Application Developer with GCP Skillset for a project involving re-design and re-platform of legacy Revenue Allocation system  Mandatory Skills: GCP BigQuery, SQL, CloudRun Desired Skills: Linux Shell Scripting is a huge plus; Nice to have - Kafka, MQ Series, Oracle PL/SQL

Posted 1 day ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Baseforge Technologies is hiring Python Developer for Nuacem AI! Python Developer Job Category: Software Engineer Job Type: Full Time Job Location: Hyderabad Company: Nuacem AI About Nuacem AI Nuacem AI is at the forefront of Conversational AI, building intelligent customer engagement platforms using advanced NLP and proprietary Whitebox NLU technology. We power smart bots and AI-first systems that operate seamlessly across voice, text, and video — for global enterprises. Role Overview We are seeking a skilled Sr. Python Developer to join our dynamic backend engineering team. You’ll be responsible for building scalable, robust APIs, integrating with cloud systems, and optimizing performance for AI-driven products. Key Responsibilities Develop, test, and maintain RESTful APIs using Flask, Fast API, or Django. Collaborate with product and ML teams to integrate backend services with AI modules. Design efficient and scalable architecture for microservices and backend systems. Implement messaging systems using Kafka, RabbitMQ, or similar tools. Optimize application performance using Redis, Memcached, and caching strategies. Manage database schemas, queries, and performance for PostgreSQL or MariaDB. Participate in code reviews, sprint planning, and Agile ceremonies. Maintain code integrity and ensure high standards through CI/CD pipelines and Git workflows. Required Skills & Qualifications 3–5 years of hands-on experience in Python backend development. Proficient in at least one Python web framework: Flask, Fast API, or Django. Solid understanding of software architecture, OOP, and design patterns. Experience with Kafka or RabbitMQ (or similar message brokers). Proficiency in Redis, Memcached, or other caching mechanisms. Strong experience with SQL databases such as PostgreSQL or MariaDB. Familiarity with Git, CI/CD workflows, and Agile development. Excellent problem-solving, debugging, and communication skills. Bonus / Preferred Skills Experience with containerization tools (Docker, Kubernetes). Familiarity with cloud platforms like AWS, Azure, or GCP. Exposure to working alongside AI/ML teams or in data-driven product environments. Send your resume: careers@baseforge.com

Posted 1 day ago

Apply

0 years

0 Lacs

India

On-site

Job description Company Description Evallo is a leading provider of a comprehensive SaaS platform for tutors and tutoring businesses, revolutionizing education management. With features like advanced CRM, profile management, standardized test prep, automatic grading, and insightful dashboards, we empower educators to focus on teaching. We're dedicated to pushing the boundaries of ed-tech and redefining efficient education management. Why this role matters Evallo is scaling from a focused tutoring platform to a modular operating system for all service businesses that bill by the hour. As we add payroll, proposals, whiteboarding, and AI tooling, we need a Solution Architect who can translate product vision into a robust, extensible technical blueprint. You’ll be the critical bridge between product, engineering, and customers, owning architecture decisions that keep us reliable at 5k+ concurrent users and cost-efficient at 100k+ total users. Outcomes we expect Map current backend + frontend, flag structural debt, and publish an Architecture Gap Report Define naming & layering conventions, linter / formatter rules, and a lightweight ADR process Ship reference architecture for new modules Lead cross-team design reviews; no major feature ships without architecture sign-off The eventual goal is to have Evallo run in a fully observable, autoscaling environment with < 10 % infra cost waste. Monitoring dashboards should trigger < 5 false positives per month. Day-to-day Solution Design: Break down product epics into service contracts, data flows, and sequence diagrams. Choose the right patterns—monolith vs. microservice, event vs. REST, cache vs. DB index—based on cost, team maturity, and scale targets. Platform-wide Standards: Codify review checklists (security, performance, observability) and enforce via GitHub templates and CI gates. Champion a shift-left testing mindset; critical paths reach 80 % automated coverage before QA touches them. Scalability & Cost Optimization: Design load-testing scenarios that mimic 5 k concurrent tutoring sessions; guide DevOps on autoscaling policies and CDN strategy. Audit infra spend monthly; recommend serverless, queuing, or data-tier changes to cut waste. Release & Environment Strategy: Maintain clear promotion paths: local → sandbox → staging → prod with one-click rollback. Own schema-migration playbooks; zero-downtime releases are the default, not the exception. Technical Mentorship: Run fortnightly architecture clinics; level-up engineers on domain-driven design and performance profiling. Act as a tie-breaker on competing technical proposals, keeping debates respectful and evidence-based. Qualifications 3+ yrs engineering experience, 2+ yrs in a dedicated architecture or staff-level role on a high-traffic SaaS product. Proven track record designing multi-tenant systems that scaled beyond 50 k users or 1k RPM. Deep knowledge of Node.js / TypeScript (our core stack), MongoDB or similar NoSQL, plus comfort with event brokers (Kafka, NATS, or RabbitMQ). Fluency in AWS (preferred) or GCP primitives—EKS, Lambda, RDS, CloudFront, IAM. Hands-on with observability stacks (Datadog, New Relic, Sentry, or OpenTelemetry). Excellent written communication; you can distill technical trade-offs in one page for execs and in one diagram for engineers.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

Lexmark is now a proud part of Xerox, bringing together two trusted names and decades of expertise into a bold and shared vision. When you join us, you step into a technology ecosystem where your ideas, skills, and ambition can shape what comes next. Whether you’re just starting out or leading at the highest levels, this is a place to grow, stretch, and make real impact—across industries, countries, and careers. From engineering and product to digital services and customer experience, you’ll help connect data, devices, and people in smarter, faster ways. This is meaningful, connected work—on a global stage, with the backing of a company built for the future, and a robust benefits package designed to support your growth, well-being, and life beyond work. Responsibilities : A Data Engineer with AI/ML focus combines traditional data engineering responsibilities with the technical requirements for supporting Machine Learning (ML) systems and artificial intelligence (AI) applications. This role involves not only designing and maintaining scalable data pipelines but also integrating advanced AI/ML models into the data infrastructure. The role is critical for enabling data scientists and ML engineers to efficiently train, test, and deploy models in production. This role is also responsible for designing, building, and maintaining scalable data infrastructure and systems to support advanced analytics and business intelligence. This role often involves leading mentoring junior team members, and collaborating with cross-functional teams. Key Responsibilities: Data Infrastructure for AI/ML: Design and implement robust data pipelines that support data preprocessing, model training, and deployment. Ensure that the data pipeline is optimized for high-volume and high-velocity data required by ML models. Build and manage feature stores that can efficiently store, retrieve, and serve features for ML models. AI/ML Model Integration: Collaborate with ML engineers and data scientists to integrate machine learning models into production environments. Implement tools for model versioning, experimentation, and deployment (e.g., MLflow, Kubeflow, TensorFlow Extended). Support automated retraining and model monitoring pipelines to ensure models remain performant over time. Data Architecture & Design Design and maintain scalable, efficient, and secure data pipelines and architectures. Develop data models (both OLTP and OLAP). Create and maintain ETL/ELT processes. Data Pipeline Development Build automated pipelines to collect, transform, and load data from various sources (internal and external). Optimize data flow and collection for cross-functional teams. MLOps Support: Develop CI/CD pipelines to deploy models into production environments. Implement model monitoring, alerting, and logging for real-time model predictions. Data Quality & Governance Ensure high data quality, integrity, and availability. Implement data validation, monitoring, and alerting mechanisms. Support data governance initiatives and ensure compliance with data privacy laws (e.g., GDPR, HIPAA). Tooling & Infrastructure Work with cloud platforms (AWS, Azure, GCP) and data engineering tools like Apache Spark, Kafka, Airflow, etc. Use containerization (Docker, Kubernetes) and CI/CD pipelines for data engineering deployments. Team Collaboration & Mentorship Collaborate with data scientists, analysts, product managers, and other engineers. Provide technical leadership and mentor junior data engineers. Core Competencies Data Engineering: Apache Spark, Airflow, Kafka, dbt, ETL/ELT pipelines ML/AI Integration: MLflow, Feature Store, TensorFlow, PyTorch, Hugging Face GenAI: LangChain, OpenAI API, Vector DBs (FAISS, Pinecone, Weaviate) Cloud Platforms: AWS (S3, SageMaker, Glue), GCP (BigQuery, Vertex AI) Languages: Python, SQL, Scala, Bash DevOps & Infra: Docker, Kubernetes, Terraform, CI/CD pipelines Educational Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or related field. 5+ years of experience in data engineering or related field. Strong understanding of data modeling, ETL/ELT concepts, and distributed systems. Experience with big data tools and cloud platforms. Soft Skills: Strong problem-solving and critical-thinking skills. Excellent communication and collaboration abilities. Leadership experience and the ability to guide technical decisions. How to Apply ? Are you an innovator? Here is your chance to make your mark with a global technology leader. Apply now! Global Privacy Notice Lexmark is committed to appropriately protecting and managing any personal information you share with us. Click here to view Lexmark's Privacy Notice.

Posted 1 day ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

Lexmark is now a proud part of Xerox, bringing together two trusted names and decades of expertise into a bold and shared vision. When you join us, you step into a technology ecosystem where your ideas, skills, and ambition can shape what comes next. Whether you’re just starting out or leading at the highest levels, this is a place to grow, stretch, and make real impact—across industries, countries, and careers. From engineering and product to digital services and customer experience, you’ll help connect data, devices, and people in smarter, faster ways. This is meaningful, connected work—on a global stage, with the backing of a company built for the future, and a robust benefits package designed to support your growth, well-being, and life beyond work. Responsibilities : A Senior Data Engineer with AI/ML focus combines traditional data engineering responsibilities with the technical requirements for supporting Machine Learning (ML) systems and artificial intelligence (AI) applications. This role involves not only designing and maintaining scalable data pipelines but also integrating advanced AI/ML models into the data infrastructure. The role is critical for enabling data scientists and ML engineers to efficiently train, test, and deploy models in production. This role is also responsible for designing, building, and maintaining scalable data infrastructure and systems to support advanced analytics and business intelligence. This role often involves leading data engineering projects, mentoring junior team members, and collaborating with cross-functional teams. Key Responsibilities: Data Infrastructure for AI/ML: Design and implement robust data pipelines that support data preprocessing, model training, and deployment. Ensure that the data pipeline is optimized for high-volume and high-velocity data required by ML models. Build and manage feature stores that can efficiently store, retrieve, and serve features for ML models. AI/ML Model Integration: Collaborate with ML engineers and data scientists to integrate machine learning models into production environments. Implement tools for model versioning, experimentation, and deployment (e.g., MLflow, Kubeflow, TensorFlow Extended). Support automated retraining and model monitoring pipelines to ensure models remain performant over time. Data Architecture & Design Design and maintain scalable, efficient, and secure data pipelines and architectures. Develop data models (both OLTP and OLAP). Create and maintain ETL/ELT processes. Data Pipeline Development Build automated pipelines to collect, transform, and load data from various sources (internal and external). Optimize data flow and collection for cross-functional teams. MLOps Support: Develop CI/CD pipelines to deploy models into production environments. Implement model monitoring, alerting, and logging for real-time model predictions. Data Quality & Governance Ensure high data quality, integrity, and availability. Implement data validation, monitoring, and alerting mechanisms. Support data governance initiatives and ensure compliance with data privacy laws (e.g., GDPR, HIPAA). Tooling & Infrastructure Work with cloud platforms (AWS, Azure, GCP) and data engineering tools like Apache Spark, Kafka, Airflow, etc. Use containerization (Docker, Kubernetes) and CI/CD pipelines for data engineering deployments. Team Collaboration & Mentorship Collaborate with data scientists, analysts, product managers, and other engineers. Provide technical leadership and mentor junior data engineers. Core Competencies Data Engineering: Apache Spark, Airflow, Kafka, dbt, ETL/ELT pipelines ML/AI Integration: MLflow, Feature Store, TensorFlow, PyTorch, Hugging Face GenAI: LangChain, OpenAI API, Vector DBs (FAISS, Pinecone, Weaviate) Cloud Platforms: AWS (S3, SageMaker, Glue), GCP (BigQuery, Vertex AI) Languages: Python, SQL, Scala, Bash DevOps & Infra: Docker, Kubernetes, Terraform, CI/CD pipelines Educational Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or related field. 5+ years of experience in data engineering or related field. Strong understanding of data modeling, ETL/ELT concepts, and distributed systems. Experience with big data tools and cloud platforms. Soft Skills: Strong problem-solving and critical-thinking skills. Excellent communication and collaboration abilities. Leadership experience and the ability to guide technical decisions. How to Apply ? Are you an innovator? Here is your chance to make your mark with a global technology leader. Apply now! Global Privacy Notice Lexmark is committed to appropriately protecting and managing any personal information you share with us. Click here to view Lexmark's Privacy Notice.

Posted 1 day ago

Apply

500.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

SDE II Backend Developer 3+ yrs Mumbai Work from office (5 day week) About Rebel We are surrounded by the world's leading consumer companies led by technology - Amazon for retail, Airbnb for hospitality, Uber for mobility, Netflix and Spotify for entertainment, etc. Food & Beverage is the only consumer sector where large players are still traditional restaurant companies. At Rebel Foods, we are challenging this status quo as we are building the world's most valuable restaurant company on the internet, superfast. The opportunity for us is immense due to the exponential growth in the food delivery business worldwide which has helped us build 'The World's Largest Internet Restaurant Company' in the last few years. Rebel Foods' current presence in India, UAE & UK with close to 50 brands and 4500+ internet restaurants has been built on The Rebel Operating System. While for us it is still Day 1, we know we are in the middle of a revolution towards creating never seen before customer-first experiences. We bring you a once-in-a-lifetime opportunity to disrupt the 500-year-old industry with technology at its core. We urge you to refer to the below to understand how we are changing the restaurant industry before applying at Rebel Foods. https://spirit.rebelfoods.com/why-is-rebel-foods-hiring-super-talented-engineers-b88586223ebe https://spirit.rebelfoods.com/how-to-build-1000-restaurants-in-24-months-the-rebel-method-cb5b0cea4dc8 https://spirit.rebelfoods.com/winning-the-last-frontier-for-consumer-internet-5f2a659c43db https://spirit.rebelfoods.com/a-unique-take-on-food-tech-dcef8c51ba41 Responsibilities as SDE II Backend Developer Design, develop, and maintain high-quality backend services and APIs using Node.js. Contribute to full-stack development efforts, leveraging Java skills when needed. Create low-level designs and perform requirement breakdowns to guide the development process. Lead by example, providing technical guidance and mentorship to junior developers. Collaborate with cross-functional teams, including front-end developers, product managers, and designers, to deliver comprehensive solutions. Implement and manage database solutions using MySQL and MongoDB. Integrate and manage middleware solutions, including Kafka, to ensure robust data processing and communication. Utilize cloud platforms (AWS, Azure, GCP) to deploy, monitor, and scale applications effectively. Participate in code reviews to maintain high code quality and share knowledge with the team. Stay up-to-date with the latest industry trends and technologies to continuously improve the development process. Requirements: Bachelor’s degree in Computer Science, Information Technology, or a related field.3+ years of professional experience in backend development with a strong focuson Node.js. Additional skills in Java/React JS is a plus. Strong mentorship and leadership abilities, with a proven track record of guiding junior developers. Experience with cloud platforms (AWS, Azure, GCP) and their respective services. Proficiency in designing and implementing low-level designs (Db Modelling) and performing requirement breakdowns. Solid experience with relational databases (MySQL) and NoSQL databases (MongoDB). Knowledge of middleware solutions, particularly Kafka, for data streaming and processing. Familiarity with DevOps practices and CI/CD pipelines is a plus. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Experience with containerization and orchestration tools (e.g., Docker, Kubernetes). Familiarity with microservices architecture and related best practices. Knowledge of front-end technologies and frameworks is a bonus. The Rebel Culture We believe in empowering and growing people to perform the best at their job functions. We follow Outcome-oriented, fail-fast iterative & collaborative culture to move fast in building tech solutions. Rebel is not a usual workplace. The following slides will give you a sense of our culture, how Rebel conducts itself and who will be the best fit for our company. We suggest you go through it before making up your mind. Culture@Rebel

Posted 1 day ago

Apply

8.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Your Role and Impact Bachelor’s Degree in Computer Science/Engineering or equivalent experience required. 8+ years of software development experience. 8+ years of Java server-side design and development experience. Highly proficient in J2EE, Spring Boot, and Hibernate. Highly proficient in JUnit, Mockito 3+ years Linux/Unix experience. Elastic Search, Queuing technologies (ActiveMQ, Kafka), Distributed Caching (Redis). Excellent knowledge of RESTful APIs. Experience with Data Model, SQL, and No-SQL. Excellent knowledge of Microservices Architecture and implementation. Experience with GitHub/Bitbucket, Jira, Scrum, Sonar Cloud, and CI/CD processes. Working knowledge of Linux. Experience working on software-as-a-service (SaaS), large-scale distributed systems, and relational/No-SQL databases. Experience working in a small team setting along with an offshore development team. Strong verbal and written communication skills: proven ability to lead both vertically and horizontally to achieve results; thrives in a dynamic, fast-paced, environment and does what it takes to deliver results. Committed to security practices in Your Contribution Bachelor’s Degree in Computer Science/Engineering or equivalent experience required. 8+ years of software development experience. 8+ years of Java server-side design and development experience. Highly proficient in J2EE, Spring Boot, and Hibernate. Highly proficient in JUnit, Mockito 3+ years Linux/Unix experience. Elastic Search, Queuing technologies (ActiveMQ, Kafka), Distributed Caching (Redis). Excellent knowledge of RESTful APIs. Experience with Data Model, SQL, and NoSQL. Excellent knowledge of Microservices Architecture and implementation. Experience with GitHub/Bitbucket, Jira, Scrum, Sonar Cloud, and CI/CD processes. Working knowledge of Linux. Experience working on software-as-a-service (SaaS), large-scale distributed systems, and relational/No-SQL databases. Experience working in a small team setting, along with an offshore development team. Strong verbal and written communication skills: proven ability to lead both vertically and horizontally to achieve results; thrives in a dynamic, fast-paced, environment and does what it takes to deliver results. Committed to security practices in

Posted 1 day ago

Apply

7.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

7+ years of proven experience in designing and documenting API’s and integration solutions in customer facing environment in JAVA. Strong understanding of Restful API’s, SOAP, JSON, XML Strong understanding of angular 2+version Strong knowledge of Java Stream and Functional Programming Strong advocate of test driven development for ALL API’s built. Experience with asynchronous messaging technology Proficiency with integration platforms (eg: Kafka, Dell Boomi,) and API management tools (Postman/Swagger etc). 6+ years’ hands on experience in designing and developing applications from scratch using Spring framework Excellent communication skills both written and verbal with ability to convey technical concepts to non-technical stakeholders and vice versa

Posted 1 day ago

Apply

4.0 - 7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We are looking for a committed and self-driven Full Stack Developer to join our growing product engineering team. You will be responsible for building responsive, user-friendly frontends using ReactJS and/or Flutter, while also contributing to robust and scalable Python-based backend systems that power our SaaS products in F&B, Quick Commerce, and Enterprise Platforms. This role is ideal for someone with a passion for full-stack development, comfortable working in dynamic environments, and excited about using modern AI-assisted coding tools (like ChatGPT, Claude, GitHub Copilot, etc.) to enhance productivity and innovation. Key Responsibilities · Develop responsive frontends using ReactJS and/or Flutter based on business requirements and design specs. · Design and develop scalable, robust, and secure backend applications using Python with Django, Flask, or FastAPI. · Architect and implement microservices-based solutions, using Kafka/MQTT/RabbitMQ for communication. · Write reusable, testable, and efficient code adhering to SOLID principles and design patterns. · Utilize AI-assisted tools (ChatGPT, Claude, GitHub Copilot) to boost development efficiency and code quality. · Integrate with third-party services: payment gateways (Stripe, PayPal, PayU), SMS/email/push providers, and analytics tools. · Contribute to AI/ML module development and data-driven features. · Collaborate closely with UI/UX designers, product managers, and QA to deliver high-quality software. · Write and maintain comprehensive technical documentation. · Participate in Agile processes: stand-ups, sprint planning, retrospectives. Required Skills & Experience · 4 to 7 years of full-stack development experience. · Solid hands-on experience with ReactJS and/or Flutter for frontend development. · Strong backend expertise in Python with frameworks like Django, Flask, or FastAPI. · Proficient in RESTful APIs, job schedulers, and microservices architecture. · Experience working with Kafka, RabbitMQ, or MQTT. · Good understanding of MySQL, PostgreSQL, and query optimization. · Familiarity with cloud platforms such as AWS, Google Cloud, or DigitalOcean. · Proficient with Git, CI/CD pipelines, Docker (Kubernetes is a plus). · Prior experience using AI-assisted coding tools for development and testing. · Excellent debugging and performance optimization skills. Education & Other Requirements · B.Tech (CSE/IT), MCA, MSc IT, or equivalent degree. · Strong analytical and problem-solving abilities. · Self-motivated, collaborative, and proactive in ownership. · Comfortable in a fast-paced, small-team environment. · Available to join immediately or on short notice. Nice to Have · Experience in AI/ML module development. · Exposure to DevOps practices and containerization. · Strong UI/UX design sense. Location: Hyderabad Job Type: Full-Time Experience: 4 to 7 years Compensation: As per industry standards

Posted 1 day ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title: Data Engineer Job Description We are currently looking for a dynamic, professional, team-oriented, self-motivated individual with excellent communication skills and a positive attitude to join our Data Engineering team. This profile has a responsibility of developing integrations, ETL, storage, and usage of data within an organization. The role is responsible for the implementation, integration, and expansion of the Data Platform system using big data, cloud technology and other trending technologies. Candidate must have experience on Python, Spark, My SQL, Anyone of the Cloud technologies, preferred Google, Google Databricks. Knowledge of working with Iceberg tables, Blob storages, S3 Buckets Minimum of 4 years work experience as a Data Engineer & 3+ years’ experience in Data Warehouse design and development for large-scale complex Data Platform. Experience with star schemas, dimensional modeling, and extract transform load (ETL) design and development. Expertise in cloud-based data platforms, such as Google, AWS, & Azure. In-depth knowledge of data modeling concepts and techniques, including relational, dimensional, and NoSQL data models Experience with open table formats like Iceberg, Hudi and Delta. Understanding of major programming/scripting languages like Java and/or R. Expert with data integration technologies like Kafka and Spark. Experience working with RDBMS like Oracle, SQL Server, MySQL and Postgres. Developer will be responsible for developing the Cloud Data warehouse using Google or Azure Platform. Candidate should be familiar with Python, Spark, Sql, No SQL, Dataproc, Airflow, DAG's, Automations, API integrations, Jupiter notebook, Databricks. Good to have knowledge working on large datasets, data pipeline optimization/performance tuning, Azure / Google DWH & Data Lakes, Analytics Experience working on Azure/Google Cloud services IaaS, PaaS Understand organization strategy and ability to design information systems to deliver the strategy on Azure/Google Platform Identify high-level business needs and lead the creation of detailed functional specs and other documentation, such as requirement traceability matrix, work-flow diagrams, use-cases etc Experience developing integration architecture and plan the system integration with third-party solutions and services. Involvement in various cloud Infrastructure architecture with proven experience with designing solutions in line with Azure/Google architecture Location: IND Bangalore - 2nd Floor Divyasree Towers Bannerghatta Main Road Language Requirements: Time Type: Full time If you are a California resident, by submitting your information, you acknowledge that you have read and have access to the Job Applicant Privacy Notice for California Residents R1613435

Posted 1 day ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Some careers shine brighter than others. If you’re looking for a career that will help you stand out, join HSBC and fulfil your potential. Whether you want a career that could take you to the top, or simply take you in an exciting new direction, HSBC offers opportunities, support and rewards that will take you further. HSBC is one of the largest banking and financial services organisations in the world, with operations in 64 countries and territories. We aim to be where the growth is, enabling businesses to thrive and economies to prosper, and, ultimately, helping people to fulfil their hopes and realise their ambitions. We are currently seeking an experienced professional to join our team in the role of a Consultant Specialist. In this role, you will You will be part of the IT Compute pod, in charge of the component of Pricing on both the legacy and strategic platforms. You will participate to the development on Sophis, on our regression tests and the maintenance/enhancements of our infrastructure (both on premise and gcp) Requirements To be successful in this role, you should meet the following requirements: B.E/B.Tech/M.E/M.S degree with 7-10 years of IT experience Excellent programming skills in Java Handson programming experience in Java, Python and Sql is must Good knowledge of Relational Database like Oracle PL/SQL programming skills Experience in C++ and C# would be a strong asset Excellent programming skills and complete Hands-On expertise. Sound knowledge of data structures, algorithms and design patterns Working knowledge of one or more of Apache Kafka, REST APIs, Docker, Kubernetes would be a great advantage Proficient in code versioning tools such as GIT, SVN etc Good understanding of agile methodologies and DevOps tooling such as JIRA, Maven, Jenkins, Ansible etc. Sound knowledge on functions, triggers, materialized views, DB management, Schema design Technology / programming language should not be a barrier to getting things done. Open to work on multiple technologies as required. Proficient in Data Structures & Algorithms and their practical usages Experience mentoring juniors and designing and building systems from scratch End to end ownership of delivery of a product (from requirement to production). Excellent problem solving and logical reasoning skills. You’ll achieve more when you join HSBC. www.hsbc.com/careers HSBC is committed to building a culture where all employees are valued, respected and opinions count. We take pride in providing a workplace that fosters continuous professional development, flexible working and opportunities to grow within an inclusive and diverse environment. Personal data held by the Bank relating to employment applications will be used in accordance with our Privacy Statement, which is available on our website. Issued by – HSBC Software Development India

Posted 1 day ago

Apply

10.0 - 12.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Our technology services client is seeking multiple Java Springboot Developer to join their team on a contract basis. These positions offer a strong potential for conversion to full-time employment upon completion of the initial contract period. Below are further details about the role: Role: Java Springboot Developer Experience: 10-12 Years Location: Hyderabad, Bangalore Notice period : Immediate to 15 Days Mandatory Skills: Java, Spring boot, Kafka, Camel Job description Minimum 12 years of experience in Java Development 6 to 12 years hands-on Java experience Experience in Building microservices Jaa SpringBoot Enterprise Integration Apache Camel messaging Kafka Cloud environment preferably Azure Good to have experience in Containerization Docker deployment Kubernetes DevOps JenkinsGitlabsAzure DevOps etc Strong in Development Engineering practices and communication skills Nice to have Experience in Volante or XSLT integration If you are interested, share the updated resume to sushmitha.r@s3staff.com

Posted 1 day ago

Apply

8.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Description Product Owner (PO) We are looking for a talented Technical Product Owner to join our Infra team on our great journey. This role involves specification of new components and improvements of existing ones. Responsibilities As a product owner you will be responsible for: Analyzing the market & customer requirements, alongside the product manager, design future capabilities and translate those into clear stories and corresponding acceptance criteria for the technical team. Write detailed & high-quality product requirements and user stories. Work closely with Engineering teams through development using agile methodology including determining scope and priorities for product development cycles. Accept/Reject completed user stories. Own and prioritize the product backlog. Foster collaboration between Engineering and Product teams. Participate in customer requirements discussions. Essential Requirements Overall 8-10 years’ experience Relevant academic background (degree in computer science, information systems or equivalent) At least 2 years’ experience as a technical PO, developer, or other hands-on role, to enable effective communication with all stakeholders. 3 years’ experience as a requirements analyst/product owner/system engineer in a software company. Experience in designing end-to-end product solutions. Technical - Java 11 , JSP/Angular UI, Microservices, Kafka, Apache Camel, Vert.X, PostgreSQL, Oracle, K8S, App Server (JBoss/WildFly), MQ, Grafana, Prometheus, DevOps practices . Working experience or familiarity from an architectural standpoint required in as many areas as possible. Technical knowledge related to MS ecosystem (development, architecture & security) will be preferred. Strong understanding of cloud-native architectures, DevOps, and CI/CD process and best practices. Strong verbal and written communication skills with excellent grasp in English. Excellent interpersonal & leadership skills. Self-starter, able to work in a fast-paced dynamic environment independently with minimal guidance. Experience in agile development – preferred. Experience in Payments/ Financial industry - preferred.

Posted 1 day ago

Apply

7.0 years

0 Lacs

Greater Chennai Area

On-site

Redefine the future of customer experiences. One conversation at a time. We’re changing the game with a first-of-its-kind, conversation-centric platform that unifies team collaboration and customer experience in one place. Powered by AI, built by amazing humans. Our culture is forward-thinking, customer-obsessed and built on an unwavering belief that connection fuels business and life; connections to our customers with our signature Amazing Service®, our products and services, and most importantly, each other. Since 2008, 100,000+ companies and 1M+ users rely on Nextiva for customer and team communication. If you’re ready to collaborate and create with amazing people, let your personality shine and be on the frontlines of helping businesses deliver amazing experiences, you’re in the right place. Build Amazing - Deliver Amazing - Live Amazing - Be Amazing We’re looking for an experienced Engineering Manager to lead backend and data platform teams building the next generation product. You will be responsible for leading the development of Java-based services , ETL pipelines , and data infrastructure that power mission-critical features like scheduling, labor forecasting, time tracking, and analytics. You’ll collaborate closely with product, data science, and infrastructure teams to ensure our systems are scalable, reliable, and data-driven — enabling our customers to optimize workforce operations in real time. Key Responsibilities Lead a team of backend and data engineers responsible for: Building and maintaining Java microservices (Spring Boot) for WFM features. Designing and scaling ETL pipelines, data ingestion, and data lake components. Supporting reporting, analytics, and forecasting models with high-quality datasets. Define and evolve the architecture for data processing, streaming, and batch workloads using tools like Apache Kafka, Airflow, AWS Glue, or Spark. Collaborate with Product Managers and Data Analysts to turn business requirements into scalable data solutions. Drive engineering best practices in CI/CD, code quality, observability, and data governance. Mentor engineers, foster a strong team culture, and support career growth through coaching and feedback. Work cross-functionally with QA, DevOps, and InfoSec to ensure compliance, scalability, and performance. Required Qualifications 7+ years of backend software engineering experience, with at least 3+ years in engineering leadership roles. Strong hands-on experience with Java (Spring Boot) and microservice architecture. Proven experience managing ETL workflows, data pipelines, and distributed data processing. Knowledge of relational and analytical databases (e.g., PostgreSQL, Redshift, Snowflake). Experience with event streaming platforms (Kafka, Kinesis, or similar). Cloud-native development experience with AWS, GCP, or Azure. Familiarity with data warehousing, schema evolution, and data quality best practices. Solid understanding of Agile development methodologies and team management. Preferred Qualifications Experience with observability tools like Prometheus, Grafana, or Datadog. Exposure to ML/forecasting models for labor planning is a plus. Nextiva DNA (Core Competencies) Nextiva’s most successful team members share common traits and behaviors: Drives Results: Action-oriented with a passion for solving problems. They bring clarity and simplicity to ambiguous situations, challenge the status quo, and ask what can be done differently. They lead and drive change, celebrating success to build more success. Critical Thinker: Understands the "why" and identifies key drivers, learning from the past. They are fact-based and data-driven, forward-thinking, and see problems a few steps ahead. They provide options, recommendations, and actions, understanding risks and dependencies. Right Attitude: They are team-oriented, collaborative, competitive, and hate losing. They are resilient, able to bounce back from setbacks, zoom in and out, and get in the trenches to help solve important problems. They cultivate a culture of service, learning, support, and respect, caring for customers and teams. Total Rewards Our Total Rewards offerings are designed to allow our employees to take care of themselves and their families so they can be their best, in and out of the office. Our compensation packages are tailored to each role and candidate's qualifications. We consider a wide range of factors, including skills, experience, training, and certifications, when determining compensation. We aim to offer competitive salaries or wages that reflect the value you bring to our team. Depending on the position, compensation may include base salary and/or hourly wages, incentives, or bonuses. Medical 🩺 - Medical insurance coverage is available for employees, their spouse, and up to two dependent children with a limit of 500,000 INR, as well as their parents or in-laws for up to 300,000 INR. This comprehensive coverage ensures that essential healthcare needs are met for the entire family unit, providing peace of mind and security in times of medical necessity. Group Term & Group Personal Accident Insurance 💼 - Provides insurance coverage against the risk of death / injury during the policy period sustained due to an accident caused by violent, visible & external means. Coverage Type - Employee Only Sum Insured - 3 times of annual CTC with minimum cap of INR 10,00,000 Free Cover Limit - 1.5 Crore Work-Life Balance ⚖️ - 15 days of Privilege leaves per calendar year, 6 days of Paid Sick leave per calendar year, 6 days of Casual leave per calendar year. Paid 26 weeks of Maternity leaves, 1 week of Paternity leave, a day off on your Birthday, and paid holidays Financial Security💰 - Provident Fund & Gratuity Wellness 🤸‍ - Employee Assistance Program and comprehensive wellness initiatives Growth 🌱 - Access to ongoing learning and development opportunities and career advancement At Nextiva, we're committed to supporting our employees' health, well-being, and professional growth. Join us and build a rewarding career! Established in 2008 and headquartered in Scottsdale, Arizona, Nextiva secured $200M from Goldman Sachs in late 2021, valuing the company at $2.7B.To check out what’s going on at Nextiva, check us out on Instagram, Instagram (MX), YouTube, LinkedIn, and the Nextiva blog.

Posted 1 day ago

Apply

4.0 years

0 Lacs

Greater Chennai Area

On-site

Redefine the future of customer experiences. One conversation at a time. We’re changing the game with a first-of-its-kind, conversation-centric platform that unifies team collaboration and customer experience in one place. Powered by AI, built by amazing humans. Our culture is forward-thinking, customer-obsessed and built on an unwavering belief that connection fuels business and life; connections to our customers with our signature Amazing Service®, our products and services, and most importantly, each other. Since 2008, 100,000+ companies and 1M+ users rely on Nextiva for customer and team communication. If you’re ready to collaborate and create with amazing people, let your personality shine and be on the frontlines of helping businesses deliver amazing experiences, you’re in the right place. Build Amazing - Deliver Amazing - Live Amazing - Be Amazing We are seeking a highly skilled Backend Software Engineer to join our engineering team. As a Backend Software Engineer , you will be responsible for designing, developing, and maintaining robust and scalable backend systems. You will play a critical role in shaping the architecture of our products and mentoring junior engineers. Responsibilities: Design, develop, and maintain backend services and APIs Collaborate with frontend and mobile teams to deliver end-to-end solutions Optimize application performance and scalability Write clean, well-structured, and maintainable code Participate in code reviews and provide constructive feedback Identify and implement process improvements Mentor and guide junior engineers Qualifications: Proven experience as a Software Engineer with a minimum of 4+ years of experience In-depth knowledge of modern software development methodologies (Agile, DevOps) Expertise in building RESTful web applications using Java 8+ and Spring Framework Strong proficiency in SQL and experience with databases like MySQL and Postgres Expert in Java (Java 8+), Spring Boot, REST APIs Strong SQL and experience with analytics tools (Power BI, Tableau, or similar) Familiarity with Kafka, microservices architecture Cloud experience (AWS, GCP, or Azure) Data modeling and ETL concepts Familiarity with caching solutions like Redis Solid understanding of system design principles and architecture Experience with distributed systems is a plus Knowledge of cloud platforms (AWS, GCP, Azure), Docker, and Kubernetes is a plus Strong problem-solving and debugging skills Excellent communication and collaboration skills Degree in mathematics or computer science preferred Additional Qualities Project ownership Self-motivation and dedication Ability to work with deadlines Multi-tasking, managing multiple tasks Attention to detail Team player as well as individual contributor Willing to develop new projects, debug, and fix issues in existing projects Nextiva DNA (Core Competencies) Nextiva’s most successful team members share common traits and behaviors: Drives Results: Action-oriented with a passion for solving problems. They bring clarity and simplicity to ambiguous situations, challenge the status quo, and ask what can be done differently. They lead and drive change, celebrating success to build more success. Critical Thinker: Understands the "why" and identifies key drivers, learning from the past. They are fact-based and data-driven, forward-thinking, and see problems a few steps ahead. They provide options, recommendations, and actions, understanding risks and dependencies. Right Attitude: They are team-oriented, collaborative, competitive, and hate losing. They are resilient, able to bounce back from setbacks, zoom in and out, and get in the trenches to help solve important problems. They cultivate a culture of service, learning, support, and respect, caring for customers and teams. Total Rewards Our Total Rewards offerings are designed to allow our employees to take care of themselves and their families so they can be their best, in and out of the office. Our compensation packages are tailored to each role and candidate's qualifications. We consider a wide range of factors, including skills, experience, training, and certifications, when determining compensation. We aim to offer competitive salaries or wages that reflect the value you bring to our team. Depending on the position, compensation may include base salary and/or hourly wages, incentives, or bonuses. Medical 🩺 - Medical insurance coverage is available for employees, their spouse, and up to two dependent children with a limit of 500,000 INR, as well as their parents or in-laws for up to 300,000 INR. This comprehensive coverage ensures that essential healthcare needs are met for the entire family unit, providing peace of mind and security in times of medical necessity. Group Term & Group Personal Accident Insurance 💼 - Provides insurance coverage against the risk of death / injury during the policy period sustained due to an accident caused by violent, visible & external means. Coverage Type - Employee Only Sum Insured - 3 times of annual CTC with minimum cap of INR 10,00,000 Free Cover Limit - 1.5 Crore Work-Life Balance ⚖️ - 15 days of Privilege leaves per calendar year, 6 days of Paid Sick leave per calendar year, 6 days of Casual leave per calendar year. Paid 26 weeks of Maternity leaves, 1 week of Paternity leave, a day off on your Birthday, and paid holidays Financial Security💰 - Provident Fund & Gratuity Wellness 🤸‍ - Employee Assistance Program and comprehensive wellness initiatives Growth 🌱 - Access to ongoing learning and development opportunities and career advancement At Nextiva, we're committed to supporting our employees' health, well-being, and professional growth. Join us and build a rewarding career! Established in 2008 and headquartered in Scottsdale, Arizona, Nextiva secured $200M from Goldman Sachs in late 2021, valuing the company at $2.7B.To check out what’s going on at Nextiva, check us out on Instagram, Instagram (MX), YouTube, LinkedIn, and the Nextiva blog.

Posted 1 day ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies