Home
Jobs

10506 Kafka Jobs - Page 35

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 6.0 years

8 - 20 Lacs

Bengaluru, Karnataka

On-site

Job Title: Java API Management & Microservices Location: Banglore Experience Required: 6 to 8 years Employment Type: Full-time Notice Period: Immediate to 30 days preferred We are looking for a highly skilled Java API Developer specializing in Springboot, API and Microservices to join our growing team. The ideal candidate will have strong backend development expertise and hands-on experience building scalable, secure, and high-performance applications using Java and Spring Boot frameworks. Key Responsibilities: Design, develop, and maintain Java-based applications using Spring Boot. Build RESTful APIs and microservices architecture. Work closely with front-end teams, QA, and DevOps to deliver integrated solutions. Participate in code reviews, unit testing, and performance tuning. Ensure best practices in software design, development, and documentation. Required Skills: Strong experience in Java 8+ , Spring Boot , and Spring Framework . Experience with REST APIs , JPA/Hibernate , and Microservices . Familiarity with SQL/NoSQL databases like MySQL, PostgreSQL, MongoDB. Knowledge of CI/CD tools (Jenkins, Git, Maven). Exposure to cloud environments (AWS, Azure, or GCP) is a plus. Must Have Skills: Java, Srpingboot, API, Microservices Nice to Have: Experience with Kafka, Docker, Kubernetes. Knowledge of Agile methodologies and tools like JIRA or Confluence. Job Type: Full-time Pay: ₹800,000.00 - ₹2,000,000.00 per year Benefits: Health insurance Life insurance Provident Fund Location Type: In-person Schedule: Fixed shift Application Question(s): Do you have 15 years of fulltime education? (No Open schooling/ open university or correspondence) Experience: Java: 6 years (Required) Spring Boot: 6 years (Required) Microservices: 6 years (Required) Location: Bangalore, Karnataka (Required) Work Location: In person

Posted 5 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Key Responsibilities: Design, develop, and maintain Java-based applications. Write clean, scalable, and reusable code using industry best practices. Build and consume RESTful APIs and integrate with third-party services. Collaborate with cross-functional teams including QA, DevOps, and Product. Participate in code reviews and contribute to a high-performing, supportive development team. Troubleshoot production issues and provide timely fixes. Ensure application performance, quality, and responsiveness. Mentor junior developers and contribute to knowledge sharing. Technical Skills Required: AreaTechnologies Languages Java 8/11/17 Frameworks Spring Boot, Spring MVC, Spring Data JPA Web Services RESTful APIs, JSON, Swagger/OpenAPI Databases MySQL, PostgreSQL, Oracle or MongoDB ORM Hibernate, JPA Tools Git, Maven/Gradle, Jenkins, JIRA Testing JUnit, Mockito, Postman Cloud (Preferred) AWS / Azure / GCP Others (Good to Have) Kafka, Docker, Kubernetes, Microservices architecture Qualifications: Bachelor's/Master's degree in Computer Science or related field. 5+ years of professional experience in Java development. Strong understanding of object-oriented programming and design patterns. Good communication and interpersonal skills. Agile/Scrum experience is a plus. Nice to Have: Exposure to CI/CD pipelines. Domain knowledge in [Banking/Healthcare/E-Commerce etc.].

Posted 5 days ago

Apply

8.0 - 12.0 years

0 Lacs

Kochi, Kerala, India

On-site

Candidates ready to join immediately can share their details via email for quick processing. 📌 CCTC | ECTC | Notice Period | Location Preference nitin.patil@ust.com Act fast for immediate attention! ⏳📩 Key Responsibilities: 1. Technology Expertise & Development Leadership Demonstrate deep expertise in Java or Node.js , including data structures, algorithms, APIs, libraries, and best practices . Lead and guide the development team in implementing high-quality, scalable, and efficient solutions . Provide hands-on coding support, technical reviews, and troubleshooting assistance as needed. 2. System Design & Low-Level Architecture Design and implement scalable, maintainable, and high-performance applications . Ensure adherence to architectural principles, coding standards, and integration patterns . Conduct low-level design reviews , focusing on error handling, logging, and maintainability . 3. Cloud & DevOps Practices Develop and deploy cloud-native applications on AWS or Azure , leveraging microservices, containerization (Docker/Kubernetes), and serverless computing . Collaborate with DevOps teams to improve automation, CI/CD pipelines, and infrastructure reliability . Ensure security best practices are followed in development and deployment. 4. Agile Development & Best Practices Work within an Agile development environment (Scrum/Kanban) and drive best practices for software development. Encourage code reusability, modular development, and performance optimization within the team. Contribute to continuous integration, automated testing, and deployment strategies . 5. Requirement Analysis & Collaboration Engage with business analysts and product owners to analyze and refine functional and non-functional requirements . Translate business needs into technical design and implementation plans . Ensure alignment of software solutions with business and technical requirements . 6. Technical Mentorship & Code Reviews Mentor junior developers and provide technical guidance to the team. Conduct code reviews, enforce coding standards, and ensure best practices in software development. Foster a culture of continuous learning and technical excellence . 7. Solution Structuring & Implementation Support Assist in defining component solutions based on architectural guidance . Provide technical inputs for solution proposals and feasibility analysis . Support the deployment of solutions, ensuring adherence to scalability, performance, and security requirements . Required Skills & Qualifications Must-Have Skills: Experience: 8-12 years of hands-on experience in software development, system design, and architecture . Technical Expertise: Strong proficiency in Java and/or Node.js , along with relevant frameworks and tools. Architecture Patterns: Deep understanding of SOA, Microservices, N-Tier, and Event-Driven Architecture . Cloud & DevOps: Hands-on experience with AWS and/or Azure , including serverless computing, containerization (Docker/Kubernetes), and CI/CD pipelines . Agile & DevOps Practices: Proficiency in Agile methodologies (Scrum, Kanban) and DevOps best practices . Database Management: Strong knowledge of SQL & NoSQL databases and data modeling principles . Problem-Solving: Excellent troubleshooting and analytical skills for diagnosing complex technical issues. Leadership & Communication: Effective communication and stakeholder management skills with the ability to mentor teams . Industry Experience: Prior experience in the Healthcare industry (preferred but not mandatory). Good-to-Have Skills: Frontend Development: Experience with modern front-end frameworks (React/Angular) . Security & Compliance: Exposure to security best practices and compliance standards . CI/CD & Automation: Hands-on experience with CI/CD pipelines and automation tools . Event-Driven Systems: Knowledge of API gateways, message brokers (Kafka, RabbitMQ), and event-driven architectures . Skills Software Architecture,Java,Node.Js,cloud technologies

Posted 5 days ago

Apply

8.0 - 12.0 years

0 Lacs

Thiruvananthapuram, Kerala, India

On-site

Candidates ready to join immediately can share their details via email for quick processing. 📌 CCTC | ECTC | Notice Period | Location Preference nitin.patil@ust.com Act fast for immediate attention! ⏳📩 Key Responsibilities: 1. Technology Expertise & Development Leadership Demonstrate deep expertise in Java or Node.js , including data structures, algorithms, APIs, libraries, and best practices . Lead and guide the development team in implementing high-quality, scalable, and efficient solutions . Provide hands-on coding support, technical reviews, and troubleshooting assistance as needed. 2. System Design & Low-Level Architecture Design and implement scalable, maintainable, and high-performance applications . Ensure adherence to architectural principles, coding standards, and integration patterns . Conduct low-level design reviews , focusing on error handling, logging, and maintainability . 3. Cloud & DevOps Practices Develop and deploy cloud-native applications on AWS or Azure , leveraging microservices, containerization (Docker/Kubernetes), and serverless computing . Collaborate with DevOps teams to improve automation, CI/CD pipelines, and infrastructure reliability . Ensure security best practices are followed in development and deployment. 4. Agile Development & Best Practices Work within an Agile development environment (Scrum/Kanban) and drive best practices for software development. Encourage code reusability, modular development, and performance optimization within the team. Contribute to continuous integration, automated testing, and deployment strategies . 5. Requirement Analysis & Collaboration Engage with business analysts and product owners to analyze and refine functional and non-functional requirements . Translate business needs into technical design and implementation plans . Ensure alignment of software solutions with business and technical requirements . 6. Technical Mentorship & Code Reviews Mentor junior developers and provide technical guidance to the team. Conduct code reviews, enforce coding standards, and ensure best practices in software development. Foster a culture of continuous learning and technical excellence . 7. Solution Structuring & Implementation Support Assist in defining component solutions based on architectural guidance . Provide technical inputs for solution proposals and feasibility analysis . Support the deployment of solutions, ensuring adherence to scalability, performance, and security requirements . Required Skills & Qualifications Must-Have Skills: Experience: 8-12 years of hands-on experience in software development, system design, and architecture . Technical Expertise: Strong proficiency in Java and/or Node.js , along with relevant frameworks and tools. Architecture Patterns: Deep understanding of SOA, Microservices, N-Tier, and Event-Driven Architecture . Cloud & DevOps: Hands-on experience with AWS and/or Azure , including serverless computing, containerization (Docker/Kubernetes), and CI/CD pipelines . Agile & DevOps Practices: Proficiency in Agile methodologies (Scrum, Kanban) and DevOps best practices . Database Management: Strong knowledge of SQL & NoSQL databases and data modeling principles . Problem-Solving: Excellent troubleshooting and analytical skills for diagnosing complex technical issues. Leadership & Communication: Effective communication and stakeholder management skills with the ability to mentor teams . Industry Experience: Prior experience in the Healthcare industry (preferred but not mandatory). Good-to-Have Skills: Frontend Development: Experience with modern front-end frameworks (React/Angular) . Security & Compliance: Exposure to security best practices and compliance standards . CI/CD & Automation: Hands-on experience with CI/CD pipelines and automation tools . Event-Driven Systems: Knowledge of API gateways, message brokers (Kafka, RabbitMQ), and event-driven architectures . Skills Software Architecture,Java,Node.Js,cloud technologies

Posted 5 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Profile: Data Management (AWS) developer Location: Pune/Bangalore Experience: 5 to 10 years of experience Overview : We are looking for a Data Management (AWS) developer who will serve as the technical counterpart to data stewards across various business domains. This role will focus on the technical aspects of data management, including the integration of data catalogs, data quality management, and access management frameworks within our data lakehouse . Key Responsibilities : 1. Integrate Acryl data catalog with AWS Glue data catalog to enhance data discoverability and management. 2. Develop frameworks and processes for deploying and maintaining data classification and data quality rules in the data lakehouse. 3. Implement and maintain Lake Formation access frameworks, including OpenID Connect (OIDC) for secure data access. 4. Build and maintain data quality and classification reports and visualizations to support data-driven decision-making. 5. Develop and implement mechanisms for column-level data lineage in the data lakehouse . 6. Collaborate with data stewards to ensure effective data ownership, cataloging, and metadata management. Qualifications: 1. Relevant experience in data management, data governance, or related technical fields. 2. Strong technical expertise in AWS services, particularly in AWS Glue, Lake Formation, and data quality management tools. 3. Familiarity with data security practices, including OIDC and AWS IAM. 4. Experience with AWS Athena, Apache Airflow. 5. Relevant certifications (e.g., CDMP) are a plus. 6. Terraform, Github, Python, Desired Skills: § Masters in IT or bachelor with 10+ year work experience. § At least 5 years’ experience in cloud development § Strong analytical and problem-solving skills. § Ability to communicate technical concepts to non-technical stakeholders. § Understanding data modeling concepts (3NF, snowflake schemas, data vault). § Knowledge of various database technologies and data lifecycle management principles. § Familiarity with Open Table formats and streaming technologies (e.g., Confluent Kafka) is a plus.

Posted 5 days ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About the Team Data is at the foundation of DoorDash success. The Data Engineering team builds database solutions for various use cases including reporting, product analytics, marketing optimization and financial reporting. Team serves as the foundation for decision-making at DoorDash. About the Role DoorDash is looking for a Staff Software Engineer, Data to be a technical lead and help architect and scale our data reliability, data infrastructure, automation and tools to meet growing business needs. You're excited about this opportunity because you will… Own critical data systems that support multiple products/teams Develop, implement and enforce best practices for data infrastructure and automation Design, develop and implement large scale, high volume, high performance data models and pipelines for Data Lake and Data Warehouse Improve the reliability and scalability of our Ingestion, data processing, ETLs, Reporting tools and data ecosystem services Manage a portfolio of data products that deliver high-quality, trustworthy data Help onboard and support other engineers as they join the team We're excited about you because… 8+ years of professional experience as a hands-on engineer and technical leader leading multiple projects 6+ years experience working in data platform and data engineering or a similar role Proficiency in programming languages such as Python/Kotlin/Scala 4+ years of experience in ETL orchestration and workflow management tools like Airflow Expert in database fundamentals, SQL, data reliability practices and distributed computing 4+ years of experience with the Distributed data/similar ecosystem (Spark, Presto) and streaming technologies such as Kafka/Flink/Spark Streaming Excellent communication skills and experience working with technical and non-technical teams and knowledge of reporting tools Comfortable working in fast paced environment, self starter and self organizing Ability to think strategically, analyze and interpret market and consumer information You must be located near one of our engineering hubs indicated above We use Covey as part of our hiring and / or promotional process for jobs in NYC and certain features may qualify it as an AEDT. As part of the evaluation process we provide Covey with job requirements and candidate submitted applications. We began using Covey Scout for Inbound on June 20, 2024. Please see the independent bias audit report covering our use of Covey here. About DoorDash At DoorDash, our mission to empower local economies shapes how our team members move quickly, learn, and reiterate in order to make impactful decisions that display empathy for our range of users—from Dashers to merchant partners to consumers. We are a technology and logistics company that started with door-to-door delivery, and we are looking for team members who can help us go from a company that is known for delivering food to a company that people turn to for any and all goods. DoorDash is growing rapidly and changing constantly, which gives our team members the opportunity to share their unique perspectives, solve new challenges, and own their careers. We're committed to supporting employees' happiness, healthiness, and overall well-being by providing comprehensive benefits and perks. Our Commitment to Diversity and Inclusion We're committed to growing and empowering a more inclusive community within our company, industry, and cities. That's why we hire and cultivate diverse teams of people from all backgrounds, experiences, and perspectives. We believe that true innovation happens when everyone has room at the table and the tools, resources, and opportunity to excel. If you need any accommodations, please inform your recruiting contact upon initial connection. About DoorDash At DoorDash, our mission to empower local economies shapes how our team members move quickly, learn, and reiterate in order to make impactful decisions that display empathy for our range of users—from Dashers to merchant partners to consumers. We are a technology and logistics company that started with door-to-door delivery, and we are looking for team members who can help us go from a company that is known for delivering food to a company that people turn to for any and all goods. DoorDash is growing rapidly and changing constantly, which gives our team members the opportunity to share their unique perspectives, solve new challenges, and own their careers. We're committed to supporting employees' happiness, healthiness, and overall well-being by providing comprehensive benefits and perks. Our Commitment to Diversity and Inclusion We're committed to growing and empowering a more inclusive community within our company, industry, and cities. That's why we hire and cultivate diverse teams of people from all backgrounds, experiences, and perspectives. We believe that true innovation happens when everyone has room at the table and the tools, resources, and opportunity to excel. If you need any accommodations, please inform your recruiting contact upon initial connection. We use Covey as part of our hiring and/or promotional process for jobs in certain locations. The Covey tool has been reviewed by an independent auditor. Results of the audit may be viewed here: https://getcovey.com/nyc-local-law-144 To request a reasonable accommodation under applicable law or alternate selection process, please inform your recruiting contact upon initial connection.

Posted 5 days ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Summary: A Data Integration Engineer is responsible for designing, developing, and maintaining data integration solutions within an organization. They work closely with various stakeholders, including business analysts, data scientists, and software developers, to ensure the smooth flow of data between different systems, databases, and applications. Their primary objective is to create efficient, reliable, and scalable data integration pipelines that enable accurate data analysis and reporting. Responsibilities: Data Integration Engineer is responsible for designing, developing, and maintaining data integration solutions within an organization Skills Requirements: Technical Skills: Proficiency in programming languages such as Python, Java, or Scala. Experience with data integration tools and technologies, such as Apache Kafka, Apache Nifi, or Talend. Familiarity with SQL and working knowledge of relational databases like MySQL, Oracle, or SQL Server. Understanding of data modeling and ETL (Extract, Transform, Load) processes. Knowledge of data warehousing concepts and technologies. Experience with cloud platforms like AWS, Azure, or Google Cloud. Familiarity with API development and integration. Understanding of data quality and data governance principles. Analytical and Problem-Solving Skills: Ability to analyze complex data integration requirements and translate them into effective technical solutions. Strong problem-solving skills to identify and resolve data integration issues. Attention to detail to ensure data accuracy and integrity. Education Requirements: Bachelor's degree in Computer Science, Information Technology, or a related field (or equivalent experience).

Posted 5 days ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Architect and Senior Architect - Data Governance Chennai, Bangalore, Hyderabad Who we are Tiger Analytics is a global leader in Data, AI, and Analytics, helping Fortune 500 companies solve their most complex business challenges. We offer full-stack AI and analytics services & solutions to empower businesses to achieve real outcomes and value at scale. We are on a mission to push the boundaries of what AI and analytics can do to help enterprises navigate uncertainty and move forward decisively. Our purpose is to provide certainty to shape a better tomorrow. Our team of 5000+ technologists and consultants are based in the US, Canada, the UK, India, Singapore and Australia, working closely with clients across CPG, Retail, Insurance, BFS, Manufacturing, Life Sciences, and Healthcare. Many of our team leaders rank in Top 10 and 40 Under 40 lists, exemplifying our dedication to innovation and excellence. In recognition of its exceptional workplace culture, industry impact, and leadership in AI strategy, Tiger Analytics has received multiple prestigious accolades in 2025, including the 3AI Pinnacle Award, India's Best Workplaces (2024-2025), WOW Workplaces of 2025, and the Leading GenAI Service Provider title at the GenAI Conclave 2025. The firm was also celebrated as a High-Performance Culture Curator at Darwin Unboxed 2025 and honored with the Minsky Award for Excellence in AI Strategy Consulting. Job Description As a Data Governance Architect, your work is a combination of hands-on contribution, customer engagement, and technical team management. Overall, youʼll design, architect, deploy, and maintain big data-based data governance solutions. More specifically, this will involve: Technical management across the full life cycle of big data-based data governance projects from requirement gathering and analysis to platform selection, design of the architecture, and deployment. Scaling the solution in a cloud-based infrastructure. Collaborating with business consultants, data scientists, engineers, and developers to develop data solutions. Exploring new technologies for creative business problem-solving Leading and mentoring a team of data governance engineers What do we expect? 10+ years of technical experience with 5+ years in the Hadoop ecosystem and 3+ years in Data Governance Solutions Hands-on experience with Data Governance Solutions with a good understanding of the below Data Catalog Business Glossary Business metadata, technical metadata, operational Metadata Data Quality Data Profiling Data Lineage Expertise And Qualification Hands-on experience with the following technologies: Hadoop ecosystem - HDFS, Hive, Sqoop, Kafka, ELK Stack, etc Spark, Scala, Python, and core/advanced Java Relevant AWS/GCP components required to build big data solutions Good to know: Databricks, Snowflake Familiarity working with: Designing/building large cloud-computing infrastructure solutions (in AWS/GCP) Data lake design and implementation Full life cycle of a Hadoop solution Distributed computing and parallel processing environments HDFS administration, configuration management, monitoring, debugging, and performance tuning You are important to us, let’s stay connected! Every individual comes with a different set of skills and qualities so even if you don’t tick all the boxes for the role today we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal- opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encouragepeople to grow the way they desire, packages are among the best in industry. Note: Thedesignation will be commensurate with expertise and experience. Compensation packages are among the bestin the industry. Additional Benefits: Health insurance (self & family),virtual wellness platform, Car Lease Program and knowledge communities.

Posted 5 days ago

Apply

0 years

39 - 42 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary We are looking for an experienced Azure Databricks Developer to design, develop, and manage big data solutions using Azure services, with a strong focus on Databricks , Spark , and ETL pipelines . The ideal candidate will have a solid understanding of data engineering, Azure cloud components, and real-time/batch data processing. Key Responsibilities Design and build scalable and robust data pipelines using Azure Databricks (Spark). Develop ETL/ELT processes for ingesting, transforming, and loading data from various sources. Integrate data solutions with Azure Data Lake, Azure Data Factory, and SQL Databases. Optimize Spark jobs for performance and cost-efficiency. Collaborate with data architects, analysts, and business stakeholders to deliver data solutions. Implement data quality checks, validations, and unit testing. Ensure best practices in CI/CD, data security, and governance. Must-Have Skills Strong hands-on experience with Azure Databricks and Apache Spark (PySpark/Scala). Proficiency in Azure Data Factory, Azure Data Lake, and Azure Synapse (optional). Solid understanding of SQL, Delta Lake, and data warehousing concepts. Experience building automated data pipelines and working with large datasets. Good understanding of DevOps, version control (e.g., Git), and CI/CD pipelines. Strong problem-solving and debugging skills. Nice-to-Have Experience with Azure Synapse Analytics, Power BI, or Kafka/Event Hub integration. Familiarity with DataBricks REST APIs, MLFlow, or Databricks Jobs orchestration. Knowledge of data modeling, data governance, and metadata management. Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. Azure certifications (e.g., DP-203, Azure Data Engineer Associate) are a plus. Skills: python,delta lake,apache spark (pyspark/scala),pyspark,ci/cd,azure data lake,azure,azure data factory,etl/elt processes,data governance,azure databricks,data modeling,git,databricks,metadata management,architect,sql

Posted 5 days ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

TransUnion's Job Applicant Privacy Notice What We'll Bring The Manager for 24x7 Application Support is responsible for ensuring the smooth and efficient operation of mission-critical applications in a round-the-clock support environment. This role involves: Overseeing support teams to maintain high performance and morale. Managing incidents promptly to minimize downtime and impact. Ensuring service levels are consistently met or exceeded. Coordinating with various stakeholders to maintain system uptime and reliability. Proactively identifying potential issues to prevent disruptions. Continuously improving service delivery processes. Ensuring application performance aligns with business objectives. The Support Manager will work proactively to identify potential issues, improve service delivery, and ensure application performance aligns with business objectives. What You'll Bring Key Roles and Responsibilities: Service Delivery Management Oversee the 24x7 support services for all critical business applications, ensuring high availability and timely resolution of incidents and requests. Manage and ensure adherence to service level agreements for response times, resolution times and overall service quality. Coordinate with cross-functional teams, including level 2, infrastructure and development teams, to ensure seamless application support, Incident Management Ensure timely identification and resolution of high priority incidents to minimize business disruption. Oversee incident management process, ensuring incidents are logged, tracked and escalated appropriately. Ensure a robust problem management process is in place to address recurring application issues and work with teams to implement long term solutions. Team Leadership and Support Lead and manage the 24x7 support team, ensuring they are adequately staffed and trained to handle the demands of continuous application support Provide leadership and mentoring to team members, setting performance goals and ensuring team performance aligns with organizational expectations. Foster a collaborative and supportive environment, ensuring team can effectively manage incidents and communicate with stakeholders. Continual Service Improvement Proactively identify areas for improvement in application support, implementing enhancements to processes, tools and technologies. Monitor application performance and trends to identify potential issues and ensure service continuity. Work closely with other teams to implement proactive measures that reduce incidents and improve service availability. Stakeholder Communication and Reporting Serve as a primary point of contact for key stakeholders, providing regular updates on application performance, incident resolution and service metrics. Produce regular reports on application support performance including, incident management, downtime and KPIs. Risk and Compliance Management Ensure that the application support service complies with relevant regulatory and security requirements. Monitor and address any security vulnerabilities related to the supported applications. Ensure proper backup, disaster recovery and business continuity plans are in place for all supported applications. Qualifications Impact You'll Make: Experience 10 + years of experience in application support (Java, Linux, Kafka, PostGre based) and IT service management with at at least 2-3 years of experience in a leadership role. Proven experience managing 24x7 support environments, particularly for mission critical or enterprise level applications. Strong understanding of ITIL processes including incident management, change management and problem management Experience working on monitoring tools, ticketing tools (Splunk, BMC Remedy, JIRA). Familiarity with cloud platforms, infrastructure and applications is a plus. Skills Excellent leadership and team management abilities. Strong communication skills with ability to effectively communicate with both technical teams and business stakeholders. Ability to manage high-pressure situations and ensure swift resolution of incidents. Analytical mindset with the ability to identify trends, root causes and improvement opportunities. Ability to deliver high quality support Certifications ITIL Foundation Certification (preferred) This is a hybrid position and involves regular performance of job responsibilities virtually as well as in-person at an assigned TU office location for a minimum of two days a week. TransUnion Job Title Manager I, Applications Support

Posted 5 days ago

Apply

0 years

39 - 42 Lacs

Gurugram, Haryana, India

On-site

Job Summary We are looking for an experienced Azure Databricks Developer to design, develop, and manage big data solutions using Azure services, with a strong focus on Databricks , Spark , and ETL pipelines . The ideal candidate will have a solid understanding of data engineering, Azure cloud components, and real-time/batch data processing. Key Responsibilities Design and build scalable and robust data pipelines using Azure Databricks (Spark). Develop ETL/ELT processes for ingesting, transforming, and loading data from various sources. Integrate data solutions with Azure Data Lake, Azure Data Factory, and SQL Databases. Optimize Spark jobs for performance and cost-efficiency. Collaborate with data architects, analysts, and business stakeholders to deliver data solutions. Implement data quality checks, validations, and unit testing. Ensure best practices in CI/CD, data security, and governance. Must-Have Skills Strong hands-on experience with Azure Databricks and Apache Spark (PySpark/Scala). Proficiency in Azure Data Factory, Azure Data Lake, and Azure Synapse (optional). Solid understanding of SQL, Delta Lake, and data warehousing concepts. Experience building automated data pipelines and working with large datasets. Good understanding of DevOps, version control (e.g., Git), and CI/CD pipelines. Strong problem-solving and debugging skills. Nice-to-Have Experience with Azure Synapse Analytics, Power BI, or Kafka/Event Hub integration. Familiarity with DataBricks REST APIs, MLFlow, or Databricks Jobs orchestration. Knowledge of data modeling, data governance, and metadata management. Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. Azure certifications (e.g., DP-203, Azure Data Engineer Associate) are a plus. Skills: python,delta lake,apache spark (pyspark/scala),pyspark,ci/cd,azure data lake,azure,azure data factory,etl/elt processes,data governance,azure databricks,data modeling,git,databricks,metadata management,architect,sql

Posted 5 days ago

Apply

0 years

39 - 42 Lacs

Greater Kolkata Area

On-site

Job Summary We are looking for an experienced Azure Databricks Developer to design, develop, and manage big data solutions using Azure services, with a strong focus on Databricks , Spark , and ETL pipelines . The ideal candidate will have a solid understanding of data engineering, Azure cloud components, and real-time/batch data processing. Key Responsibilities Design and build scalable and robust data pipelines using Azure Databricks (Spark). Develop ETL/ELT processes for ingesting, transforming, and loading data from various sources. Integrate data solutions with Azure Data Lake, Azure Data Factory, and SQL Databases. Optimize Spark jobs for performance and cost-efficiency. Collaborate with data architects, analysts, and business stakeholders to deliver data solutions. Implement data quality checks, validations, and unit testing. Ensure best practices in CI/CD, data security, and governance. Must-Have Skills Strong hands-on experience with Azure Databricks and Apache Spark (PySpark/Scala). Proficiency in Azure Data Factory, Azure Data Lake, and Azure Synapse (optional). Solid understanding of SQL, Delta Lake, and data warehousing concepts. Experience building automated data pipelines and working with large datasets. Good understanding of DevOps, version control (e.g., Git), and CI/CD pipelines. Strong problem-solving and debugging skills. Nice-to-Have Experience with Azure Synapse Analytics, Power BI, or Kafka/Event Hub integration. Familiarity with DataBricks REST APIs, MLFlow, or Databricks Jobs orchestration. Knowledge of data modeling, data governance, and metadata management. Qualifications Bachelor’s or Master’s degree in Computer Science, Information Systems, or related field. Azure certifications (e.g., DP-203, Azure Data Engineer Associate) are a plus. Skills: python,delta lake,apache spark (pyspark/scala),pyspark,ci/cd,azure data lake,azure,azure data factory,etl/elt processes,data governance,azure databricks,data modeling,git,databricks,metadata management,architect,sql

Posted 5 days ago

Apply

0.0 - 7.0 years

0 Lacs

Kovilpatti, Tamil Nadu

On-site

Title: Senior Backend Developer (API Specialist) Years of Experience: 7+ years Location: Onsite ( The selected candidate is required to relocate to Kovilpatti, Tamil Nadu for the initial three-month project training session. Post training, the candidate will be relocated to one of our onsite locations: Chennai, Hyderabad, or Pune, based on project allocation.) Job Description The Senior Backend Developer (API Specialist) will design and build enterprise-grade backend services and APIs. This role is key in enabling secure, scalable integrations across internal systems and external platforms. The ideal candidate has hands-on experience with microservices, API gateways, and DevSecOps, along with performance tuning and versioning best practices. Key responsibilities · Design microservice architectures and RESTful APIs using modern backend frameworks · Implement secure API authentication/authorization protocols (OAuth2, OpenID) · Manage API versioning, rate-limiting, and lifecycle documentation · Work with API Management platforms (e.g., Azure API Management) · Develop unit and performance tests to ensure robust integration · Enable backend telemetry, tracing, and analytics · Partner with frontend, cloud, and DevOps teams for end-to-end delivery Technical Skills · Languages: C#, .NET Core, Python, Java, Go · API Technologies: Swagger/OpenAPI, GraphQL, Postman, API Management · Cloud: Azure Functions, Azure API Management, AWS Lambda · Messaging: Kafka, RabbitMQ, Azure Service Bus · Databases: SQL Server, Cosmos DB, Redis, Elasticsearch · DevOps: Docker, Azure DevOps, Terraform, GitHub · Security: OAuth2, TLS, JWT, Azure AD Qualification Bachelor’s in Computer Science, IT, or related field Microsoft Certified: Azure Developer or Integration Architect a plus Experience with high-load, mission-critical API development Deep understanding of integration patterns, performance tuning, and observability Job Type: Full-time Pay: Up to ₹80,000.00 per year Location Type: In-person Ability to commute/relocate: Kovilpatti, Tamil Nadu: Reliably commute or willing to relocate with an employer-provided relocation package (Required) Application Question(s): Expected salary in annual (INR) Experience: API Backend Developer: 7 years (Required) Work Location: In person

Posted 5 days ago

Apply

0.0 - 7.0 years

0 Lacs

Chennai, Tamil Nadu

Remote

Title: Senior Backend Developer (API Specialist) Years of Experience: 7+ years *Location: The selected candidate is required to work onsite at our Chennai location for the initial six-month project training and execution period. After the six months , the candidate will be offered remote opportunities.* Job Description The Senior Backend Developer (API Specialist) will design and build enterprise-grade backend services and APIs. This role is key in enabling secure, scalable integrations across internal systems and external platforms. The ideal candidate has hands-on experience with microservices, API gateways, and DevSecOps, along with performance tuning and versioning best practices. Key responsibilities Design microservice architectures and RESTful APIs using modern backend frameworks Implement secure API authentication/authorization protocols (OAuth2, OpenID) Manage API versioning, rate-limiting, and lifecycle documentation Work with API Management platforms (e.g., Azure API Management) Develop unit and performance tests to ensure robust integration Enable backend telemetry, tracing, and analytics Partner with frontend, cloud, and DevOps teams for end-to-end delivery Technical Skills Languages: C#, .NET Core, Python, Java, Go API Technologies: Swagger/OpenAPI, GraphQL, Postman, API Management Cloud: Azure Functions, Azure API Management, AWS Lambda Messaging: Kafka, RabbitMQ, Azure Service Bus Databases: SQL Server, Cosmos DB, Redis, Elasticsearch DevOps: Docker, Azure DevOps, Terraform, GitHub Security: OAuth2, TLS, JWT, Azure AD Qualification Bachelor’s in Computer Science, IT, or related field Microsoft Certified: Azure Developer or Integration Architect a plus Experience with high-load, mission-critical API development Deep understanding of integration patterns, performance tuning, and observability Job Type: Full-time Pay: Up to ₹80,000.00 per year Location Type: In-person Ability to commute/relocate: Chennai, Tamil Nadu: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Expected salary in annual (INR) Experience: API Backend Developer: 7 years (Required) Work Location: In person

Posted 5 days ago

Apply

5.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Responsibilities The Sr. Integration Developer (Senior Software Engineer) will work in the Professional Services Team and play a significant role in designing and implementing complex integration solutions using the Adeptia platform. This role requires hands-on expertise in developing scalable and efficient solutions to meet customer requirements. The engineer will act as a key contributor to the team's deliverables while mentoring junior engineers. They will ensure high-quality deliverables by collaborating with cross-functional teams and adhering to industry standards and best practices. Responsibilities include but not limited to: Collaborate with customers' Business and IT teams to understand integration requirements in the B2B/Cloud/API/Data/ETL/EAI Integration space and implement solutions using the Adeptia platform Design, develop, and configure complex integration solutions, ensuring scalability, performance, and maintainability. Take ownership of assigned modules and lead the implementation lifecycle from requirement gathering to production deployment. Troubleshoot issues during implementation and deployment, ensuring smooth system performance. Guide team members in addressing complex integration challenges and promote best practices and performance practices. Collaborate with offshore/onshore and internal teams to ensure timely execution and coordination of project deliverables. Write efficient, well-documented, and maintainable code, adhering to established coding standards. Review code and designs of team members, providing constructive feedback to improve quality. Participate in Agile processes, including Sprint Planning, Daily Standups, and Retrospectives, ensuring effective task management and delivery. Stay updated with emerging technologies to continuously enhance technical expertise and team skills. Essential Skills: Technical 5-7 years of hands-on experience in designing and implementing integration solutions across B2B, ETL, EAI, Cloud, API & Data Integration environments using leading platforms such as Adeptia, Talend, MuleSoft , or equivalent enterprise-grade tools. Proficiency in designing and implementing integration solutions, including integration processes, data pipelines, and data mappings, to facilitate the movement of data between applications and platforms. Proficiency in applying data transformation and data cleansing as needed to ensure data quality and consistency across different data sources and destinations. Good experience in performing thorough testing and validation of data integration processes to ensure accuracy, reliability, and data integrity. Proficiency in working with SOA, RESTful APIs, and SOAP Web Services with all security policies. Good understanding and implementation experience with various security concepts, best practices,Security standards and protocols such as OAUTH, SSL/TLS, SSO, SAML, IDP (Identity Provider). Strong understanding of XML, XSD, XSLT, and JSON. Good understanding in RDBMS/NoSQL technologies (MSSQL, Oracle, MySQL). Proficiency with transport protocols (HTTPS, SFTP, JDBC) and experiences of messaging systems such as Kafka, ASB(Azure Service Bus) or RabbitMQ. Hands-on experience in Core Java and exposure to commonly used Java frameworks Desired Skills: Technical Familiarity with JavaScript frameworks like ReactJS, AngularJS, or NodeJS. Exposure to integration standards (EDI, EDIFACT, IDOC). Experience with modern web UI tools and frameworks. Exposure to DevOps tools such as Git, Jenkins, and CI/CD pipelines. About Adeptia Adeptia believes business users should be able to access information anywhere, anytime by creating data connections themselves, and its mission is to enable that self-service capability. Adeptia is a unique social network for digital business connectivity for “citizen integrators” to respond quickly to business opportunities and get to revenue faster. Adeptia helps Information Technology (IT) staff to manage this capability while retaining control and security. Adeptia’ s unified hybrid offering — with simple data connectivity in the cloud, and optional on-premises enterprise process-based integration — provides a competitive advantage to 450+ customers, ranging from Fortune 500 companies to small businesses. Headquartered in Chicago, Illinois, USA and with an office in Noida, India, Adeptia provides world-class support to its customers around-the-clock. For more, visit www.adeptia.com Our Locations: India R&D Centre: Office 56, Sixth floor, Tower-B, The Corenthum, Sector-62, Noida, U.P. US Headquarters: 332 S Michigan Ave, Unit LL-A105, Chicago, IL 60604, USA

Posted 5 days ago

Apply

0.0 - 7.0 years

7 - 10 Lacs

Gomtinagar, Lucknow, Uttar Pradesh

On-site

Position Overview: We are looking for a talented and experienced Node Js Developer to join our dynamic engineering team. As a Senior Backend Developer, you will be responsible for designing, developing, and maintaining the backend infrastructure for our [product/service]. You will collaborate closely with cross-functional teams to ensure scalability, performance, and reliability of our systems. Key Responsibilities: Design and implement highly scalable, reliable, and performant backend systems and APIs. Develop and optimize database schemas and queries for both relational and NoSQL databases. Write clean, maintainable, and testable code with high-quality standards. Collaborate with product manager, frontend developers, and other team members to understand requirements and design system architecture. Participate in the design and architecture of complex software systems. Troubleshoot and resolve complex technical issues in a timely and efficient manner. Mentor junior developers and conduct code reviews to ensure code quality and best practices. Stay updated with emerging technologies and industry best practices, and evaluate their applicability to existing systems. Improve system performance and reliability through optimizations, monitoring, and testing. Contribute to the automation of deployment pipelines and CI/CD processes. Required Skills & Qualifications: Bachelor’s degree in Computer Science, Engineering, or a related field, or equivalent practical experience. 5+ years of professional backend development experience. Strong proficiency in backend programming languages such as Python, Java, Go, Node.js, Ruby, etc. Solid understanding of software engineering principles, algorithms, and data structures. Experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB, Cassandra). Expertise in RESTful API design and development. Hands-on experience with cloud services (AWS, GCP, Azure). Experience with containerization and orchestration (Docker, Kubernetes). Familiarity with version control systems like Git. Strong understanding of microservices architecture and distributed systems. Proven ability to write clean, efficient, and well-documented code. Excellent problem-solving and debugging skills. Strong collaboration and communication skills. Preferred Skills: Experience with message queuing systems like Kafka, RabbitMQ, or similar. Knowledge of GraphQL APIs. Familiarity with DevOps practices, continuous integration, and continuous deployment (CI/CD). Experience in Agile/Scrum environments. Job Types: Full-time, Permanent Pay: ₹700,000.00 - ₹1,000,000.00 per year Location Type: In-person Schedule: Day shift Ability to commute/relocate: Gomtinagar, Lucknow, Uttar Pradesh: Reliably commute or planning to relocate before starting work (Required) Application Question(s): How soon can you join the company, if get selected? This is an on-site position in Lucknow. Are you comfortable? Education: Bachelor's (Required) Experience: APIs: 7 years (Required) Node.js: 7 years (Required) Back-end development: 7 years (Required) MongoDB: 7 years (Required) Work Location: In person Application Deadline: 17/07/2025

Posted 5 days ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary An experienced skilled technology expert who manages the technical delivery, works with multiple squads or hives to deliver complex features, epics and releases. Key Responsibilities Strategy Develop and drive application strategy and roadmap in collaboration with the Domain Tech Lead. Ensure the technical consistency of solutions in the Hive with the business architecture, technology architecture and enterprise technology standards. Execute the Hive’s product vision with the Hive Leads. Collaborate with Hive Lead and DTL on technology roadmap. Communicate technical feasibility and constraints, articulate clear explanation of technical issues to stakeholders. Business Support business in appropriate sequencing of the backlog, refinement, and rationalization through a technical lens. Perform solution sizing, design detailed technological solutions for platforms, applications, or collections of applications. Deliver solution intent (SI) and application designs, produce clear and comprehensive design and solution intent documents. Collaborate with IT Change and Release Managers to develop strategic delivery and release plans with technical dependencies. Proactively identify cross-hive or squad dependencies and risks, and develop mitigation plans to address them without waiting for formal planning sessions. Develop change implementation, release plans, and run books with technical dependencies, aligning all partners with support from the squad and architect. Technical Leadership Influence prioritization for technology backlog and strategic initiatives. Incorporate considerations of technical debt into the roadmap alongside customer outcomes, ensuring a balanced approach to long-term sustainability. Align detailed technical solutions with strategic roadmaps, long-term architecture and product/platform vision and with overall organization's technology direction and standards. Support the squad in clearly slice the back log into independently shippable experience for customers and responsible for technical delivery and technical excellence across the squads. Identify and address technical and design dependencies to enhance the speed and quality of delivery, ensuring smooth delivery. Accountable to work with technical counterparts to mitigate technical debt and balance risk, and regulatory items with new features, functionality, or changes to keep the cost of change low. Responsible to work in partnership with technology to ensure a balance of functional and non-functional requirements are represented in the backlog and that there is an approach to mitigate or avoid technical debt. Lead the squad in defining both functional and non-functional requirements, ensuring critical aspects like API response times and overall system performance are met. Act as an expert for resolving technical and design issues. Drive an accountable and sensible technical direction to build reusable, scalable and interoperable solutions that integrate with existing investments Skills And Experience API & Microservices Container Platforms (OpenShift, Kubernetes) Database Technologies (SQL, NoSQL) Integration Technologies Agile Methodologies, Lean Framework DevOps Engineering Qualifications Graduation or Post-Graduation in Computer Science Engineering Experience in the software development using Agile methodologies Knowledge & Experience in practising Agile & Lean framework Knowledge & Experience in the API & Microservices Knowledge & Experience in J2EE and good understanding of OOA and OOP Knowledge & Experience in Integration technologies – MQ, Kafka, Solace Knowledge / Experience in Container platform and Cloud technologies Knowledge / Experience in DevOps & SRE Experiences with relational databases and NOSQL databases More than 10+ years’ experience in software development domain Proven experience in solution management, business development, or a related field Strong understanding of the software development life cycle and experience with agile methodologies Excellent communication and interpersonal skills, with the ability to build and maintain relationships with clients and partners Strong project management skills and experience leading cross-functional teams Strong problem-solving and analytical skills Strong understanding of technology and the ability to learn new technologies quickly About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.

Posted 5 days ago

Apply

0.0 - 10.0 years

0 Lacs

Kovilpatti, Tamil Nadu

On-site

Title: Senior Data Architect Years of Experience : 10+ years Location: Onsite ( The selected candidate is required to relocate to Kovilpatti, Tamil Nadu for the initial three-month project training session. Post training, the candidate will be relocated to one of our onsite locations: _ Chennai, Hyderabad, or Pune, based on project allocation.)_ Job Description The Senior Data Architect will design, govern, and optimize the entire data ecosystem for advanced analytics and AI workloads. This role ensures data is collected, stored, processed, and made accessible in a secure, performant, and scalable manner. The candidate will drive architecture design for structured/unstructured data, build data governance frameworks, and support the evolution of modern data platforms across cloud environments. Key responsibilities Architect enterprise data platforms using Azure/AWS/GCP and modern data lake/data mesh patterns Design logical and physical data models, semantic layers, and metadata frameworks Establish data quality, lineage, governance, and security policies Guide the development of ETL/ELT pipelines using modern tools and streaming frameworks Integrate AI and analytics solutions with operational data platforms Enable self-service BI and ML pipelines through Databricks, Synapse, or Snowflake Lead architecture reviews, design sessions, and CoE reference architecture development Technical Skills · Cloud Platforms: Azure Synapse, Databricks, Azure Data Lake, AWS Redshift · Data Modeling: ERWin, dbt, Power Designer · Storage & Processing: Delta Lake, Cosmos DB, PostgreSQL, Hadoop, Spark · Integration: Azure Data Factory, Kafka, Event Grid, SSIS · Metadata/Lineage: Purview, Collibra, Informatica · BI Platforms: Power BI, Tableau, Looker · Security & Compliance: RBAC, encryption at rest/in transit, NIST/FISMA Qualification Bachelor’s or Master’s in Computer Science, Information Systems, or Data Engineering Microsoft Certified: Azure Data Engineer / Azure Solutions Architect Strong experience building cloud-native data architectures Demonstrated ability to create data blueprints aligned with business strategy and compliance. Job Type: Full-time Pay: Up to ₹80,000.00 per month Ability to commute/relocate: Kovilpatti, Tamil Nadu: Reliably commute or willing to relocate with an employer-provided relocation package (Required) Application Question(s): Expected Salary in annual (INR) Experience: Data Architect: 10 years (Required) License/Certification: Azure Data Engineer or Azure Solutions Architect (Required) Work Location: In person

Posted 5 days ago

Apply

0.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

Title: Senior Data Architect Years of Experience : 10+ years Location: The selected candidate is required to work onsite at our Chennai location for the initial six-month project training and execution period. After the six months, the candidate may be offered onsite opportunities Job Description The Senior Data Architect will design, govern, and optimize the entire data ecosystem for advanced analytics and AI workloads. This role ensures data is collected, stored, processed, and made accessible in a secure, performant, and scalable manner. The candidate will drive architecture design for structured/unstructured data, build data governance frameworks, and support the evolution of modern data platforms across cloud environments. Key responsibilities Architect enterprise data platforms using Azure/AWS/GCP and modern data lake/data mesh patterns Design logical and physical data models, semantic layers, and metadata frameworks Establish data quality, lineage, governance, and security policies Guide the development of ETL/ELT pipelines using modern tools and streaming frameworks Integrate AI and analytics solutions with operational data platforms Enable self-service BI and ML pipelines through Databricks, Synapse, or Snowflake Lead architecture reviews, design sessions, and CoE reference architecture development Technical Skills Cloud Platforms: Azure Synapse, Databricks, Azure Data Lake, AWS Redshift Data Modeling: ERWin, dbt, Power Designer Storage & Processing: Delta Lake, Cosmos DB, PostgreSQL, Hadoop, Spark Integration: Azure Data Factory, Kafka, Event Grid, SSIS Metadata/Lineage: Purview, Collibra, Informatica BI Platforms: Power BI, Tableau, Looker Security & Compliance: RBAC, encryption at rest/in transit, NIST/FISMA Qualification Bachelor’s or Master’s in Computer Science, Information Systems, or Data Engineering Microsoft Certified: Azure Data Engineer / Azure Solutions Architect Strong experience building cloud-native data architectures Demonstrated ability to create data blueprints aligned with business strategy and compliance. Job Type: Full-time Pay: Up to ₹80,000.00 per month Ability to commute/relocate: Chennai, Tamil Nadu: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Expected Salary in annual (INR) Experience: Data Architect: 10 years (Required) License/Certification: Azure Data Engineer or Azure Solutions Architect (Required) Work Location: In person

Posted 5 days ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Position : Senior Java Developer Location- Gurugram Cybercity for all positions Exp- 8 yrs Qualification: * Bachelor's degree in computer science or related fields preferred. * 8+ years of experience developing core Java applications across enterprise, SME, or start-up environments. * Proven experience with distributed systems and event-driven architectures. * Expertise in Spring Boot, Spring Framework, and RESTful API development. * Experience in designing, building, and monitoring microservices. * Solid background in persistence technologies including JPA, Hibernate, MS-SQL, and PostgreSQL. * Proficient in Java 11+, including features like Streams, Lambdas, and Functional Programming. * Experience with CI/CD pipelines using tools such as Jenkins, GitLab CI, GitHub Actions, or AWS DevOps. * Familiarity with major cloud platforms: AWS, Azure, or GCP (AWS preferred). * Front-end development experience using React or Angular with good understanding of leveraging best practices around HTML, CSS3/Tailwind, Responsive designs. * Comfortable in Agile environments with iterative development and regular demos. * Experience with container orchestration using Managed Kubernetes (EKS, AKS, or GKE). * Working knowledge of Domain-Driven Design (DDD) and Backend-for-Frontend (BFF) concepts. * Hands-on experience integrating applications with cloud services. * Familiarity with event-driven technologies (e.g., Kafka, MQ, Event Buses). * Hospitality services domain experience is a plus. * Strong problem-solving skills, with the ability to work independently and in a team. * Proficiency in Agile methodologies and software development best practices. * • Skilled in code and query optimization. * Experience with version control systems, particularly gits

Posted 5 days ago

Apply

4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Greetings from TCS!!!!!! Skill: Automation Testing Years of Experience: 4+ Years Work Mode: Work from office Location: Chennai, Bangalore JD: Must-Have** (Ideally should not be more than 3-5) Experience in designing, developing and executing the functional & automated testing scripts. Should have worked in Agile and must be aware of Agile processes. Exposure to JIRA. Good understanding of automated end to end testing Knowledge and Application experience of programming concepts in Java technology stack required. Good understanding of Micro-Services Architecture. Good understanding of Rest APIs and WebSocket based communication. Demonstrated ability to solve complex software development/design issues using clean, coherent code following established coding guidelines Good-to-Have Experience working with Banking applications. Good understanding of asynchronous messaging systems like Kafka, solace, MQ or similar. Knowledge of QA and testing standards, methodologies, tools and processes Working understanding of Dockers, Kubernetes and other deployment and infrastructure as code technologies Ability to succeed in high energy, dynamic environment Self-driven, good communication skills Shradha HR Recruitment

Posted 5 days ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Position Summary... What you'll do... Job Description Summary: Responsible for coding, unit testing, building high performance and scalable applications that meet the needs of millions of Walmart-International customers, in the areas of supply chain management ; Customer experience. About Team: Our team collaborates with Walmart International, which has over 5,900 retail units operating outside of the United States under 55 banners in 26 countries including Africa, Argentina, Canada, Central America, Chile, China, India, Japan, and Mexico, to name a few. What you'll do: Architect, Design, build, test and deploy cutting edge solutions at scale, impacting millions of customers worldwide drive value from data at Walmart Scale Interact with Walmart engineering teams across geographies to leverage expertise and contribute to the tech community. Engage with Product Management and Business to drive the agenda, set your priorities and deliver awesome product features to keep platform ahead of market scenarios. Identify right open source tools to deliver product features by performing research, POC/Pilot and/or interacting with various open source forums Develop and/or Contribute to add features that enable customer analytics at Walmart scale Deploy and monitor products on Cloud platforms Develop and implement best-in-class monitoring processes to enable data applications meet SLAs What you'll bring: Min 3-6 years of web based application development experience Demonstrates up-to-date expertise in building enterprise grade web based applications. End to end knowledge of full stack web application development using JAVA/ Springboot, Kafka, RESTFul services, API and other frameworks. Proficient in GUI technologies like React.js , JavaScript, HTML, CSS. Passionate about building user friendly, intuitive web applications to solve complex business problems. Extremely strong technical background, being hands-on and earn the respect and ability to mentor top individual technical talent. Good experience of working with globally distributed teams in a collaborative and productive manner. Excellent interpersonal skills, good with people, ability to negotiate. Retail experience is huge plus About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. Thats what we do at Walmart Global Tech. Were a team of software engineers, data scientists, cybersecurity experts and service professionals within the worlds leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Equal Opportunity Employer: Walmart, Inc. is an Equal Opportunity Employer By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing valuing unique unique styles, experiences, identities, ideas and opinions while being welcoming of all people. Minimum Qualifications... Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelor's degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 2years’ experience in software engineering or related area at a technology, retail, or data-driven company. Option 2: 4 years’ experience in software engineering or related area at a technology, retail, or data-driven company. Preferred Qualifications... Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Certification in Security+, Network+, GISF, GSEC, CISSP, or CCSP, Master’s degree in Computer Science, Information Technology, Engineering, Information Systems, Cybersecurity, or related area Primary Location... Pardhanani Wilshire Ii, Cessna Business Park, Kadubeesanahalli Village, Varthur Hobli , India R-2127130

Posted 5 days ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Position Summary... What you'll do... About Team : Market Place is one of the fastest growing business at Walmart. As part of this team you will be working on 3rd party seller integrations with Walmart eco system, managing seller data and building recommendation engines for improving seller experience, business with Walmart. What you will do: As a Software Engineer III at Walmart, you'll have the opportunity to: Develop intuitive software that meets and exceeds the needs of the customer and the company. You also get to collaborate with team members to develop best practices and requirements for the software. In this role it would be important for you to professionally maintain all codes and create updates regularly to address the customers and company's concerns. You will show your skills in analyzing and testing programs/products before formal launch to ensure flawless performance. Troubleshooting coding problems quickly and efficiently will offer you a chance to grow your skills in a high-pace, high-impact environment. Software security is of prime importance and by developing programs that monitor sharing of private information, you will be able to add tremendous credibility to your work. You will also be required to seek ways to improve the software and its effectiveness. Adhere to Company policies, procedures, mission, values, and standards of ethics and integrity. What you will bring: B.E./B. Tech/MS/MCA in Computer Science or related technical field. Minimum 3 years of object-oriented programming experience in Java. Excellent computer systems fundamentals, DS/Algorithms and problem solving skills. Hands-on experience in building web based Java EE services/applications and Kafka, Apache Camel, RESTful Web-Services, Spring, Hibernate, Splunk, Caching. Excellent organization, communication and interpersonal skills. Large scale distributed services experience, including scalability and fault tolerance. Exposure to cloud infrastructure, such as Open Stack, Azure, GCP, or AWS Exposure to build, CI/CD ; deployment pipelines and related technologies like Kubernetes, Docker, Jenkins etc. A continuous drive to explore, improve, enhance, automate and optimize systems and tools. Experience in systems design and distributed systems. Exposure to SQL/No SQL data stores like Cassandra, Elastic, Mongo etc. About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. That’s what we do at Walmart Global Tech. We’re a team of software engineers, data scientists, cybersecurity expert's and service professionals within the world’s leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a Walmart for everyone. At Walmart, our vision is "everyone included." By fostering a workplace culture where everyone is—and feels—included, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, we’re able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal Opportunity Employer Walmart, Inc., is an Equal Opportunities Employer – By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions – while being inclusive of all people. Minimum Qualifications... Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1: Bachelor's degree in computer science, information technology, engineering, information systems, cybersecurity, or related area and 2years’ experience in software engineering or related area at a technology, retail, or data-driven company. Option 2: 4 years’ experience in software engineering or related area at a technology, retail, or data-driven company. Preferred Qualifications... Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Certification in Security+, Network+, GISF, GSEC, CISSP, or CCSP, Master’s degree in Computer Science, Information Technology, Engineering, Information Systems, Cybersecurity, or related area Primary Location... BLOCK- 1, PRESTIGE TECH PACIFIC PARK, SY NO. 38/1, OUTER RING ROAD KADUBEESANAHALLI, , India R-2222538

Posted 5 days ago

Apply

0.0 - 7.0 years

0 Lacs

Kovilpatti, Tamil Nadu

On-site

Title: AI Integration Specialist Years of Experience : 7+ years Location: Onsite ( Job Description The AI Integration Specialist bridges the gap between intelligent models and operational systems. They will ensure seamless integration of AI models, APIs, and cognitive services into enterprise platforms, web/mobile applications, and automation pipelines. This role demands strong backend development skills, knowledge of integration patterns, and experience working across system boundaries to bring AI capabilities to business workflows. Key responsibilities · Integrate AI services into enterprise applications (CRM, ERP, portals, mobile apps) · Develop middleware services to enable interaction with AI models (REST, GraphQL, gRPC) · Manage APIs, version control, security, and authentication for AI endpoints · Implement monitoring, logging, and failover strategies for AI-powered modules · Facilitate integration of LLM-based copilots, agents, and intelligent assistants · Collaborate with Dev, QA, and AI teams to ensure end-to-end performance and quality · Enable multi-system orchestration using platforms like Power Automate or Logic Apps · Document integration blueprints, SDKs, and reusable patterns for scale. Technical Skills · Backend: Node.js, C#, Python (Flask, FastAPI), .NET Core · API Management: Azure API Management, Postman, Swagger/OpenAPI · Integration Platforms: Azure Logic Apps, Power Platform, MuleSoft, Zapier · AI Services: OpenAI, Azure OpenAI, Cognitive Search, GPT/LLM APIs · CI/CD: Azure DevOps, GitHub Actions, Jenkins · Familiarity with event-driven architecture (Service Bus, Kafka, Pub/Sub) · Understanding of OAuth, JWT, and API gateway security best practices Qualification Bachelor’s or Master’s in Computer Science, Software Engineering, or Systems Integration Microsoft Certified: Azure Developer Associate / Integration Architect desirable Hands-on experience integrating AI/ML models into customer-facing applications Strong understanding of RESTful architecture, data contracts, and system interoperability Job Type: Full-time Pay: Up to ₹700,000.00 per year Ability to commute/relocate: Kovilpatti, Tamil Nadu: Reliably commute or willing to relocate with an employer-provided relocation package (Required) Application Question(s): Expected salary in annual (INR) Experience: AI Integration Specialist: 7 years (Required) Work Location: In person

Posted 5 days ago

Apply

0.0 - 7.0 years

0 Lacs

Chennai, Tamil Nadu

Remote

Title: AI Integration Specialist Years of Experience : 7+ years *Location: The selected candidate is required to work onsite at our Chennai location for the initial six-month project training and execution period. After the six months , the candidate will be offered remote opportunities.* Job Description The AI Integration Specialist bridges the gap between intelligent models and operational systems. They will ensure seamless integration of AI models, APIs, and cognitive services into enterprise platforms, web/mobile applications, and automation pipelines. This role demands strong backend development skills, knowledge of integration patterns, and experience working across system boundaries to bring AI capabilities to business workflows. Key responsibilities Integrate AI services into enterprise applications (CRM, ERP, portals, mobile apps) Develop middleware services to enable interaction with AI models (REST, GraphQL, gRPC) Manage APIs, version control, security, and authentication for AI endpoints Implement monitoring, logging, and failover strategies for AI-powered modules Facilitate integration of LLM-based copilots, agents, and intelligent assistants Collaborate with Dev, QA, and AI teams to ensure end-to-end performance and quality Enable multi-system orchestration using platforms like Power Automate or Logic Apps Document integration blueprints, SDKs, and reusable patterns for scale. Technical Skills Backend: Node.js, C#, Python (Flask, FastAPI), .NET Core API Management: Azure API Management, Postman, Swagger/OpenAPI Integration Platforms: Azure Logic Apps, Power Platform, MuleSoft, Zapier AI Services: OpenAI, Azure OpenAI, Cognitive Search, GPT/LLM APIs CI/CD: Azure DevOps, GitHub Actions, Jenkins Familiarity with event-driven architecture (Service Bus, Kafka, Pub/Sub) Understanding of OAuth, JWT, and API gateway security best practices Qualification Bachelor’s or Master’s in Computer Science, Software Engineering, or Systems Integration Microsoft Certified: Azure Developer Associate / Integration Architect desirable Hands-on experience integrating AI/ML models into customer-facing applications Strong understanding of RESTful architecture, data contracts, and system interoperability Job Type: Full-time Pay: Up to ₹700,000.00 per year Ability to commute/relocate: Kovilpatti, Tamil Nadu: Reliably commute or planning to relocate before starting work (Required) Application Question(s): Expected salary in annual (INR) Experience: AI Integration Specialist: 7 years (Required) Work Location: In person

Posted 5 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies