Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Roles & Responsibilities: Design, build, and manage Kafka clusters using Confluent Platform and Kafka Cloud services (AWS MSK, Confluent Cloud) . Develop and maintain Kafka topics, schemas (Avro/Protobuf), and connectors for data ingestion and processing pipelines. Monitor and ensure the reliability, scalability, and security of Kafka infrastructure. Collaborate with application and data engineering teams to integrate Kafka with other AWS-based services (e.g., Lambda, S3, EC2, Redshift). Implement and manage Kafka Connect , Kafka Streams , and ksqlDB where applicable. Optimize Kafka performance, troubleshoot issues, and manage incident response. Preferred candidate profile 4-6 years of experience working with Apache Kafka and Confluent Kafka. Strong knowledge of Kafka internals (brokers, zookeepers, partitions, replication, offsets). Experience with Kafka Connect, Schema Registry, REST Proxy, and Kafka security. Hands-on experience with AWS (EC2, IAM, CloudWatch, S3, Lambda, VPC, Load balancers). Proficiency in scripting and automation using Terraform, Ansible, or similar tools. Familiarity with DevOps practices and tools (CI/CD pipelines, monitoring tools like Prometheus/Grafana, Splunk, Datadog etc). Experience with containerization (Docker, Kubernetes) is a plus.
Posted 2 months ago
6.0 - 11.0 years
4 - 9 Lacs
Bengaluru
Work from Office
SUMMARY Job Role: Apache Kafka Admin Experience: 6+ years Location: Pune (Preferred), Bangalore, Mumbai Must-Have: The candidate should have 6 years of relevant experience in Apache Kafka Job Description: We are seeking a highly skilled and experienced Senior Kafka Administrator to join our team. The ideal candidate will have 6-9 years of hands-on experience in managing and optimizing Apache Kafka environments. As a Senior Kafka Administrator, you will play a critical role in designing, implementing, and maintaining Kafka clusters to support our organization's real-time data streaming and event-driven architecture initiatives. Responsibilities: Design, deploy, and manage Apache Kafka clusters, including installation, configuration, and optimization of Kafka brokers, topics, and partitions. Monitor Kafka cluster health, performance, and throughput metrics and implement proactive measures to ensure optimal performance and reliability. Troubleshoot and resolve issues related to Kafka message delivery, replication, and data consistency. Implement and manage Kafka security mechanisms, including SSL/TLS encryption, authentication, authorization, and ACLs. Configure and manage Kafka Connect connectors for integrating Kafka with various data sources and sinks. Collaborate with development teams to design and implement Kafka producers and consumers for building real-time data pipelines and streaming applications. Develop and maintain automation scripts and tools for Kafka cluster provisioning, deployment, and management. Implement backup, recovery, and disaster recovery strategies for Kafka clusters to ensure data durability and availability. Stay up-to-date with the latest Kafka features, best practices, and industry trends and provide recommendations for optimizing our Kafka infrastructure. Requirements: 6-9 years of experience as a Kafka Administrator or similar role, with a proven track record of managing Apache Kafka clusters in production environments. In - depth knowledge of Kafka architecture, components, and concepts, including brokers, topics, partitions, replication, and consumer groups. Hands - on experience with Kafka administration tasks, such as cluster setup, configuration, performance tuning, and monitoring. Experience with Kafka ecosystem tools and technologies, such as Kafka Connect, Kafka Streams, and Confluent Platform. Proficiency in scripting languages such as Python, Bash, or Java. Strong understanding of distributed systems, networking, and Linux operating systems. Excellent problem-solving and troubleshooting skills, with the ability to diagnose and resolve complex technical issues. Strong communication and interpersonal skills, with the ability to effectively collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders.
Posted 3 months ago
5.0 - 10.0 years
16 - 19 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Detailed job description - Skill Set: Proven experience as a Kafka Developer Knowledge of Kafka Schemas and use of the Schema Registry Strong knowledge of Kafka and other big data technologies Best practices to optimize the Kafka ecosystem based on use-case and workload Knowledge of Kafka clustering, and its fault-tolerance model supporting High Availability Strong fundamentals in Kafka client configuration and troubleshooting Designing and implementing data pipelines using Apache Kafka Develop and maintain Kafka-based data pipelines Monitor and optimize Kafka clusters Troubleshoot and resolve issues related to Kafka and data processing Ensure data security and compliance with industry standards Create and maintain documentation for Kafka configurations and processes Implement best practices for Kafka architecture and operations Mandatory Skills(ONLY 2 or 3) Kafka Developer
Posted 3 months ago
5.0 - 8.0 years
9 - 14 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Kafka Integration. Experience5-8 Years.
Posted 3 months ago
10.0 - 13.0 years
35 - 50 Lacs
Chennai
Work from Office
Cognizant Hiring Payments BA!!! Location: Chennai, Bangalore, Hyderabad JD: Job Summary Atleast 10yrs of experience in the BA role and in that a couple of years of experience as BA lead role good domain knowledge in SWIFT/ISO 20022 Payment background and stakeholders management Java Microservices and Spring boot Technical Knowledge: Java / Spring Boot Kafka Streams REST JSON Netflix Micro Services suite ( Zuul Eureka Hystrix etc)12 Factor Apps Oracle PostgresSQL Cassandra & ELK Ability to work with geographically dispersed and highly varied stakeholders Responsibilities Strategy Develop the strategic direction and roadmap for our flagship payments platform aligning with Business Strategy Tech and Ops Strategy and investment priorities. Tap into latest industry trends innovative products & solutions to deliver effective and faster product capabilities Support CASH Management Operations leveraging technology to streamline processes enhance productivity reduce risk and improve controls Business Work hand in hand with Payments Business taking product programs from investment decisions into design specifications solutioning development implementation and hand-over to operations securing support and collaboration from other teams Ensure delivery to business meeting time cost and high quality constraints Support respective businesses in growing Return on investment commercialization of capabilities bid teams monitoring of usage improving client experience enhancing operations and addressing defects & continuous improvement of systems Thrive an ecosystem of innovation and enabling business through technology Processes Responsible for the end-to-end deliveries of the technology portfolio comprising key business product areas such as Payments Clearing etc. Own technology delivery of projects and programs across global markets that a develop/enhance core product capabilities b ensure compliance to Regulatory mandates c support operational improvements process efficiencies and zero touch agenda d build payments platform to align with latest technology and architecture trends improved stability and scale Interface with business & technology leaders of other systems for collaborative delivery.
Posted 3 months ago
3 - 6 years
4 - 8 Lacs
Bengaluru
Work from Office
locationsIN - Bangaloreposted onPosted Today time left to applyEnd DateMay 22, 2025 (5 days left to apply) job requisition idR140300 Company Overview A.P. Moller - Maersk is an integrated container logistics company and member of the A.P. Moller Group. Connecting and simplifying trade to help our customers grow and thrive . With a dedicated team of over 95,000 employees, operating in 130 countries; we go all the way to enable global trade for a growing world . From the farm to your refrigerator, or the factory to your wardrobe, A.P. Moller - Maersk is developing solutions that meet customer needs from one end of the supply chain to the other. About the Team At Maersk, the Global Ocean Manifest team is at the heart of global trade compliance and automation. We build intelligent, high-scale systems that seamlessly integrate customs regulations across 100+ countries, ensuring smooth cross-border movement of cargo by ocean, rail, and other transport modes. Our mission is to digitally transform customs documentation, reducing friction, optimizing workflows, and automating compliance for a complex web of regulatory bodies, ports, and customs authorities. We deal with real-time data ingestion, document generation, regulatory rule engines, and multi-format data exchange while ensuring resilience and security at scale. Key Responsibilities Work with large, complex datasets and ensure efficient data processing and transformation. Collaborate with cross-functional teams to gather and understand data requirements. Ensure data quality, integrity, and security across all processes. Implement data validation, lineage, and governance strategies to ensure data accuracy and reliability. Build, optimize , and maintain ETL pipelines for structured and unstructured data , ensuring high throughput, low latency, and cost efficiency . Experience in building scalable, distributed data pipelines for processing real-time and historical data. Contribute to the architecture and design of data systems and solutions. Write and optimize SQL queries for data extraction, transformation, and loading (ETL). Advisory to Product Owners to identify and manage risks, debt, issues and opportunities for the technical improvement . Providing continuous improvement suggestions in internal code frameworks, best practices and guidelines . Contribute to engineering innovations that fuel Maersks vision and mission. Required Skills & Qualifications 4 + years of experience in data engineering or a related field. Strong problem-solving and analytical skills. E xperience on Java, Spring framework Experience in building data processing pipelines using Apache Flink and Spark. Experience in distributed data lake environments ( Dremio , Databricks, Google BigQuery , etc.) E xperience on Apache Kafka, Kafka Streams Experience working with databases. PostgreSQL preferred, with s olid experience in writin g and opti mizing SQL queries. Hands-on experience in cloud environments such as Azure Cloud (preferred), AWS, Google Cloud, etc. Experience with data warehousing and ETL processes . Experience in designing and integrating data APIs (REST/ GraphQL ) for real-time and batch processing. Knowledge on Great Expectations, Apache Atlas, or DataHub , would be a plus Knowledge on RBAC, encryption, GDPR compliance would be a plus Business skills Excellent communication and collaboration skills Ability to translate between technical language and business language, and communicate to different target groups Ability to understand complex design Possessing the ability to balance and find competing forces & opinions , within the development team Personal profile Fact based and result oriented Ability to work independently and guide the team Excellent verbal and written communication Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing . Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing .
Posted 3 months ago
6 - 9 years
10 - 16 Lacs
Hyderabad
Hybrid
Senior Software Engineer - Java Developer with Kafka Streaming, Spark & OpenShift Position Description At CGI, were a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve. At CGI, we’re a team of builders. We call our employees members because all who join CGI are building their own company - one that has grown to 72,000 professionals located in 40 countries. Founded in 1976, CGI is a leading IT and business process services firm committed to helping clients succeed. We have the global resources, expertise, stability and dedicated professionals needed to achieve results for our clients - and for our members. Come grow with us. Learn more at www.cgi.com. This is a great opportunity to join a winning team. CGI offers a competitive compensation package with opportunities for growth and professional development. Benefits for full-time, permanent members start on the first day of employment and include a paid time-off program and profit participation and stock purchase plans. We wish to thank all applicants for their interest and effort in applying for this position, however, only candidates selected for interviews will be contacted. No unsolicited agency referrals please. Job Title: Senior Software Engineer - Java Developer with Kafka Streaming, Spark & OpenShift Position Title: Senior Software Engineer - Java Developer with Kafka Streaming, Spark & OpenShift Experience: 6 to 9 Years Category: Software Development/ Engineering Main location: Hyderabad Shift Timings: General Shift Employment Type: Full Time-Permanent Your future duties and responsibilities Job Summary: • CGI is looking for a skilled and proactive Java Developer with hands-on experience in Kafka streaming, Apache Spark, and Red Hat OpenShift. • The ideal candidate will play a key role in designing, developing, and deploying scalable backend systems and real-time data pipelines. • This position is ideal for someone passionate about building high-performance systems and working with cutting-edge technologies in cloud-native environments. Key Responsibilities: • Design, develop, and maintain robust Java-based microservices and backend applications. • Develop real-time data streaming applications using Apache Kafka and Kafka Streams. • Build and optimize large-scale batch and stream processing pipelines with Apache Spark. • Containerize applications and manage deployments using OpenShift and Kubernetes. • Collaborate with DevOps teams to ensure CI/CD pipelines are robust and scalable. • Write unit tests and conduct code reviews to maintain code quality and reliability. • Work closely with Product and Data Engineering teams to understand requirements and translate them into technical solutions. • Troubleshoot and debug production issues across multiple environments. Required qualifications to be successful in this role • Strong programming skills in Java (Java 8 or higher). • Hands-on experience with Apache Kafka, Kafka Streams, and event-driven architecture. • Solid knowledge of Apache Spark (batch and streaming). • Experience with OpenShift, Kubernetes, and container orchestration. • Familiarity with microservices architecture, RESTful APIs, and distributed systems. • Experience with build tools such as Maven or Gradle. • Familiar with Git, Jenkins, CI/CD pipelines, and Agile development practices. • Excellent problem-solving skills and ability to work in a fast-paced environment. Education & Experience: • Bachelor's or Master's degree in Computer Science, Engineering, or related field. • Minimum 6 years of experience in backend development with Java and related technologies. Preferred Skills (Nice to Have): • Knowledge of cloud platforms like AWS, Azure, or GCP. • Understanding of security best practices in cloud-native environments. • Familiarity with SQL/NoSQL databases (e.g., PostgreSQL, Cassandra, MongoDB). • Experience with Scala or Python for Spark jobs is a plus.
Posted 3 months ago
3 - 5 years
5 - 9 Lacs
Bengaluru
Work from Office
About The Role Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters ? Do 1. Instrumental in understanding the requirements and design of the product/ software Develop software solutions by studying information needs, studying systems flow, data usage and work processes Investigating problem areas followed by the software development life cycle Facilitate root cause analysis of the system issues and problem statement Identify ideas to improve system performance and impact availability Analyze client requirements and convert requirements to feasible design Collaborate with functional teams or systems analysts who carry out the detailed investigation into software requirements Conferring with project managers to obtain information on software capabilities ? 2. Perform coding and ensure optimal software/ module development Determine operational feasibility by evaluating analysis, problem definition, requirements, software development and proposed software Develop and automate processes for software validation by setting up and designing test cases/scenarios/usage cases, and executing these cases Modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Analyzing information to recommend and plan the installation of new systems or modifications of an existing system Ensuring that code is error free or has no bugs and test failure Preparing reports on programming project specifications, activities and status Ensure all the codes are raised as per the norm defined for project / program / account with clear description and replication patterns Compile timely, comprehensive and accurate documentation and reports as requested Coordinating with the team on daily project status and progress and documenting it Providing feedback on usability and serviceability, trace the result to quality risk and report it to concerned stakeholders ? 3. Status Reporting and Customer Focus on an ongoing basis with respect to project and its execution Capturing all the requirements and clarifications from the client for better quality work Taking feedback on the regular basis to ensure smooth and on time delivery Participating in continuing education and training to remain current on best practices, learn new programming languages, and better assist other team members. Consulting with engineering staff to evaluate software-hardware interfaces and develop specifications and performance requirements Document and demonstrate solutions by developing documentation, flowcharts, layouts, diagrams, charts, code comments and clear code Documenting very necessary details and reports in a formal way for proper understanding of software from client proposal to implementation Ensure good quality of interaction with customer w.r.t. e-mail content, fault report tracking, voice calls, business etiquette etc Timely Response to customer requests and no instances of complaints either internally or externally ? Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: Kafka Integration. Experience3-5 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 3 months ago
5 - 8 years
9 - 14 Lacs
Hyderabad
Work from Office
About The Role Role Purpose The purpose of the role is to support process delivery by ensuring daily performance of the Production Specialists, resolve technical escalations and develop technical capability within the Production Specialists. ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Mandatory Skills: Kafka Integration. Experience5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.
Posted 3 months ago
8 - 13 years
20 - 35 Lacs
Kolkata, Pune, Bengaluru
Hybrid
Role & responsibilities 8+ years experiences on relevant field below (Internship, prototype, and personal projects won't be counted) Coding is required . (Ideally Python or Java) Own end to end lifecycle (From development to deployment to production environment) Experience in building or deploying solution in the cloud. Either Cloud Native (Serverless) : S3, Lambda, AWS Batch, ECS Or Cloud Agnostic: Kubernetes, Helm Chart, ArgoCD, Prometeus, Grafana. CICD experience: Github action or Jenkin. Infrastructure as code : e.g., Terraform And experience in at least one of this focus area: Big Data: Building Big data pipeline or Platform to process petabytes of data: (PySpark, Hudi, Data Lineage, AWS Glue, AWS EMR, Kafka, Schema Registry) Or GraphDB : Ingesting and consuming data in Graph Database such as Neo4J, AWS Neptune, JanusGraph or DGraph Preferred candidate profile Specifically highlight Kafka expertise - include details like: Experience with Kafka cluster management and configuration Stream processing with Kafka Streams or KSQL Schema Registry implementation and management Kafka Connect for data integration Put significant focus on PySpark skills: Experience building and optimizing PySpark jobs for batch processing Stream processing with Spark Structured Streaming Familiarity with Delta Lake, Hudi, or Iceberg for lakehouse implementation Highlight data engineering skills that complement these technologies: Data pipeline design and implementation Experience with data quality, validation, and lineage tracking Performance optimization for large-scale data processing
Posted 3 months ago
3 - 8 years
8 - 18 Lacs
Gurugram
Remote
Kafka/MSK Linux In-depth understanding of Kafka broker configurations, zookeepers, and connectors Understand Kafka topic design and creation. Good knowledge in replication and high availability for Kafka system ElasticSearch/OpenSearch
Posted 3 months ago
4 - 6 years
15 - 22 Lacs
Gurugram
Hybrid
The Job We are looking out for a Sr Data Engineer responsible to Design, Develop and Support Real Time Core Data Products to support TechOps Applications. Work with various teams to understand business requirements, reverse engineer existing data products and build state of the art performant data pipelines. AWS is the cloud of choice for these pipelines and a solid understand and experience of architecting , developing and maintaining real time data pipelines in AWS Is highly desired. Design, Architect and Develop Data Products that provide real time core data for applications. Production Support and Operational Optimisation of Data Projects including but not limited to Incident and On Call Support , Performance Optimization , High Availability and Disaster Recovery. Understand Business Requiremensts interacting with business users and or reverse engineering existing legacy data products. Mentor and train junior team members and share architecture , design and development knowdge of data products and standards. Mentor and train junior team members and share architecture , design and development knowdge of data products and standards. Good understand and working knowledge of distributed databases and pipelines. Your Profile An ideal candidate will have 4+ yrs of experience in Real Time Streaming along with hands on Spark, Kafka, Apache Flink, Java, Big data technologies, AWS and MSK (managed service kafka) AWS Distrubuited Database technologies including Managed Services Kafka, Managed Apache Flink, DynamoDB, S3, Lambda. Experience designing and developing with Apache Flink real time data products.(Scala experience can be considered) Experience with python and pyspark SQL Code Development AWS Solutions Architecture experience for data products is required Manage, troubleshoot, real time data pipelines in the AWS Cloud Experience with High Availability and Disaster Recovery Solutions for Real time data streaming Excellent Analytical, Problem solving and Communication Skills Must be self-motivated, and ability to work independently Ability to understand existing SQL and code and user requirements and translate them into modernized data products.
Posted 3 months ago
3 - 7 years
14 - 15 Lacs
Hyderabad
Work from Office
Hi Greeting for the Day! We found your profile suitable for the below opening, kindly go through the JD and reach out to us if you are interested. About Us Incorporated in 2006, We are an 18 year old recruitment and staffing company, we are a provider of manpower for some of the fortune 500 companies for junior/ Middle/ Executive talent. About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Kafka Testing Qualification : Any Graduate or Above Relevant Experience : 3+Yrs Location : Hyderabad - Initial 8 weeks Work From Office, followed by Work From Home CTC Range : 15LPA (Lakhs Per Annum) Notice period : Immediate Shift Timing : 1 PM to 10 PM Mode of Interview : Virtual Joel. IT Staff. Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA. 8067432416 I joel.manivasan@blackwhite.in I www.blackwhite.in
Posted 3 months ago
4 - 8 years
10 - 17 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Warm Greetings from SP Staffing Services Pvt Ltd!!!! Experience:4-7yrs Work Location :Chennai /Hyd/Bangalore Job Description: Interested candidates, Kindly share your updated resume to gokul.priya@spstaffing.in or contact number (Whatsapp:9360311230) to proceed further. Job Description: Confluent Kafka platform setup , maintenance , upgrade. Hands on experience with Kafka Brokers. Hands on experience with Schema Registry. Hands on experience with KSQL DB and understanding of underlying implementation and functions. Hands on experience with Kafka connectors and understanding of underlying implementation. Proficient understanding of Kafka client Producer and consumer functioning Experience with Kafka deployment in Azure Kubernetes Service Experience working in Azure cloud
Posted 3 months ago
7.0 - 12.0 years
8 - 18 Lacs
bengaluru, delhi / ncr, mumbai (all areas)
Work from Office
7+ years’ experience (3+ in Kafka – Apache, Confluent, MSK – & RabbitMQ) with strong skills in monitoring, optimization, and incident resolution. Proficient in brokers, connectors, Zookeeper/KRaft, schema registry, and middleware performance metrics.
Posted Date not available
8.0 - 13.0 years
30 - 45 Lacs
pune
Remote
Mandate skillsets- Debezium, Oracle connector (LogMiner)., Kafka. Review the current Debezium deployment architecture, including Oracle connector configuration, Kafka integration, and downstream consumers. Analyze Oracle database setup for CDC compatibility (e.g., redo log configuration, supplemental logging, privileges). Evaluate connector performance, lag, and error handling mechanisms. Identify bottlenecks, misconfigurations, or anti-patterns in the current implementation. Provide a detailed report with findings, best practices, and actionable recommendations. Optionally, support implementation of recommended changes and performance tuning. Experience: Strong hands-on experience with Debezium, especially the Oracle connector (LogMiner). Deep understanding of Oracle internals relevant to CDC: redo logs, SCNs, archive log mode, supplemental logging. Proficiency with Apache Kafka and Kafka ecosystem tools. Experience with monitoring and debugging Debezium connectors in production environments. Ability to analyze logs, metrics, and connector configurations to identify root causes of issues. Strong documentation and communication skills for delivering technical assessments
Posted Date not available
15.0 - 20.0 years
10 - 14 Lacs
coimbatore
Work from Office
Project Role :Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Data Engineering Good to have skills : API Management, Microsoft Azure IaaSMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders to drive project success. You will also engage in problem-solving activities, providing guidance and support to your team while ensuring that best practices are followed throughout the development process. Roles & Responsibilities:-7+ years in Apache Kafka/Azure Event Hub, Kafka Streams, and distributed messaging systems-Must have lead experience of handling project independently and leading project task end to end-Proficient in designing event-driven microservices and decoupled architectures using Kafka or cloud-native messaging platforms-Skilled in analyzing functional specifications and deriving technical design and implementation plans-Proficient in Java, Python, or Scala for developing and integrating event-based solutions- Expertise in stream processing with Kafka Streams, Flink, or Spark Streaming-Configure and manage Kafka clusters, topics, partitions, and replication for optimal performance and availability-Implement authentication, authorization (RBAC), and encryption (SSL/SASL) for secure Kafka communication and data protection-Hands-on with Avro/Protobuf schemas, topic partitioning, and event ordering strategies- Experience integrating Kafka with external systems via Kafka Connect or REST Proxy-Familiar with deploying and monitoring services on Kubernetes and cloud platforms like AWS or Azure-Good understanding of security, fault tolerance, and observability in event-based architectures-Knowledge on following Best practices & Guidelines-Working knowledge of Integration/API desigining- Hands-on with Infrastructure as Code (Terraform, Helm, Ansible)- Exposure to Observability Tools (Prometheus, Grafana, ELK) Additional Information:- The candidate should have minimum 7.5 years of experience in Data Engineering.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted Date not available
6.0 - 8.0 years
6 - 16 Lacs
hyderabad
Work from Office
Key Responsibilities: Develop and maintain scalable data processing pipelines using Python and PySpark. Design, implement, and manage real-time data streaming applications using Apache Kafka. Collaborate with data scientists, analysts, and other stakeholders to understand data requirements. Optimize data workflows for performance and reliability in distributed computing environments. Write efficient, reusable, and well-documented code. Monitor and troubleshoot data pipeline issues and ensure data quality. Work with cloud platforms (AWS, Azure, or GCP) and big data tools as needed. Participate in code reviews, testing, and continuous integration/deployment. Stay updated with emerging technologies and best practices in big data and streaming platforms. Required Skills & Qualifications: Bachelors degree in Computer Science, Engineering, or related field. Strong programming skills in Python. Hands-on experience with PySpark for large-scale data processing. Solid understanding of Apache Kafka and experience with building streaming data pipelines. Familiarity with distributed computing concepts and frameworks. Experience working with relational and NoSQL databases. Knowledge of data formats such as JSON, Avro, Parquet, etc. Experience with cloud platforms and containerization (Docker) is a plus. Familiarity with version control (Git) and CI/CD practices. Strong problem-solving skills and ability to work in a collaborative team environment. Preferred Qualifications: Experience with data orchestration tools like Apache Airflow. Knowledge of other big data technologies such as Hadoop, Hive, or Presto. Experience with monitoring and logging tools for data pipelines. Familiarity with machine learning workflows or data science tools.
Posted Date not available
15.0 - 20.0 years
4 - 8 Lacs
hyderabad
Work from Office
Project Role :Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Confluent Event Streaming Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of features, troubleshooting issues, and maintaining high standards of code quality. You will also participate in discussions to share insights and contribute to the overall improvement of the development process, ensuring that the applications meet client requirements and industry standards. Roles & Responsibilities:- Expected to perform independently and become an SME (L2 Program)- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in code reviews to ensure adherence to best practices and standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in Confluent Event Streaming Platform.- Strong understanding of event-driven architecture and microservices.- Experience with Kafka and its ecosystem, including Kafka Connect and Kafka Streams.- Familiarity with cloud platforms and deployment strategies for streaming applications.- Knowledge of programming languages such as Java or Scala. Additional Information:- The candidate should have minimum 3 years of experience in Confluent Event Streaming Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted Date not available
15.0 - 20.0 years
4 - 8 Lacs
hyderabad
Work from Office
Project Role :Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Confluent Event Streaming Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code for multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of software solutions, while also performing maintenance and enhancements to existing applications. You will be responsible for delivering high-quality code and contributing to the overall success of the projects you are involved in, ensuring that client requirements are met effectively and efficiently. Roles & Responsibilities:- Expected to perform independently and become an SME.(L2 Program)- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of software specifications and design.- Collaborate with cross-functional teams to ensure seamless integration of software components. Professional & Technical Skills: - Must To Have Skills: Proficiency in Confluent Event Streaming Platform.- Strong understanding of event-driven architecture and microservices.- Experience with Kafka and its ecosystem, including Kafka Connect and Kafka Streams.- Familiarity with cloud platforms and deployment strategies.- Knowledge of programming languages such as Java or Python. Additional Information:- The candidate should have minimum 3 years of experience in Confluent Event Streaming Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education
Posted Date not available
3.0 - 6.0 years
6 - 10 Lacs
bengaluru
Work from Office
About the role: We are a modern agile product organization looking for an excellent DevOps engineer that can support and offload a remote product development team. Our platform handles tens of thousands of requests/second with sub-second response times across the globe. We serve ads to some of the biggest live events in the world, providing reports and forecasts based on billions of log rows. These are some of the complex challenges that make development and operational work at INVIDI interesting and rewarding. To accomplish this, we use the best frameworks and tools out there or, when they are not good enough, we write our own. Most of the code we write is Java or Kotlin on top of Dropwizard, but every problem is unique, and we always evaluate the best tools for the job. We work with technologies such as Kafka, Google Cloud (GKE, Pub/Sub), BigTable, Terraform and Jsonnet and a lot more. The position will report directly to the Technical Manager of Software Development and will be based in our Chennai, India office. Key responsibilities: You will maintain, deploy and operate backend services in Java and Kotlin that are scalable, durable and performant. You will proactively evolve deployment pipelines and artifact generation. You will have a commitment to Kubernetes and infrastructure maintenance. You will troubleshoot incoming issues from support and clients, fixing and resolving what you can You will collaborate closely with peers and product owners in your team. You will help other team members grow as engineers through code review, pairing, and mentoring. Our Requirements: You are an outstanding DevOps Engineer who loves to work with distributed high-volume systems. You care about the craft and cherish the opportunity to work with smart, supportive, and highly motivated colleagues. You are curious; you like to learn new things, mentor and share knowledge with team members. Like us, you strive to handle complexity by keeping things simple and elegant. As a part of the DevOps team, you will be on-call for the services and clusters that the team owns. You are on call for one week, approximately once or twice per month. While on-call, you are required to be reachable by telephone and able to act upon alarm using your laptop. Skills and qualifications: Bachelor / Masters graduation in computer science, or equivalent 4+ years of experience in the computer science industry Strong development and troubleshooting skill sets Ability to support a SaaS environment to meet service objectives Ability to collaborate effectively and work well in an Agile environment Excellent oral and written communication skills in English Ability to quickly learn new technologies and work in a fast-paced environment. Highly Preferred: Experience building service applications with Dropwizard/Spring Boot Experience with cloud services such as GCP and/or AWS. Experience with Infrastructure as Code tools such as Terraform. Experience in Linux environment. Experience working with technologies such as SQL, Kafka, Kafka Streams Experience with Docker Experience with SCM and CI/CD tools such as GIT and Bitbucket Experience with build tools such as Gradle or Maven Experience in writing Kubernetes deployment manifests and troubleshooting cluster and application-level issues. Physical Requirements: INVIDI is a conscious, clean, well-organized, and supportive office environment. Prolonged periods of sitting at a desk and working on a computer are normal. Note: Final candidates must successfully pass INVIDIs background screening requirements. Final candidates must be legally authorized to work in India. INVIDI has reopened its offices on a flexible hybrid model.
Posted Date not available
3.0 - 6.0 years
4 - 8 Lacs
bengaluru
Work from Office
locationsIN - Bangaloreposted onPosted Today time left to applyEnd DateMay 22, 2025 (5 days left to apply) job requisition idR140300 Company Overview A.P. Moller - Maersk is an integrated container logistics company and member of the A.P. Moller Group. Connecting and simplifying trade to help our customers grow and thrive . With a dedicated team of over 95,000 employees, operating in 130 countries; we go all the way to enable global trade for a growing world . From the farm to your refrigerator, or the factory to your wardrobe, A.P. Moller - Maersk is developing solutions that meet customer needs from one end of the supply chain to the other. About the Team At Maersk, the Global Ocean Manifest team is at the heart of global trade compliance and automation. We build intelligent, high-scale systems that seamlessly integrate customs regulations across 100+ countries, ensuring smooth cross-border movement of cargo by ocean, rail, and other transport modes. Our mission is to digitally transform customs documentation, reducing friction, optimizing workflows, and automating compliance for a complex web of regulatory bodies, ports, and customs authorities. We deal with real-time data ingestion, document generation, regulatory rule engines, and multi-format data exchange while ensuring resilience and security at scale. Key Responsibilities Work with large, complex datasets and ensure efficient data processing and transformation. Collaborate with cross-functional teams to gather and understand data requirements. Ensure data quality, integrity, and security across all processes. Implement data validation, lineage, and governance strategies to ensure data accuracy and reliability. Build, optimize , and maintain ETL pipelines for structured and unstructured data , ensuring high throughput, low latency, and cost efficiency . Experience in building scalable, distributed data pipelines for processing real-time and historical data. Contribute to the architecture and design of data systems and solutions. Write and optimize SQL queries for data extraction, transformation, and loading (ETL). Advisory to Product Owners to identify and manage risks, debt, issues and opportunities for the technical improvement . Providing continuous improvement suggestions in internal code frameworks, best practices and guidelines . Contribute to engineering innovations that fuel Maersks vision and mission. Required Skills & Qualifications 4 + years of experience in data engineering or a related field. Strong problem-solving and analytical skills. E xperience on Java, Spring framework Experience in building data processing pipelines using Apache Flink and Spark. Experience in distributed data lake environments ( Dremio , Databricks, Google BigQuery , etc.) E xperience on Apache Kafka, Kafka Streams Experience working with databases. PostgreSQL preferred, with s olid experience in writin g and opti mizing SQL queries. Hands-on experience in cloud environments such as Azure Cloud (preferred), AWS, Google Cloud, etc. Experience with data warehousing and ETL processes . Experience in designing and integrating data APIs (REST/ GraphQL ) for real-time and batch processing. Knowledge on Great Expectations, Apache Atlas, or DataHub , would be a plus Knowledge on RBAC, encryption, GDPR compliance would be a plus Business skills Excellent communication and collaboration skills Ability to translate between technical language and business language, and communicate to different target groups Ability to understand complex design Possessing the ability to balance and find competing forces & opinions , within the development team Personal profile Fact based and result oriented Ability to work independently and guide the team Excellent verbal and written communication Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing . Maersk is committed to a diverse and inclusive workplace, and we embrace different styles of thinking. Maersk is an equal opportunities employer and welcomes applicants without regard to race, colour, gender, sex, age, religion, creed, national origin, ancestry, citizenship, marital status, sexual orientation, physical or mental disability, medical condition, pregnancy or parental leave, veteran status, gender identity, genetic information, or any other characteristic protected by applicable law. We will consider qualified applicants with criminal histories in a manner consistent with all legal requirements. We are happy to support your need for any adjustments during the application and hiring process. If you need special assistance or an accommodation to use our website, apply for a position, or to perform a job, please contact us by emailing .
Posted Date not available
5.0 - 8.0 years
5 - 9 Lacs
bengaluru
Work from Office
JD : KAfka: Design and develop Kafka/Kafka connectors data pipelines and integration solutions. Implement and manage Kafka producers, consumers, topics, and partitions. Integrate Kafka with external systems (e.g., databases, APIs, cloud services). Optimize Kafka performance for throughput, latency, and reliability. Collaborate with DevOps, data engineering, and application teams. Monitor and troubleshoot Kafka clusters and streaming applications. Ensure data security, compliance, and governance in Kafka implementations. Maintain documentation for Kafka configurations, schemas, and processes. Mandatory Skills: Kafka Integration. Experience: 5-8 Years.
Posted Date not available
8.0 - 10.0 years
7 - 10 Lacs
hyderabad
Work from Office
Role Purpose The purpose of the role is to create exceptional integration architectural solution design and thought leadership and enable delivery teams to provide exceptional client engagement and satisfaction. Do 1. Define integration architecture for the new deals/ major change requests in existing deals a. Creates an enterprise-wide integration architecture that ensures that systems are seamlessly integrated while being scalable, reliable, and manageable. b. Provide solutioning for digital integration for RFPs received from clients and ensure overall design assurance i. Analyse applications, exchange points, data formats, connectivity requirements, technology environment, enterprise specifics, client requirements to set an integration solution design framework/ architecture ii. Provide technical leadership to the design, development and implementation of integration solutions through thoughtful use of modern technology iii. Define and understand current state integration solutions and identify improvements, options & tradeoffs to define target state solutions iv. Clearly articulate, document and use integration patterns, best practices and processes. v. Evaluate and recommend products and solutions to integrate with overall technology ecosystem vi. Works closely with various IT groups to transition tasks, ensure performance and manage issues through to resolution vii. Document integration architecture covering logical, deployment and data views mentioning all the artefacts in detail viii. Validate the integration solution/ prototype from technology, cost structure and customer differentiation point of view ix. Identify problem areas and perform root cause analysis of integration architectural design and solutions and provide relevant solutions to the problem x. Collaborating with sales, program/project, consulting teams to reconcile solutions to architecture xi. Tracks industry integration trends and relates these to planning current and future IT needs c. Provides technical and strategic input during the project planning phase in the form of technical architectural designs and recommendations d. Collaborates with all relevant parties in order to review the objectives and constraints of solutions and determine conformance with the Enterprise Architecture. e. Identifies implementation risks and potential impacts. 2. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with technical leaders, product owners, peer architects and other stakeholders to become a trusted advisor b. Develops and establishes relevant integration metrics (KPI/SLA) to drive results c. Identify risks related to integration and prepares a risk mitigation plan d. Ensure quality assurance of the integration architecture or design decisions and provides technical mitigation support to the delivery teams e. Leads the development and maintenance of integration framework and related artefacts f. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams g. Ensures integration architecture principles, patterns and standards are consistently applied to all the projects h. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Coordinate with the client teams to ensure all requirements are met and create an effective integration solution iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor 3. Competency Building and Branding a. Ensure completion of necessary trainings and certifications on integration middleware b. Develop Proof of Concepts (POCs),case studies, demos etc. for new growth areas and solve new customer problems based on market and customer research c. Develop and present a point of view of Wipro on digital integration by writing white papers, blogs etc. d. Help in attaining market recognition through analyst rankings, client testimonials and partner credits e. Be the voice of Wipros Thought Leadership by speaking in forums (internal and external) f. Mentor developers, designers and Junior architects in the project for their further career development and enhancement g. Contribute to the integration practice by conducting selection interviews etc. 4. Team Management a. Resourcing i. Anticipating new talent requirements as per the market/ industry trends or client requirements ii. Support in hiring adequate and right resources for the team through conducting interviews b. Talent Management i. Ensure adequate onboarding and training for the team members to enhance capability & effectiveness c. Performance Management i. Provide inputs to project manager in setting appraisal objectives for the team, conduct timely performance reviews and provide constructive feedback to own direct reports (if present) Mandatory Skills: Kafka Integration. Experience: 8-10 Years.
Posted Date not available
10.0 - 15.0 years
35 - 40 Lacs
pune
Work from Office
Experience Required : 10+ years overall, with 5+ years in Kafka infrastructure management and operations. Must have successfully deployed and maintained Kafka clusters in production environments, with proven experience in securing, monitoring, and scaling Kafka for enterprise-grade data streaming. Overview : We are seeking an experienced Kafka Administrator to lead the deployment, configuration, and operational management of Apache Kafka clusters supporting real-time data ingestion pipelines. The role involves ensuring secure, scalable, and highly available Kafka infrastructure for streaming flow records into centralized data platforms. Role & responsibilities Architect and deploy Apache Kafka clusters with high availability. Implement Kafka MirrorMaker for cross-site replication and disaster recovery readiness. Integrate Kafka with upstream flow record sources using IPFIX-compatible plugins. Configure Kafka topics, partitions, replication, and retention policies based on data flow requirements. Set up TLS/SSL encryption, Kerberos authentication, and access control using Apache Ranger. Monitor Kafka performance using Prometheus, Grafana, or Cloudera Manager and ensure proactive alerting. Perform capacity planning, cluster upgrades, patching, and performance tuning. Ensure audit logging, compliance with enterprise security standards, and integration with SIEM tools. Collaborate with solution architects and Kafka developers to align infrastructure with data pipeline needs. Maintain operational documentation, SOPs, and support SIT/UAT and production rollout activities. Preferred candidate profile Proven experience in Apache Kafka, Kafka Connect, Kafka Streams, and Schema Registry. Strong understanding of IPFIX, nProbe Cento, and network flow data ingestion. Hands-on experience with Apache Spark (Structured Streaming) and modern data lake or DWH platforms. Familiarity with Cloudera Data Platform, HDFS, YARN, Ranger, and Knox. Deep knowledge of data security protocols, encryption, and governance frameworks. Excellent communication, documentation, and stakeholder management skills.
Posted Date not available
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City