Jobs
Interviews

33 Confluent Kafka Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

kolkata, west bengal

On-site

At PwC, our team in deals focuses on providing strategic advice and support to clients in areas such as mergers and acquisitions, divestitures, and restructuring. We help clients navigate complex transactions and maximize value in their business deals. Those in deal integration and valuation realization at PwC will focus on assisting clients in successfully integrating acquisitions and maximizing the value of their investments. You will be responsible for conducting valuations, financial analysis, and developing strategies for post-merger integration. Focused on relationships, you are building meaningful client connections and learning how to manage and inspire others. Navigating increasingly complex situations, you are growing your personal brand, deepening technical expertise, and awareness of your strengths. You are expected to anticipate the needs of your teams and clients, and to deliver quality. Embracing increased ambiguity, you are comfortable when the path forward isn't clear, you ask questions, and you use these moments as opportunities to grow. Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: - Respond effectively to the diverse perspectives, needs, and feelings of others. - Use a broad range of tools, methodologies, and techniques to generate new ideas and solve problems. - Use critical thinking to break down complex concepts. - Understand the broader objectives of your project or role and how your work fits into the overall strategy. - Develop a deeper understanding of the business context and how it is changing. - Use reflection to develop self-awareness, enhance strengths, and address development areas. - Interpret data to inform insights and recommendations. - Uphold and reinforce professional and technical standards (e.g. refer to specific PwC tax and audit guidance), the Firm's code of conduct, and independence requirements. Essential Skills and Experience: - 3+ years of relevant experience - Must have skills: JAVA, Spring/Spring Boot, REST API, Microservices - JAVA or NODE.JS, JBOSS, SQL, MS Azure (Azure EventHub, Confluent Kafka, ASP) or AWS equivalent. - Working Knowledge: Bitbucket, GIT, Confluence, JIRA, Strong experience in DevOps pipeline, CI/CD, and related tools. - Nice to Have: OAuth and Event Driven messaging, Postman, O/S (Windows, Linux), Jboss scripting/CLI, prior FI experience. - Expert knowledge of the business, broader organization, technical environment, standards, processes, tools, procedures, multiple programming languages, operating systems, solutions design, and other relevant technology areas from a design/support/solutions perspective. - Responsible for overall development activities/progress in alignment with the development standards and guidelines set by the practice. - Provide technical guidelines, support, and align the development practice to align with the bank's strategic vision and objective. - Provide technical input and support to Architecture & Design, delivery, and other team leads/partners as required. - Point of escalation for the development team. - Readiness and motivation to work autonomously in a lead capacity on a diverse range of activities (e.g. design, support of technical business solutions) and can be relied on to coach, educate, and monitor the work of others. - Primary subject matter expertise in multiple areas; you're seasoned in counseling clients and project teams on all aspects of research, analysis, design, hardware and software support, development of technical solutions, and testing. - Involvement in coaching and advising clients, partners, and project teams; capable of being an internal expert resource in "technical information exchange". - Commitment to and belief in the quality of your deliverables.,

Posted 1 week ago

Apply

10.0 - 12.0 years

10 - 12 Lacs

Hyderabad, Telangana, India

On-site

About Chubb JOB DESCRIPTION Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com . About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Position Details Job Title : Confluent Kafka Platform Engineer / Technical Lead Function/Department : Technology Location : Hyderabad Employment Type : Full Time Reports To : Vijai Sai Kosireddy Role Overview Key Responsibilities Design, implement, and manage a robust Confluent Kafka platform on Azure AKS or Azure VMs. Work on Kafka clusters configuration, optimization, and scaling to ensure high availability and performance. Provide support and guidance to development teams on integrating applications with Confluent Kafka. Troubleshoot and resolve issues related to Kafka integrations, ensuring minimal service disruption. Lead the migration of existing Kafka implementations to the latest Confluent Kafka version, ensuring a smooth transition with minimal downtime. Collaborate with teams to define migration strategies, testing protocols, and rollback plans. Oversee the operational health of Kafka clusters, ensuring uptime, performance, and reliability. Implement monitoring, logging, and alerting solutions to track system performance and potential issues. Actively monitor the environment for security vulnerabilities and implement necessary patches and updates. Ensure compliance with industry security standards and best practices. Create and maintain documentation for Kafka architecture, configurations, and best practices. Conduct training sessions and workshops to upskill team members on Confluent Kafka usage and management. Communicate regularly with stakeholders to provide updates on platform health, upcoming changes, and migration progress. Collaborate with cross-functional teams to align on project goals and timelines. Skills And Qualifications Bachelor's degree in Computer Science, Information Technology, or a related field. Master's degree preferred. Minimum of 10 years of experience in building and supporting data streaming solutions, specifically with Confluent Kafka. Demonstrated experience with Azure AKS and Azure VM environments. Solid understanding of Core Java fundamentals and experience in developing Java-based applications. Strong knowledge of Kafka architecture, including brokers, topics, consumers, and producers. Familiarity with migration processes for Kafka, including version upgrades and data migrations. Experience with monitoring tools (e.g., Prometheus, Grafana) for Kafka performance and health checks. Excellent troubleshooting skills with a passion for problem-solving. Good To Have Confluent Kafka certification or other relevant cloud certifications (e.g., Azure). Experience with microservices architecture, RESTful APIs, and containerization. Knowledge of security best practices related to data streaming and cloud environments. Why Chubb Join Chubb to be part of a leading global insurance company! Industry leader : Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence. A Great Place to work : Chubb India has been recognized as a Great Place to Work for the years 2023-2024, 2024-2025, and 2025-2026. Laser focus on excellence : At Chubb, we pride ourselves on our culture of greatness, where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results. Start-Up Culture : Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter. Growth and success : As we continue to grow, we are steadfast in our commitment to provide our employees with the best work experience, enabling them to advance their careers in a conducive environment. Employee Benefits Our company offers a comprehensive benefits package designed to support our employees health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision-related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans : We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits, and Car Lease that help employees optimally plan their finances. Upskilling and career growth opportunities : With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs, and access to global learning programs. Health and Welfare Benefits : We care about our employees well-being in and out of work and have benefits like Hybrid Work Environment, Employee Assistance Program (EAP), Yearly Free Health campaigns, and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent and inclusive. Step 1 : Submit your application via the Chubb Careers Portal. Step 2 : Engage with our recruitment team for an initial discussion. Step 3 : Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4 : Final interaction with Chubb leadership. Join Us With you, Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India's journey. Apply Now : Chubb External Careers

Posted 1 week ago

Apply

4.0 - 8.0 years

4 - 8 Lacs

Hyderabad, Telangana, India

On-site

About Chubb Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength, and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com . About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow. With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Position Details Job Title : Senior Software Engineer Function/Department : Technology Location : Hyderabad - Work From Office Employment Type : Full-time Reports To : Kuldeep Kumar Role Overview As a Full Stack engineer, you will engage in both front-end and back-end programming languages, contributing to the development of frameworks and integrating third-party libraries. Key Responsibilities Full Stack Development : Work on both front-end and back-end technologies to build robust and scalable applications. Data Streaming Solutions : Build and support data streaming solutions , particularly using Confluent Kafka . Framework and API Development : Develop frameworks and integrate third-party libraries using technologies like .Net Framework , C# , Asp.Net , and Web APIs . Cloud Application Design : Design, build, test, and maintain cloud applications and services on Microsoft Azure , with proficiency in various Azure SDKs, data storage, and app authentication. UI/UX & Databases : Familiarity with databases like MSSQL and MySQL . Perform performance tuning of relational databases and contribute to UI/UX design . DevOps & CICD : Implement and manage CICD pipelines using tools like BitBucket , Bamboo , PowerShell , and SonarQube . Skills and Qualifications Education : Bachelor's degree in Computer Science , Information Technology , or a related field. Master's degree preferred . Experience : 4-8 years of experience in building and supporting data streaming solutions , specifically with Confluent Kafka . Technical Skills : In-depth knowledge of .Net Framework , C# , Asp.Net , and Web APIs . Hands-on experience in Angular 13+ and JavaScript frameworks . Familiarity with Agile methodologies and Scrum . Expertise in designing, building, testing, and maintaining cloud applications on Microsoft Azure . Familiarity with MSSQL , MySQL , IIS , and UI/UX design and performance tuning of relational databases. Proficient in DevOps practices and implementing CICD pipelines using BitBucket , Bamboo , PowerShell , and SonarQube . Why Chubb Industry Leader : Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence. A Great Place to Work : Chubb India has been recognized as a Great Place to Work for the years 2023-2024, 2024-2025, and 2025-2026. Focus on Excellence : Chubb takes pride in a culture of excellence, constantly seeking innovative ways to excel at work and deliver outstanding results. Start-Up Culture : Embracing the spirit of a start-up, Chubb focuses on speed and agility , enabling quick responses to market needs. Growth and Success : Chubb remains committed to providing employees with the best work experience, supporting career advancement and continuous learning . Employee Benefits Chubb offers a comprehensive benefits package that supports employees health , well-being , and professional growth . The benefits include: Savings and Investment Plans : Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits, and Car Lease. Upskilling and Career Growth : Customized programs supporting education reimbursement , certification programs , and access to global learning programs . Health and Welfare Benefits : Hybrid work environment , Employee Assistance Program (EAP) , Yearly Free Health campaigns , and comprehensive insurance benefits. Application Process Step 1 : Submit your application via the Chubb Careers Portal . Step 2 : Engage with the recruitment team for an initial discussion . Step 3 : Participate in HackerRank assessments or technical/functional interviews . Step 4 : Final interaction with Chubb leadership . Join Us With you, Chubb is better . Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity , innovation , and inclusion , and are ready to make a difference, we invite you to be part of Chubb India's journey . Apply Now : Chubb External Careers

Posted 1 week ago

Apply

6.0 - 8.0 years

6 - 8 Lacs

Hyderabad, Telangana, India

On-site

About the job About Chubb Chubb is a world leader in insurance. With operations in 54 countries and territories, Chubb provides commercial and personal property and casualty insurance, personal accident and supplemental health insurance, reinsurance, and life insurance to a diverse group of clients. The company is defined by its extensive product and service offerings, broad distribution capabilities, exceptional financial strength, and local operations globally. Parent company Chubb Limited is listed on the New York Stock Exchange (NYSE: CB) and is a component of the S&P 500 index. Chubb employs approximately 40,000 people worldwide. Additional information can be found at: www.chubb.com . About Chubb India At Chubb India, we are on an exciting journey of digital transformation driven by a commitment to engineering excellence and analytics. We are proud to share that we have been officially certified as a Great Place to Work for the third consecutive year, a reflection of the culture at Chubb where we believe in fostering an environment where everyone can thrive, innovate, and grow. With a team of over 2500 talented professionals, we encourage a start-up mindset that promotes collaboration, diverse perspectives, and a solution-driven attitude. We are dedicated to building expertise in engineering, analytics, and automation, empowering our teams to excel in a dynamic digital landscape. We offer an environment where you will be part of an organization that is dedicated to solving real-world challenges in the insurance industry. Together, we will work to shape the future through innovation and continuous learning. Position Details Job Title : Senior Software Engineer Function/Department : Technology Location : Hyderabad - Work From Office Employment Type : Full-time Reports To : Kuldeep Kumar Role Overview Key Responsibilities As a Full Stack engineer, you will engage in both front-end and back-end programming languages, contributing to the development of frameworks and integrating third-party libraries. Skills And Qualifications Bachelor's degree in computer science, Information Technology, or a related field. Master's degree preferred. Minimum of 6-8 years of experience in building and supporting data streaming solutions, specifically with Confluent Kafka. In-depth knowledge of .Net Framework, C#, Asp.Net, Web APIs. Hands-on experience in Angular 13+, JavaScript framework. Familiar with Agile methodologies and Scrum experience. Expertise in designing, building, testing, and maintaining cloud applications and services on Microsoft Azure. In addition, he/she should have the ability to program in a language supported by Azure and proficiency in Azure SDKs, data storage options, data connections, APIs, app authentication and authorization, compute and container deployment, debugging, performance tuning, and monitoring. Familiarity with databases (e.g. MSSQL & MySQL), web servers (IIS), and UI/UX design, performance tuning of relational databases. Skilled in DevOps with knowledge of implementing CICD pipeline using BitBucket, Bamboo, PowerShell, SonarQube. Why Chubb Join Chubb to be part of a leading global insurance company! Industry leader : Chubb is a world leader in the insurance industry, powered by underwriting and engineering excellence. A Great Place to work : Chubb India has been recognized as a Great Place to Work for the years 2023-2024, 2024-2025, and 2025-2026. Laser focus on excellence : At Chubb, we pride ourselves on our culture of greatness where excellence is a mindset and a way of being. We constantly seek new and innovative ways to excel at work and deliver outstanding results. Start-Up Culture : Embracing the spirit of a start-up, our focus on speed and agility enables us to respond swiftly to market requirements, while a culture of ownership empowers employees to drive results that matter. Growth and success : As we continue to grow, we are steadfast in our commitment to providing our employees with the best work experience, enabling them to advance their careers in a conducive environment. Employee Benefits Our company offers a comprehensive benefits package designed to support our employees health, well-being, and professional growth. Employees enjoy flexible work options, generous paid time off, and robust health coverage, including treatment for dental and vision-related requirements. We invest in the future of our employees through continuous learning opportunities and career advancement programs, while fostering a supportive and inclusive work environment. Our benefits include: Savings and Investment plans : We provide specialized benefits like Corporate NPS (National Pension Scheme), Employee Stock Purchase Plan (ESPP), Long-Term Incentive Plan (LTIP), Retiral Benefits, and Car Lease that help employees optimally plan their finances. Upskilling and career growth opportunities : With a focus on continuous learning, we offer customized programs that support upskilling like Education Reimbursement Programs, Certification programs, and access to global learning programs. Health and Welfare Benefits : We care about our employees well-being in and out of work and have benefits like Hybrid Work Environment, Employee Assistance Program (EAP), Yearly Free Health campaigns, and comprehensive Insurance benefits. Application Process Our recruitment process is designed to be transparent, and inclusive. Step 1 : Submit your application via the Chubb Careers Portal. Step 2 : Engage with our recruitment team for an initial discussion. Step 3 : Participate in HackerRank assessments/technical/functional interviews and assessments (if applicable). Step 4 : Final interaction with Chubb leadership. Join Us With you, Chubb is better. Whether you are solving challenges on a global stage or creating innovative solutions for local markets, your contributions will help shape the future. If you value integrity, innovation, and inclusion, and are ready to make a difference, we invite you to be part of Chubb India's journey. Apply Now : Chubb External Careers

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for designing, implementing, and maintaining scalable event-streaming architectures that support real-time data. Your duties will include designing, building, and managing Kafka clusters using Confluent Platform and Kafka Cloud services (AWS MSK, Confluent Cloud). You will also be involved in developing and maintaining Kafka topics, schemas (Avro/Protobuf), and connectors for data ingestion and processing pipelines. Monitoring and ensuring the reliability, scalability, and security of Kafka infrastructure will be crucial aspects of your role. Collaboration with application and data engineering teams to integrate Kafka with other AWS-based services (e.g., Lambda, S3, EC2, Redshift) is essential. Additionally, you will implement and manage Kafka Connect, Kafka Streams, and ksqlDB where applicable. Optimizing Kafka performance, troubleshooting issues, and managing incidents will also be part of your responsibilities. To be successful in this role, you should have at least 3-5 years of experience working with Apache Kafka and Confluent Kafka. Strong knowledge of Kafka internals such as brokers, zookeepers, partitions, replication, and offsets is required. Experience with Kafka Connect, Schema Registry, REST Proxy, and Kafka security is also important. Hands-on experience with AWS services like EC2, IAM, CloudWatch, S3, Lambda, VPC, and Load balancers is necessary. Proficiency in scripting and automation using tools like Terraform, Ansible, or similar is preferred. Familiarity with DevOps practices and tools such as CI/CD pipelines, monitoring tools like Prometheus/Grafana, Splunk, Datadog, etc., is beneficial. Experience with containerization using Docker and Kubernetes is an advantage. Having a Confluent Certified Developer or Administrator certification, AWS Certified, experience with CICD tools like AWS Code Pipeline, Harness, and knowledge of containers (Docker, Kubernetes) will be considered as additional assets for this role.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Entity: Technology Job Family Group: IT&S Group Job Description: You will work with bp is transforming and at our Digital Hub in Pune we are growing the digital expertise and solutions needed to advance the global energy transition. Digital Engineering are a team of technology and software specialists providing innovative, custom-built or customized software and technical platforms to bp colleagues and external users. Let Me Tell You About The Role As an Integration Senior Enterprise Tech Engineer, you are a senior member of a team creating Application Integration solutions for BP colleagues and external users. Your teams mission is to be the digital provider of choice to your area of BP delivering innovation at speed where it&aposs wanted, and day-in-day-out reliability where it&aposs needed. You will operate in a dynamic and commercially focussed environment, with the resources of one of the world&aposs largest Digital organisations and leading Digital and IT vendors working with you. You will be part of growing and strengthening our technical talent base experts coming together to solve BP and the worlds problems. What You Will Deliver Lead enterprise technology architecture, security frameworks, and platform engineering across enterprise landscapes. Oversee the end-to-end security of enterprise platforms, ensuring compliance with industry standards and regulatory requirements. Drive enterprise operations excellence, optimising system performance, availability, and scalability. Provide leadership in enterprise modernization and transformation, ensuring seamless integration with enterprise IT. Establish governance, security standards, and risk management strategies aligned with global security policies. Design and implement automated security monitoring, vulnerability assessments, and identity management solutions for enterprise environments. Drive CI/CD, DevOps, and Infrastructure-as-Code adoption for enterprise deployments. Ensure disaster recovery, high availability, and resilience planning for enterprise platforms. Engage with business leaders, technology teams, and external vendors to ensure enterprise solutions align with enterprise goals. Mentor and lead enterprise security and operations teams, fostering a culture of excellence, innovation, and continuous improvement. Provide executive-level insights and technical recommendations on enterprise investments, cybersecurity threats, and operational risks What you will need to be successful (experience and qualifications) Technical Skills We Need From You Bachelors (or higher) degree, ideally in Computer Science, MIS/IT, Mathematics or a hard science. Years of experience: 8-12 years, with a minimum of 5-7 years of relevant experience. Essential Skills SME in enterprise integration domain, should be able to design highly scalable integrations which involves with API, Messaging, Files, Databases, and cloud services Experienced in Integration tools like TIBCO/MuleSoft, Apache Camel/ Spring Integration, Confluent Kafka...etc. Expert in Enterprise Integration Patterns (EIPs) and iBlocks to build secure integrations Willingness and ability to learn, to become skilled in at least one more cloud-native (AWS and Azure) integration solutions on top of your existing skillset. Deep understanding of the Interface development lifecycle, including design, security, design patterns for extensible and reliable code, automated unit and functional testing, CI/CD and telemetry Demonstrated understanding of modern technologies like Cloud native, containers, serverless Emerging Technology Monitoring Application Support Strong inclusive leadership and people management Stakeholder Management Embrace a culture of continuous improvement Skills That Set You Apart Agile methodologies ServiceNow Risk Management Systems Development Management Monitoring and telemetry tools User Experience Analysis Cybersecurity and compliance Key Behaviors: Empathetic: Cares about our people, our community and our planet Curious: Seeks to explore and excel Creative: Imagines the extraordinary Inclusive: Brings out the best in each other About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Automation, Integration Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bps recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less

Posted 1 week ago

Apply

5.0 - 9.0 years

5 - 10 Lacs

Hyderabad, Telangana, India

On-site

We are seeking a highly skilled Kafka Integration Specialist with extensive experience in designing, developing, and integrating Apache Kafka solutions. You will be responsible for creating and maintaining Kafka-based data pipelines, managing clusters, and ensuring high availability and performance. This role requires a strong understanding of distributed systems and data streaming concepts to deliver robust, real-time integration solutions. Roles & Responsibilities: Design, implement, and maintain Kafka-based data pipelines . Develop integration solutions using Kafka Connect, Kafka Streams , and other related technologies. Manage Kafka clusters, ensuring high availability, scalability, and performance. Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. Implement best practices for data streaming, including message serialization, partitioning, and replication. Monitor and troubleshoot Kafka performance, latency, and security issues. Ensure data integrity and implement failover strategies for critical data pipelines. Skills Required: Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). Proficiency in programming languages like Java, Python, or Scala . Experience with distributed systems and data streaming concepts. Familiarity with Zookeeper, Confluent Kafka , and Kafka Broker configurations. Expertise in creating and managing topics, partitions, and consumer groups. Hands-on experience with integration tools such as REST APIs, MQ, or ESB . Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Experience with monitoring tools like Prometheus, Grafana , or Datadog is a plus. Exposure to DevOps practices, CI/CD pipelines , and infrastructure automation is a plus. Knowledge of data serialization formats like Avro, Protobuf , or JSON is a plus. QUALIFICATION: Bachelor's degree in Computer Science, Information Technology, or a related field.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Ahmedabad, Gujarat, India

On-site

We are seeking a highly skilled Kafka Integration Specialist with extensive experience in designing, developing, and integrating Apache Kafka solutions. You will be responsible for creating and maintaining Kafka-based data pipelines, managing clusters, and ensuring high availability and performance. This role requires a strong understanding of distributed systems and data streaming concepts to deliver robust, real-time integration solutions. Roles & Responsibilities: Design, implement, and maintain Kafka-based data pipelines . Develop integration solutions using Kafka Connect, Kafka Streams , and other related technologies. Manage Kafka clusters, ensuring high availability, scalability, and performance. Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. Implement best practices for data streaming, including message serialization, partitioning, and replication. Monitor and troubleshoot Kafka performance, latency, and security issues. Ensure data integrity and implement failover strategies for critical data pipelines. Skills Required: Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). Proficiency in programming languages like Java, Python, or Scala . Experience with distributed systems and data streaming concepts. Familiarity with Zookeeper, Confluent Kafka , and Kafka Broker configurations. Expertise in creating and managing topics, partitions, and consumer groups. Hands-on experience with integration tools such as REST APIs, MQ, or ESB . Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Experience with monitoring tools like Prometheus, Grafana , or Datadog is a plus. Exposure to DevOps practices, CI/CD pipelines , and infrastructure automation is a plus. Knowledge of data serialization formats like Avro, Protobuf , or JSON is a plus. QUALIFICATION: Bachelor's degree in Computer Science, Information Technology, or a related field.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Gurgaon, Haryana, India

On-site

We are seeking a highly skilled Kafka Integration Specialist with extensive experience in designing, developing, and integrating Apache Kafka solutions. You will be responsible for creating and maintaining Kafka-based data pipelines, managing clusters, and ensuring high availability and performance. This role requires a strong understanding of distributed systems and data streaming concepts to deliver robust, real-time integration solutions. Roles & Responsibilities: Design, implement, and maintain Kafka-based data pipelines . Develop integration solutions using Kafka Connect, Kafka Streams , and other related technologies. Manage Kafka clusters, ensuring high availability, scalability, and performance. Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. Implement best practices for data streaming, including message serialization, partitioning, and replication. Monitor and troubleshoot Kafka performance, latency, and security issues. Ensure data integrity and implement failover strategies for critical data pipelines. Skills Required: Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). Proficiency in programming languages like Java, Python, or Scala . Experience with distributed systems and data streaming concepts. Familiarity with Zookeeper, Confluent Kafka , and Kafka Broker configurations. Expertise in creating and managing topics, partitions, and consumer groups. Hands-on experience with integration tools such as REST APIs, MQ, or ESB . Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Experience with monitoring tools like Prometheus, Grafana , or Datadog is a plus. Exposure to DevOps practices, CI/CD pipelines , and infrastructure automation is a plus. Knowledge of data serialization formats like Avro, Protobuf , or JSON is a plus. QUALIFICATION: Bachelor's degree in Computer Science, Information Technology, or a related field.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Chennai, Tamil Nadu, India

On-site

We are seeking a highly skilled Kafka Integration Specialist with extensive experience in designing, developing, and integrating Apache Kafka solutions. You will be responsible for creating and maintaining Kafka-based data pipelines, managing clusters, and ensuring high availability and performance. This role requires a strong understanding of distributed systems and data streaming concepts to deliver robust, real-time integration solutions. Roles & Responsibilities: Design, implement, and maintain Kafka-based data pipelines . Develop integration solutions using Kafka Connect, Kafka Streams , and other related technologies. Manage Kafka clusters, ensuring high availability, scalability, and performance. Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. Implement best practices for data streaming, including message serialization, partitioning, and replication. Monitor and troubleshoot Kafka performance, latency, and security issues. Ensure data integrity and implement failover strategies for critical data pipelines. Skills Required: Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). Proficiency in programming languages like Java, Python, or Scala . Experience with distributed systems and data streaming concepts. Familiarity with Zookeeper, Confluent Kafka , and Kafka Broker configurations. Expertise in creating and managing topics, partitions, and consumer groups. Hands-on experience with integration tools such as REST APIs, MQ, or ESB . Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Experience with monitoring tools like Prometheus, Grafana , or Datadog is a plus. Exposure to DevOps practices, CI/CD pipelines , and infrastructure automation is a plus. Knowledge of data serialization formats like Avro, Protobuf , or JSON is a plus. QUALIFICATION: Bachelor's degree in Computer Science, Information Technology, or a related field.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Pune, Maharashtra, India

On-site

We are seeking a highly skilled Kafka Integration Specialist with extensive experience in designing, developing, and integrating Apache Kafka solutions. You will be responsible for creating and maintaining Kafka-based data pipelines, managing clusters, and ensuring high availability and performance. This role requires a strong understanding of distributed systems and data streaming concepts to deliver robust, real-time integration solutions. Roles & Responsibilities: Design, implement, and maintain Kafka-based data pipelines . Develop integration solutions using Kafka Connect, Kafka Streams , and other related technologies. Manage Kafka clusters, ensuring high availability, scalability, and performance. Collaborate with cross-functional teams to understand integration requirements and deliver robust solutions. Implement best practices for data streaming, including message serialization, partitioning, and replication. Monitor and troubleshoot Kafka performance, latency, and security issues. Ensure data integrity and implement failover strategies for critical data pipelines. Skills Required: Strong experience in Apache Kafka (Kafka Streams, Kafka Connect). Proficiency in programming languages like Java, Python, or Scala . Experience with distributed systems and data streaming concepts. Familiarity with Zookeeper, Confluent Kafka , and Kafka Broker configurations. Expertise in creating and managing topics, partitions, and consumer groups. Hands-on experience with integration tools such as REST APIs, MQ, or ESB . Knowledge of cloud platforms like AWS, Azure, or GCP for Kafka deployment. Experience with monitoring tools like Prometheus, Grafana , or Datadog is a plus. Exposure to DevOps practices, CI/CD pipelines , and infrastructure automation is a plus. Knowledge of data serialization formats like Avro, Protobuf , or JSON is a plus. QUALIFICATION: Bachelor's degree in Computer Science, Information Technology, or a related field.

Posted 2 weeks ago

Apply

7.0 - 11.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Back-end Software Development Engineering Senior Engineer at TekWissen, located in Chennai, you will play a crucial role in our engineering team. Your responsibilities will include designing, developing, and maintaining robust, scalable, and high-performance backend services. Leveraging your expertise in technologies like Spring Boot, microservices architecture, and GraphQL, you will be instrumental in building the core of our enterprise applications. Seamless integration with frontend systems and external services will be a key aspect of your work. A strong understanding of cloud-native development principles is essential for this role, along with hands-on experience in deploying, managing, and optimizing cloud infrastructure using Google Cloud Platform (GCP). Working in a collaborative, cross-functional environment, you will contribute to technical strategy, best practices, and the overall success of our product suite. The ideal candidate should possess skills in Dynatrace, Microservices, Java Full stack, GraphQL, SPRING, Confluent Kafka, and GCP. Experience in Agile Software Development and CI/CD is preferred. The desired candidate should have a Bachelor's Degree and a Senior Engineer experience of 7 - 10 years, with a strong background in spring java microservice, GCP, and GraphQL. At TekWissen Group, we are committed to providing equal opportunities and supporting workforce diversity. Join us in our mission to make the world a better place through innovative technology solutions.,

Posted 2 weeks ago

Apply

12.0 - 15.0 years

8 - 10 Lacs

Hyderabad, Telangana, India

Remote

Basic Qualifications: Master s degree in computer science & engineering preferred with 12-15 years of software development experience OR, Bachelor s degree in computer science & engineering preferred with 11-15 years of software development experience Minimum of 7 years of professional experience in technology, including at least 3 years in a data architecture and AI solution architect role. Strong expertise in cloud platforms, preferably Azure and GCP, and associated data and AI services. Proven experience in architecting and deploying scalable data solutions, including data lakes, warehouses, and streaming platforms. Working knowledge of tools/technologies like Azure Data Factory, Confluent Kafka, Spark, Databricks, BigQuery and Vertex AI. Deep understanding of AI/ML frameworks and tools such as TensorFlow, PyTorch, Spark ML, or Azure ML. Preferred Qualifications: Programming Languages: Proficiency in multiple languages (e.g., Python, Java,Data bricks, Vertex) is crucial and must Experienced with API integration, serverless, microservices architecture. Proficiency with programming languages like Python, Java, or Scala. Proficiency vAzure Data Factory, Confluent Kafka, Spark, Databricks, BigQuery and Vertex AI. Proficiency with AI/ML frameworks and tools such as TensorFlow, PyTorch, Spark ML, or Azure ML Solid understanding of data governance, security, privacy, and compliance standards. Exceptional communication, presentation, and stakeholder management skills. Experience working in agile project environments Good to Have Skills Willingness to work on AI Applications Experience with popular large language models Experience with Langchain or llamaIndex framework for language models Experience with prompt engineering, model fine tuning Knowledge of NLP techniques for text analysis and sentiment analysis Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, remote teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

This is a data engineer position where you will be responsible for designing, developing, implementing, and maintaining data flow channels and data processing systems to support the collection, storage, batch and real-time processing, and analysis of information in a scalable, repeatable, and secure manner in coordination with the Data & Analytics team. Your main objective will be to define optimal solutions for data collection, processing, and warehousing, particularly within the banking & finance domain. You must have expertise in Spark Java development for big data processing, Python, and Apache Spark. You will be involved in designing, coding, and testing data systems and integrating them into the internal infrastructure. Your responsibilities will include ensuring high-quality software development with complete documentation, developing and optimizing scalable Spark Java-based data pipelines, designing and implementing distributed computing solutions for risk modeling, pricing, and regulatory compliance, ensuring efficient data storage and retrieval using Big Data, implementing best practices for Spark performance tuning, maintaining high code quality through testing, CI/CD pipelines, and version control, working on batch processing frameworks for Market risk analytics, and promoting unit/functional testing and code inspection processes. You will also collaborate with business stakeholders, Business Analysts, and other data scientists to understand and interpret complex datasets. Qualifications: - 5-8 years of experience in working in data ecosystems - 4-5 years of hands-on experience in Hadoop, Scala, Java, Spark, Hive, Kafka, Impala, Unix Scripting, and other Big data frameworks - 3+ years of experience with relational SQL and NoSQL databases such as Oracle, MongoDB, HBase - Strong proficiency in Python and Spark Java with knowledge of core Spark concepts (RDDs, Dataframes, Spark Streaming, etc.), Scala, and SQL - Data integration, migration, and large-scale ETL experience - Data modeling experience - Experience building and optimizing big data pipelines, architectures, and datasets - Strong analytic skills and experience working with unstructured datasets - Experience with various technologies like Confluent Kafka, Redhat JBPM, CI/CD build pipelines, Git, BitBucket, Jira, external cloud platforms, container technologies, and supporting frameworks - Highly effective interpersonal and communication skills - Experience with software development life cycle Education: - Bachelors/University degree or equivalent experience in computer science, engineering, or a similar domain This is a full-time position in the Data Architecture job family group within the Technology sector.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

punjab

On-site

As an Integration DevOps Engineer in Sydney, you will need to possess a skillset that includes expertise in Redhat Openshift Kubernetes Container Platform, Infrastructure as Code using Terraform, and various DevOps concepts, tools, and languages. With at least 3+ years of experience, you will be responsible for developing, configuring, and maintaining Openshift in lower environments and production settings. Your role will involve working with Automation and CI/CD tools such as Ansible, Jenkins, Tekton, or Bamboo pipelines in conjunction with Kubernetes containers. A strong understanding of security policies including BasicAuth, OAuth, WSSE token, and configuring security policies for APIs using Hashicorp Vault will be essential for this position. In addition, you will be expected to create environments, namespaces, virtual hosts, API proxies, and cache as well as work with APIGEEX, ISTIO ServiceMesh, and Confluent Kafka Setup and Deployments. Your experience with cloud architectures such as AWS, Azure, Private, OnPrem, and Multi-cloud will be valuable. Furthermore, you will play a key role in developing, managing, and supporting automation tools, processes, and runbooks. Your contribution to delivering services or features via an agile DevOps approach, ensuring information security for the cloud, and promoting good practices in coding and automation processes will be crucial. Effective communication skills are essential as you will be required to provide thought leadership on cloud platform, automation, coding patterns, and components. A self-rating matrix from the candidate on skills like OpenShift, Kubernetes, and APIGEEX is mandatory for consideration. If you have any questions or need further clarification, please feel free to reach out.,

Posted 3 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

You are a Senior Cloud Application Developer (AWS to Azure Migration) with 8+ years of experience. Your role involves hands-on experience in developing applications for both AWS and Azure platforms. You should have a strong understanding of Azure services for application development and deployment, including Azure IaaS and PaaS services. Your responsibilities include proficiency in AWS to Azure cloud migration, which involves service mapping and SDK/API conversion. You will also be required to perform code refactoring and application remediation for cloud compatibility. You should have a minimum of 5 years of experience in application development using Java, Python, Node.js, or .NET. Additionally, you must possess a solid understanding of CI/CD pipelines, deployment automation, and Azure DevOps. Experience with containerized applications, AKS, Kubernetes, and Helm charts is also necessary. Your role will involve application troubleshooting, support, and testing in cloud environments. Experience with the following tech stack is highly preferred: - Spring Boot REST API, NodeJS REST API - Apigee config, Spring Server Config - Confluent Kafka, AWS S3 Sync Connector - Azure Blob Storage, Azure Files, Azure Functions - Aurora PostgreSQL to Azure DB migration - EKS to AKS migration, S3 to Azure Blob Storage - AWS to Azure SDK Conversion Location options for this role include Hyderabad, Bangalore, or Pune. You should have a notice period of 10-15 days.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Join us as a Senior ETL Test Engineer at Barclays, responsible for supporting the successful delivery of location strategy projects to plan, budget, agreed quality and governance standards. You'll spearhead the evolution of our digital landscape, driving innovation and excellence. You will harness cutting-edge technology to revolutionize our digital offerings, ensuring unparalleled customer experiences. To be successful as a Senior ETL Test Engineer, you should have experience with expertise in ETL tool e.g. Informatica. Develop and execute ETL test cases, test scripts, and data validation scenarios. Validation on data extraction, transformation, and loading along with data completeness. Test Automation experience in developing and implementing ETL test automation scripts using Python, SQL scripting, QuerySurge, Unix, and Shall Scripts. Automate comparison, schema validation, and regression testing. Integrate test automation with CICD pipeline (Jenkins, Gitlab, DevOps). Optimize and maintain automated test suite for scalability and performance. Understand requirements, user stories, and able to relate them with the design document. Work closely with business analyst, Dev team to define the test scope. Maintain Test Plan, Test Data, and Automation in version control. Document best practices, lessons learned, and continuous improvement strategies. Identify and log defects via JIRA & defect management. Work with business analyst and developers to troubleshoot data issues and pipeline failures. Provide a detailed report on test execution, coverage, and defect analysis. Understanding of agile development/test methodology and practice it in day-to-day work. Unearth gaps between business requirements and User stories. Ensure ETL process adheres to data privacy and compliance. Validate data masking encryption and access control. Audit and data Recon testing to track the data modification. Some other highly valued skills may include preferable earlier experience in coding with an engineering background. Detail understanding of Cloud technology viz AWS, Confluent Kafka. Good if have hands-on experience in BDD/TDD. You may be assessed on key critical skills relevant for success in the role, such as risk and controls, change and transformation, business acumen, strategic thinking, and digital and technology, as well as job-specific technical skills. This role is based out of Pune. Purpose of the role To design, develop, and execute testing strategies to validate functionality, performance, and user experience, while collaborating with cross-functional teams to identify and resolve defects, and continuously improve testing processes and methodologies, to ensure software quality and reliability. Accountabilities Development and implementation of comprehensive test plans and strategies to validate software functionality and ensure compliance with established quality standards. Creation and execution automated test scripts, leveraging testing frameworks and tools to facilitate early detection of defects and quality issues. Collaboration with cross-functional teams to analyze requirements, participate in design discussions, and contribute to the development of acceptance criteria, ensuring a thorough understanding of the software being tested. Root cause analysis for identified defects, working closely with developers to provide detailed information and support defect resolution. Collaboration with peers, participate in code reviews, and promote a culture of code quality and knowledge sharing. Stay informed of industry technology trends and innovations and actively contribute to the organization's technology communities to foster a culture of technical excellence and growth. Analyst Expectations To perform prescribed activities in a timely manner and to a high standard consistently driving continuous improvement. Requires in-depth technical knowledge and experience in their assigned area of expertise. Thorough understanding of the underlying principles and concepts within the area of expertise. They lead and supervise a team, guiding and supporting professional development, allocating work requirements, and coordinating team resources. If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviors to create an environment for colleagues to thrive and deliver to a consistently excellent standard. The four LEAD behaviors are: L Listen and be authentic, E Energize and inspire, A Align across the enterprise, D Develop others. OR for an individual contributor, they develop technical expertise in the work area, acting as an advisor where appropriate. Will have an impact on the work of related teams within the area. Partner with other functions and business areas. Takes responsibility for the end results of a team's operational processing and activities. Escalate breaches of policies/procedure appropriately. Take responsibility for embedding new policies/procedures adopted due to risk mitigation. Advise and influence decision making within own area of expertise. Take ownership of managing risk and strengthening controls in relation to the work you own or contribute to. Deliver your work and areas of responsibility in line with relevant rules, regulation and codes of conduct. Maintain and continually build an understanding of how own sub-function integrates with the function, alongside knowledge of the organization's products, services, and processes within the function. Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. Guide and persuade team members and communicate complex/sensitive information. Act as the contact point for stakeholders outside of the immediate function while building a network of contacts outside the team and external to the organization. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,

Posted 3 weeks ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Navi Mumbai

Work from Office

We are hiring a RabbitMQ Admin with strong expertise in Kafka, messaging systems, and performance monitoring. This role involves managing and optimizing enterprise messaging infrastructure in a banking environment. Required Candidate profile Experienced Messaging Admin with hands-on Kafka & RabbitMQ skills, certified in Confluent Kafka,adept at ensuring high-performance message delivery,troubleshooting issues,securing middleware systems.

Posted 1 month ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Key Skills: Confluent Kafka, Kafka Connect, Schema Registry, Kafka Brokers, KSQL, KStreams, Java/J2EE, Troubleshooting, RCA, Production Support. Roles & Responsibilities: Design and develop Kafka Pipelines. Perform unit testing of the code and prepare test plans as required. Analyze, design, and develop programs in a development environment. Support applications and jobs in the production environment for issues or failures. Develop operational documents for applications, including DFD, ICD, HLD, etc. Troubleshoot production issues and provide solutions within defined SLA. Prepare RCA (Root Cause Analysis) document for production issues. Provide permanent fixes to production issues. Experience Requirement: 5-10 yeras of experience working with Confluent Kafka. Hands-on experience with Kafka Connect using Schema Registry. Strong knowledge of Kafka brokers and KSQL. Familiarity with Kafka Control Center, Zookeepers, and KStreams is good to have. Experience with Java/J2EE is a plus. Education: B.E., B.Tech.

Posted 1 month ago

Apply

7.0 - 11.0 years

0 Lacs

pune, maharashtra

On-site

You are a results-driven Data Project Manager (PM) responsible for leading data initiatives within a regulated banking environment, focusing on leveraging Databricks and Confluent Kafka. Your role involves overseeing the successful end-to-end delivery of complex data transformation projects aligned with business and regulatory requirements. In this position, you will be required to lead the planning, execution, and delivery of enterprise data projects using Databricks and Confluent. This includes developing detailed project plans, delivery roadmaps, and work breakdown structures, as well as ensuring resource allocation, budgeting, and adherence to timelines and quality standards. Collaboration with data engineers, architects, business analysts, and platform teams is essential to align on project goals. You will act as the primary liaison between business units, technology teams, and vendors, facilitating regular updates, steering committee meetings, and issue/risk escalations. Your technical oversight responsibilities include managing solution delivery on Databricks for data processing, ML pipelines, and analytics, as well as overseeing real-time data streaming pipelines via Confluent Kafka. Ensuring alignment with data governance, security, and regulatory frameworks such as GDPR, CBUAE, and BCBS 239 is crucial. Risk and compliance management are key aspects of your role, involving ensuring regulatory reporting data flows comply with local and international financial standards and managing controls and audit requirements in collaboration with Compliance and Risk teams. The required skills and experience for this role include 7+ years of Project Management experience within the banking or financial services sector, proven experience in leading data platform projects, a strong understanding of data architecture, pipelines, and streaming technologies, experience in managing cross-functional teams, and proficiency in Agile/Scrum and Waterfall methodologies. Technical exposure to Databricks (Delta Lake, MLflow, Spark), Confluent Kafka (Kafka Connect, kSQL, Schema Registry), Azure or AWS Cloud Platforms, integration tools, CI/CD pipelines, and Oracle ERP Implementation is expected. Preferred qualifications include PMP/Prince2/Scrum Master certification, familiarity with regulatory frameworks, and a strong understanding of data governance principles. The ideal candidate will hold a Bachelors or Masters degree in Computer Science, Information Systems, Engineering, or a related field. Key performance indicators for this role include on-time, on-budget delivery of data initiatives, uptime and SLAs of data pipelines, user satisfaction, and compliance with regulatory milestones.,

Posted 1 month ago

Apply

6.0 - 10.0 years

0 Lacs

maharashtra

On-site

NTT DATA is looking for a Data Ingest Engineer to join the team in Pune, Mahrshtra (IN-MH), India (IN). As a Data Ingest Engineer, you will be part of the Ingestion team of the DRIFT data ecosystem, focusing on ingesting data in a timely, complete, and comprehensive manner using the latest technology available to Citi. Your role will involve leveraging new and creative methods for repeatable data ingestion from various sources while ensuring the highest quality data is provided to downstream partners. Responsibilities include partnering with management teams to integrate functions effectively, identifying necessary system enhancements for new products and process improvements, and resolving high impact problems/projects through evaluation of complex business processes and industry standards. You will provide expertise in applications programming, ensure application design aligns with the overall architecture blueprint, and develop standards for coding, testing, debugging, and implementation. Additionally, you will analyze issues, develop innovative solutions, and mentor mid-level developers and analysts. The ideal candidate should have 6-10 years of experience in Apps Development or systems analysis, with extensive experience in system analysis and programming of software applications. Proficiency in Application Development using JAVA, Scala, Spark, familiarity with event-driven applications and streaming data, and experience with various schema, data types, ELT methodologies, and formats are required. Experience working with Agile and version control tool sets, leadership skills, and clear communication abilities are also essential. NTT DATA is a trusted global innovator of business and technology services, serving 75% of the Fortune Global 100. With experts in more than 50 countries and a strong partner ecosystem, NTT DATA is committed to helping clients innovate, optimize, and transform for long-term success. As a part of the NTT Group, NTT DATA invests significantly in R&D to support organizations and society in moving confidently into the digital future. For more information, visit us at us.nttdata.com.,

Posted 1 month ago

Apply

5.0 - 10.0 years

3 - 6 Lacs

Chennai, Tamil Nadu, India

On-site

Skills: 5+ years experience as developer in .NET environment Exposure to Azure/AWS cloud infra and services Hands-on tech stack deployment and implementation Client-facing experience Good comm skills Desirable: Angular/React exposure TCP/IP Socket programming Message queues using Apache/Confluent Kafka SQL Server/ MongoDB / Cosmos DB work experience REDIS cache usage experienceIdeally, it would be a solution architect who can do hands-on coding and deployment.

Posted 1 month ago

Apply

8.0 - 13.0 years

3 - 5 Lacs

Chennai, Tamil Nadu, India

On-site

8+ years experience as developer in .NET environment Exposure to Azure/AWS cloud infra and services Hands-on tech stack deployment and implementation Client-facing experience Good comm skills Desirable: Angular/React exposure TCP/IP Socket programming Message queues using Apache/Confluent Kafka SQL Server/ MongoDB / Cosmos DB work experience REDIS cache usage experience Ideally, it would be a solution architect who can do hands-on coding and deployment.

Posted 1 month ago

Apply

7.0 - 12.0 years

12 - 18 Lacs

Pune, Chennai

Work from Office

Key Responsibilities: Implement Confluent Kafka-based CDC solutions to support real-time data movement across banking systems. Implement event-driven and microservices-based data solute zions for enhanced scalability, resilience, and performance . Integrate CDC pipelines with core banking applications, databases, and enterprise systems . Ensure data consistency, integrity, and security , adhering to banking compliance standards (e.g., GDPR, PCI-DSS). Lead the adoption of Kafka Connect, Kafka Streams, and Schema Registry for real-time data processing. Optimize data replication, transformation, and enrichment using CDC tools like Debezium, GoldenGate, or Qlik Replicate . Collaborate with Infra team, data engineers, DevOps teams, and business stakeholders to align data streaming capabilities with business objectives. Provide technical leadership in troubleshooting, performance tuning, and capacity planning for CDC architectures. Stay updated with emerging technologies and drive innovation in real-time banking data solutions . Required Skills & Qualifications: Extensive experience in Confluent Kafka and Change Data Capture (CDC) solutions . Strong expertise in Kafka Connect, Kafka Streams, and Schema Registry . Hands-on experience with CDC tools such as Debezium, Oracle GoldenGate, or Qlik Replicate . Hands on experience on IBM Analytics Solid understanding of core banking systems, transactional databases, and financial data flows . Knowledge of cloud-based Kafka implementations (AWS MSK, Azure Event Hubs, or Confluent Cloud) . Proficiency in SQL and NoSQL databases (e.g., Oracle, MySQL, PostgreSQL, MongoDB) with CDC configurations. Strong experience in event-driven architectures, microservices, and API integrations . Familiarity with security protocols, compliance, and data governance in banking environments. Excellent problem-solving, leadership, and stakeholder communication skills .

Posted 1 month ago

Apply

6.0 - 8.0 years

27 - 30 Lacs

Pune, Ahmedabad, Chennai

Work from Office

Technical Skills Must Have: 8+ years overall IT industry experience, with 5+ years in a solution or technical architect role using service and hosting solutions such as private/public cloud IaaS, PaaS and SaaS platforms. 5+ years of hands-on development experience with event driven architecture-based implementation. Achieved one or more of the typical solution and technical architecture certifications e.g. Microsoft, MS Azure Certification, TOGAF, AWS Cloud Certified, SAFe, PMI, and SAP etc. Hand-on experience with: o Claims-based authentication (SAML/OAuth/OIDC), MFA, JIT, and/or RBAC / Ping etc. o Architecting Mission critical technology components with DR capabilities. o Multi-geography, multi-tier service design and management. o Project financial management, solution plan development and product cost estimation. o Supporting peer teams and their responsibilities; such as infrastructure, operations, engineering, info-security. o Configuration management and automation tools such as Azure DevOps, Ansible, Puppet, Octopus, Chef, Salt, etc. o Software development full lifecycle methodologies, patterns, frameworks, libraries and tools. o Relational, graph and/or unstructured data technologies such as SQL Server, Azure SQL, Cosmos, Azure Data Lake, HD Insights, Hadoop, Neo4j etc. o Data management and data governance technologies. o Experience in data movement and transformation technologies. o AI and Machine Learning tools such as Azure ML etc. o Architecting mobile applications that are either independent applications or supplementary addons (to intranet or extranet). o Cloud security controls including tenant isolation, encryption at rest, encryption in transit, key management, vulnerability assessments, application firewalls, SIEM, etc. o Apache Kafka, Confluent Kafka, Kafka Streams, and Kafka Connect. o Proficient in NodeJS, Java, Scala, or Python languages.

Posted 1 month ago

Apply
Page 1 of 2
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies