Home
Jobs

11926 Kafka Jobs - Page 25

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Summary: We are seeking a skilled Data Engineer to join our dynamic team. In this role, will be responsible for implementing and maintaining scalable data pipelines and infrastructure on AWS cloud platform. The ideal candidate will have experience with AWS services, particularly in the realm of big data processing and analytics. The role involves working closely with cross-functional teams to support data-driven decision-making and focus on delivering business objectives while improving efficiency and ensuring high service quality. Key Responsibilities: Develop, and maintain large-scale data pipelines that can handle large datasets from multiple sources. Implement real-time data replication and batch processing of data using distributed computing platforms like Spark, Kafka, etc. Optimize performance of data processing jobs and ensure system scalability and reliability. Collaborate with DevOps teams to manage infrastructure, including cloud environments like AWS Collaborate with data scientists, analysts, and business stakeholders to develop tools and platforms that enable advanced analytics and reporting. Strong analytical and problem-solving skills with attention to detail. To maintain a healthy working relationship with the business partners/users and other MLI departments Responsible for overall performance, cost and delivery of technology solutions Key Technical competencies/skills required: Proficiency in programming languages such as SQL, Python, PySpark, SQL/PLSQL for implementing data pipelines and ETL processes. Hands-on knowledge with AWS services such as S3, Lambda, EMR, Glue, Redshift, Athena, etc. Knowledge of data modelling and knowledge of modern file and table formats. Cloud/hybrid cloud (preferable AWS) solution for data strategy for Data lake, BI and Analytics Experience with data warehousing concepts. Desired qualifications and experience: Bachelor’s degree in computer science, Engineering, or related field (Master’s preferred). Proven experience of 5+ years as a Data Engineer or similar role with a strong focus on AWS cloud Strong analytical and problem-solving skills with attention to detail. Excellent communication and collaboration skills.

Posted 3 days ago

Apply

2.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About Us We’re Basedynamics a fast-growing startup building a unified Customer Intelligence platform for modern B2B companies. Our stack spans React, TypeScript, Spring Boot, ClickHouse, and Kafka. We ship fast, data-driven, enterprise-grade features that product managers, growth teams, and customer-success leaders love. What You’ll Do Craft scalable, high-impact web apps with React, JavaScript (ES6+), HTML5, and CSS3. Bring ideas to life partnering with backend engineers, designers, and product managers from brainstorm to ship. Integrate seamless experiences with RESTful and GraphQL APIs. Write robust, maintainable, and well-tested code (think: Jest, Cypress, Mocha, Chai). Stay ahead of the curve on frontend trends and emerging AI tech. Ship fast, learn faster, and help us keep iterating! What We’re Looking For 2+ years of real-world frontend experience (preferably in product-based company). Pro-level JavaScript (ES6+), React, HTML5, and CSS3 skills. (Angular, Vue, or Ember? That works too!) A solid grip on software engineering best practices and SDLC. Experience with REST/GraphQL API integrations. Familiarity with modern frontend testing frameworks. Curiosity, teamwork, and a problem-solving attitude. Why Join Us? Be part of a fun, talented team shaping products from Chennai to the world. Flexible culture where your voice and ideas matter. Learn, experiment, and grow your career, your way. Make an impact you’ll be proud of. Ready to build what’s next? Even if you don’t tick every box, if you’re excited to join us, we want to hear from you!

Posted 3 days ago

Apply

13.0 years

0 Lacs

Mohali district, India

On-site

About the Company: Finvasia is a multi-disciplinary, multinational organisation that owns and operates over a dozen brands across financial services, technology, real estate and healthcare verticals.Some of the notable brands owned by Finvasia are Fxview, Shoonya, ACT Trader, CapitalWallet, Zulutrade, AAAfx, Gini Health, bodyLoop, StackFlow, Finvasia estates, and portfolios.com Over the last 13 years of our history, Finvasia has managed funds for some of the notable hedge funds of the Wall street, launched the first and only commission free ecosystem for listed and fee based financial products in India, provided technology to some of the notable listed and unlisted financial services entities across the globe, launched medically proven diabetes reversal program and engaged scientist from various specialised fields to build nano and micro medical devices that can monitor and assist in various body functions. Key Responsibilities: As a Java Developer, the Consultant primary responsibilities will encompass various aspects of software development, ensuring the delivery of robust and high-quality solutions. The Consultant role will involve active participation in the entire software development life cycle, from feature inception through grooming and up to the release to production. Key responsibilities include: Feature Development: Collaborate with cross-functional teams to develop new features, understanding requirements and contributing to the design process. Actively engage in feature grooming sessions, providing valuable insights and expertise. Legacy Functionality Refactoring: Take a leading role in refactoring and improving existing legacy functionality to enhance code quality, maintainability, and overall system performance. Team Contributions: Demonstrate proactive participation in team activities, taking a leadership role in architecture decisions, design discussions, and code reviews. Foster a collaborative environment by mentoring junior developers and sharing knowledge with team members. Disciplined Software Engineering: Adhere to disciplined software engineering practices, emphasizing the importance of unit testing, code reviews, and continuous integration. Strive to write clean, simple, and maintainable code, following coding standards and established best practices. Implementation Best Practices: Implement solutions based on industry best practices, ensuring high reliability and service quality. Stay abreast of the latest developments in Java technologies and incorporate them into the development process. Quality Assurance: Contribute to the establishment and maintenance of quality assurance processes, ensuring that the software meets the defined standards and specifications. Documentation: Create and maintain comprehensive technical documentation for code, design decisions, and architectural considerations. Continuous Improvement: Actively participate in retrospectives and contribute to continuous improvement initiatives within the development team. Identify opportunities for process optimization and efficiency gains. Qualifications: Minimum 3 years of backend development experience using Java. Strong expertise in Spring and Spring Boot Knowledge of best practices, architectural design patterns and principles. Experience developing large scale highly available backend services. Experience working with Kafka and RabbitMQ. Experience with queue brokers. Experience with cloud stack GCP/AWS. Experience with Dockers and developing Microservices. A plus, experience developing for K8S. A learning and growth-focused mindset and a strong desire to share your skills and learn from those around you. English - fluent

Posted 3 days ago

Apply

1.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: Full Stack Developer Job Location: Hyderabad Notice Period: 15 Days Role Overview: · Play a crucial role in driving Company mission to simplify and innovate construction management. · Collaborate with diverse clients worldwide, helping them transform complex workflows. · Thrive in a fast-paced, tech-driven environment that encourages continuous learning and growth. · Advance your career by delivering real impact on large scale infrastructure and construction projects. Key Responsibilities: · We are looking for a tech enthusiast with a knack for full stack developer. Eager to dive into code and bring ideas to life. · Own features from brainstorming to deployment—handling everything from database architecture to front-end performance. · Optimize and Scale: Ensure that our platform is high-performing, scalable, and future-proof. You will be part of laying the groundwork for big, exciting growth. · Collaborate & Conquer: Work closely with our design, product, and AI teams to integrate machine learning and automation features into our platform, pushing the boundaries of what tech can do in construction. · Write clean, efficient, and maintainable code track record that talks. Required Qualifications: · Bachelor’s or Master’s degree in Computer Science, Engineering, or related field. · Equivalent practical experience may be acceptable with a strong portfolio and leadership track record. · 1+ years of experience with either MEAN (MongoDB, Express, Angular, Nodejs) or MERN (MongoDB, Express, React, Nodejs) stack. · Hands-on experience in designing and building scalable, secure full-stack applications in a microservices or monolithic architecture. · Strong proficiency in Angular 15+, RxJS, NgRx (or other state management libraries). · Solid understanding of TypeScript, JavaScript, HTML5, and CSS3. · Experience building responsive and cross-browser applications. · Familiar with Angular CLI, lazy loading, routing, and component-based architecture. · Proficiency in MongoDB and its query syntax, aggregation framework. · Knowledge of Mongoose (ORM). Understanding of schema design, indexing, and performance tuning. Nice-to-have: · Experience with GraphQL, Socket.IO, or WebRTC. · Understanding of Server-Side Rendering (SSR) using Next.js (for MERN) or Angular Universal (for MEAN). · Knowledge of Redis, Kafka, or other message queues · Familiarity with multi-tenant architecture or SaaS product engineering. What We Offer: · Grow with purpose: Accelerate your career with hands-on learning and expert mentorship. · Culture that empowers: Join a team where your ideas matter and diversity is celebrated. · Perks that matter: Enjoy flexible work options and benefits designed to support your work-life balance. · Make a real impact: Work on advanced solutions that simplify construction and help build smarter cities and communities worldwide. To Apply: Send your resume to hr@velsprint.com

Posted 3 days ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

We are looking for a Data Engineer to join the Data Platform team who can help develop and deploy data pipelines at a huge scale of ~5B daily events and concurrency of 500K users. The platform is built on AWS based modern data stack enabling real-time reporting and performing data exploration from first principles. Experience: 2-5 Years Job Location: Gurgaon Responsibilities: ➔ Create and maintain a robust, scalable, and optimized data pipeline. ➔ Handle TBs of data volume daily on Wynk Music/Xstream(Vod & live tv) Data platform ➔ Extract and consume data from our live systems to analyze and produce robust 99.999% SLA Big-data environment ➔ Build and execute data mining and modeling activities using agile development techniques ➔ Solve problems in robust and creative ways and demonstrate solid verbal, interpersonal and written communication skills ➔ Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources. ➔ Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. ➔ Handle multiple kinds of SQL And NoSQL databases for both structured and unstructured datasets. ➔ Appreciate and understand the cloud computation model and how that affects application solutions – both delivery and deployment. ➔ The incumbent will be in the driving seat for innovation and delivery in various exciting tech opportunities, from powering business analytics pipelines to machine learning applications. Desired Profile: ➔ B.E / B.Tech / M.E / M.Tech / M.S in Computer Science or software engineering from a premier institute ➔ 2+ years of experience in Spark and Scala ➔ Should be very strong with data structures and algorithm. ➔ Fluent with Hadoop/Pig/Hive/Storm/SQL ➔ Good to have knowledge: NoSQL solutions like Cassandra/MongoDB/CouchDB, Postgres, Kafka, Redis, ElasticSearch, Spark Streaming, or you really want to learn about them. ➔ Own end to end product modules/features (from the requirement to going live) ➔ Knowledge in at least one of the programming or scripting languages like Java, Scala, Python. ➔ Hands-on in big data infrastructure, distributed systems, dimensional modeling, query processing & optimization and relational databases

Posted 3 days ago

Apply

170.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary Strategy Develop the strategic direction and roadmap for SCPAY, aligning with Business Strategy, ITO Strategy and investment priorities. Tap into latest industry trends, innovative products & solutions to deliver effective and faster product capabilities Support CASH Management Operations leveraging technology to streamline processes, enhance productivity, reduce risk and improve controls Business Work hand in hand with Payments Business, taking product programs from investment decisions into design, specifications, solutioning, development, implementation and hand-over to operations, securing support and collaboration from other SCB teams Ensure delivery to business meeting time, cost and high quality constraints Support respective businesses in growing Return on investment, commercialisation of capabilities, bid teams, monitoring of usage, improving client experience, enhancing operations and addressing defects & continuous improvement of systems Thrive an ecosystem of innovation and enabling business through technology Processes Responsible for the end-to-end deliveries of the technology portfolio comprising key business product areas such as Payments & Clearing. Own technology delivery of projects and programs across global SCB markets that develop/enhance core product capabilities ensure compliance to Regulatory mandates support operational improvements, process efficiencies and zero touch agenda build payments platform to align with latest technology & architecture trends, improved stability and scale Interface with business & technology leaders of other SCB systems for collaborative delivery. People & Talent Employee, engage and retain high quality talent to ensure Payments Technology team is adequately staffed and skilled to deliver on business commitments Lead through example and build appropriate culture and values. Set appropriate tone and expectations for the team and work in collaboration with risk and control partners. Bridge skill / capability gaps through learning and development Ensure role, job descriptions and expectations are clearly set and periodic feedback provided to the entire team Ensure the optimal blend and balance of in-house and vendor resources Risk Management Be proactive in ensuring regular assurance that the Payments ITO Team is performing to acceptable risk levels and control standards Act quickly and decisively when any risk and control weakness becomes apparent and ensure those are addressed within quick / prescribed timeframes and escalated through the relevant committees Balance business delivery on time, quality and cost constraints with risks & controls to ensure that they do not materially threaten the Group’s ability to remain within acceptable risk levels Ensure business continuity and disaster recovery planning for the entire technology portfolio Governance Promote an environment where compliance with internal control functions and the external regulatory framework Key Responsibilities Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Lead the team to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters. Key stakeholders Solution Architect – SCPAY SCPAY – Programme Managers Group Payments Product Development Heads & Group Cash Operations Other Responsibilities Embed Here for good and Group’s brand and values in team; Perform other responsibilities assigned under Group, Country, Business or Functional policies and procedures; Multiple functions (double hats); Skills And Experience Java / Spring Boot Kafka Streams, REST, JSON Design Principle Hazelcast & ELK Oracle & Postgres Qualifications Minimum 6-10 yrs of experience in the Dev role and in that a couple of years of experience as Dev lead role is an added advantage, good knowledge in Java, Microservices and Spring boot Technical Knowledge: Java / Spring Boot, Kafka Streams, REST, JSON, Netflix Micro Services suite ( Zuul / Eureka / Hystrix etc., ), 12 Factor Apps, Oracle, PostgresSQL, Cassandra & ELK Ability to work with geographically dispersed and highly varied stakeholders Very Good communication and interpersonal skills to manage senior stakeholders and top management Knowledge on JIRA and Confluence tools are desired About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential.

Posted 3 days ago

Apply

18.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

JD Vice President – Delivery (Payments & Transaction Banking) Grade: VP Experience:18 + years Location: Goregaon,Mumbai Role Overview As Vice President – Delivery , you will lead the technology delivery of large-scale, mission-critical payments and transaction banking platforms , with a strong focus on modern tech stacks (Java, Microservices, J2EE). You will drive technical execution excellence, architecture alignment, and delivery rigor across global transformation engagements. The role requires deep technical leadership combined with domain expertise in Payments, Cash Management, and Digital Banking . Key Responsibilities Technical & Program Leadership: Lead the design, architecture, and end-to-end delivery of large tech programs in the payments and transaction banking space . Drive engineering rigor across Java-based platforms, Microservices, APIs, and integrations. Ensure scalability, reliability, and performance of platforms being delivered. Program Management: Oversee multi-stream programs , ensuring timelines, budgets, quality standards, and stakeholder alignment. Implement strong program governance, risk mitigation frameworks, and cadence reviews. Team Management: Manage large cross-functional technology teams (developers, architects, QA, DevOps). Drive performance, innovation, and a culture of engineering excellence. Stakeholder Engagement: Engage with C-level and senior stakeholders on architecture reviews, technical direction, and delivery roadmaps. Act as the escalation point for key technology delivery issues. Continuous Improvement & Best Practices: Champion DevOps, Agile/Scrum, and modern engineering principles. Lead initiatives around code quality, CI/CD, observability, and automation. Core Areas of Expertise Strong hands-on expertise in Java, Spring Boot, J2EE, Microservices architecture, and REST APIs Proven delivery of enterprise-scale payment systems, transaction platforms, and cash/channel banking applications Deep domain experience in digital payments, RTGS/NEFT, UPI, ISO20022, SWIFT, reconciliation systems Deep understanding of platform engineering , systems integration, and regulatory compliance Ability to scale large tech teams, drive modernization, and lead cloud-native transformations Key Requirements B.Tech in Computer Science or related field 18+ years of progressive experience in product engineering, platform delivery, or fintech transformation Strong technical background with hands-on or architectural exposure to Java/J2EE, Spring Boot, Microservices, Kafka, cloud platforms Demonstrated success in leading enterprise banking/payment system implementations Proficient in Agile, DevOps, SAFe , and global delivery methodologies Experience handling high-volume, low-latency, mission-critical systems PMP/Prince2 certification preferred Willingness to travel as required Personal Attributes Technically strong and detail-oriented with an engineering mindset Strategic thinker with delivery discipline and executive presence Excellent communicator with ability to engage CXO-level stakeholders Proactive, result-driven, and comfortable working in high-stakes environments

Posted 3 days ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

We are seeking a skilled Java Backend Developer to build and maintain scalable server-side applications. The ideal candidate will have hands-on experience with JDBC, JMS, Apache Kafka, and HTTP-based APIs, and will contribute to designing robust backend systems that power high-performance enterprise solutions. Experience with Spring Framework, Spring Boot, and Microservices architectureFamiliarity with SQL/NoSQL databases (e.g., MySQL, PostgreSQL, MongoDB) Proficient in using Git, JIRA, and CI/CD tools like Jenkins or GitLab Knowledge of containerization (Docker) and cloud platforms (AWS, GCP, Azure) is a plus Responsibilities Develop and maintain backend services using Java, Spring Boot, and related frameworks Integrate with databases using JDBC and ORM tools like Hibernate Implement asynchronous messaging using JMS and Kafka Design and consume RESTful HTTP interfaces for internal and external integrations Optimize application performance and ensure scalabilityCollaborate with cross-functional teams including frontend, DevOps, and QAParticipate in code reviews, unit testing, and CI/CD pipelines Document technical specifications and maintain clean, reusable code Behavioural Good communication and problem-solving skills. Prior experience working with UK client stakeholders beneficial. Able to handle ambiguity and takes a proactive nature to resolve queries An inquisitive mind-set and desire to understand both data and business requirements Continuous self-improvement and learningMethodical in their work with an ability to create and maintain clear documentation as required. 3 must haves Java Experience in JDBC, JMS, Apache Kafka HTTP-based APIsStrong expertise in JDBC, JMS (ActiveMQ, RabbitMQ, etc.), and KafkaSolid understanding of HTTP protocols, RESTful APIs, and JSON/XML

Posted 3 days ago

Apply

2.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Job Description Data Analytics Engineer Our Enterprise Data & Analytics (EDA) is looking for a Data Engineer to join our growing data engineering team. We are a globally distributed remote first team. You’ll work in a collaborative Agile environment using the latest engineering best practices with involvement in all aspects of the software development lifecycle. You will craft and develop curated data products, applying standard architectural & data modeling practices . You will be primarily developing Data Warehouse Solutions using technologies such as dbt, Airflow, terraform. What You Get To Do Every Single Day Collaborate with team members and business partners to collect business requirements, define successful analytics outcomes and design data models Use best engineering practices such as version control system, CI/CD, code review, pair programming Design, build, and maintain ELT pipelines in Enterprise Data Warehouse to ensure reliable business reporting Design & build ELT based data models using SQL & dbt Build analytics solutions that provide practical insights into customer 360, finance, product, sales and other key business domains Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery Work with data and analytics experts to strive for greater functionality in our data systems Basic Qualifications What you bring to the role: 2+ years of data / analytics engineering experience building, working & maintaining data pipelines & ETL processes on big data environments Basic knowledge in modern as well as classic Data Modeling - Kimball, Innmon, etc Experience with any of the programming language: Python, Go, Java, Scala, we use primarily Python SQL knowledge and experience working with Cloud columnar databases (We use Snowflake) as well as working familiarity with a variety of databases Familiarity with processes supporting data transformation, data structures, metadata, dependency and workload management Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong communication skills and be adaptive for changing requirements and tech stack. Preferred Qualifications Demonstrated experience in one or many business domains 1+ completed projects with dbt Proficient knowledge in SQL and/or python Experience using Airflow as data orchestration tool What Does Our Data Stack Looks Like ELT (Snowflake, dbt, Airflow, Kafka, HighTouch) BI (Tableau, Looker) Infrastructure (GCP, AWS, Kubernetes, Terraform, Github Actions) Please note that Zendesk can only hire candidates who are physically located and plan to work from Karnataka or Maharashtra. Please refer to the location posted on the requisition for where this role is based. Hybrid: In this role, our hybrid experience is designed at the team level to give you a rich onsite experience packed with connection, collaboration, learning, and celebration - while also giving you flexibility to work remotely for part of the week. This role must attend our local office for part of the week. The specific in-office schedule is to be determined by the hiring manager. The Intelligent Heart Of Customer Experience Zendesk software was built to bring a sense of calm to the chaotic world of customer service. Today we power billions of conversations with brands you know and love. Zendesk believes in offering our people a fulfilling and inclusive experience. Our hybrid way of working, enables us to purposefully come together in person, at one of our many Zendesk offices around the world, to connect, collaborate and learn whilst also giving our people the flexibility to work remotely for part of the week. Zendesk is an equal opportunity employer, and we’re proud of our ongoing efforts to foster global diversity, equity, & inclusion in the workplace. Individuals seeking employment and employees at Zendesk are considered without regard to race, color, religion, national origin, age, sex, gender, gender identity, gender expression, sexual orientation, marital status, medical condition, ancestry, disability, military or veteran status, or any other characteristic protected by applicable law. We are an AA/EEO/Veterans/Disabled employer. If you are based in the United States and would like more information about your EEO rights under the law, please click here. Zendesk endeavors to make reasonable accommodations for applicants with disabilities and disabled veterans pursuant to applicable federal and state law. If you are an individual with a disability and require a reasonable accommodation to submit this application, complete any pre-employment testing, or otherwise participate in the employee selection process, please send an e-mail to peopleandplaces@zendesk.com with your specific accommodation request.

Posted 3 days ago

Apply

2.0 - 4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About the Role: The Applications Development Programmer Analyst is an intermediate-level role focused on establishing and implementing new or updated application systems and programs in collaboration with the Technology team. This role involves writing code, collaborating with other developers and contributing effectively to an agile team. Key Responsibilities: Software Development: Design, develop, and maintain robust, scalable, and high-performance application features. Develop clean, maintainable, and testable code following SOLID principles and software design best practices. Ensure high levels of unit test coverage, test-driven development (TDD), and behavior-driven development (BDD). Actively contribute to hands-on coding and refactoring to maintain high engineering standards. Skills and Qualifications: Must-Have Skills 2 to 4 years of strong hands-on experience in coding Java, Multi threading, REST API , Spring Boot, Hibernate , SQL (Oracle), Git. Hands on experience with Junits. Understanding of design pattern & SOLID design principles. Excellent problem-solving skills and ability to work in fast-paced, agile environments. Strong communication and collaboration skills. Good-to-Have Skills Familiarity with kafka, caching solutions and microservice architecture. Hands on experience with UI technologies like Angular, JavaScript. Education: Bachelor’s degree/University degree or equivalent experience ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 3 days ago

Apply

9.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About Us Asiatic Stock & Securities Ltd is a SEBI-registered broker and member of NSE and BSE since 1995. With an unblemished track record spanning three decades, we’ve combined our strong regulatory foundation with the agility of a modern trading firm. Over the last 9 years, we’ve actively supported and funded mid-frequency, overnight, and systematic trading strategies , and built our own in-house execution engine — currently integrated with Symphony Fintech's XTS API platform. We are now looking to upgrade our execution infrastructure , enhance its functionality, and make it more resilient, scalable, and intelligent. Role Overview We’re seeking an experienced Quantitative Execution Developer to take ownership of our existing execution engine — upgrading its architecture, extending its capabilities, and addressing real-world trading challenges in live environments. This is a critical role, sitting at the intersection of technology , trading operations , and market microstructure . Key ResponsibilitiesEnhance o ur existing execution engine: Optimize performance, reliability, and error handling. Develop smart execution logic : Integrate TWAP, VWAP, order slicing, price/volume adaptive logic, etc. Improve order routing and reduce slippages : Monitor fills, latency, and execution quality in live environments. Integrate new asset classes and strategy types : Extend support to options, multi-leg, and conditional orders. Handle and troubleshoot execution issues : Detect, log, and mitigate exchange rejections, throttling errors, order mismatches, and connectivity failures. Build real-time monitoring dashboards : Provide visibility into orders, positions, latencies, and system health. Collaborate with quants, traders, and RMS : Ensure seamless hand-off from signal generation to final execution, with tight risk controls. Required Skills Strong programming experience in Python and/or C++ (Rust/Go a bonus). Deep familiarity with Symphony Fintech XTS APIs (or similar broker/exchange APIs). Experience in live trading environments , especially around order management, risk controls, and post-trade reconciliation . Ability to debug and resolve common execution errors: Exchange rejections (price bands, order limits) RMS rejections, margin errors, throttling Partial fills, slippages, mismatched positions Network/connectivity drops, session expiries Understanding of market microstructure : LTP vs. bid-ask, impact costs, auction vs continuous sessions. Strong debugging, logging, and fault-tolerance design mindset. Nice to Have Experience with Kafka, Redis, InfluxDB/Grafana , or similar tech stack for real-time streaming and monitoring. Exposure to co-location / low-latency environments . Familiarity with SEBI compliance norms , exposure rules, and risk handling. Previous work at a quant trading desk, broker RMS, or HFT firm . Why Join Us? Direct exchange access — no black boxes or interm ediaries.Real PnL impact — yo ur work directly drives trading outcomes.Oppor tunity to upgra de existing systems, not start from zero — but with full freedom to re-architect.Work with a close -knit team of traders, quants, and coders solv ing real-world problems together.Compe titive compensation with performance-linked incentives.

Posted 3 days ago

Apply

3.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Summary Job Description: AWS Data Engineer About the Role We are looking for a highly technical and experienced AWS Data Engineer to join our team. The successful candidate will be responsible for designing, developing, and deploying machine learning models to solve complex business problems by leveraging large datasets on the AWS platform. This role requires working across the entire ML lifecycle, from data collection and preprocessing to model training, evaluation, and deployment using AWS AI Services. The goal is to create efficient self-learning applications capable of evolving over time. If you are passionate about data engineering and machine learning, possess strong programming skills, and have a deep understanding of statistical methods and various ML algorithms, we want to hear from you. Responsibilities Design, develop, and deploy machine learning models on AWS to address complex business challenges. Work across the ML lifecycle, including data collection, preprocessing, model training, evaluation, and deployment using services such as Amazon SageMaker, AWS Glue, and Amazon S3. Leverage large datasets to derive insights and create data-driven solutions using AWS analytics tools. Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions on AWS. Optimize and maintain data pipelines and systems on AWS to ensure efficient data processing and storage. Implement and monitor model performance, making necessary adjustments to improve accuracy and efficiency using AWS monitoring tools. Keep up-to-date with the latest advancements in AWS AI and machine learning technologies. Document processes and models to ensure transparency and reproducibility. Preferred Qualifications Bachelor's or Master's degree in Computer Science, Data Science, Statistics, or a related field. Proven experience as a Data Engineer or in a similar role, with a strong focus on machine learning and AWS, ranging from 3 to 8 years of relevant experience. Proficiency in programming languages such as Python, with experience in using AWS SDKs and APIs. Deep understanding of statistical methods and various machine learning algorithms. Experience with AWS AI and ML frameworks and libraries, such as Amazon SageMaker, AWS Glue, and AWS Lambda. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Knowledge of big data technologies and tools, such as Hadoop, Spark, or Kafka, is a plus. Familiarity with AWS cloud platform and services like Amazon EC2, Amazon RDS, and Amazon Redshift is an advantage. Ability to work independently and manage multiple projects simultaneously.

Posted 3 days ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. Software engineering is the application of engineering to the design, development, implementation, testing and maintenance of software in a systematic method. The roles in this function will cover all primary engineering activity across all technology functions that ensure we deliver code with high quality for our applications, products and services and to understand customer needs and to develop product roadmaps. These roles include, but are not limited to analysis, design, coding, engineering, testing, debugging, standards, methods, tools analysis, documentation, research and development, maintenance, new development, operations and delivery. With every role in the company, each position has a requirement for building quality into every output. This also includes evaluating new tools, new techniques, strategies; Automation of common tasks; build of common utilities to drive organizational efficiency with a passion around technology and solutions and influence of thought and leadership on future capabilities and opportunities to apply technology in new and innovative ways. - Coordinates, supervises and is accountable for the daily activities of business support, technical or production team or unit. Primary Responsibilities Impact of work is most often at the team level Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications B.E. (Computer Engineering) or equivalent from reputed engineering colleges 3+ years of relevant software lifecycle experience specifically focusing on FSE over cloud Experience with Kubernetes and automated codeable deployments Technically hands-on experience and excellent in Design, Coding and Testing Knowledge on REST and Spring Boot over any Cloud Cassandra or any other No-SQL exposure Exposure to AWS services, Lambda and containerization Sound fundamentals of Core Java Knowledge of build tools such as Maven or Gradle Knowledge or background on Product or Project or Program Related Tech Stack: Back End: Java, Spring, Sprint Boot, REST, AWS Middleware: Kafka and its ecosystem, Cassandra or NoSQL Testing: JUnit, Gatling , Cypress, Selenium DevOps: Jenkins, GitHub, Docker, Sonar, Fortify Others: JMeter, Groovy Development Methodology or Engineering Practices Agile (SCRUM or KANBAN) knowledge At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 3 days ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities Design and develop software applications and application components in an agile environment using solid expertise on Azure cloud platform Writing complex SQL queries on Azure PostGrey DB Provides explanations and interpretations within area of expertise Create high-quality software by conducting peer design/code reviews and developing automated test cases Design patterns and optimize the performance of services Act as a hands-on engineer, leading by example and working independently Build production-ready features and capabilities adhering to the best engineering standards Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Graduate degree or equivalent experience 3+ years of software development experience using Azure cloud technology like kafka, microservices architecture, including Springboot framework, multi-threading, caching techniques, software engineering best practices, CI or CD systems such as Jenkins, Github Actions Experience designing and developing data-driven applications using RDMBS (Oracle, PL or SQL) and/or NoSQL datastores At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission.

Posted 3 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

5+ years of experience in Java development, with a focus on Java 11/17 or later. - Strong expertise in Spring Boot and Spring Batch for building enterprise-grade applications. - Proficiency in working with Oracle DB, including writing optimized SQL queries and handling large datasets. - Hands-on experience with Mockito, JUnit 5, and other testing frameworks. - Familiarity with Gradle/Jenkins/Harness/Splunk for build and dependency management. - Experience with RESTful API development and integration. - Strong understanding of batch processing concepts, including chunk-based processing and tasklets. - Knowledge of exception handling, logging, and monitoring in Spring-based applications. - Experience with version control systems like Git and tools like IntelliJ IDEA. - Strong problem-solving skills and the ability to work independently or as part of a team. - Hands-on experience with any one of middleware (Kafka, IBM MQ or Solace) will be added advantage

Posted 3 days ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Title: Java Developer Experience: 6+ Years Location: On-Site: Noida Employment Type: Full-Time Job Summary: We are seeking a highly skilled and motivated Java Developer with 6 years of hands-on experience in building scalable, high-performance backend systems. The ideal candidate will have strong expertise in Java 8, Core Java, Multithreading, and Microservices architecture, along with experience in working with Kafka and Spring Boot frameworks. Key Responsibilities: Design, develop, and maintain high-quality Java-based applications using Java 8 and Spring Boot. Implement and optimize multithreaded applications for performance and scalability. Develop and integrate microservices-based solutions aligned with modern architectural patterns. Work with messaging systems like Kafka to ensure reliable communication between distributed systems. Collaborate with cross-functional teams to define, design, and ship new features. Identify bottlenecks and bugs and devise solutions to mitigate and address these issues. Contribute to all phases of the development lifecycle including writing well-designed, testable, and efficient code. Practice strong problem-solving skills and analytical thinking in a fast-paced environment. Required Skills: Core Java & Java 8 – Strong foundational knowledge with hands-on coding experience. Multithreading – Proven experience in writing efficient and thread-safe code. Spring Boot – Deep understanding of creating RESTful APIs, annotations, and dependency injection. Microservices – Experience in designing and developing loosely coupled services. Kafka – Working knowledge of message queues, producers, and consumers. Problem Solving – Strong analytical and debugging skills.

Posted 3 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Teamwork makes the stream work. Roku is changing how the world watches TV Roku is the #1 TV streaming platform in the U.S., Canada, and Mexico, and we've set our sights on powering every television in the world. Roku pioneered streaming to the TV. Our mission is to be the TV streaming platform that connects the entire TV ecosystem. We connect consumers to the content they love, enable content publishers to build and monetize large audiences, and provide advertisers unique capabilities to engage consumers. From your first day at Roku, you'll make a valuable - and valued - contribution. We're a fast-growing public company where no one is a bystander. We offer you the opportunity to delight millions of TV streamers around the world while gaining meaningful experience across a variety of disciplines. About the Team: The Data Foundations team plays a critical role in supporting Roku Ads business intelligence and analytics . The team is responsible for developing and managing foundational datasets designed to serve the operational and analytical needs of the broader organization. The team's mission is carried out through three focus areas: acting as the interface between data producers and consumers, simplifying data architecture, and creating tools in a standardized way . About the Role: We are seeking a talented and experienced Senior Software Engineer with a strong background in big data technologies, including Apache Spark and Apache Airflow. This hybrid role bridges software and data engineering, requiring expertise in designing, building, and maintaining scalable systems for both application development and data processing. You will collaborate with cross-functional teams to design and manage robust, production-grade, large-scale data systems. The ideal candidate is a proactive self-starter with a deep understanding of high-scale data services and a commitment to excellence. What you’ll be doing Software Development: Write clean, maintainable, and efficient code, ensuring adherence to best practices through code reviews. Big Data Engineering: Design, develop, and maintain data pipelines and ETL workflows using Apache Spark, Apache Airflow. Optimize data storage, retrieval, and processing systems to ensure reliability, scalability, and performance. Develop and fine-tune complex queries and data processing jobs for large-scale datasets. Monitor, troubleshoot, and improve data systems for minimal downtime and maximum efficiency. Collaboration & Mentorship: Partner with data scientists, software engineers, and other teams to deliver integrated, high-quality solutions. Provide technical guidance and mentorship to junior engineers, promoting best practices in data engineering. We’re excited if you have Bachelor's degree in Computer Science, Engineering, or a related field (or equivalent experience). 5+ years of experience in software and/or data engineering with expertise in big data technologies such as Apache Spark, Apache Airflow and Trino. Strong understanding of SOLID principles and distributed systems architecture. Proven experience in distributed data processing, data warehousing, and real-time data pipelines. Advanced SQL skills, with expertise in query optimization for large datasets. Exceptional problem-solving abilities and the capacity to work independently or collaboratively. Excellent verbal and written communication skills. Experience with cloud platforms such as AWS, GCP, or Azure, and containerization tools like Docker and Kubernetes. (preferred) Familiarity with additional big data technologies, including Hadoop, Kafka, and Presto. (preferred) Strong programming skills in Python, Java, or Scala. (preferred) Knowledge of CI/CD pipelines, DevOps practices, and infrastructure-as-code tools (e.g., Terraform). (preferred) Expertise in data modeling, schema design, and data visualization tools. (preferred) Benefits Roku is committed to offering a diverse range of benefits as part of our compensation package to support our employees and their families. Our comprehensive benefits include global access to mental health and financial wellness support and resources. Local benefits include statutory and voluntary benefits which may include healthcare (medical, dental, and vision), life, accident, disability, commuter, and retirement options (401(k)/pension). Our employees can take time off work for vacation and other personal reasons to balance their evolving work and life needs. It's important to note that not every benefit is available in all locations or for every role. For details specific to your location, please consult with your recruiter. The Roku Culture Roku is a great place for people who want to work in a fast-paced environment where everyone is focused on the company's success rather than their own. We try to surround ourselves with people who are great at their jobs, who are easy to work with, and who keep their egos in check. We appreciate a sense of humor. We believe a fewer number of very talented folks can do more for less cost than a larger number of less talented teams. We're independent thinkers with big ideas who act boldly, move fast and accomplish extraordinary things through collaboration and trust. In short, at Roku you'll be part of a company that's changing how the world watches TV. We have a unique culture that we are proud of. We think of ourselves primarily as problem-solvers, which itself is a two-part idea. We come up with the solution, but the solution isn't real until it is built and delivered to the customer. That penchant for action gives us a pragmatic approach to innovation, one that has served us well since 2002. To learn more about Roku, our global footprint, and how we've grown, visit https://www.weareroku.com/factsheet. By providing your information, you acknowledge that you have read our Applicant Privacy Notice and authorize Roku to process your data subject to those terms.

Posted 3 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra

Remote

Job Description Senior Data Engineer Our Enterprise Data & Analytics (EDA) is looking for an experienced Senior Data Engineer to join our growing data engineering team. You’ll work in a collaborative Agile environment using the latest engineering best practices with involvement in all aspects of the software development lifecycle. You will craft and develop curated data products, applying standard architectural & data modeling practices to maintain the foundation data layer serving as a single source of truth across Zendesk . You will be primarily developing Data Warehouse Solutions in BigQuery/Snowflake using technologies such as dbt, Airflow, Terraform. What you get to do every single day: Collaborate with team members and business partners to collect business requirements, define successful analytics outcomes and design data models Serve as Data Model subject matter expert and data model spokesperson, demonstrated by the ability to address questions quickly and accurately Implement Enterprise Data Warehouse by transforming raw data into schemas and data models for various business domains using SQL & dbt Design, build, and maintain ELT pipelines in Enterprise Data Warehouse to ensure reliable business reporting using Airflow, Fivetran & dbt Optimize data warehousing processes by refining naming conventions, enhancing data modeling, and implementing best practices for data quality testing Build analytics solutions that provide practical insights into customer 360, finance, product, sales and other key business domains Build and Promote best engineering practices in areas of version control system, CI/CD, code review, pair programming Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery Work with data and analytics experts to strive for greater functionality in our data systems What you bring to the role: Basic Qualifications 5+ years of data engineering experience building, working & maintaining data pipelines & ETL processes on big data environments 5+ years of experience in Data Modeling and Data Architecture in a production environment 5+ years in writing complex SQL queries 5+ years of experience with Cloud columnar databases (We use Snowflake) 2+ years of production experience working with dbt and designing and implementing Data Warehouse solutions Ability to work closely with data scientists, analysts, and other stakeholders to translate business requirements into technical solutions. Strong documentation skills for pipeline design and data flow diagrams. Intermediate experience with any of the programming language: Python, Go, Java, Scala, we primarily use Python Integration with 3rd party API SaaS applications like Salesforce, Zuora, etc Ensure data integrity and accuracy by conducting regular data audits, identifying and resolving data quality issues, and implementing data governance best practices. Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Preferred Qualifications Hands-on experience with Snowflake data platform, including administration, SQL scripting, and query performance tuning Good Knowledge in modern as well as classic Data Modeling - Kimball, Innmon, etc Demonstrated experience in one or many business domains (Finance, Sales, Marketing) 3+ completed “production-grade� projects with dbt Expert knowledge in python What does our data stack looks like: ELT (Snowflake, Fivetran, dbt, Airflow, Kafka, HighTouch) BI (Tableau, Looker) Infrastructure (GCP, AWS, Kubernetes, Terraform, Github Actions) Please note that Zendesk can only hire candidates who are physically located and plan to work from Karnataka or Maharashtra. Please refer to the location posted on the requisition for where this role is based. Hybrid: In this role, our hybrid experience is designed at the team level to give you a rich onsite experience packed with connection, collaboration, learning, and celebration - while also giving you flexibility to work remotely for part of the week. This role must attend our local office for part of the week. The specific in-office schedule is to be determined by the hiring manager. The intelligent heart of customer experience Zendesk software was built to bring a sense of calm to the chaotic world of customer service. Today we power billions of conversations with brands you know and love. Zendesk believes in offering our people a fulfilling and inclusive experience. Our hybrid way of working, enables us to purposefully come together in person, at one of our many Zendesk offices around the world, to connect, collaborate and learn whilst also giving our people the flexibility to work remotely for part of the week. Zendesk is an equal opportunity employer, and we’re proud of our ongoing efforts to foster global diversity, equity, & inclusion in the workplace. Individuals seeking employment and employees at Zendesk are considered without regard to race, color, religion, national origin, age, sex, gender, gender identity, gender expression, sexual orientation, marital status, medical condition, ancestry, disability, military or veteran status, or any other characteristic protected by applicable law. We are an AA/EEO/Veterans/Disabled employer. If you are based in the United States and would like more information about your EEO rights under the law, please click here . Zendesk endeavors to make reasonable accommodations for applicants with disabilities and disabled veterans pursuant to applicable federal and state law. If you are an individual with a disability and require a reasonable accommodation to submit this application, complete any pre-employment testing, or otherwise participate in the employee selection process, please send an e-mail to peopleandplaces@zendesk.com with your specific accommodation request.

Posted 3 days ago

Apply

0.0 years

0 Lacs

Pune, Maharashtra

Remote

Job Description Data Analytics Engineer Our Enterprise Data & Analytics (EDA) is looking for a Data Engineer to join our growing data engineering team. We are a globally distributed remote first team. You’ll work in a collaborative Agile environment using the latest engineering best practices with involvement in all aspects of the software development lifecycle. You will craft and develop curated data products, applying standard architectural & data modeling practices . You will be primarily developing Data Warehouse Solutions using technologies such as dbt, Airflow, terraform. What you get to do every single day: Collaborate with team members and business partners to collect business requirements, define successful analytics outcomes and design data models Use best engineering practices such as version control system, CI/CD, code review, pair programming Design, build, and maintain ELT pipelines in Enterprise Data Warehouse to ensure reliable business reporting Design & build ELT based data models using SQL & dbt Build analytics solutions that provide practical insights into customer 360, finance, product, sales and other key business domains Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery Work with data and analytics experts to strive for greater functionality in our data systems What you bring to the role: Basic Qualifications 2+ years of data / analytics engineering experience building, working & maintaining data pipelines & ETL processes on big data environments Basic knowledge in modern as well as classic Data Modeling - Kimball, Innmon, etc Experience with any of the programming language: Python, Go, Java, Scala, we use primarily Python SQL knowledge and experience working with Cloud columnar databases (We use Snowflake) as well as working familiarity with a variety of databases Familiarity with processes supporting data transformation, data structures, metadata, dependency and workload management Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement. Strong communication skills and be adaptive for changing requirements and tech stack. Preferred Qualifications Demonstrated experience in one or many business domains 1+ completed projects with dbt Proficient knowledge in SQL and/or python Experience using Airflow as data orchestration tool What does our data stack looks like: ELT (Snowflake, dbt, Airflow, Kafka, HighTouch) BI (Tableau, Looker) Infrastructure (GCP, AWS, Kubernetes, Terraform, Github Actions) Please note that Zendesk can only hire candidates who are physically located and plan to work from Karnataka or Maharashtra. Please refer to the location posted on the requisition for where this role is based. Hybrid: In this role, our hybrid experience is designed at the team level to give you a rich onsite experience packed with connection, collaboration, learning, and celebration - while also giving you flexibility to work remotely for part of the week. This role must attend our local office for part of the week. The specific in-office schedule is to be determined by the hiring manager. The intelligent heart of customer experience Zendesk software was built to bring a sense of calm to the chaotic world of customer service. Today we power billions of conversations with brands you know and love. Zendesk believes in offering our people a fulfilling and inclusive experience. Our hybrid way of working, enables us to purposefully come together in person, at one of our many Zendesk offices around the world, to connect, collaborate and learn whilst also giving our people the flexibility to work remotely for part of the week. Zendesk is an equal opportunity employer, and we’re proud of our ongoing efforts to foster global diversity, equity, & inclusion in the workplace. Individuals seeking employment and employees at Zendesk are considered without regard to race, color, religion, national origin, age, sex, gender, gender identity, gender expression, sexual orientation, marital status, medical condition, ancestry, disability, military or veteran status, or any other characteristic protected by applicable law. We are an AA/EEO/Veterans/Disabled employer. If you are based in the United States and would like more information about your EEO rights under the law, please click here . Zendesk endeavors to make reasonable accommodations for applicants with disabilities and disabled veterans pursuant to applicable federal and state law. If you are an individual with a disability and require a reasonable accommodation to submit this application, complete any pre-employment testing, or otherwise participate in the employee selection process, please send an e-mail to peopleandplaces@zendesk.com with your specific accommodation request.

Posted 3 days ago

Apply

0.0 - 12.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Job Description : Job Responsibilities: Collaborates with Product and Engineering stakeholders to design and build platform services that meet key product and infrastructure requirements Produces both detailed designs for platform-level services Must be able to evaluate software and products against business requirements and turn business requirements into robust technical solutions fitting into corporate standards and strategy. Designs and implements microservices with thoughtfully defined APIs Should be conversant with frameworks & Architectures - Spring Boot, Spring Cloud, Spring Batch, Messaging Frameworks (like Kafka), Micro service Architecture Work with other areas of technology team to realize end to end solution and estimation for delivery proposals. Sound understanding of Java concepts, understanding of the technologies in the various architecture tiers – presentation, middleware, data access and integration to propose solution using Java /open-source technologies Design modules that are scalable, reusable, modular, secure. Clearly communicates design decisions, roadblocks and timelines to key stakeholders Adheres to all industry best practices and standards for Agile/Scrum Frameworks adopted by the Organization including but not limited to daily stand-ups, grooming, planning, retrospectives, sprint reviews, demos, and analytics via systems (JIRA) administration to directly support initiatives set by Product Management and the Organization at large Actively participate in Production stabilization and lead system software improvements along with team members. Technical Skills: Candidate Should have at least total 8+ years of experience in IT software development/design architecture. 3+ experience as an Architect in building distributed , highly available and scalable, microservice-based Cloud Native architecture Experience in one or more open-source Java frameworks such as Spring Boot, Spring Batch, Quartz, Spring Cloud, Spring Security, BPM, etc. Experience in single page web application framework like Angular . Experience with at least one type messaging system ( Apache Kafka (Required), RabbitMQ ) Experience with at least one RDBMS ( MySQL, PostgreSQL, Oracle ) Experience with at least one document-oriented DB ( MongoDB, Preferably Couchbase DB ) Experience with NoSQL DB like Elasticsearch Proficient in creating design documents - LLD documents with UML Good Exposure on Design Patterns, Microservices Architecture Design patterns and 12 factor application Experience working with observability/monitoring framework ( Prometheus/Grafana, ELK ) along with any APM tool Ability to conceptualize end-to-end system components across a wide range of technologies and translate into architectural design patterns for implementation Knowledge of security systems like Oauth 2 , Keyclaok and SAML Familiarity with source code version control systems like Git/SVN Experience using, designing, and building REST/GRPC/ GraphQL/Web Service APIs Production experience with container orchestration ( Docker, Kubernetes/CI/CD) and maintaining production environments Good understanding of public clouds GCP, AWS Etc. Good Exposure on API Gateways, Config servers Familiar with OWASP Experience in Telecom BSS (Business Support System) for CRM component s added advantage Qualification: BE/B.Tech/M.Tech/MCA with Computer science Background Mandatory Skills : Spring Cloud, Spring Boot, Kafka Location : Bengaluru, Karnataka, India Years Of Exp : 8 to 12 years Why you should choose us? Are you interested in working for a Global Leader in E-commerce? Are you excited about working on highly scalable platforms and applications that are accessed by millions of users every day? If so, read on to find out more about the opportunity. Rakuten is the largest E-commerce company in Japan and one of the largest E-commerce and Internet Services companies in the World. Rakuten is ranked in top 20 most innovative companies in the world by Forbes. Rakuten India Development Centre is the second largest technology hub outside of Japan that enables & builds platforms for global E commerce, Payments, Digital, AI, Data Science services across the globe. The India arm serves as a research and development center with an employee strength of around 450+ (& Growing). Rakuten is committed to cultivating and preserving a culture of inclusion and connectedness. We are able to grow and learn better together with a diverse team and inclusive workforce. The collective sum of the individual differences, life experiences, knowledge, innovation, self-expression, and talent that our employees invest in their work represents not only part of our culture, but our reputation and Rakuten’s achievement as well. In recruiting for our team, we welcome the unique contributions that you can bring in terms of their education, opinions, culture, ethnicity, race, sex, gender identity and expression, nation of origin, age, languages spoken, veteran’s status, color, religion, disability, sexual orientation and beliefs”

Posted 3 days ago

Apply

2.0 - 10.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Job Description : looking for Java, Spring boot, microservices professional with experience range 6-10 years who can lead the team of 4 to 6 members technically. Key Responsibilities: Works with the solution Designer/Architect to understand and clarify business requirements. Develop microservices which are Rest based or Event based using Java, Spring Boot, Kafka Lead and mentor junior developers and drive them into technical discussions. Coordinate cross-functional teams to deliver on the activities for existing technology solutions Responsible to do the code reviews with the team members Ensuring micro service system architecture is well-maintained as per standards and core principles of business. Participate in Sprint planning, system architecture, solution design and technical detailed design meetings. Deliver High Quality Code with adheres to standards and best practices Present development progress updates to the teams and stakeholders Be responsible for assessing current software development practices and principles to identify and implement process improvements Technical skill you should have Overall, 6 to 10 years of experience with Java 8, Spring Boot, Microservices Minimum 2 years of work experience as a Technical Lead Must have Development Experience on Reactive Spring Boot. Solid experience using Spring Boot, Spring Core, Reactive Spring, Spring Cloud, Spring Data Integration, Hibernate, and Microservices Should have experience in writing & consuming REST API Must have strong experience with Microservice Architecture Concepts Must have Experience on Kafka (Event based model) and Elasticsearch Must have Experience with any NoSQL Database like Couchbase(preferable), MongoDB, Cassandra, SQL Database like MySQL Experience with designing, implementing, and deploying micro services in distributed systems. Experience using GIT, Junit, Maven, Jenkins, Sonar Cloud Platform - Docker / Kubernetes/ CI/CD/AWS or any other - decent knowledge. Good Exposure on Design Patterns, API Gateways, Config servers Mandatory Skills : Java, Spring boot, Microservices Location : Bengaluru, Karnataka, India Years Of Exp : 6 to 10 years Why you should choose us? Are you interested in working for a Global Leader in E-commerce? Are you excited about working on highly scalable platforms and applications that are accessed by millions of users every day? If so, read on to find out more about the opportunity. Rakuten is the largest E-commerce company in Japan and one of the largest E-commerce and Internet Services companies in the World. Rakuten is ranked in top 20 most innovative companies in the world by Forbes. Rakuten India Development Centre is the second largest technology hub outside of Japan that enables & builds platforms for global E commerce, Payments, Digital, AI, Data Science services across the globe. The India arm serves as a research and development center with an employee strength of around 450+ (& Growing). Rakuten is committed to cultivating and preserving a culture of inclusion and connectedness. We are able to grow and learn better together with a diverse team and inclusive workforce. The collective sum of the individual differences, life experiences, knowledge, innovation, self-expression, and talent that our employees invest in their work represents not only part of our culture, but our reputation and Rakuten’s achievement as well. In recruiting for our team, we welcome the unique contributions that you can bring in terms of their education, opinions, culture, ethnicity, race, sex, gender identity and expression, nation of origin, age, languages spoken, veteran’s status, color, religion, disability, sexual orientation and beliefs”

Posted 3 days ago

Apply

0.0 - 13.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Job Description : Job Title: Staff Engineer - MCPD Department Overview: Marketing Cloud Platform Department (MCPD)’s mission is to lead Rakuten’s marketing related products strategy, execute product development and implementation. We empower Rakuten internal marketing teams by building engaging, respectful and cost-efficient marketing platforms that put our customer at the center. Our main advantage comes from our ability to leverage the Rakuten Ecosystem. We provide marketing solutions such as marketing campaign management, multichannel communication and personalization. As a 200+ experts across Japan, India and Singapore, we are proud to be a technology organization and we share our knowledge across the Rakuten Tech community. Job Description: Position Overview: We are seeking a highly skilled and experienced Staff Engineer to to join our engineering team. The ideal candidate will possess deep expertise in Java, advanced Java, and microservices architecture, with strong skills in system design, low-level design (LLD), and database scaling. You will play a pivotal role in shaping our technology stack, ensuring high availability and scalability, and mentoring junior engineers. A key focus will be on leveraging Kubernetes for container orchestration and driving technical excellence across the organization. As a Staff Engineer, you will be responsible for hands-on and overseeing the development and delivery of our MCPD products, ensuring the highest standards of quality, performance, and reliability. Key Responsibilities: Architect and Design Systems: Lead the design and development of highly scalable and resilient microservices-based systems, providing both high-level architecture (HLD) and detailed low-level designs (LLD). Code Reviews and Best Practices: Drive engineering best practices through code reviews, design discussions, and collaboration with cross-functional teams. Database Scaling: Design and optimize databases to support high transaction volumes, ensuring efficient scaling and performance tuning for production environments. Microservices Development: Build, deploy, and manage microservices using modern technologies and frameworks, ensuring smooth operation in distributed environments. System Ownership: Take full ownership of features or systems from conception to production, ensuring they meet reliability, performance, and security standards. Kubernetes & Cloud-Native Architecture: Leverage Kubernetes for deploying, scaling, and managing containerized applications, ensuring seamless orchestration in cloud environments. Mentorship and Leadership: Mentor junior engineers and contribute to their professional growth through knowledge sharing, coaching, and promoting a culture of continuous improvement. Collaboration: Work closely with product managers, architects, and other engineering teams to translate business requirements into technical solutions. Innovation and R&D: Stay updated with the latest technology trends and evaluate new tools, frameworks, and methodologies to ensure the team is leveraging cutting-edge solutions. Qualifications: Bachelor’s or master’s degree in computer science, Engineering, or a related field. 10+ years of experience in software development, with expertise in Java and advanced Java concepts. Proven experience with microservices architecture, including hands-on experience building and maintaining distributed systems. Strong understanding of system design and ability to create high-availability, high-performance applications. Experience with low-level design (LLD), providing detailed design solutions for complex problems. Deep knowledge of database scaling techniques, such as partitioning, sharding, and replication. Proficient in Kubernetes for container orchestration, including production experience with large-scale deployments. Knowledge of messaging systems (Kafka, RabbitMQ). Familiarity with cloud platforms (AWS, GCP, Azure) and modern DevOps practices. Preferred Skills: Experience with other languages such as Python, Go, or Node.js. Familiarity with AIOps, observability, and monitoring tools. Mandatory Skills : Java, Spring Boot, GO, Python, LLD, HLD Location : Bangalore, Karnataka, India Years Of Exp : 10 to 13 years Why you should choose us? Are you interested in working for a Global Leader in E-commerce? Are you excited about working on highly scalable platforms and applications that are accessed by millions of users every day? If so, read on to find out more about the opportunity. Rakuten is the largest E-commerce company in Japan and one of the largest E-commerce and Internet Services companies in the World. Rakuten is ranked in top 20 most innovative companies in the world by Forbes. Rakuten India Development Centre is the second largest technology hub outside of Japan that enables & builds platforms for global E commerce, Payments, Digital, AI, Data Science services across the globe. The India arm serves as a research and development center with an employee strength of around 450+ (& Growing). Rakuten is committed to cultivating and preserving a culture of inclusion and connectedness. We are able to grow and learn better together with a diverse team and inclusive workforce. The collective sum of the individual differences, life experiences, knowledge, innovation, self-expression, and talent that our employees invest in their work represents not only part of our culture, but our reputation and Rakuten’s achievement as well. In recruiting for our team, we welcome the unique contributions that you can bring in terms of their education, opinions, culture, ethnicity, race, sex, gender identity and expression, nation of origin, age, languages spoken, veteran’s status, color, religion, disability, sexual orientation and beliefs”

Posted 3 days ago

Apply

0.0 - 6.0 years

0 Lacs

Saidapet, Chennai, Tamil Nadu

On-site

Job Information Date Opened 07/11/2025 City Saidapet Country India Job Role Data Engineering State/Province Tamil Nadu Industry IT Services Job Type Full time Zip/Postal Code 600096 Job Description Introduction to the Role: Are you passionate about unlocking the power of data to drive innovation and transform business outcomes? Join our cutting-edge Data Engineering team and be a key player in delivering scalable, secure, and high-performing data solutions across the enterprise. As a Data Engineer , you will play a central role in designing and developing modern data pipelines and platforms that support data-driven decision-making and AI-powered products. With a focus on Python , SQL , AWS , PySpark , and Databricks , you'll enable the transformation of raw data into valuable insights by applying engineering best practices in a cloud-first environment. We are looking for a highly motivated professional who can work across teams to build and manage robust, efficient, and secure data ecosystems that support both analytical and operational workloads. Accountabilities: Design, build, and optimize scalable data pipelines using PySpark , Databricks , and SQL on AWS cloud platforms . Collaborate with data analysts, data scientists, and business users to understand data requirements and ensure reliable, high-quality data delivery. Implement batch and streaming data ingestion frameworks from a variety of sources (structured, semi-structured, and unstructured data). Develop reusable, parameterized ETL/ELT components and data ingestion frameworks. Perform data transformation, cleansing, validation, and enrichment using Python and PySpark . Build and maintain data models, data marts, and logical/physical data structures that support BI, analytics, and AI initiatives. Apply best practices in software engineering, version control (Git), code reviews, and agile development processes. Ensure data pipelines are well-tested, monitored, and robust with proper logging and alerting mechanisms. Optimize performance of distributed data processing workflows and large datasets. Leverage AWS services (such as S3, Glue, Lambda, EMR, Redshift, Athena) for data orchestration and lakehouse architecture design. Participate in data governance practices and ensure compliance with data privacy, security, and quality standards. Contribute to documentation of processes, workflows, metadata, and lineage using tools such as Data Catalogs or Collibra (if applicable). Drive continuous improvement in engineering practices, tools, and automation to increase productivity and delivery quality. Essential Skills / Experience: 4 to 6 years of professional experience in Data Engineering or a related field. Strong programming experience with Python and experience using Python for data wrangling, pipeline automation, and scripting. Deep expertise in writing complex and optimized SQL queries on large-scale datasets. Solid hands-on experience with PySpark and distributed data processing frameworks. Expertise working with Databricks for developing and orchestrating data pipelines. Experience with AWS cloud services such as S3 , Glue , EMR , Athena , Redshift , and Lambda . Practical understanding of ETL/ELT development patterns and data modeling principles (Star/Snowflake schemas). Experience with job orchestration tools like Airflow , Databricks Jobs , or AWS Step Functions . Understanding of data lake, lakehouse, and data warehouse architectures. Familiarity with DevOps and CI/CD tools for code deployment (e.g., Git, Jenkins, GitHub Actions). Strong troubleshooting and performance optimization skills in large-scale data processing environments. Excellent communication and collaboration skills, with the ability to work in cross-functional agile teams. Desirable Skills / Experience: AWS or Databricks certifications (e.g., AWS Certified Data Analytics, Databricks Data Engineer Associate/Professional). Exposure to data observability , monitoring , and alerting frameworks (e.g., Monte Carlo, Datadog, CloudWatch). Experience working in healthcare, life sciences, finance, or another regulated industry. Familiarity with data governance and compliance standards (GDPR, HIPAA, etc.). Knowledge of modern data architectures (Data Mesh, Data Fabric). Exposure to streaming data tools like Kafka, Kinesis, or Spark Structured Streaming. Experience with data visualization tools such as Power BI, Tableau, or QuickSight. Work Environment & Collaboration: We value a hybrid, collaborative environment that encourages shared learning and innovation. You will work closely with product owners, architects, analysts, and data scientists across geographies to solve real-world business problems using cutting-edge technologies and methodologies. We encourage flexibility while maintaining a strong in-office presence for better team synergy and innovation. About Agilisium - Agilisium, is an AWS technology Advanced Consulting Partner that enables companies to accelerate their "Data-to-Insights-Leap. With $25+ million in annual revenue and over 40% year-over-year growth, Agilisium is one of the fastest-growing IT solution providers in Southern California. Our most important asset? People. Talent management plays a vital role in our business strategy. We’re looking for “drivers”; big thinkers with growth and strategic mindset — people who are committed to customer obsession, aren’t afraid to experiment with new ideas. And we are all about finding and nurturing individuals who are ready to do great work. At Agilisium, you’ll collaborate with great minds while being challenged to meet and exceed your potential

Posted 3 days ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana

On-site

Position Title: Enterprise Integration Architect Position Type: Regular - Full-Time Position Location: Gurgaon Requisition ID: 37105 Job Overview: We are seeking a highly experienced Senior Application Integration Architect to lead the design, development, and governance of enterprise integration solutions across cloud, hybrid, and on-premises environments. This role requires deep expertise in integration architecture patterns, API-led connectivity, Event Driven Architecture, API Management and iPaaS platforms to drive digital transformation initiatives and ensure scalable, secure, and maintainable integrations across our application landscape. You will collaborate closely with enterprise architects, product teams, infrastructure, and security stakeholders to shape the integration strategy, define best practices, and oversee execution. Key Responsibilities: Define and implement enterprise integration architectures leveraging API-led and event-driven patterns to enable agility, reusability, and rapid delivery Lead the design and governance of integration solutions, ensuring compliance with enterprise-wide architecture principles. Provide best practices for API management, EDA (Event-Driven Architecture), and hybrid cloud integrations. Collaborate with business stakeholders, solution architects, and development teams to design and implement integration solutions. Define integration patterns, standards, and guidelines to improve performance, reusability, and security. Ensure smooth integration with SAP S/4HANA, legacy systems, third-party applications, and external APIs. Drive CI/CD automation, DevOps practices, and monitoring strategies for integration solutions. Troubleshoot complex integration issues and optimize solutions for better performance. Guide and mentor integration developers in implementing best practices and architectural guidelines. Required Skills & Qualifications: 10+ years of experience in enterprise integration architecture and iPaaS solutions. 5+ years of hands-on experience with any iPaaS platform with Cloud Integration, API Management, Event Mesh, B2B Integration capabilities. Strong knowledge of integration architectures, patterns, middleware, APIs (REST/SOAP), microservices, event-driven architectures, and security best practices. Proficiency in API security, OAuth, JWT, SAML, and other authentication mechanisms. Hands-on experience with CI/CD pipelines, DevOps, and monitoring tools for integration solutions. Knowledge of Kafka, MQ, Webhooks, and other event streaming technologies. Ability to lead architecture discussions, influence stakeholders, and align technology with business goals. Excellent communication, leadership, and documentation skills. Soft Skills: Strong communication and stakeholder management skills. Strategic thinking with a passion for problem solving and innovation. Ability to work in a fast-paced , cross-functional environment. Proven leadership and mentoring capabilities. McCain Foods is an equal opportunity employer. As a global family-owned company, we strive to be the employer of choice in the diverse communities around the world in which we live and work. We recognize that inclusion drives our creativity, resilience, and success and makes our business stronger. All qualified applicants will receive consideration for employment without regard to race, religion, color, national origin, sex, age, veteran status, disability, or any other protected characteristic under applicable law. McCain is an accessible employer. If you require an accommodation throughout the recruitment process (including alternate formats of materials or accessible meeting rooms), please let us know and we will work with you to find appropriate solutions. Your privacy is important to us. By submitting personal data or information to us, you agree this will be handled in accordance with McCain’s Global Privacy Policy and Global Employee Privacy Policy , as applicable. You can understand how your personal information is being handled here . Job Family: Information Technology Division: Global Digital Technology Department: GDTC India Function Location(s): IN - India : Haryana : Gurgaon Company: McCain Foods(India) P Ltd

Posted 3 days ago

Apply

0.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Job Description : As a Camunda Developer, he will design, develop, and implement Camunda components, primarily Job Workers and Process Models 4 to 6 years of IT experience with at least 2 years of hands on experience in development using Camunda Develop, test, deploy, and maintain large-scale Java 8+ applications using Spring Boot microservices architecture. Good at algorithms / Data structures Troubleshoot issues in production environments by analyzing logs files, debugging techniques, and collaborating with team members. Good Problem solving and communication skills Hands on coding experience in Core Java Spring boot , spring security , Spring DATA JPA, Kafka Hands on Experience in RestFull Web services and microservices Hands on experience in Kafka or any messaging Queues Hands on experience in writing data base queries either RDBMS and SQL queries or nosql Hands on knowledge in writing unit test cases using java test framework (Junits, Mockito) Profound insight of Java and JEE internals (Classloading, Memory Management, Transaction management etc) Experience with Atlassian Jira/Confluence Experience with version control and CI/CD tools like gitlab/GitHub Secondary Collaborate with cross-functional teams to identify requirements and implement solutions that meet business needs. Strong Ability to troubleshoot vulenerabilities and sonar issues and remediate with appropriate sonar rules compliant solution. Experience of working in Agile projects and aware of key Agile concepts Knowledge on CI/CD process Knowledge on Open API spec / swagger 2.0 Knowledge on NOSQL Mandatory Skills : Java, Spring boot, microservices and Camunda Location : Bangalore, Karnataka, India Years Of Exp : 4 to 8 years Why you should choose us? Are you interested in working for a Global Leader in E-commerce? Are you excited about working on highly scalable platforms and applications that are accessed by millions of users every day? If so, read on to find out more about the opportunity. Rakuten is the largest E-commerce company in Japan and one of the largest E-commerce and Internet Services companies in the World. Rakuten is ranked in top 20 most innovative companies in the world by Forbes. Rakuten India Development Centre is the second largest technology hub outside of Japan that enables & builds platforms for global E commerce, Payments, Digital, AI, Data Science services across the globe. The India arm serves as a research and development center with an employee strength of around 450+ (& Growing). Rakuten is committed to cultivating and preserving a culture of inclusion and connectedness. We are able to grow and learn better together with a diverse team and inclusive workforce. The collective sum of the individual differences, life experiences, knowledge, innovation, self-expression, and talent that our employees invest in their work represents not only part of our culture, but our reputation and Rakuten’s achievement as well. In recruiting for our team, we welcome the unique contributions that you can bring in terms of their education, opinions, culture, ethnicity, race, sex, gender identity and expression, nation of origin, age, languages spoken, veteran’s status, color, religion, disability, sexual orientation and beliefs”

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies