Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Position: We are looking for a skilled Java Kafka Developer to join our dynamic engineering team. You will be responsible for designing, developing, and maintaining real-time data streaming applications using Apache Kafka and Java technologies. Your role will be critical in building scalable and fault-tolerant event-driven systems that support our business needs. Role: Senior Java Kafka Developer Location: Hyderabad Experience: 6-10 Years Job Type: Full Time Employment What You'll Do: Design, develop, and maintain robust Java applications using Spring Framework. Develop and manage event-driven systems using Apache Kafka. Implement enterprise integration solutions using Apache Camel. Collaborate with cross-functional teams to gather and analyze requirements. Optimize and troubleshoot performance issues related to messaging and integration flows. Ensure code quality by conducting code reviews and writing unit/integration tests. Participate in agile development processes and deliver high-quality software on schedule. Mentor junior developers and share best practices across the team. Expertise You'll Bring: 6 to 10 years of hands-on experience in Java development. Strong expertise in Spring Framework (Spring Boot, Spring MVC, Spring Cloud). Solid experience with Apache Kafka for building event-driven architectures. Hands-on experience with Apache Camel for enterprise integration patterns. Good understanding of RESTful APIs, microservices architecture, and messaging systems. Experience with build tools (Maven/Gradle), version control (Git), and CI/CD pipelines. Knowledge of database technologies such as SQL and NoSQL databases. Strong problem-solving skills and ability to work in a fast-paced environment. Excellent communication and interpersonal skills. Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent “Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind.”
Posted 1 day ago
6.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Entity: Technology Job Family Group: IT&S Group Job Description: You will work with You will lead a high-energy, top-performing team of engineers and product managers, working alongside technology and business leaders to shape the vision and drive the execution of transformative data initiatives that make a real impact. Let me tell you about the role As a Senior Master Data Platform Services Manager, you will play a strategic role in shaping and securing enterprise-wide technology landscapes, ensuring their resilience, performance, and compliance. You will provide deep expertise in security, infrastructure, and operational excellence, driving large-scale transformation and automation initiatives. Your role will encompass platform architecture, system integration, cybersecurity, and operational continuity. You will be collaborating with engineers, architects, and business partners, working to establish robust governance models, technology roadmaps, and innovative security frameworks to safeguard critically important enterprise applications. What you will deliver Design and implement enterprise technology architecture, security frameworks, and platform engineering. Strengthen platform security and ensure compliance with industry standards and regulations. Optimize system performance, availability, and scalability. Advance enterprise modernization and drive seamless integration with enterprise IT. Establish governance, security standards, and risk management strategies. Develop automated security monitoring, vulnerability assessments, and identity management solutions. Drive adoption of CI/CD, DevOps, and Infrastructure-as-Code methodologies. Enhance disaster recovery and resilience planning for enterprise platforms. Partner with technology teams and external vendors to align enterprise solutions with business goals. Lead and mentor engineering teams, fostering a culture of innovation and excellence. Shape strategies for enterprise investments, cybersecurity risk mitigation, and operational efficiency! Collaborate across teams to implement scalable solutions and long-term technology roadmaps. What you will need to be successful (experience and qualifications) Technical Skills We Need From You Bachelor’s degree in technology, Engineering, or a related technical discipline. 6+ years of experience in enterprise technology, security, and operations in large-scale global environments. Experience implementing CI/CD pipelines, DevOps methodologies, and Infrastructure-as-Code (AWS Cloud Development Kit, Azure Bicep, etc.). Deep knowledge of ITIL, Agile, and enterprise IT governance frameworks. Proficiency in programming languages such as Python, Java, or Scala. Experience with data pipeline frameworks (e.g., Apache Airflow, Kafka, Spark) and cloud-based data platforms (AWS, GCP, Azure). Expertise in database technologies (SQL, NoSQL, Data Lakes) and data modeling principles. Essential Skills Proven technical expertise in Microsoft Azure, AWS, Databricks, and Palantir. Strong understanding of data ingestion, pipelines, governance, security, and visualization. Experience designing, deploying, and optimizing multi-cloud data platforms that support large-scale, cloud-native workloads balancing cost efficiency with performance and resilience. Hands-on performance tuning, data indexing, and distributed query optimization. Experience with real-time, and batch data streaming architectures. Skills That Set You Apart Proven success navigating global, highly regulated environments, ensuring compliance, security, and enterprise-wide risk management. AI/ML-driven data engineering expertise, applying intelligent automation to optimize workflows. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management {+ 4 more} Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.
Posted 1 day ago
5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Entity: Technology Job Family Group: IT&S Group Job Description: You will work with You will be part of a high-energy, top-performing team of engineers and product managers, working alongside technology and business leaders to support the execution of transformative data initiatives that make a real impact. Let me tell you about the role As a Senior Data Platform Services Engineer, you will play a strategic role in shaping and securing enterprise-wide technology landscapes, ensuring their resilience, performance, and compliance. You will provide deep expertise in security, infrastructure, and operational excellence, driving large-scale transformation and automation initiatives. Your role will encompass platform architecture, system integration, cybersecurity, and operational continuity. You will be collaborating with engineers, architects, and business partners, working to establish robust governance models, technology roadmaps, and innovative security frameworks to safeguard critically important enterprise applications. What You Will Deliver Contribute to enterprise technology architecture, security frameworks, and platform engineering for our core data platform. Support end-to-end security implementation across our unified data platform, ensuring compliance with industry standards and regulatory requirements. Help drive operational excellence by supporting system performance, availability, and scalability. Contribute to modernization and transformation efforts, assisting in integration with enterprise IT systems. Assist in the design and execution of automated security monitoring, vulnerability assessments, and identity management solutions. Apply DevOps, CI/CD, and Infrastructure-as-Code (IaC) approaches to improve deployment and platform consistency. Support disaster recovery planning and high availability for enterprise platforms. Collaborate with engineering and operations teams to ensure platform solutions align with business needs. Provide guidance on platform investments, security risks, and operational improvements. Partner with senior engineers to support long-term technical roadmaps that reduce operational burden and improve scalability! What you will need to be successful (experience and qualifications) Technical Skills We Need From You Bachelor’s degree in technology, engineering, or a related technical discipline. 3–5 years of experience in enterprise technology, security, or platform operations in large-scale environments. Experience with CI/CD pipelines, DevOps methodologies, and Infrastructure-as-Code (e.g., AWS CDK, Azure Bicep). Knowledge of ITIL, Agile delivery, and enterprise governance frameworks. Proficiency with big data technologies such as Apache Spark, Hadoop, Kafka, and Flink. Experience with cloud platforms (AWS, GCP, Azure) and cloud-native data solutions (BigQuery, Redshift, Snowflake, Databricks). Strong skills in SQL, Python, or Scala, and hands-on experience with data platform engineering. Understanding of data modeling, data warehousing, and distributed systems architecture. Essential Skills Technical experience in Microsoft Azure, AWS, Databricks, and Palantir. Understanding of data ingestion pipelines, governance, security, and data visualization. Experience supporting multi-cloud data platforms at scale—balancing cost, performance, and resilience. Familiarity with performance tuning, data indexing, and distributed query optimization. Exposure to both real-time and batch data streaming architectures Skills That Set You Apart Proven success navigating global, highly regulated environments, ensuring compliance, security, and enterprise-wide risk management. AI/ML-driven data engineering expertise, applying intelligent automation to optimize workflows. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management {+ 4 more} Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.
Posted 1 day ago
8.0 - 12.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Entity: Technology Job Family Group: IT&S Group Job Description: You will work with bp is transforming and at our Digital Hub in Pune we are growing the digital expertise and solutions needed to advance the global energy transition. Digital Engineering are a team of technology and software specialists providing innovative, custom-built or customized software and technical platforms to bp colleagues and external users. Let Me Tell You About The Role As an Integration Senior Enterprise Tech Engineer, you are a senior member of a team creating Application Integration solutions for BP colleagues and external users. Your team’s mission is to be the digital provider of choice to your area of BP – delivering innovation at speed where it's wanted, and day-in-day-out reliability where it's needed. You will operate in a dynamic and commercially focussed environment, with the resources of one of the world's largest Digital organisations and leading Digital and IT vendors working with you. You will be part of growing and strengthening our technical talent base – experts coming together to solve BP and the world’s problems. What You Will Deliver Lead enterprise technology architecture, security frameworks, and platform engineering across enterprise landscapes. Oversee the end-to-end security of enterprise platforms, ensuring compliance with industry standards and regulatory requirements. Drive enterprise operations excellence, optimising system performance, availability, and scalability. Provide leadership in enterprise modernization and transformation, ensuring seamless integration with enterprise IT. Establish governance, security standards, and risk management strategies aligned with global security policies. Design and implement automated security monitoring, vulnerability assessments, and identity management solutions for enterprise environments. Drive CI/CD, DevOps, and Infrastructure-as-Code adoption for enterprise deployments. Ensure disaster recovery, high availability, and resilience planning for enterprise platforms. Engage with business leaders, technology teams, and external vendors to ensure enterprise solutions align with enterprise goals. Mentor and lead enterprise security and operations teams, fostering a culture of excellence, innovation, and continuous improvement. Provide executive-level insights and technical recommendations on enterprise investments, cybersecurity threats, and operational risks What you will need to be successful (experience and qualifications) Technical Skills We Need From You Bachelors (or higher) degree, ideally in Computer Science, MIS/IT, Mathematics or a hard science. Years of experience: 8-12 years, with a minimum of 5-7 years of relevant experience. Essential Skills SME in enterprise integration domain, should be able to design highly scalable integrations which involves with API, Messaging, Files, Databases, and cloud services Experienced in Integration tools like TIBCO/MuleSoft, Apache Camel/ Spring Integration, Confluent Kafka...etc. Expert in Enterprise Integration Patterns (EIPs) and iBlocks to build secure integrations Willingness and ability to learn, to become skilled in at least one more cloud-native (AWS and Azure) integration solutions on top of your existing skillset. Deep understanding of the Interface development lifecycle, including design, security, design patterns for extensible and reliable code, automated unit and functional testing, CI/CD and telemetry Demonstrated understanding of modern technologies like Cloud native, containers, serverless Emerging Technology Monitoring Application Support Strong inclusive leadership and people management Stakeholder Management Embrace a culture of continuous improvement Skills That Set You Apart Agile methodologies ServiceNow Risk Management Systems Development Management Monitoring and telemetry tools User Experience Analysis Cybersecurity and compliance Key Behaviors: Empathetic: Cares about our people, our community and our planet Curious: Seeks to explore and excel Creative: Imagines the extraordinary Inclusive: Brings out the best in each other About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Automation, Integration Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.
Posted 1 day ago
5.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Entity: Technology Job Family Group: IT&S Group Job Description: You will work with You will be part of a high-energy, top-performing team of engineers and product managers, working alongside technology and business leaders to support the execution of transformative data initiatives that make a real impact. Let Me Tell You About The Role As a Senior Data Tooling Services Engineer, you will play a strategic role in shaping and securing enterprise-wide technology landscapes, ensuring their resilience, performance, and compliance. You will provide deep expertise in security, infrastructure, and operational excellence, driving large-scale transformation and automation initiatives. Your role will encompass platform architecture, system integration, cybersecurity, and operational continuity. You will be collaborating with engineers, architects, and business partners, working to establish robust governance models, technology roadmaps, and innovative security frameworks to safeguard critically important enterprise applications. What You Will Deliver Contribute to enterprise technology architecture, security frameworks, and platform engineering for our core data platform. Support end-to-end security implementation across our unified data platform, ensuring compliance with industry standards and regulatory requirements. Help drive operational excellence by supporting system performance, availability, and scalability. Contribute to modernization and transformation efforts, assisting in integration with enterprise IT systems. Assist in the design and execution of automated security monitoring, vulnerability assessments, and identity management solutions. Apply DevOps, CI/CD, and Infrastructure-as-Code (IaC) approaches to improve deployment and platform consistency. Support disaster recovery planning and high availability for enterprise platforms. Collaborate with engineering and operations teams to ensure platform solutions align with business needs. Provide guidance on platform investments, security risks, and operational improvements. Partner with senior engineers to support long-term technical roadmaps that reduce operational burden and improve scalability! What you will need to be successful (experience and qualifications) Technical Skills We Need From You Bachelor’s degree in technology, engineering, or a related technical discipline. 3–5 years of experience in enterprise technology, security, or platform operations in large-scale environments. Experience with CI/CD pipelines, DevOps methodologies, and Infrastructure-as-Code (e.g., AWS CDK, Azure Bicep). Knowledge of ITIL, Agile delivery, and enterprise governance frameworks. Proficiency with big data technologies such as Apache Spark, Hadoop, Kafka, and Flink. Experience with cloud platforms (AWS, GCP, Azure) and cloud-native data solutions (BigQuery, Redshift, Snowflake, Databricks). Strong skills in SQL, Python, or Scala, and hands-on experience with data platform engineering. Understanding of data modeling, data warehousing, and distributed systems architecture. Essential Skills Technical experience in Microsoft Azure, AWS, Databricks, and Palantir. Understanding of data ingestion pipelines, governance, security, and data visualization. Experience supporting multi-cloud data platforms at scale—balancing cost, performance, and resilience. Familiarity with performance tuning, data indexing, and distributed query optimization. Exposure to both real-time and batch data streaming architectures Skills That Set You Apart Proven success navigating global, highly regulated environments, ensuring compliance, security, and enterprise-wide risk management. AI/ML-driven data engineering expertise, applying intelligent automation to optimize workflows. About Bp Our purpose is to deliver energy to the world, today and tomorrow. For over 100 years, bp has focused on discovering, developing, and producing oil and gas in the nations where we operate. We are one of the few companies globally that can provide governments and customers with an integrated energy offering. Delivering our strategy sustainably is fundamental to achieving our ambition to be a net zero company by 2050 or sooner! We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management {+ 4 more} Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.
Posted 1 day ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
PFB the detailed JD: 🔹 Experience: 6 to 8+ years (Hands-on) 🔹 Location: Pune (WFO) 🔹 Notice Period: 0-30 Days Must Have: Proficiency in at least one of the following programming languages: Java/Scala/Python Good understanding of SQL Experience of development and deployment of at least one end-to-end data storage/processing pipeline Strong Experience in Spark development with batch and streaming Intermediate level expertise in HDFS and Hive Experience with Pyspark and Data Engineering ETL implementation and migration to spark Experience of working with Hadoop cluster Python, PySpark, Data Bricks developer with knowledge of cloud Experience with Kafka and Spark streaming (Dstream and Structured Streaming) Experience with using Jupyter notebooks or any other developer tool Experience with Airflow or other workflow engines Good communication skills and logical skills Good to Have Skills: Prior experience of writing Spark jobs using Java is highly appreciated Prior experience of working with Cloudera Data Platform (CDP) Hands-on experience with NoSQL databases like HBase, Cassandra, Elasticsearch, etc. Experience of using maven and git Agile scrum methodologies Flink and Kudu streaming Automation of workflows CI/CD Nifi streaming and transformation
Posted 1 day ago
0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Role As a lead, your primary role will be to build products and services for Airtel’s growth. You will be spending better part of the time in development and writing technical documents. You will solve business problems at scale, work on existing and new repositories, review code, write design documents and help in resolving customer issues. As senior lead, you will build and mentor engineering teams and deliver on below : Technical Expertise : Provide guidance on technical and functional aspects of project and take decisions. You will setup and execute best practices in application architecture, development, code reviews, performance, deployment and execution by owning end-to-end business deliveries Innovation and continuous learning. : You will build culture of innovation and continuous improvement in your team. You will encourage and adopt new technology and emerging industry trends and methods for improving efficiency of your team Communication and Collaboration : You will facilitate communication within the engineering teams and stakeholders including quality, operations, product and program to ensure alignment for business goals. You will address technical issues in teams and promote positive working environment Team Management : Conduct performance review and setup constructive feedback loop for team members along with Engineering Managers. You will be responsible for hiring new members for effectively meeting organizational goals and deadlines Project Management : You will be part of overall planning by providing correct estimation and execution of engineering projects and ensure their timely delivery At Airtel, We Use Many Technologies Including Application layer technologies including Tomcat/Nodejs, Netty, Springboot, hibernate, Elasticsearch, Kafka, Apache flink Caching technologies like Redis, Aerospike or Hazelcast Frontend technologies including ReactJs, Angular, Android/IOS Data storage technologies like Oracle, S3, Postgres, Mongodb Tooling including Git, Command line, Jenkins, Nifi, Airflow, Jmeter, Postman, Gatling, Ngnix/Haproxy, Jira/Confluence, Grafana, K8s#BAL
Posted 1 day ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
We are seeking a skilled Data Engineer to join our growing data team in India. You will be responsible for designing, building, and maintaining scalable data infrastructure and pipelines that enable data-driven decision making across our organization and client projects. This role offers the opportunity to work with cutting-edge technologies and contribute to innovative data solutions for global clients. What you do Technical Skills Minimum 3+ years of experience in data engineering or related field Strong programming skills in Python and/or Scala/Java Experience with SQL and database technologies (PostgreSQL, MySQL, MongoDB) Hands-on experience with data processing frameworks: Apache Spark, Hadoop ecosystem Apache Kafka for streaming data Apache Airflow or similar workflow orchestration tools Knowledge of data warehouse concepts and technologies Experience with containerization (Docker, Kubernetes) Understanding of data modeling principles and best practices Cloud & Platform Experience Experience with at least one major cloud platform (AWS, Azure, or GCP) Familiarity with cloud-native data services: Data lakes, data warehouses, and analytics services Server less computing and event-driven architectures Identity and access management for data systems Knowledge of Infrastructure as Code (Terraform, CloudFormation, ARM templates) Data & Analytics Understanding of data governance and security principles Experience with data quality frameworks and monitoring Knowledge of dimensional modeling and data warehouse design Familiarity with business intelligence and analytics tools Understanding of data privacy regulations (GDPR, CCPA) Preferred Qualifications Advanced Technical Skills Experience with modern data stack tools (dbt, Fivetran, Snowflake, Databricks) Knowledge of machine learning pipelines and MLOps practices Experience with event-driven architectures and microservices Familiarity with data mesh and data fabric concepts Experience with graph databases (Neo4j, Amazon Neptune) Industry Experience Experience in digital agency or consulting environment Background in financial services, e-commerce, retail, or customer experience platforms Knowledge of marketing technology and customer data platforms Experience with real-time analytics and personalization systems Soft Skills Strong problem-solving and analytical thinking abilities Excellent communication skills for client-facing interactions Ability to work independently and manage multiple projects Adaptability to rapidly changing technology landscape Experience mentoring junior team members What we ask Data Infrastructure & Architecture Design and implement robust, scalable data architectures and pipelines Build and maintain ETL/ELT processes for batch and real-time data processing Develop data models and schemas optimized for analytics and reporting Ensure data quality, consistency, and reliability across all data systems Platform-Agnostic Development Work with multiple cloud platforms (AWS, Azure, GCP) based on client requirements Implement data solutions using various technologies and frameworks Adapt quickly to new tools and platforms as project needs evolve Maintain expertise across different cloud ecosystems and services Data Pipeline Development Create automated data ingestion pipelines from various sources (APIs, databases, files, streaming) Implement data transformation logic using modern data processing frameworks Build monitoring and alerting systems for data pipeline health Optimize pipeline performance and cost-efficiency Collaboration & Integration Work closely with data scientists, analysts, and business stakeholders Collaborate with DevOps teams to implement CI/CD for data pipelines Partner with client teams to understand data requirements and deliver solutions Participate in architecture reviews and technical decision-making What we offer You’ll join an international network of data professionals within our organisation. We support continuous development through our dedicated Academy. If you're looking to push the boundaries of innovation and creativity in a culture that values freedom and responsibility, we encourage you to apply. At Valtech, we’re here to engineer experiences that work and reach every single person. To do this, we are proactive about creating workplaces that work for every person at Valtech. Our goal is to create an equitable workplace which gives people from all backgrounds the support they need to thrive, grow and meet their goals (whatever they may be). You can find out more about what we’re doing to create a Valtech for everyone here. Please do not worry if you do not meet all of the criteria or if you have some gaps in your CV. We’d love to hear from you and see if you’re our next member of the Valtech team!
Posted 1 day ago
3.0 - 6.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Tasks Experience Experience in building and managing data pipelines. experience with development and operations of data pipelines in the cloud (Preferably Azure.) Experience with distributed data/computing tools: Map/Reduce, Hadoop, Hive, Spark Deep expertise in architecting and data pipelines in cloud using cloud native technologies. Good experience in both ETL and ELT Ingestion patterns Hands-on experience working on large volumes of data(Petabyte scale) with distributed compute frameworks. Good Understanding of container platforms Kubernetes and docker Excellent knowledge and experience with object-oriented programming Familiarity developing with RESTful API interfaces. Experience in markup languages such as JSON and YAML Proficient in relational database design and development Good knowledge on Data warehousing concepts Working experience with agile scrum methodology Technical Skills Strong skills in distributed cloud Data analytics platforms like Databricks, HD insight, EMR cluster etc. Strong in Programming Skills -Python/Java/R/Scala etc. Experience with stream-processing systems: Kafka, Apache Storm, Spark-Streaming, Apache Flink, etc. Hands-on working knowledge in cloud data lake stores like Azure Data Lake Storage. Data pipeline orchestration with Azure Data Factory, Amazon Data Pipeline Good Knowledge on File Formats like ORC, Parquet, Delta, Avro etc. Good Experience in using SQL and No-SQL. databases like MySQL, Elasticsearch, MongoDB, PostgreSQL and Cassandra running huge volumes of data Strong experience in networking and security measures Proficiency with CI/CD automation, and specifically with DevOps build and release pipelines Proficiency with Git, including branching/merging strategies, Pull Requests, and basic command line functions Strong experience in networking and security measures Good Data Modelling skills Job Responsibilities Cloud Analytics, Storage, security, resiliency and governance Building and maintaining the data architecture for data Engineering and data science projects Extract Transform and Load data from sources Systems to data lake or Datawarehouse leveraging combination of various IaaS or SaaS components Perform compute on huge volume of data using open-source projects like Databricks/spark or Hadoop Define table schema and quickly adapt with the pipeline Working with High volume unstructured and streaming datasets Responsible to manage NoSQL Databases on Cloud (AWS, Azure etc.) Architect solutions to migrate projects from On-premises to cloud Research, investigate and implement newer technologies to continually evolve security capabilities Identify valuable data sources and automate collection processes Implement adequate networking and security measures for the data pipeline Implement monitoring solution for the data pipeline Support the design, and implement data engineering solutions Maintain excellent documentation for understanding and accessing data storage Work independently as well as in teams to deliver transformative solutions to clients Be proactive and constantly pay attention to the scalability, performance and availability of our systems Establishes privacy/security hierarchy and regulates access Collaborate with engineering and product development teams Systematic problem-solving approach with strong communication skills and a sense of ownership and drive Qualifications Bachelors degree or Masters in Computer Science or relevant streams Any Relevant cloud data engineering certification
Posted 1 day ago
5.0 - 7.0 years
15 - 25 Lacs
Pune
Work from Office
Required Skills and Qualifications: 3+ years of backend development experience in Java (Java 8+) and Spring Boot Strong understanding of REST APIs, JPA/Hibernate, and SQL databases (e.g., PostgreSQL, MySQL) Knowledge of software engineering principles and design patterns Experience with testing frameworks like JUnit and Mockito Familiarity with Docker and CI/CD tools Good communication and team collaboration skills Roles and Responsibilities Key Responsibilities: Develop and maintain backend systems using Java (Spring Boot) Build RESTful APIs and integrate with databases and third-party services Write unit and integration tests to ensure code quality Participate in code reviews and collaborate with peers and senior engineers Follow clean code principles and best practices in microservices design Support CI/CD deployment pipelines and container-based workflows Continuously learn and stay updated with backend technologies Required Skills and Qualifications: 3+ years of backend development experience in Java (Java 8+) and Spring Boot Strong understanding of REST APIs, JPA/Hibernate, and SQL databases (e.g., PostgreSQL, MySQL) Knowledge of software engineering principles and design patterns Experience with testing frameworks like JUnit and Mockito Familiarity with Docker and CI/CD tools Good communication and team collaboration skills Nice to Have: Exposure to Kubernetes and cloud platforms (AWS, GCP, etc.) Familiarity with messaging systems like Kafka or RabbitMQ Awareness of security standards and authentication protocols (OAuth2, JWT) Interest in DevOps practices and monitoring tools (Prometheus, ELK, etc.)
Posted 1 day ago
8.0 years
0 Lacs
Noida, Uttar Pradesh, India
Remote
Location: Noida, Uttar Pradesh, India (Hybrid/Remote options available) Employment Type: Full-time About Fusionpact Technologies At Fusionpact Technologies, we are at the forefront of leveraging a fusion of cutting-edge technologies to create impactful solutions that drive significant business value for our clients globally. Established in 2022, we specialize in Cloud Services, Artificial Intelligence, Software Development, ERP Solutions, and IT Consulting. Our passion lies in pushing the boundaries of what's possible with technologies like AI/ML, Blockchain, Reactive Architecture, and Cloud-Native solutions . We're a dynamic, agile, and innovation-driven company committed to delivering high-quality, scalable, and secure software that truly makes a difference. With a proven track record across 175+ projects, including innovative products like ForestTwin™ for carbon tech and the ISO Platform for compliance, we are dedicated to transforming businesses and making a brighter world. The Opportunity We're looking for a highly skilled and experienced Tech Lead to join our dynamic engineering team. In this pivotal role, you'll be instrumental in shaping our technical vision, driving the development of next-generation reactive and microservices-based applications, and fostering a culture of technical excellence within our agile development environment. You'll be a key player in designing and implementing robust, scalable, and resilient systems. Your expertise in architectural principles will be crucial in guiding the team and ensuring the successful deployment of high-quality software. We're seeking a leader who can not only leverage strong fundamental knowledge but also expertly integrate and utilize AI tools to deliver superior software solutions in a fast-paced, agile manner. If you thrive on technical challenges, enjoy mentoring, and are excited about the impact of AI on software development, Fusionpact Technologies is the place for you. Responsibilities Technical Leadership & Architecture Lead the design, development, and deployment of complex reactive and microservices-based applications, ensuring adherence to Fusionpact's best practices, architectural principles, and quality standards. Define and enforce coding standards, design patterns, and architectural guidelines across development teams to ensure consistency and maintainability. Conduct rigorous technical reviews and provide constructive feedback to ensure high-quality code, scalable solutions, and optimal performance. Mentor, coach, and guide development teams on advanced architectural concepts, reactive programming paradigms (e.g., Akka), and microservices best practices. Agile Development & AI Integration Drive agile development practices within your scrum team, working closely with the Scrum Master, DevOps, QA, Backend, and Frontend engineers to ensure efficient workflows and timely delivery. Champion the adoption and effective utilization of cutting-edge AI tools (e.g., Cursor AI, GitHub Copilot, or similar generative AI solutions) to enhance code quality, accelerate development cycles, and improve overall team efficiency. Proactively identify opportunities to leverage AI for tasks such as intelligent code generation, automated refactoring, advanced bug detection, and smart automated testing frameworks. Ensure the seamless and effective integration of AI-powered workflows into the existing development pipeline, continuously optimizing the software delivery lifecycle. Project Management & Quality Assurance Effectively manage and contribute to multiple projects simultaneously, consistently delivering superior quality output in line with project timelines and client expectations. Take ownership of the technical success of projects, from initial conception and architectural design to successful deployment and ongoing maintenance. Collaborate with product owners and stakeholders to translate complex business requirements into clear, actionable technical specifications. Ensure the delivery of highly performant, secure, maintainable, and resilient software solutions that meet Fusionpact's high standards. Team Collaboration & Mentorship Foster a collaborative, innovative, and inclusive team environment, encouraging knowledge sharing, continuous learning, and cross-functional synergy. Provide dedicated technical guidance, coaching, and mentorship to junior and mid-level engineers, helping them grow their skills and careers. Champion a culture of continuous learning, staying abreast of emerging technologies, industry trends, and innovative software development methodologies, and bringing these insights back to the team. Required Skills & Experience 8+ years of progressive experience in software development, with at least 3+ years in a Tech Lead or similar leadership role focused on complex distributed systems. Proven hands-on experience in designing, building, and deploying highly available, scalable, and resilient reactive and microservices-based applications. Deep understanding of modern architecture principles, design patterns (e.g., Domain-Driven Design, Event Sourcing, CQRS), and software development best practices. Strong hands-on experience with at least one major programming language extensively used in reactive/microservices development (e.g., Java, Kotlin, Go, or Scala). Strong fundamental knowledge and practical experience leveraging AI tools (e.g., Cursor AI, GitHub Copilot, Tabnine, or similar) to enhance development workflows, improve code quality, and accelerate delivery. Demonstrated ability to effectively manage and contribute to multiple projects simultaneously while maintaining superior quality output. Extensive experience working in a fast-paced, agile (Scrum, Kanban) environment and guiding cross-functional scrum teams (Scrum Master, DevOps, QA, Backend, Frontend). Solid understanding of DevOps principles, CI/CD pipelines, and automated deployment strategies. Excellent communication, interpersonal, and leadership skills, with the ability to articulate complex technical concepts to diverse audiences. Strong ethics and integrity, with a proven ability to thrive and lead effectively in a remote or hybrid work environment. Preferred Qualifications Hands-on experience with Scala and Akka for building reactive systems. Proficiency with cloud platforms such as AWS, Azure, or GCP, including experience with their relevant services for microservices deployment and management. In-depth experience with containerization technologies (Docker, Kubernetes) and orchestration. Familiarity with various data storage technologies (relational databases, NoSQL databases like Cassandra, MongoDB, Redis) and message queues (Kafka, RabbitMQ). Experience with performance tuning, monitoring, and troubleshooting distributed systems. Certifications in relevant cloud platforms or agile methodologies. If you are a passionate and experienced Tech Lead with a strong background in reactive and microservices architectures, a knack for leveraging AI to deliver exceptional software, and a commitment to fostering a high-performing team, we encourage you to apply and become a part of Fusionpact Technologies' innovative journey! Apply Now!
Posted 1 day ago
6.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Our Team: Search, Discovery, and Content (SDC) team provides platform and services that power search, browse, recommendation, and discovery functionalities in Adobe Clouds such as Creative Cloud products, Adobe Express, Adobe Stock Marketplace, Document Cloud, Adobe HelpX and more. Our platform caters to dozens of product integrations, millions of users, and billions of entities. SDC Platform is expanding quickly in adoption and infrastructure footprint, with various innovative initiatives centered on NoSQL, Machine Learning, and other major evolving data technologies. Job Description: Bridge gap between development and operations teams, foster collaboration, manage infrastructure availability, and improve software delivery. Key Responsibilities: Large scale data: Work on large scale search index setup and tuning, setting up real-time messaging & data ingestion platforms, NO-SQL databases, web services & orchestration services and more. Continuous Integration and Deployment: Automating CI/CD pipelines to enable faster and more frequent releases. Infrastructure as Code (IaC): Develop and manage infrastructure automation using industry-standard tools to build scalable and reproducible environments. Set up monitoring and logging solutions to ensure the health, performance, and security of applications and systems. Respond promptly to alerts and incidents, solving issues to minimize downtime. Collaboration: Work closely with development and QA teams to facilitate smooth code deployments, testing, and operations. Promote a DevOps culture that encourages shared responsibility and open communication. Security: Implement and maintain security standard methodologies in all stages of the development and deployment process. Conduct regular security assessments and audits to identify and address potential vulnerabilities. Cloud Services: In-depth understanding of cloud platforms like AWS and Azure to deploy and manage applications and infrastructure efficiently. Automation and Scripting: Develop automation scripts and tools to streamline manual processes, reducing human errors and improving efficiency. Performance Optimization: Identify performance bottlenecks and work on optimizing system performance to improve the user experience. Resource Forecasting: Anticipate and plan resource needs to support current and future demands. Requirements: Bachelor's / Post Graduation degree or equivalent experience in Computer Science, Information Technology, or related field. Proven experience as a DevOps Engineer or in a similar role with 6+ years of hands-on experience. Experience in crafting and scaling infrastructure for technologies like Elastic Search, Kafka, HBase, Apache Spark, etc. Experience in building, deploying, and managing infrastructures in public clouds (AWS, Azure). Proficiency in scripting languages like Bash, Python, Go, or Ruby. Familiarity with containerization technologies such as Docker and container orchestration tools like Kubernetes. Good understanding of different cloud architectures and design principles. Knowledge of configuration management tools like Chef, Terraform, Terragrunt. Familiarity with industry-standard monitoring tools like Prometheus, Grafana, ELK stack. Strong problem-solving and debugging skills. Excellent communication and collaboration abilities. Ability to work in an Agile/Scrum environment. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015.
Posted 1 day ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
We're looking for a Senior Software Engineer to build upon our application and data platform as we continue to innovate on application observability. We move fast and iterate quickly. We are passionate about solving customers’ problems. We have ambitious goals to build best-of-its-kind products. You will help the team win in a fast-growing market. If you are passionate about innovation and embrace the challenge of working on highly scalable systems that handle large volumes of data, this position is for you. What you'll get to do Design and build highly scalable solutions Work with a team of exceptionally capable and dedicated peers, all the way from engineering to product management and customer support Work in an open environment, work together to get things done and adapt to the team's changing needs Leverage technologies including Kafka, ElasticSearch, Docker, and Kubernetes across different cloud environments like AWS and Azure Must-have Qualifications 5+ years of full-stack developer experience in designing and developing highly scalable, distributed applications, products, and services. Proficiency in Java programming language. Experience in developing user facing features with Angular/ React Proficiency in data structures, algorithms, threads, concurrent programming Extensive knowledge of SQL and at least one relational database engine MySQL. Hands on experience in RDS or NoSQL (Dynamo, MongoDB, Coachbase) is a big plus Understanding of microservices design with expertise in Dockers and Kubernetes. Strong communication skills, both verbal and written. Ability to multi-task and adapt quickly to changing requirements, scope, and priorities. Nice-to-have Qualifications We’ve taken special care to separate the must-have qualifications from the nice-to-haves. “Nice-to-have” means just that Nice. To. Have. So, don’t worry if you can’t check off every box. We’re not hiring a list of bullet points–we’re interested in the whole you. Knowledge about other programming languages like Python, C++, JavaScript, Added advantage of having an experience in working on Cloud Observability Space. Splunk is an Equal Opportunity Employer Splunk, a Cisco company, is an Equal Opportunity Employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation, national origin, genetic information, age, disability, veteran status, or any other legally protected basis.
Posted 1 day ago
5.0 - 7.0 years
7 - 10 Lacs
Gurugram
Work from Office
Role description Education: Bachelors / Masters in Software Engineering Responsibilities In this lead role, you will be designing and developing complex software systems that have been successfully delivered to customers. Ensure the quality of design of systems by serving as a technical lead on our most demanding, cross-functional teams Build reusable code and libraries for future use. Build highly available, high performance, scalable software and you will work on distributed systems for massive-scale systems Responsible for code quality of the product being developed along with unit and integration testing. Mentor junior developers to improve their skills, and make them more effective, product software engineers. Communicate with technical teams, and senior management to collect requirements, describe software product features, technical designs, and product strategy. Skills Sounds Like You? 5+ years of software development experience with Enterprise Java (JDK 8 and above), Spring (Boot, MVC, AOP, DI), ORM Frameworks. 1+ years of experience contributing to the architecture and design (LLD, HLD, Solid principles and design patterns and scaling) of new and current systems. Strong experience in technically leading junior developers with a focus on the goal. Strong experience in data structures & algorithms and their space & time complexities. Solid understanding of multithreading, microservices, MVC and strong OO skills with demonstrated experience in developing complex and reusable APIs. Strong experience working with Databases like SQL & NoSQL. Experience working with microservices-based architecture. Experience in high traffic, highly scalable distributed system designs, complex web applications, and code reviews. Experience working in an Agile environment. Solid understanding of the full software development life cycle and the domain. Good to have knowledge of messaging systems like SNS/ SQS /Kafka etc.
Posted 1 day ago
2.0 - 4.0 years
5 - 8 Lacs
Gurugram
Work from Office
Role description Do You Make The Cut? FreeCharge is looking for a passionate, independent and results driven Technical Program Management professional. As a program professional you will be in a unique position to experience the best of both product and engineering functions. You will drive programs, execute and launch products that will impact millions of customers for the largest mobile commerce company in India. The program manager is expected to work in an extremely ambiguous and fast paced environment and is expected to think on the feet to facilitate smooth and flawless product launches. Education: Bachelors / Masters in Software Engineering Responsibilities Program Manager manages product releases from concept to successful launch. Work closely with product and engineering teams and other cross functional teams from concept through development to launch. Plan and deliver flawless product by co-ordinating with various engineering functions such as, but not limited to, development, QA and release management Prepare detailed project plans, generate appropriate metrics to assist with decision making. Track milestones, be able to build consensus, resolve conflicts, prepare risk mitigation plans and strategies, keep teams focussed and aligned on delivery. Enable clear, concise and transparent information sharing across all stakeholders. Proactively identify and clear bottlenecks, carry out escalation management in timely manner, make tradeoffs, and balance the business needs versus technical constraints. De-risk the product launch either by having contingency plans, iterative execution or any out of the box solutions. Identify engineering and product process deficiencies and work with teams to eliminate them. Sounds Like You? Overall 2-4 years of software development experience and at least 4+ years of Technical Program Management experience in software industry. Bachelor's degree in Engineering, Computer Science or related technical field. Hands on technical design and architecture experience. Analytical thinking skills, ability to see the bigger picture. Experience with Agile methodologies and tools Experience with tools such as excel, PowerPoint, SharePoint. A proven track record of delivering initiatives from conception through completion on time, within budget and on or beyond scope. Ability to work under tight deadlines in a high pressure environment and able to adjust to multiple demands. Excellent oral and written communication skills with both technical and non-technical individuals. Excellent people person skills will be required. Experience with building stream-processing systems, using solutions such as Storm, or Spark-Streaming. Experience with the integration of data from multiple data sources. Experience with SQL, NoSQL databases. Experience with various messaging systems, such as Kafka, or RabbitMQ. Experience with confluent packages like Kafka connect, schema registry. Experience with Cloudera/MapR/Hortonworks. Experience with AWS big data solutions. Good knowledge of Big Data querying tools, such as Pig, Hive or Impala Knowledge of various ETL techniques and frameworks, such as Flume.
Posted 1 day ago
8.0 - 10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role Expectations: Design & develop data pipelines for real-time and batch data ingestion and processing using Confluent Kafka, ksqlDB, Kafka Connect, and Apache Flink. Build and configure Kafka Connectors to ingest data from various sources (databases, APIs, message queues, etc.) into Kafka. Develop Flink applications for complex event processing, stream enrichment, and real-time analytics. Develop and optimize ksqlDB queries for real-time data transformations, aggregations, and filtering. Implement data quality checks and monitoring to ensure data accuracy and reliability throughout the pipeline. Monitor and troubleshoot data pipeline performance, identify bottlenecks, and implement optimizations. Automate data pipeline deployment, monitoring, and maintenance tasks. Stay up-to-date with the latest advancements in data streaming technologies and best practices. Contribute to the development of data engineering standards and best practices within the organization. Participate in code reviews and contribute to a collaborative and supportive team environment. Work closely with other architects and tech leads in India & US and create POCs and MVPs Provide regular updates on the tasks, status and risks to project manager The experience we are looking to add to our team Qualifications: Bachelor's degree or higher from a reputed university 8 to 10 years total experience with majority of that experience related to ETL/ELT, big data, Kafka etc. Proficiency in developing Flink applications for stream processing and real-time analytics. Strong understanding of data streaming concepts and architectures. Extensive experience with Confluent Kafka, including Kafka Brokers, Producers, Consumers, and Schema Registry. Hands-on experience with ksqlDB for real-time data transformations and stream processing. Experience with Kafka Connect and building custom connectors. Extensive experience in implementing large scale data ingestion and curation solutions Good hands on experience in big data technology stack with any cloud platform - Excellent problem-solving, analytical, and communication skills. Ability to work independently and as part of a team
Posted 1 day ago
3.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About Us Yubi stands for ubiquitous. But Yubi will also stand for transparency, collaboration, and the power of possibility. From being a disruptor in India’s debt market to marching towards global corporate markets from one product to one holistic product suite with seven products Yubi is the place to unleash potential. Freedom, not fear. Avenues, not roadblocks. Opportunity, not obstacles. Yubi, formerly known as CredAvenue, is re-defining global debt markets by freeing the flow of finance between borrowers, lenders, and investors. We are the world's possibility platform for the discovery, investment, fulfillment, and collection of any debt solution. At Yubi, opportunities are plenty and we equip you with tools to seize it. In March 2022, we became India's fastest fintech and most impactful startup to join the unicorn club with a Series B fundraising round of $137 million. In 2020, we began our journey with a vision of transforming and deepening the global institutional debt market through technology. Our two-sided debt marketplace helps institutional and HNI investors find the widest network of corporate borrowers and debt products on one side and helps corporates to discover investors and access debt capital efficiently on the other side. Switching between platforms is easy, which means investors can lend, invest and trade bonds - all in one place. All of our platforms shake up the traditional debt ecosystem and offer new ways of digital finance. Yubi Credit Marketplace - With the largest selection of lenders on one platform, our credit marketplace helps enterprises partner with lenders of their choice for any and all capital requirements. Yubi Invest - Fixed income securities platform for wealth managers & financial advisors to channel client investments in fixed income Financial Services Platform - Designed for financial institutions to manage co-lending partnerships & asset based securitization Spocto - Debt recovery & risk mitigation platform Corpository - Dedicated SaaS solutions platform powered by Decision-grade data, Analytics, Pattern Identifications, Early Warning Signals and Predictions to Lenders, Investors and Business Enterprises So far, we have on-boarded over 17000+ enterprises, 6200+ investors & lenders and have facilitated debt volumes of over INR 1,40,000 crore. Backed by marquee investors like Insight Partners, B Capital Group, Dragoneer, Sequoia Capital, LightSpeed and Lightrock, we are the only-of-its-kind debt platform globally, revolutionizing the segment. At Yubi, People are at the core of the business and our most valuable assets. Yubi is constantly growing, with 1000+ like-minded individuals today, who are changing the way people perceive debt. We are a fun bunch who are highly motivated and driven to create a purposeful impact. Come, join the club to be a part of our epic growth story. Role and Responsibilities Developing a revolutionary finance marketplace product that includes design, user experience, and business logic to ensure the product is easy to use, appealing, and effective. Ensure that the implementation adheres to defined specs and processes in the PRD Own end-to-end quality of deliverables during all phases of the software development lifecycle. Work with managers, leads and peers to come up with implementation options. Ability to function effectively in a fast-paced environment and manage continuously changing business needs Mentor junior engineers and foster innovation within the team. Design and develop the pod’s software components and systems. Evaluate and recommend tools, technologies, and processes, driving adoption to ensure high-quality products. Requirements Minimum 3+ years of experience in Backend development, delivering enterprise-class web applications and services. Expertise in Java technologies including Spring, Hibernate, and Kafka. Strong knowledge of NoSQL and RDBMS, with expertise in schema design Familiarity with Kubernetes deployment and managing CI/CD pipelines. Ability to function effectively in a fast-paced environment and manage continuously changing business needs. Experience with microservices architecture and RESTful APIs. Familiarity with monitoring and logging tools (Prometheus, Grafana, ELK stack). Competent in software engineering tools (e.g., Java build tools) and best practices (e.g., unit testing, test automation, continuous integration). Experience with the Cloud technologies of AWS and GCP and developing secure applications Strong understanding of the software development lifecycle and agile methodologies Benefits YUBI is committed to creating a diverse environment and is proud to be an equal opportunity employer. All qualified applicants receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, or age.
Posted 1 day ago
14.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
About Us Yubi stands for ubiquitous. But Yubi will also stand for transparency, collaboration, and the power of possibility. From being a disruptor in India’s debt market to marching towards global corporate markets from one product to one holistic product suite with seven products Yubi is the place to unleash potential. Freedom, not fear. Avenues, not roadblocks. Opportunity, not obstacles. Yubi, formerly known as CredAvenue, is re-defining global debt markets by freeing the flow of finance between borrowers, lenders, and investors. We are the world's possibility platform for the discovery, investment, fulfillment, and collection of any debt solution. At Yubi, opportunities are plenty and we equip you with tools to seize it. In March 2022, we became India's fastest fintech and most impactful startup to join the unicorn club with a Series B fundraising round of $137 million. In 2020, we began our journey with a vision of transforming and deepening the global institutional debt market through technology. Our two-sided debt marketplace helps institutional and HNI investors find the widest network of corporate borrowers and debt products on one side and helps corporates to discover investors and access debt capital efficiently on the other side. Switching between platforms is easy, which means investors can lend, invest and trade bonds - all in one place. All of our platforms shake up the traditional debt ecosystem and offer new ways of digital finance. Yubi Credit Marketplace - With the largest selection of lenders on one platform, our credit marketplace helps enterprises partner with lenders of their choice for any and all capital requirements. Yubi Invest - Fixed income securities platform for wealth managers & financial advisors to channel client investments in fixed income Financial Services Platform - Designed for financial institutions to manage co-lending partnerships & asset based securitization Spocto - Debt recovery & risk mitigation platform Corpository - Dedicated SaaS solutions platform powered by Decision-grade data, Analytics, Pattern Identifications, Early Warning Signals and Predictions to Lenders, Investors and Business Enterprises So far, we have on-boarded over 17000+ enterprises, 6200+ investors & lenders and have facilitated debt volumes of over INR 1,40,000 crore. Backed by marquee investors like Insight Partners, B Capital Group, Dragoneer, Sequoia Capital, LightSpeed and Lightrock, we are the only-of-its-kind debt platform globally, revolutionizing the segment. At Yubi, People are at the core of the business and our most valuable assets. Yubi is constantly growing, with 1000+ like-minded individuals today, who are changing the way people perceive debt. We are a fun bunch who are highly motivated and driven to create a purposeful impact. Come, join the club to be a part of our epic growth story. About Role: We are seeking a skilled and experienced Engineering Manager to join our team. In this role, you will be responsible for leading a team of high performing engineers and ensuring the successful delivery of software solutions. You will be responsible for developing high-quality, efficient, and scalable code that meets business requirements and design specifications. You will obsess over quality, developer productivity, driving innovation and adopting best practices while coaching and mentoring the engineering team. Responsibilities: Lead and manage a team of backend and frontend developers, providing technical guidance and mentoring. Develop high-quality, efficient, and scalable code that meets business requirements and design specifications Collaborate with project managers, business leaders and other stakeholders to ensure the seamless integration of front-end and backend systems. Design and implement data storage solutions, including databases and caching mechanisms Ensure the reliability, scalability, and security of the systems Driving the adoption of engineering best practices to improve the quality and reliability of deliverables. Manage deployments and infrastructure, working closely with DevOps teams to ensure the availability and performance of backend systems Keep up-to-date with emerging trends and technologies in backend development and integrate new technologies where appropriate. Requirements Requirements 14 -18 years of hands-on experience in building & running full-stack applications Proven track record of successfully leading and managing software engineering teams Excellent problem-solving skills and the ability to make sound technical and architectural decisions Strong understanding of both back-end and front-end technologies Strong knowledge of software development methodologies, best practices, and coding standards Exceptional communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams Having built 0-1 product in a fast growing startup environment would be a big plus Entrepreneurial spirit and willingness to learn more Exposure to DevOps tools and practices, such as Docker, Kubernetes, or Ansible Tech Stack: Java/Spring Boot, Microservices, ReactJS, MySQL, Kafka, Redis, AWS
Posted 1 day ago
5.0 - 7.0 years
8 - 12 Lacs
Gurugram
Work from Office
Education: Bachelors / Masters in Software Engineering Responsibilities In this lead role, you will be designing and developing complex software systems that have been successfully delivered to customers. Ensure the quality of design of systems by serving as a technical lead on our most demanding, cross-functional teams Build reusable code and libraries for future use. Build highly available, high performance, scalable software and you will work on distributed systems for massive-scale systems Responsible for code quality of the product being developed along with unit and integration testing. Mentor junior developers to improve their skills, and make them more effective, product software engineers. Communicate with technical teams, and senior management to collect requirements, describe software product features, technical designs, and product strategy. Sounds Like You? 4+ years of software development experience with Enterprise Java (JDK 8 and above), Spring (Boot, MVC, AOP, DI), ORM Frameworks. 1+ years of experience contributing to the architecture and design (LLD, HLD, Solid principles and design patterns and scaling) of new and current systems. Strong experience in technically leading junior developers with a focus on the goal. Strong experience in data structures & algorithms and their space & time complexities. Solid understanding of multithreading, microservices, MVC and strong OO skills with demonstrated experience in developing complex and reusable APIs. Strong experience working with Databases like SQL & NoSQL. Experience working with microservices-based architecture. Experience in high traffic, highly scalable distributed system designs, complex web applications, and code reviews. Experience working in an Agile environment. Solid understanding of the full software development life cycle and the domain. Good to have knowledge of messaging systems like SNS/ SQS /Kafka etc.
Posted 1 day ago
2.0 - 5.0 years
5 - 8 Lacs
Gurugram
Work from Office
Programming Languages: Python, Scala Machine Learning frameworks: Scikit Learn, Xgboost, Tensorflow, Keras, PyTorch, Spacy, Gensim, Stanford NLP, NLTK, Open CV, Spark MLlib, . Machine Learning Algorithms experience good to have Scheduling experience: Airflow Big Data/ Streaming/ Queues: Apache Spark, Apache Nifi, Apache Kafka, RabbitMQ any one of them Databases: MySQL, Mongo/Redis/Dynamo DB, Hive Source Control: GIT Cloud: AWS Build and Deployment: Jenkins, Docker, Docker Swarm, Kubernetes. BI tool: Quicksight(preferred) else any BI tool (Must have)
Posted 1 day ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
Remote
Job Title: Lead Product Manager - Application and Platform Connectors Location: Hyderabad, India / Remote, Anywhere in India Department: Product Management Reports To: Director, Product Management Job Overview Celigo is seeking a Lead Product Manager - Application and Platform Connectors to drive the strategy and execution of our connector ecosystem within our iPaaS platform. This role is designed for experienced professionals who excel in technical product leadership without direct people management responsibilities. This role focuses on driving the technical vision, execution, and collaboration across cross-functional teams to ensure successful product outcomes. The ideal candidate will act as a key technical expert, bridging the gap between engineering, product strategy, and business needs. Key Responsibilities Product Strategy and Vision Develop and execute the connector strategy, ensuring alignment with business objectives and customer needs, while leveraging AI to enhance UX and streamline the integration experience. Conduct market research and competitive analysis to identify trends in connectivity. Product Development and Execution Own the full lifecycle of connectors, defining requirements, managing engineering execution, and optimizing performance. Develop Feature Requirements Documents (FRDs) and user stories for cloud and on-prem applications. Oversee product roadmap discussions, triad meetings, and prioritization to ensure security, scalability, and usability. Research and implement enhancements to improve the configuration and troubleshooting experience for users. Cross-Functional Collaboration and Influence Lead communication between engineering, sales, marketing, and support, ensuring clear updates on roadmap priorities and risks. Support go-to-market efforts through customer and partner enablement, including documentation and training. Ensure roadmap priorities are informed by internal stakeholders and customer needs, while the connector operations team manages communication of vendor updates. Work closely with UX, AI, and engineering teams to refine the product usability experience. Customer Focus Engage customers and communicate roadmap priorities, ensuring integration reliability and usability. Leverage customer insights and data analytics to refine connector performance and user experience. Lead Product Advisory Councils (PACs) to align development with customer expectations. Collaborate with the connector operations team to ensure upgrades, deprecations, and maintenance are incorporated into the product roadmap. Use AI-driven insights to enhance the onboarding experience and simplify complex integration flows. Risk Management and Problem Solving Identify and mitigate risks across API updates, endpoint-driven changes, and performance optimizations. Provide regular updates to senior leadership on connector health and major technical risks. Evaluate trade-offs between engineering investment, customer impact, and endpoint-driven requirements to ensure long-term connector stability. Technical Leadership Serve as a subject matter expert and lead within the connector team, providing technical guidance and best practices for managing integrations at scale. Encourage collaboration and innovation in API connectivity and integration product management. Support and mentor peers in understanding best practices for designing and maintaining high-performing connectors. Skills And Qualifications Educational Background Bachelor’s degree in Computer Science, Engineering, Business, or a related field. An advanced degree is a plus. Experience 10+ years in SaaS, iPaaS, or API-first product management, with some experience in connectivity and integrations across APIs (REST, SOAP, GraphQL), file transfers (SFTP, FTPS, FTP, cloud), event-driven systems (Kafka, JMS), and On-Prem Agent (OPA) connectors. Experience with ERP and CRM integration platforms (e.g. NetSuite, Salesforce, SAP, Microsoft) Proven experience leading strategic initiatives, and technical discussions, driving execution without direct management responsibilities. Experience in leveraging AI to enhance UX, particularly in reducing user friction in integration setup and troubleshooting. Strong ability to influence senior leadership and drive cross-functional collaboration. Core Competencies Strategic thinker with expertise in API integrations, connector frameworks, and iPaaS platforms. Technical acumen to engage in AI-driven discussions, API authentication protocols, on-prem connectivity, and UX improvements for integration automation. Excellent communication skills, capable of articulating complex connector challenges to technical and non-technical audiences. Data-driven decision-maker, optimizing connector performance using customer insights and API monitoring metrics. Bonus Skills Deep experience in iPaaS, API management, and connector frameworks. Expertise in API-first platforms and best practices for managing API deprecations and endpoint updates. Experience designing AI-driven UX enhancements, such as AI-powered recommendations, automated troubleshooting, and no-code integration assistance. Physical Requirements Prolonged periods sitting at a desk and working on a computer.
Posted 1 day ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
Req ID: 333121 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Java, Spring, Spring Boot, Kafka, Rest API, Microservices, Azure, CD - Developer to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Java, Spring, Spring Boot, Kafka, Rest API, Microservices, Azure, CD - Developer Java Full Stack Engineer 3 (6-9 Years) Mandatory Skills Hands on experience in Java, Spring, Springboot, Event/Listener messaging frameworks like Kafka Hands on experience in Designing and Developing Robust RESTful API and Micro services. Hands on experience in Hashicorp Vault, Terraform and Packer Hands on experience in Kubernetes tools and services, including managed Kubernetes platforms, service meshes, monitoring solutions, and security tools In-depth understanding on API Management – Stratum/Apigee Proven experience in designing, deploying, and maintaining cloud infrastructure across platforms like AWS, Azure, or Google Cloud. preferably Azure – Namespace, AKS, ASB, Data Factory, API Management, Storage Account, and Redis. Knowledge on CD process and tools, testing frameworks and practices (GitHub, Jenkins, uDeploy, Stash) Good to have skills to this Role Knowledge in Control M, DB2 to CICS, Cloud to CICS and MAUI Detailed JD: The Expertise You Have Bachelor’s degree in computer science, Engineering or Equivalent. You have hands-on experience in building the interconnected systems that enable a business to operate, including hardware, software, network and database. Very Strong expertise in updating and maintaining legacy systems to leverage modern technologies and architectures. You have the expertise and experience in designing and developing microservices which can handle high Transaction Per Second traffic. Strong understanding of data governance principles and best practices. You are experienced with a variety of modern programming languages and frameworks. 8+ years of experience working with Java, Springboot, Oracle, Kubernetes, Kafka, Azure/AWS cloud technologies. You have a passion for technology and can stay on top of latest technology trends. Good working knowledge on ITIL processes like Incident management, Change management etc., You have hands-on experience leading or mentoring scrum teams focused on building software solutions for business critical, architecturally distributed experiences. The teams you have worked with have multi-functional responsibilities such as engineering, quality, devops and release implementation. You care about cycle time and use CI/CD practices, tools to rapidly deploy changes to production while minimizing risk. Have strong communication skills and technical expertise to drive and participate in meaningful discussions with partners across different roles and different skillsets. The Skills that are Key to This Role Hands on experience in Java, Spring, Springboot, Event/Listener messaging frameworks Hands on experience in Designing and Developing Robust RESTful API Hands on experience in Hashicorp Vault, Terraform and Packer Hands on experience in Kubernetes tools and services, including managed Kubernetes platforms, service meshes, monitoring solutions, and security tools In-depth understanding on API Management – Stratum/Apigee Proven experience in designing, deploying, and maintaining cloud infrastructure across platforms like AWS, Azure, or Google Cloud. preferably Azure – Namespace, AKS, ASB, Data Factory, API Management, Storage Account, and Redis. Hands on experience in container-based development (Docker) Hands on experience working with EDA solutions such as Kafka/ MQ Hands on experience working with database and data concepts, tools and technologies (Oracle, PL/SQL Informatica) Familiarity working with OAuth 2.0 framework and scopes Experience in implementing Micro services Architecture & building / deploying highly automated, scalable and maintainable infrastructure. Experience in designing and developing apps with high throughput and low latency utilizing load balancing, caching, threading etc. Knowledge on CD process and tools, testing frameworks and practices (GitHub, Jenkins, uDeploy, Stash) Strategic thinking and critical problem-solving skills General Expectation Must have Good Communication Must be ready to work in 10:30 AM to 8:30 PM Shift Flexible to work in Client Location Ramanujam IT park, Taramani, Chennai Must be ready to work from office in a Hybrid work environment. Full Remote work is not an option Expect Full Return to office in 2025 About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
ZS is a place where passion changes lives. As a management consulting and technology firm focused on improving life and how we live it, our most valuable asset is our people. Here you’ll work side-by-side with a powerful collective of thinkers and experts shaping life-changing solutions for patients, caregivers and consumers, worldwide. ZSers drive impact by bringing a client first mentality to each and every engagement. We partner collaboratively with our clients to develop custom solutions and technology products that create value and deliver company results across critical areas of their business. Bring your curiosity for learning; bold ideas; courage and passion to drive life-changing impact to ZS. Our most valuable asset is our people . At ZS we honor the visible and invisible elements of our identities, personal experiences and belief systems—the ones that comprise us as individuals, shape who we are and make us unique. We believe your personal interests, identities, and desire to learn are part of your success here. Learn more about our diversity, equity, and inclusion efforts and the networks ZS supports to assist our ZSers in cultivating community spaces, obtaining the resources they need to thrive, and sharing the messages they are passionate about. Business Technology ZS’s Technology group focuses on scalable strategies, assets and accelerators that deliver to our clients enterprise-wide transformation via cutting-edge technology. We leverage digital and technology solutions to optimize business processes, enhance decision-making, and drive innovation. Our services include, but are not limited to, Digital and Technology advisory, Product and Platform development and Data, Analytics and AI implementation. What you’ll do: Work with business stakeholders to understand their business needs. Create data pipelines that extract, transform, and load (ETL) from various sources into a usable format in a Data warehouse. Clean, filter, and validate data to ensure it meets quality and format standards. Develop data model objects (tables, views) to transform the data into unified format for downstream consumption. Expert in monitoring, controlling, configuring, and maintaining processes in cloud data platform. Optimize data pipelines and data storage for performance and efficiency. Participate in code reviews and provide meaningful feedback to other team members. Provide technical support and troubleshoot issue(s). What you’ll bring : Bachelor’s degree in Computer Science, Information Technology, or a related field, or equivalent work experience. Experience working in the AWS cloud platform. Data engineer with expertise in developing big data and data warehouse platforms. Experience working with structured and semi-structured data. Expertise in developing big data solutions, ETL/ELT pipelines for data ingestion, data transformation, and optimization techniques. Experience working directly with technical and business teams. Able to create technical documentation. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. AWS (Big Data services) - S3, Glue, Athena, EMR Programming - Python, Spark, SQL, Mulesoft,Talend, Dbt Data warehouse - ETL, Redshift / Snowflake Additional Skills : Experience in data modeling. Certified in AWS platform for Data Engineer skills. Experience with ITSM processes/tools such as ServiceNow, Jira Understanding of Spark, Hive, Kafka, Kinesis, Spark Streaming, and Airflow Perks & Benefits: ZS offers a comprehensive total rewards package including health and well-being, financial planning, annual leave, personal growth and professional development. Our robust skills development programs, multiple career progression options and internal mobility paths and collaborative culture empowers you to thrive as an individual and global team member. We are committed to giving our employees a flexible and connected way of working. A flexible and connected ZS allows us to combine work from home and on-site presence at clients/ZS offices for the majority of our week. The magic of ZS culture and innovation thrives in both planned and spontaneous face-to-face connections. Travel: Travel is a requirement at ZS for client facing ZSers; business needs of your project and client are the priority. While some projects may be local, all client-facing ZSers should be prepared to travel as needed. Travel provides opportunities to strengthen client relationships, gain diverse experiences, and enhance professional growth by working in different environments and cultures. Considering applying? At ZS, we're building a diverse and inclusive company where people bring their passions to inspire life-changing impact and deliver better outcomes for all. We are most interested in finding the best candidate for the job and recognize the value that candidates with all backgrounds, including non-traditional ones, bring. If you are interested in joining us, we encourage you to apply even if you don't meet 100% of the requirements listed above. ZS is an equal opportunity employer and is committed to providing equal employment and advancement opportunities without regard to any class protected by applicable law. To Complete Your Application: Candidates must possess or be able to obtain work authorization for their intended country of employment.An on-line application, including a full set of transcripts (official or unofficial), is required to be considered. NO AGENCY CALLS, PLEASE. Find Out More At: www.zs.com
Posted 1 day ago
8.0 - 12.0 years
15 - 30 Lacs
Gurugram
Work from Office
Role description Lead and mentor a team of data engineers to design, develop, and maintain high-performance data pipelines and platforms. Architect scalable ETL/ELT processes, streaming pipelines, and data lake/warehouse solutions (e.g., Redshift, Snowflake, BigQuery). Own the roadmap and technical vision for the data engineering function, ensuring best practices in data modeling, governance, quality, and security. Drive adoption of modern data stack tools (e.g., Airflow, Kafka, Spark etc.) and foster a culture of continuous improvement. Ensure the platform is reliable, scalable, and cost-effective across batch and real-time use cases. Champion data observability, lineage, and privacy initiatives to ensure trust in data across the org. Skills Bachelors or Masters degree in Computer Science, Engineering, or related technical field. 8+ years of hands-on experience in data engineering with at least 2+ years in a leadership or managerial role. Proven experience with distributed data processing frameworks such as Apache Spark, Flink, or Kafka. Strong SQL skills and experience in data modeling, data warehousing, and schema design. Proficiency with cloud platforms (AWS/GCP/Azure) and their native data services (e.g., AWS Glue, Redshift, EMR, BigQuery). Solid grasp of data architecture, system design, and performance optimization at scale. Experience working in an agile development environment and managing sprint-based delivery cycles.
Posted 1 day ago
8.0 - 10.0 years
25 - 30 Lacs
Gurugram
Work from Office
Role description Lead and mentor a team of data engineers to design, develop, and maintain high-performance data pipelines and platforms. Architect scalable ETL/ELT processes, streaming pipelines, and data lake/warehouse solutions (e.g., Redshift, Snowflake, BigQuery). Own the roadmap and technical vision for the data engineering function, ensuring best practices in data modeling, governance, quality, and security. Drive adoption of modern data stack tools (e.g., Airflow, Kafka, Spark etc.) and foster a culture of continuous improvement. Ensure the platform is reliable, scalable, and cost-effective across batch and real-time use cases. Champion data observability, lineage, and privacy initiatives to ensure trust in data across the org. Skills Bachelors or Masters degree in Computer Science, Engineering, or related technical field. 8+ years of hands-on experience in data engineering with at least 2+ years in a leadership or managerial role. Proven experience with distributed data processing frameworks such as Apache Spark, Flink, or Kafka. Strong SQL skills and experience in data modeling, data warehousing, and schema design. Proficiency with cloud platforms (AWS/GCP/Azure) and their native data services (e.g., AWS Glue, Redshift, EMR, BigQuery). Solid grasp of data architecture, system design, and performance optimization at scale. Experience working in an agile development environment and managing sprint-based delivery cycles.
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France