Home
Jobs

91666 Python Jobs - Page 20

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Not Applicable Specialism Data, Analytics & AI & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. s Must have Candidates with minimum 5 years of relevant experience for 1012 years of total experience (Architect / Managerial level). Deep expertise with technologies such as Data factory, Data Bricks (Advanced), SQLDB (writing complex Stored Procedures), Synapse, Python scripting (mandatory), Pyspark scripting, Azure Analysis Services. Must be certified with DP 203 (Azure Data Engineer Associate), Databricks Certified Data Engineer Professional (Architect / Managerial level) Strong troubleshooting and debugging skills. Proven experience in working source control technologies (such as GITHUB, Azure DevOps), build and release pipelines. Experience in writing complex PySpark queries to perform data analysis. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. Mandatory skill sets Azure Databricks, Pyspark, Datafactory Preferred skill sets Azure Databricks, Pyspark, Datafactory, Python, Azure Devops Mandatory skill sets Azure Databricks, Pyspark, Datafactory Preferred skill sets Azure Databricks, Pyspark, Datafactory, Python, Azure Devops Years of experience required 915yrs Education qualification B.Tech / M.Tech / M.E / MCA/B.E Education Degrees/Field of Study required Master of Engineering, Bachelor of Technology, Bachelor of Engineering Degrees/Field of Study preferred Required Skills Azure Data Factory, Databricks Platform Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Travel Requirements Government Clearance Required?

Posted 1 day ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data management focus on organising and maintaining data to enable accuracy and accessibility for effective decisionmaking. These individuals handle data governance, quality control, and data integration to support business operations. In data governance at PwC, you will focus on establishing and maintaining policies and procedures to optimise the quality, integrity, and security of data. You will be responsible for optimising data management processes and mitigate risks associated with data usage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organizations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organizations to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Must have 1. SAP Native Hana modelling experience with good knowledge on architecture and different features of HANA DB and HANA on cloud, SLT, SDI (e.g. Hands on experience with calculated columns, Input parameters/Variables, Performance optimization techniques) 2. SAP Analytics SAP BW, SAP BW on HANA, SAP Native HANA, ADSO (Advanced Data Storage Options), CP (Composite Providers), Cube, Routines, DSO, InfoObjects, Multiproviders, Infosets 3. SAP Extractors, ABAP 4. Handson experience of SQL queries, performance optimization, delta/SCD logic and able to handle complex transformation logics. 5. Working on SLT 6. Able to independently handle the ETL activities including loading the data from SAP ECC, third party system to HANA, flat file and other business formats. 7. Understanding of Data Profiling, Data Quality, Data Integrator and platform transformations. 8. Handling SAP BODS, problem definition, Architecture/Design Detailing of Processes. Mandatory skill sets Native HANA, BW on HANA, SLT, SQL Certifications (any one) SAP Native HANA SAP BW on HANA SAP BI7.0 SAP BW3.5 Preferred skill sets Good to have Working knowledge of Python BODS Years of experience required Experience 510 Years Education qualification B.Tech / M.Tech / MCA/MBA Education Degrees/Field of Study required Master of Engineering, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred Required Skills SAP HANA Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Process Management (BPM), Communication, Corporate Governance, Creativity, Data Access Control, Database Administration, Data Governance Training, Data Processing, Data Processor, Data Quality, Data Quality Assessment, Data Quality Improvement Plans (DQIP), Data Stewardship, Data Stewardship Best Practices, Data Stewardship Frameworks, Data Warehouse Governance, Data Warehousing Optimization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 17 more} No

Posted 1 day ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data management focus on organising and maintaining data to enable accuracy and accessibility for effective decisionmaking. These individuals handle data governance, quality control, and data integration to support business operations. In data governance at PwC, you will focus on establishing and maintaining policies and procedures to optimise the quality, integrity, and security of data. You will be responsible for optimising data management processes and mitigate risks associated with data usage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organizations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organizations to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Must have 1. SAP Native Hana modelling experience with good knowledge on architecture and different features of HANA DB and HANA on cloud, SLT, SDI (e.g. Hands on experience with calculated columns, Input parameters/Variables, Performance optimization techniques) 2. SAP Analytics SAP BW, SAP BW on HANA, SAP Native HANA, ADSO (Advanced Data Storage Options), CP (Composite Providers), Cube, Routines, DSO, InfoObjects, Multiproviders, Infosets 3. SAP Extractors, ABAP 4. Handson experience of SQL queries, performance optimization, delta/SCD logic and able to handle complex transformation logics. 5. Working on SLT 6. Able to independently handle the ETL activities including loading the data from SAP ECC, third party system to HANA, flat file and other business formats. 7. Understanding of Data Profiling, Data Quality, Data Integrator and platform transformations. 8. Handling SAP BODS, problem definition, Architecture/Design Detailing of Processes. Mandatory skill sets Native HANA, BW on HANA, SLT, SQL Certifications (any one) SAP Native HANA SAP BW on HANA SAP BI7.0 SAP BW3.5 Preferred skill sets Good to have Working knowledge of Python BODS Years of experience required Experience 510 Years Education qualification B.Tech / M.Tech / MCA/MBA Education Degrees/Field of Study required Master of Business Administration, Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred Required Skills SAP HANA Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Process Management (BPM), Communication, Corporate Governance, Creativity, Data Access Control, Database Administration, Data Governance Training, Data Processing, Data Processor, Data Quality, Data Quality Assessment, Data Quality Improvement Plans (DQIP), Data Stewardship, Data Stewardship Best Practices, Data Stewardship Frameworks, Data Warehouse Governance, Data Warehousing Optimization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 17 more} No

Posted 1 day ago

Apply

5.0 - 10.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Not Applicable Specialism Data, Analytics & AI & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. s Must have Candidates with minimum 5 years of relevant experience for 1012 years of total experience (Architect / Managerial level). Deep expertise with technologies such as Data factory, Data Bricks (Advanced), SQLDB (writing complex Stored Procedures), Synapse, Python scripting (mandatory), Pyspark scripting, Azure Analysis Services. Must be certified with DP 203 (Azure Data Engineer Associate), Databricks Certified Data Engineer Professional (Architect / Managerial level) Strong troubleshooting and debugging skills. Proven experience in working source control technologies (such as GITHUB, Azure DevOps), build and release pipelines. Experience in writing complex PySpark queries to perform data analysis. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. Mandatory skill sets Azure Databricks, Pyspark, Datafactory Preferred skill sets Azure Databricks, Pyspark, Datafactory, Python, Azure Devops Mandatory skill sets Azure Databricks, Pyspark, Datafactory Preferred skill sets Azure Databricks, Pyspark, Datafactory, Python, Azure Devops Years of experience required 915yrs Education qualification B.Tech / M.Tech / M.E / MCA/B.E Education Degrees/Field of Study required Bachelor of Technology, Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred Required Skills Azure Data Factory, Databricks Platform Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Travel Requirements Government Clearance Required?

Posted 1 day ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate & Summary At PwC, our people in data management focus on organising and maintaining data to enable accuracy and accessibility for effective decisionmaking. These individuals handle data governance, quality control, and data integration to support business operations. In data governance at PwC, you will focus on establishing and maintaining policies and procedures to optimise the quality, integrity, and security of data. You will be responsible for optimising data management processes and mitigate risks associated with data usage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . & Summary A career within Data and Analytics services will provide you with the opportunity to help organizations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organizational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organizations to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities Must have 1. SAP Native Hana modelling experience with good knowledge on architecture and different features of HANA DB and HANA on cloud, SLT, SDI (e.g. Hands on experience with calculated columns, Input parameters/Variables, Performance optimization techniques) 2. SAP Analytics SAP BW, SAP BW on HANA, SAP Native HANA, ADSO (Advanced Data Storage Options), CP (Composite Providers), Cube, Routines, DSO, InfoObjects, Multiproviders, Infosets 3. SAP Extractors, ABAP 4. Handson experience of SQL queries, performance optimization, delta/SCD logic and able to handle complex transformation logics. 5. Working on SLT 6. Able to independently handle the ETL activities including loading the data from SAP ECC, third party system to HANA, flat file and other business formats. 7. Understanding of Data Profiling, Data Quality, Data Integrator and platform transformations. 8. Handling SAP BODS, problem definition, Architecture/Design Detailing of Processes. Mandatory skill sets Native HANA, BW on HANA, SLT, SQL Certifications (any one) SAP Native HANA SAP BW on HANA SAP BI7.0 SAP BW3.5 Preferred skill sets Good to have Working knowledge of Python BODS Years of experience required Experience 510 Years Education qualification B.Tech / M.Tech / MCA/MBA Education Degrees/Field of Study required Bachelor of Engineering, Master of Engineering, Master of Business Administration Degrees/Field of Study preferred Required Skills SAP HANA Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Process Management (BPM), Communication, Corporate Governance, Creativity, Data Access Control, Database Administration, Data Governance Training, Data Processing, Data Processor, Data Quality, Data Quality Assessment, Data Quality Improvement Plans (DQIP), Data Stewardship, Data Stewardship Best Practices, Data Stewardship Frameworks, Data Warehouse Governance, Data Warehousing Optimization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 17 more} No

Posted 1 day ago

Apply

5.0 - 10.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Not Applicable Specialism Data, Analytics & AI & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decisionmaking and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. s Must have Candidates with minimum 5 years of relevant experience for 1012 years of total experience (Architect / Managerial level). Deep expertise with technologies such as Data factory, Data Bricks (Advanced), SQLDB (writing complex Stored Procedures), Synapse, Python scripting (mandatory), Pyspark scripting, Azure Analysis Services. Must be certified with DP 203 (Azure Data Engineer Associate), Databricks Certified Data Engineer Professional (Architect / Managerial level) Strong troubleshooting and debugging skills. Proven experience in working source control technologies (such as GITHUB, Azure DevOps), build and release pipelines. Experience in writing complex PySpark queries to perform data analysis. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purposeled and valuesdriven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. Mandatory skill sets Azure Databricks, Pyspark, Datafactory Preferred skill sets Azure Databricks, Pyspark, Datafactory, Python, Azure Devops Mandatory skill sets Azure Databricks, Pyspark, Datafactory Preferred skill sets Azure Databricks, Pyspark, Datafactory, Python, Azure Devops Years of experience required 915yrs Education qualification B.Tech / M.Tech / M.E / MCA/B.E Education Degrees/Field of Study required Master of Engineering, Bachelor of Engineering, Bachelor of Technology Degrees/Field of Study preferred Required Skills Azure Data Box, Databricks Platform Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling {+ 32 more} Travel Requirements Government Clearance Required?

Posted 1 day ago

Apply

6.0 - 7.0 years

13 - 15 Lacs

Bengaluru

Work from Office

Job Title: Software DeveloperLocation: TechM Blr ITC06 07Years of Experience: 2 5 YearsJob Summary:We are seeking a skilled Software Developer with a strong background in SAP Archiving to join our dynamic team The ideal candidate will have 2 5 years of experience in software development, with a focus on SAP solutions You will be responsible for designing, developing, and implementing software applications that meet our business needs while ensuring data integrity and compliance through effective archiving strategies Responsibilities:Design, develop, and maintain software applications in accordance with business requirements Implement and manage SAP Archiving solutions to optimize data storage and retrieval processes Collaborate with cross functional teams to gather requirements and translate them into technical specifications Conduct code reviews and ensure adherence to best practices in software development Perform testing and debugging of applications to ensure high quality deliverables Provide technical support and troubleshooting for existing applications Stay updated with the latest industry trends and technologies related to SAP and software development Mandatory Skills:Strong knowledge and experience in SAP Archiving

Posted 1 day ago

Apply

4.0 - 9.0 years

16 - 17 Lacs

Bengaluru

Work from Office

Must have 4+ years of experience in Design, Development, Testing, and Deployment: Lead the creation of scalable Data & AI applications using best practices in software engineering such as automation, version control, and CI/CD. Develop and implement rigorous testing strategies to ensure application reliability and performance. Oversee deployment processes, addressing issues related to configuration, environment, or security. Engineering and Analytics: Translate Data & AI use case requirements into effective data models and pipelines, ensuring data integrity through statistical quality procedures and advanced AI techniques. API & Microservice Development: Architect and build secure, scalable microservices and APIs, ensuring broad usability, security, and adherence to best practices in documentation and version control. Platform Scalability & Optimization: Evaluate and select optimal technologies for cloud and on-premise deployments, implementing strategies for scalability, performance monitoring, and cost optimization. Knowledge of machine learning frameworks (TensorFlow, PyTorch, Keras) Understanding of MLOps (machine learning operations) and continuous integration/deployment (CI/CD) Familiarity with deployment tools (Docker, Kubernetes) Technologies: Demonstrate expertise with Data & AI technologies (e.g., Spark, , Databricks), programming languages (Java, Scala, SQL), API development patterns (e.g., HTTP/REST, GraphQL), and cloud platforms (Azure) Good to have skills: Technologies: Demonstrate expertise with Data & AI technologies (e.g.Kafka, , Snowflake), programming languages (Python, SQL), API development patterns (e.g., HTTP/REST, GraphQL). Location: IND:KA:Bengaluru / Innovator Building, Itpb, Whitefield Rd - Adm: Intl Tech Park, Innovator Bldg Job ID R-72059 Date posted 06/26/2025

Posted 1 day ago

Apply

5.0 - 8.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Job Title: Config Lead - o9 Supply Planning Experience: 5-8 Years Location: Bengaluru (Hybrid) Job Summary: We are seeking a dynamic and detail-oriented o9 Supply Planning Config Lead to join our growing supply chain solutions team. As a Config Lead, you will be responsible for leading the configuration, testing, and supporting o9 Supply Planning solution for clients across industries. You ll work closely with business stakeholders and managers to deliver value through intelligent supply chain planning. Key Responsibilities Roles & Responsibilities: Lead the complete lifecycle of a project from build and config to delivery of o9 s solutions to customers across industries and geographies. Collaborate with business stakeholders, IT teams, and o9 experts to ensure successful project execution. Effectively map the business requirements into o9 s platform and come up with a clear, phased achievable blueprint. Work with a team of experts to configure, solution as per the design to solve deep operations / supply chain problems and institute rigorous performance monitoring process. Develop and configure o9 models, planning workflows, and analytics dashboards to meet client needs. Define and implement data integration strategies between o9 and enterprise systems like SAP, Oracle, or other ERPs. Conduct user training, coordinate with change management team, and post-go-live support team to drive adoption. Actively helps in improving internal processes and product features based on customer feedback by interfacing with the development and operations teams. Support customer demonstrations. Be a mentor and guide to junior members. Skills/Qualifications: 5+ years of experience in o9 implementation and supply chain planning. Strong understanding of supply chain processes, including supply planning, inventory management, and S&OP. Experience in configuring and deploying o9 s Enterprise Knowledge Graph (EKG) and Data Model in at least 2-3 end to end implementations. Hands-on experience with SQL, Python, or similar scripting languages for data transformations. Ability to work with large datasets, APIs, and cloud-based analytics tools. Strong problem-solving and analytical skills with a focus on business value realization. Excellent communication and stakeholder management abilities. Familiarity with Agile methodologies and project management frameworks. Consulting or client-facing experience in industries like CPG, Retail, Pharma, or Manufacturing.

Posted 1 day ago

Apply

5.0 - 10.0 years

11 - 12 Lacs

Bengaluru

Work from Office

Req ID: 331216 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a AEP Developer to join our team in Bangalore, Karn taka (IN-KA), India (IN). About the Role: We are seeking a highly skilled and experienced Senior Adobe Experience Platform (AEP) Developer to join our growing team. In this role, you will play a critical part in Support & maintenance, design, development, and implementation of innovative customer data solutions within the AEP ecosystem. You will be responsible for building and maintaining robust data pipelines, integrating data from various sources, and developing custom solutions to meet the unique needs of our business. Key Responsibilities: AEP Platform Expertise: Deep understanding of the AEP suite, Experience Data Model (XDM), Data Science Workspace, and other relevant modules. Proficient in AEP APIs, Web SDKs, and integrations with other MarTech platforms (Adobe Target, CJA, AJO, Adobe Campaign etc.). Experience with AEP data ingestion, transformation, and activation. Strong understanding of data modeling principles and best practices within the AEP ecosystem. Data Engineering & Development: Design, develop, and maintain high-quality data pipelines and integrations using AEP and other relevant technologies. High level knowledge and understanding to develop and implement custom solutions within the AEP environment using scripting languages (e.g., JavaScript, Python) and other relevant tools. Troubleshoot and resolve data quality issues and performance bottlenecks. Ensure data accuracy, consistency, and security across all stages of the data lifecycle. Customer Data Solutions: Collaborate with cross-functional teams (e.g., marketing, product, data science) to understand the issues and support to fix problems. Support and maintenance of developed data-driven solutions to improve customer experience, personalize marketing campaigns, and drive business growth. Analyze data trends and provide insights to inform business decisions. Project Management & Collaboration: Contribute to the planning and execution of AEP projects, ensuring timely delivery and adherence to project timelines and budgets. Effectively communicate technical concepts to both technical and non-technical audiences. Collaborate with team members and stakeholders to ensure successful project outcomes. Stay Updated: Stay abreast of the latest advancements in AEP and related technologies. Continuously learn and expand your knowledge of data management, data science, and customer experience. Qualifications: Education: Bachelor s degree in computer science, Engineering, or a related field (or equivalent experience). Experience: Overall IT experience of 5+ years with 3-4 years of hands-on experience with Adobe Experience Platform (AEP). Technical Skills: 3+ Strong proficiency in JavaScript, or other relevant programming languages. 3 years of experience with RESTful APIs, JSON, and XML. 3+ years of experience with data warehousing, data modeling, and data quality best practices. 3+ years of experience in Tag management system like Adobe Launch 2+ years of experience working with WebSDK Experience of Adobe Analytics is a plus. Knowledge and experience on leveraging Python libraries and tools for data cleaning and analysis is a plus Experience with cloud platforms (e.g., AWS, Azure, GCP) is a plus. Soft Skills: Excellent analytical and problem-solving skills. Strong communication, interpersonal, and collaboration skills. Ability to work independently and as part of a team. Detail-oriented and results-driven. Strong organizational and time-management skills. Bonus Points: Experience with other Adobe Marketing Cloud solutions (e.g., Adobe Analytics, Adobe Target). Experience with Agile development methodologies. Experience with data visualization tools (e.g., Tableau, Power BI). Experience with data governance and compliance (e.g., GDPR, CCPA). Understanding of Real-time Customer Data Platform (RT-CDP)

Posted 1 day ago

Apply

1.0 - 6.0 years

3 - 8 Lacs

Bengaluru

Work from Office

Req ID: 329983 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a DevOps Engineer to join our team in Bangalore, Karn taka (IN-KA), India (IN). Once You Are Here, You Will: Quickly be steeped in a suite of custom accelerators which include: Continuously tested & delivered Infrastructure, Pipeline, and Policy as Code Extensible automation & dependency management code frameworks Development of event driven functions, APIs, backend services, Command Language Interfaces, and Self-Service Developer Portals. We eat our own dog food; all your work will be covered with unit, security, governance, and functional testing utilizing appropriate frameworks. Armed with these accelerators you will be among the first on the ground to customize and deploy the delivery platform that will enable the application developers that follow us to rapidly create, demonstrate, and deliver value sprint over sprint for much of the Global Fortune 500. Basic Qualifications: 1+ years scripting experience in bash or powershell 3+ years of experience with GCP engineering, API Pub/Sub, BigQuery, Python, CI/CD, and DevOps process. 3+ years of experience in the design, development, configuration, and implementation of projects in GCP 4+ years of networking experience (security, DNS, VPN, Cloud, load balancing) 4+ years of systems administration experience with at least one operating system (Linux or Windows) 1+ years of experience with one of the following public cloud platforms (AWS or Azure) 1+ years managing, maintaining, or working with SonarQube, Desired Experience & Skills: 1+ years of serverless or container-based architecture experience 1+ years of Infrastructure as code (IAC) experience 3+ years of Azure DevOps Management 3+ years managing, maintaining, or working with SonarQube, JFrog, Jenkins Can autonomously contribute to cloud and application orchestration code and actively involved in peer reviews Can deploy and manage the common tools we use (Jenkins, monitoring, logging, SCM, etc.) from code Advance networking (tcpdump, network flow security analysis, can collect and understand metric between microservices) Some sense of advance authentication technologies (federated auth, SSO) Have a curious mindset with the ability to identify and resolve issues from start to end **** Only include for India Based positions Please note Shift Timing Requirement: 1:30pm IST -10:30 pm IST

Posted 1 day ago

Apply

5.0 - 8.0 years

11 - 15 Lacs

Bengaluru

Work from Office

Req ID: 324145 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Config Lead - o9 Supply Planning to join our team in Bangalore, Karn taka (IN-KA), India (IN). Job Title: Config Lead - o9 Supply Planning Experience: 5-8 Years Location: Bengaluru (Hybrid) Job Summary: We are seeking a dynamic and detail-oriented o9 Supply Planning Config Lead to join our growing supply chain solutions team. As a Config Lead, you will be responsible for leading the configuration, testing, and supporting o9 Supply Planning solution for clients across industries. You ll work closely with business stakeholders and managers to deliver value through intelligent supply chain planning. Key Responsibilities Roles & Responsibilities: Lead the complete lifecycle of a project from build and config to delivery of o9 s solutions to customers across industries and geographies. Collaborate with business stakeholders, IT teams, and o9 experts to ensure successful project execution. Effectively map the business requirements into o9 s platform and come up with a clear, phased achievable blueprint. Work with a team of experts to configure, solution as per the design to solve deep operations / supply chain problems and institute rigorous performance monitoring process. Develop and configure o9 models, planning workflows, and analytics dashboards to meet client needs. Define and implement data integration strategies between o9 and enterprise systems like SAP, Oracle, or other ERPs. Conduct user training, coordinate with change management team, and post-go-live support team to drive adoption. Actively helps in improving internal processes and product features based on customer feedback by interfacing with the development and operations teams. Support customer demonstrations. Be a mentor and guide to junior members. Skills/Qualifications: 5+ years of experience in o9 implementation and supply chain planning. Strong understanding of supply chain processes, including supply planning, inventory management, and S&OP. Experience in configuring and deploying o9 s Enterprise Knowledge Graph (EKG) and Data Model in at least 2-3 end to end implementations. Hands-on experience with SQL, Python, or similar scripting languages for data transformations. Ability to work with large datasets, APIs, and cloud-based analytics tools. Strong problem-solving and analytical skills with a focus on business value realization. Excellent communication and stakeholder management abilities. Familiarity with Agile methodologies and project management frameworks. Consulting or client-facing experience in industries like CPG, Retail, Pharma, or Manufacturing.

Posted 1 day ago

Apply

7.0 - 12.0 years

3 - 7 Lacs

Chennai

Work from Office

Job Summary: We are looking for an experienced Senior Software Engineer with deep expertise in Spark SQL / SQL development to lead the design, development, and optimization of complex database systems.As a Senior Spark SQL/SQL Developer, you will play a key role in creating and maintaining high performance, scalable database solutions that meet business requirements and support critical applications.You will collaborate with engineering teams, mentor junior developers, and drive improvements in database architecture and performance. Key Responsibilities:Design, develop, and optimize complex Spark SQL / SQL queries, stored procedures, views, and triggers for high performance systems.Lead the design and implementation of scalable database architectures to meet business needs.Perform advanced query optimization and troubleshooting to ensure database performance, efficiency, and reliability.Mentor junior developers and provide guidance on best practices for SQL development, performance tuning, and database design.Collaborate with cross functional teams, including software engineers, product managers, and system architects, to understand requirements and deliver robust database solutions.Conduct code reviews to ensure code quality, performance standards, and compliance with database design principles.Develop and implement strategies for data security, backup, disaster recovery, and high availability.Monitor and maintain database performance, ensuring minimal downtime and optimal resource utilization.Contribute to long term technical strategies for database management and integration with other systems.Write and maintain comprehensive documentation on database systems, queries, and architecture. Required Skills & Qualifications :-- Experience: 7+ years of hands on experience in SQL Developer / data engineering or a related field.Expert level proficiency in Spark SQL and extensive experience with Bigdata (Hive), MPP (Teradata), relational databases such as SQL Server, MySQL, or Oracle. Strong experience in database design, optimization, and troubleshooting. Deep knowledge of query optimization, indexing, and performance tuning techniques.Strong understanding of database architecture, scalability, and high availability strategies. Experience with large scale, high transaction databases and data warehousing. Strong problem solving skills with the ability to analyze complex data issues and provide effective solutions.Data testing and data reconciliation Ability to mentor and guide junior developers and promote best practices in SQL development. Proficiency in database migration, version control, and integration with applications. Excellent communication and collaboration skills, with the ability to interact with both technical and non technical stakeholders. Preferred Qualifications: Experience with NoSQL databases (e.g., MongoDB, Cassandra) and cloud based databases (e.g., AWS RDS, Azure SQL Database).Familiarity with data analytics, ETL processes, and data pipelines.Experience in automation tools, CI/CD pipelines, and agile methodologies. Familiarity with programming languages such as Python, Java, or C#. Education:Bachelors or Masters degree in Computer Science, Information Technology, or a related field (or equivalent experience).

Posted 1 day ago

Apply

5.0 - 10.0 years

14 - 15 Lacs

Bengaluru

Work from Office

Job Description: Sr. Python Developer We are seeking a talented and experienced Senior Python Developer to join our growing team. You will play a key role in designing, developing, and implementing complex software solutions using Python and related technologies. You will be responsible for the entire development lifecycle, from requirements gathering to deployment and maintenance. Responsibilities: Collaborate with designers, product managers, and other engineers to understand requirements and translate them into high-quality, maintainable Python code. Design, develop, test, and deploy efficient and scalable back-end features using Python language and frameworks (e.g., Django, Flask). Write clean, well-documented, and unit-testable code adhering to best practices. Participate in code reviews and provide constructive feedback to improve code quality. Troubleshoot and debug complex technical problems. Automate tasks using scripting languages (e.g., Bash, Python). Stay up to date with the latest Python technologies and best practices. Participate in the technical design and architecture of new systems. Potentially mentor and guide junior developers. Qualifications: Minimum 5+ years of experience in software development with Python. Strong understanding of object-oriented programming (OOP) principles and design patterns. Experience with web development frameworks (e.g., Django, Flask) is a plus. Experience with database technologies is a plus. Experience with version control systems (e.g., Git) is a must. Experience with unit testing frameworks (e.g., unittest, pytest) is a plus. Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Ability to work independently and as part of a team.

Posted 1 day ago

Apply

8.0 - 13.0 years

4 - 9 Lacs

Pune

Work from Office

Are you curious, motivated, and forward-thinking? At FIS, you ll have the opportunity to work on some of the most challenging and relevant issues in financial services and technology. Our talented people empower us, and we believe in being part of a team that is open, collaborative, entrepreneurial, passionate and above all fun. Position - Automation Engineer (Software Test Analyst Senior) Required Experience Level - 8+ Years Work Location - Pune Educational Qualification - B.E. in the field of Computer Science/IT Work Shift - Regular About Team & Product/Project Details: As a QA SME, you ll be part of our PTP Platform development group, working in scrum teams responsible for key system functionalities. You will become involved with all phases of the QA lifecycle and take ownership of key QA functional areas. What you will be doing (Shift Time & Key Roles and Responsibilities): In this position you will be working as Software Test Analyst Senior providing end to end solution for business problems and innovative automation solutions for fast tracking releases. As an individual contributor you will: Boost QA coverage, following testing standards, and serving as the QA advocate. Collaborate with your team and help QA activities to deliver quality releases Expand QA Automation frameworks and features to increase Test Coverage Perform test analysis, design, execution, defect management and stakeholder engagement Mentor team members to implement wider QA and Testing initiatives. Responsible for requirement analysis and test plan design, test data preparation with wide coverage of functionalities, Test automation framework creation and its maintenance, general QA activities which includes defect reporting and tracking, effective stakeholder engagement, CI/CD and materialize automation and innovation opportunities to improve productivity. Participate in Agile Activities and contribute to achieving defined goals for each sprint Work with team to understand business scenarios and automate those in Python or other innovative solutions Work within the team as well as with various stakeholders including clients, global teams geographically located in different regions Build an understanding of the product on a technical and functional level, understand the context of our clients, participate in solution design, drive testing activities, and deliver solutions that address the business needs Work with the Business Analyst and Solution Architect as well as with experienced people in team to understand projects requirements, identify requirement gaps, and functional risks so that these can be addressed early in the process. Evaluate test data and/or engineer test data to be used throughout the software development life cycle Drive all phases of test execution and documentation including functional, regression, performance, usability, integration and acceptance testing. Work closely with other team members like Developers, BAs in a team to ensure better turnaround for issue resolution and change implementation Implement processes to improvise quality of deliverables, create and maintain repository of documents and train and guide junior team members Ensure quality of newly build or enhanced features and regression test to make sure nothing is broken What you bring (Experience Level & Mandatory Skills in potential candidate): Technical Experience - Hands-on experience in Automation Frameworks using Python, Proficiency in SQL, including creating complex queries Good hands-on experience with UNIX/Linux commands and scripting skills Relevant degree in a numeric discipline, or equivalent work experience Test Case management (e.g. Zephyr / X-ray), Jira and confluence (or similar) experience Experience in Banking and Finance/Capital Markets Excellent written and spoken English Added bonus if you have (Desirable skills): Docker / Kubernetes experience Non-functional testing experience (performance, security scans etc) Experience of financial markets and the trade lifecycle API and GUI automation experience. What We offer You: A range of benefits designed to help support your lifestyle and wellbeing A multi-faceted job with a broad spectrum of responsibilities An international work environment and a dedicated and innovative team Flexible and creative work environment Diverse and collaborative atmosphere Professional and personal development resources

Posted 1 day ago

Apply

5.0 - 8.0 years

11 - 12 Lacs

Bengaluru

Work from Office

Req ID: 330804 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Azure Cloud Engineer to join our team in Bangalore, Karn taka (IN-KA), India (IN). Domain/ Skills required Banking Domain Azure consultant Bachelor s/Master s degree in Computer Science or Data Science 5 to 8 years of experience in software development and with data structures/algorithms 5 to 7 years of experience with programming language Python or JAVA, database languages (e.g., SQL), and no-sql 5 years of experience in developing large-scale platform, distributed systems or networks, experience with compute technologies, storage architecture Strong understanding of microservices architecture Experience building AKS applications on Azure Strong understanding and experience with Kubernetes for availability and scalability of the application in Azure Kubernetes Service Experience in building and deploying applications with Azure, using third-party tools (e.g., Docker, Kubernetes and Terraform) Experience working with AKS clusters, VNETs, NSGs, Azure storage technologies, Azure container registries etc. Good understanding of building Redis, ElasticSearch, and MongoDB applications Preferably have worked with RabbitMQ E2E understanding of ELK, Azure Monitor, DataDog, Splunk and logging stack Experience with development tools, CI/CD pipelines such as GitLab CI/CD, Artifactory, Cloudbees, Jenkins, Helm, Terraforms etc. Understanding of IAM roles on Azure and integration / configuration experience Preferably working on Data Robot setup or similar applications on Cloud/Azure Functional, integration, and security testing; performance validation

Posted 1 day ago

Apply

5.0 - 10.0 years

22 - 27 Lacs

Bengaluru

Work from Office

Req ID: 331415 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Systems Integration Advisor to join our team in Bangalore, Karn taka (IN-KA), India (IN). Required skills: 5 + years of experience in IT Technology. 2+ years of experience in AI / ML with strong working knowledge in neural networks. 2+ years of data engineering experience preferably using AWS Glue, Cribl, SignalFx, OpenTelemetry or AWS Lambda 2+ years of python coding using numpy, vectorization and Tensorflow. 2+ years of experience leading complex enterprise-wide integration programs and efforts as an individual contributor Preferred Skills: Mathematics or physics degree. 2+ years of technical knowledge in cloud technologies such as AWS, Azure, GCP Excellent verbal, written, and interpersonal communication skills Ability to provide strong customer service. #LI-INPAS

Posted 1 day ago

Apply

8.0 - 13.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Req ID: 329466 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a AWS Cloud Kafka Lead Data eng to join our team in Bengaluru, Karn taka (IN-KA), India (IN). Data Engineer Lead Robust hands-on experience with industry standard tooling and techniques, including SQL, Git and CI/CD pipelines mandiroty Management, administration, and maintenance with data streaming tools such as Kafka/Confluent Kafka, Flink Experienced with software support for applications written in Python & SQL Administration, configuration and maintenance of Snowflake & DBT Experience with data product environments that use tools such as Kafka Connect, Synk, Confluent Schema Registry, Atlan, IBM MQ, Sonarcube, Apache Airflow, Apache Iceberg, Dynamo DB, Terraform and GitHub Debugging issues, root cause analysis, and applying fixes Management and maintenance of ETL processes (bug fixing and batch job monitoring) Training & Certification Apache Kafka Administration Snowflake Fundamentals/Advanced Training Experience 8 years of experience in a technical role working with AWS At least 2 years in a leadership or management role #LI-INPAS

Posted 1 day ago

Apply

8.0 - 12.0 years

14 - 19 Lacs

Bengaluru

Work from Office

Job Summary As a Senior Software Engineer for the Core Software and Data Management team at NetApp, you will focus on delivering solutions that meet customers needs across engineered systems and cloud services. The CSDM team is responsible for a range of functions in ONTAP, NetApps cross-platform data management and storage software. Areas of responsibility include ONTAP Filesystem, Anti-ransomware and Encryption data management capabilities, Core wafl features and quality. Job Requirements Excellent coding skills in C/C++ required, Python is optional. System/Kernel programming, Experience with Filesystems or Networking or file/cloud protocols is a must. Proven track record of leading mid to large sized projects. This position requires an individual to be creative, team-oriented, a quick learner and driven to produce results. Responsible for providing support in the development and testing activities of other engineers that involve several inter-dependencies. Participate in technical discussions within the team and with other groups within Business Units associated with specified projects. Willing to work on additional tasks and responsibilities that will contribute towards team, department and company goals. A strong understanding and experience with concepts related to computer architecture, data structures and programming practices. Education Typically requires a minimum of 8-12 years of related experience with a Bachelor s degree or 6 years and a Master s degree; or a PhD with 3 years experience; or equivalent experience.

Posted 1 day ago

Apply

5.0 - 7.0 years

22 - 25 Lacs

Bengaluru

Work from Office

Job Summary Member of a software engineering team involved in development & design of the features related to NetApp s flagship storage operating ONTAP. ONTAP is a feature rich stack with its rich data management capabilities that has tremendous value to our customers and are used in mission critical applications across the world. You will work as part of a team responsible for the development, testing and debugging of distributed software that drives NetApp cloud, hybrid-cloud, and on-premises solutions. As part of the Research and Development function, the overall focus of the group is on competitive market and customer requirements, supportability, technology advances, product quality, product cost and time-to-market. Software engineers focus on enhancements to existing products as well as new product development. This is a mid-level technical position that requires an individual to be broad-thinking, systems-focused, creative, team-oriented, technologically savvy, able to work in a small and large cross-functional teams, willing to learn and driven to produce results. Job Requirements Excellent coding skills in C/C++ required, Python is optional. System/Kernel programming, Multithreading Experience with Filesystems, Networking or file/cloud protocols is a must Proven track record of working on mid to large-sized projects This position requires an individual to be creative, team-oriented, a quick learner, and driven to produce results. Responsible for providing support in the development and testing activities of other engineers that involve several inter-dependencies. Participate in technical discussions within the team and across cross-functional teams. Willing to work on additional tasks and responsibilities that will contribute towards team, department, and company goals A strong understanding and experience with concepts related to computer architecture, data structures, and programming practices Work collaboratively within a team environment of other engineers to meet aggressive goals and high-quality standards. Possesses sufficient technical knowledge and experience to pick up new expertise quickly with guidance from the technical leads. Participate in all phases of the product development cycle: from product definition and design, through implementation, debugging, testing and early customer support. Resourceful in applying creative ideas to solve problems. Support Critical and/or high-visibility customer support engagements. Education Requires a minimum of 5-7 years of related experience with a Bachelor s degree or 3-5 years and a Master s degree; or a PhD with 1 years experience; or equivalent experience.

Posted 1 day ago

Apply

4.0 - 6.0 years

11 - 12 Lacs

Noida

Work from Office

Responsibilities Date posted 07/03/2025 End Date 07/24/2025 City Noida State/Region Uttar Pradesh Country India Location Type Onsite Calling all innovators find your future at Fiserv. Job Title Specialist, Quality Assurance Engineering What does a successful Automation Testing, Professional do? Understand business requirements and translate them into test cases. Work closely with business analysts, developers, and QAs to test software and ensure quality. Design and maintain automated test suites. Experience with different QA/defect tracking tools and processes. What You will do: Works with QA Automation team on short to long-term automation projects. Decompose requirements and develop test automation scripts for projects of simple to high complexity including both functional and interoperability. Practices automation development efficiency, maintainability, and reusability Develops automation scripts according to the coding standards and debugs scripts developed by team peers. Participates in technical review meetings and automation framework enhancement forums and demonstrate the work independently with minimal supervision. Understanding of Automation Framework and enhance the automation framework. Lead the design, development and maintenance of automation frameworks and automation test suites with CI/CD integration. Report and track defects in a timely manner during test execution. What you will need to have: 4 to 6 years of experience as a Quality Assurance. Minimum 2 years of experience in Automation Testing Good knowledge of testing phases and approaches, including functional, Regression, Integration, and end to end testing. Good Knowledge in UI/API/Web API/Mobile Testing Technologies Experience working in an Agile/Scrum development process. Good understanding of Software Test Lifecycle including Test Planning, Test Cases Design, Test Data Setup, Defect Management, Test log, Test results, Traceability Matrix Experience with programming language Selenium/JAVA/.Net/Python/C#. Experience in database testing using tools like MS SQL server/MySQL/Oracle Knowledge of automated testing tools like TestNG, Junit/Postman/Ready API/Soap UI/UFT/TOSCA. Working experience on CICD/ DevOps Jenkins or TeamCity. Familiarity of version control systems like GitHub/Bit bucket/Azure DevOps. Knowledge in any monitoring tool like Splunk. Strong problem-solving skills. Good communication skills to speak with the Business Partners are essential and non-negotiable for this position. What would be great to have: Knowledge of Cloud platforms like AWS/Azure/GCP. Knowledge of tools like Pipelines, Git Repo, SQL developer. Prior experience in Payments or Banking and Finance domain in general. Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address. Share this Job Email LinkedIn X Facebook

Posted 1 day ago

Apply

3.0 - 6.0 years

4 - 8 Lacs

Pune

Work from Office

Overview We are loking for a candidate who has: CAD CATIA Data to TC Migration experience is mandatory Worked on Legacy System to TC Migration Projects Experience in Teamcenter Migration tools like csv2tcxml, tcxml export, tcxml import, ips upload Worked on Analysing CATIA Data, Report Issues and CATIA Data Cleansing Knowledge of Teamcenter Export Import Admin Application (transfer modes, closure rules) Knowledge of Teamcenter Datamodel and BMIDE usage Knowledge of Teamcenter Database Tables Knowledge of TC, CATIA and TCIC setup and configuration and TC and CATIA Integration application usage Teamcenter Application usage like My Teamcenter, Structure Manager, Organization, Classification, Projects Knowledge of CATIA API, CAT Script API will be helpful Experience in SQL, Python, Java, Windows Batch Scripting, XML Transformation Experience of Teamcenter ITK for batch utility development and customization Good communication skills, defect analysis and defect solving skills Should work in a team, be proactive, should be responsible and accountable for the tasks, from understanding requirement to implementation to testing Architect should have lead migration projects and capable of leading in technical areas, customer and internal communications, team and task management Qualifications Bachelor degree in computer science or equivalent Essential skills CAD CATIA Data to TC Migration experience is mandatory Worked on Legacy System to TC Migration Projects Experience in Teamcenter Migration tools like csv2tcxml, tcxml export, tcxml import, ips upload Worked on Analysing CATIA Data, Report Issues and CATIA Data Cleansing Knowledge of Teamcenter Export Import Admin Application (transfer modes, closure rules) Knowledge of Teamcenter Datamodel and BMIDE usage Knowledge of Teamcenter Database Tables Knowledge of TC, CATIA and TCIC setup and configuration and TC and CATIA Integration application usage Teamcenter Application usage like My Teamcenter, Structure Manager, Organization, Classification, Projects Knowledge of CATIA API, CAT Script API will be helpful Experience in SQL, Python, Java, Windows Batch Scripting, XML Transformation Experience of Teamcenter ITK for batch utility development and customization Good communication skills, defect analysis and defect solving skills Should work in a team, be proactive, should be responsible and accountable for the tasks, from understanding requirement to implementation to testing Architect should have lead migration projects and capable of leading in technical areas, customer and internal communications, team and task management Experience 3-6 years

Posted 1 day ago

Apply

5.0 - 8.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Title: Tech Lead Date: 4 Jul 2025 Location: Bangalore, KA, IN Job Description We are a technology-led healthcare solutions provider. We are driven by our purpose to enable healthcare organizations to be future-ready. We offer accelerated, global growth opportunities for talent that s bold, industrious, and nimble. With Indegene, you gain a unique career experience that celebrates entrepreneurship and is guided by passion, innovation, collaboration, and empathy. To explore exciting opportunities at the convergence of healthcare and technology, check out www.careers.indegene.com Looking to jump-start your career We understand how important the first few years of your career are, which create the foundation of your entire professional journey. At Indegene, we promise you a differentiated career experience. You will not only work at the exciting intersection of healthcare and technology but also will be mentored by some of the most brilliant minds in the industry. We are offering a global fast-track career where you can grow along with Indegene s high-speed growth. We are purpose-driven. We enable healthcare organizations to be future ready and our customer obsession is our driving force. We ensure that our customers achieve what they truly want. We are bold in our actions, nimble in our decision-making, and industrious in the way we work. If this excites you, then apply below. Role: Research lead - Agentic AI & Advanced RAG Role Type: AI Research & Innovation Description: We are looking for a curious, experimental, and technically deep Research Lead to shape the future of our Agentic AI and Advanced RAG platforms. You will lead PoCs involving Schema RAG, Knowledge Graphs, MCP, and multi-agent task orchestration. Responsibilities - Prototype LLM-based agents using LangChain/DSPy - Research and validate Schema RAG, Multi-RAG, and contextual prompt memory - Implement and test Model Context Protocol across A2A communication flows - Integrate taxonomies and ontologies with vector DBs and prompt retrieval - Publish internal papers, evaluate prompt safety, and support productization Must Have Experience: 5-8 years Tech Stack - Python, LangChain, OpenAI APIs, HuggingFace, DSPy - Streamlit, Neo4j, pgvector, Redis, FastAPI Cloud & Deployment AWS (Bedrock, Sagemaker), Azure AI Studio, GCP AI Platform AI Tools & Productivity Stack GitHub Copilot, Langfuse, Code review agents Security & Compliance Prompt safety, guardrail evaluation, hallucination monitoring, audit trail Good to have Innovative thinker, research-oriented, strong communicator, collaborative, self-starter EQUAL OPPORTUNITY

Posted 1 day ago

Apply

3.0 - 5.0 years

13 - 14 Lacs

Bengaluru

Work from Office

Title: Sr. Engineer Date: 4 Jul 2025 Location: Bangalore, KA, IN Job Description We are a technology-led healthcare solutions provider. We are driven by our purpose to enable healthcare organizations to be future-ready. We offer accelerated, global growth opportunities for talent that s bold, industrious, and nimble. With Indegene, you gain a unique career experience that celebrates entrepreneurship and is guided by passion, innovation, collaboration, and empathy. To explore exciting opportunities at the convergence of healthcare and technology, check out www.careers.indegene.com Looking to jump-start your career We understand how important the first few years of your career are, which create the foundation of your entire professional journey. At Indegene, we promise you a differentiated career experience. You will not only work at the exciting intersection of healthcare and technology but also will be mentored by some of the most brilliant minds in the industry. We are offering a global fast-track career where you can grow along with Indegene s high-speed growth. We are purpose-driven. We enable healthcare organizations to be future ready and our customer obsession is our driving force. We ensure that our customers achieve what they truly want. We are bold in our actions, nimble in our decision-making, and industrious in the way we work. If this excites you, then apply below. Role : Senior Developer ( Langchain Experience) Description: We are seeking a highly skilled Senior Python Developer with at least 3 -5 years of experience to join our dynamic team. The ideal candidate will have extensive expertise in Python development and a strong understanding of Langchain. You will be responsible for developing and maintaining high-quality software solutions that align with our companys objectives. Key Responsibility : Design, develop, and maintain scalable Python applications using best practices. Collaborate with cross-functional teams to define, design, and ship new features. Implement and maintain integrations with Langchain, ensuring seamless communication between systems. Write clean, efficient, and maintainable code while adhering to coding standards. Conduct code reviews and provide constructive feedback to team members. Troubleshoot and debug issues to optimize performance and ensure reliability. Stay updated on emerging technologies and industry trends, particularly in Langchain and LLM. Participate in Agile development processes, including sprint planning, daily stand-ups, and retrospectives. Mentor junior developers and share knowledge across the team. Collaborate with stakeholders to gather requirements and propose technical solutions. Must Have Bachelors degree in Computer Science, Engineering, or related field. Minimum of 3 years of professional experience in Python development. Proficiency in Langchain and experience with GenAI technologies. Strong understanding of software development principles, data structures, and algorithms. Experience with web frameworks. Familiarity with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., MongoDB). Knowledge of version control systems (e.g., Bitbucket) and CI/CD pipelines. Good communication and collaboration skills. Ability to work independently and as part of a team in a fast-paced environment. Experience with cloud platforms such as AWS, Azure, or GCP is a plus. EQUAL OPPORTUNITY

Posted 1 day ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Pune

Work from Office

Overview We are loking for a candidate who has: CAD NX Data to TC Migration experience is mandatory Worked on Legacy System to TC Migration Projects Experience in Teamcenter Migration tools like csv2tcxml, tcxml export, tcxml import, ips upload Worked on Analysing NX Data, Report Issues and NX Data Cleansing Knowledge of Teamcenter Export Import Admin Application (transfer modes, closure rules) Knowledge of Teamcenter Datamodel and BMIDE usage Knowledge of Teamcenter Database Tables Knowledge of TC, NX and TCIN setup and configuration and TC and NX Integration application usage Teamcenter Application usage like My Teamcenter, Structure Manager, Organization, Classification, Projects Knowledge of NX Open API will be helpful Experience in SQL, Python, Java, Windows Batch Scripting, XML Transformation Experience of Teamcenter ITK for batch utility development and customization Good communication skills, defect analysis and defect solving skills Should work in a team, be proactive, should be responsible and accountable for the tasks, from understanding requirement to implementation to testing Architect should have lead migration projects and capable of leading in technical areas, customer and internal communications, team and task management Qualifications Bachelor degree in computer science or equivalent Essential skills CAD NX Data to TC Migration experience is mandatory Worked on Legacy System to TC Migration Projects Experience in Teamcenter Migration tools like csv2tcxml, tcxml export, tcxml import, ips upload Worked on Analysing NX Data, Report Issues and NX Data Cleansing Knowledge of Teamcenter Export Import Admin Application (transfer modes, closure rules) Knowledge of Teamcenter Datamodel and BMIDE usage Knowledge of Teamcenter Database Tables Knowledge of TC, NX and TCIN setup and configuration and TC and NX Integration application usage Teamcenter Application usage like My Teamcenter, Structure Manager, Organization, Classification, Projects Knowledge of NX Open API will be helpful Experience in SQL, Python, Java, Windows Batch Scripting, XML Transformation Experience of Teamcenter ITK for batch utility development and customization Good communication skills, defect analysis and defect solving skills Should work in a team, be proactive, should be responsible and accountable for the tasks, from understanding requirement to implementation to testing Architect should have lead migration projects and capable of leading in technical areas, customer and internal communications, team and task management Experience 3-6 years

Posted 1 day ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies