Jobs
Interviews

4508 Informatica Jobs - Page 32

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 5.0 years

6 - 10 Lacs

Pune

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the client’s needs. Your primary responsibilities include* Design, build, optimize and support new and existing data models and ETL processes based on our client’s business requirements Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Design, develop, and maintain Ab Initio graphs for extracting, transforming, and loading (ETL) data from diverse sources to various target systems. Implement data quality and validation processes within Ab Initio. Data Modelling and Analysis. Collaborate with data architects and business analysts to understand data requirements and translate them into effective ETL processes. Analyse and model data to ensure optimal ETL design and performance. Ab Initio Components, Utilize Ab Initio components such as Transform Functions, Rollup, Join, Normalize, and others to build scalable and efficient data integration solutions. Implement best practices for reusable Ab Initio components Preferred technical and professional experience Optimize Ab Initio graphs for performance, ensuring efficient data processing and minimal resource utilization. Conduct performance tuning and troubleshooting as needed. Collaboration. Work closely with cross-functional teams, including data analysts, database administrators, and quality assurance, to ensure seamless integration of ETL processes. Participate in design reviews and provide technical expertise to enhance overall solution quality. Documentation

Posted 1 week ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Are you ready to write your next chapter? Make your mark at one of the biggest names in payments. With proven technology, we process the largest volume of payments in the world, driving the global economy every day. When you join Worldpay, you join a global community of experts and changemakers, working to reinvent an industry by constantly evolving how we work and making the way millions of people pay easier, every day. About The Role Worldpay are on an exciting journey re-engineering our core merchant payments platform to be more cost effective, scalable and cloud ready utilizing the latest cutting-edge technologies. This journey will require the very best engineering talent to get us there as it’s not just a technical change, it’s a cultural change as well. About The Team The New Acquiring Platform (NAP) is pioneering a payment revolution. Our state-of-the-art system equips us for the spending habits of tomorrow, as well as today. We will be able to deliver unrivalled insights into every trade and transaction. The platform is designed around payments, not just cards – so we can cater to every emerging trend with speed and customer centricity. What You'll Own We are looking for bright talent who can build future testing capability for ongoing BAU delivery and drive quality improvements across multiple agile teams. You will be working on the QA Team who caters to the product, platform and business needs of a number of Agile Release Trains. including Acquiring, Billing & Funding, Servicing, Reporting. E2E Volume Functional Testing Quality Assurance team is fundamental to unlocking value for our Merchant business, being relied upon to both ensure the stability of our production releases and deliver new boundary breaking products for our customers. This includes full E2E testing of the full Merchant Acquirer lifecycle from merchant onboarding to Invoice generation and reporting whilst interacting with many payment modules/interfaces as part of delivery. Where you'll own it You will own it in our Vibrant Office Locations as Bangalore and Indore hub . APAC With hubs in the heart of city centers and tech capitals, things move fast in APAC. We pride ourselves on being an agile and dynamic collective, collaborating with different teams and offices across the glob What You Bring Proven track record of E2E systems integration testing across multiple team and application boundaries, including 3rd parties and vendors. Experience of operating within an Agile Team (SAFe methodology preferable) Testing of API’s using SoapUI, Postman, etc Advanced SQL, PL/SQL skills (Procedure, Package), basic performance tuning and DBA metadata. ETL or Data Warehouse concepts, experience in any ETL tools (Informatica, ODI, etc.) Experience with automated testing development (Shell, Python, Java) Experience with testing frameworks - Selenium and tools such as Cucumber Good understanding of CI/CD principles, error logging and reporting including the supporting tools e.g. Github, Jenkins, Splunk Experience on testing against modern cloud platforms and containerised applications (AWS/ Azure). Understanding of Kafka / Hadoop (Spark) and/or event driven design and principles. Understanding of job scheduler tools (Control M, Autosys, etc). Experience of the payments industry is preferable, and working with large volumes of data across real-time processing systems (35+ million data records) Good understanding of Unix/Linux/Windows Operating Systems and Oracle Databases. Working with complex reporting requirements across multiple systems Experience in supporting a small team of experienced Quality Analysts Experience in carrying out internal reviews to ensure quality standards are met Must demonstrate ability to own tasks and defects, and see through to completion Building strong relationships across multiple engineering teams and stakeholders Experience in reviewing progress and presenting results to stakeholders Experience with environment management, deployments, and prioritisation Provide subject matter expert knowledge in Quality Assurance best practices, tools, and software Experience working with Rally for test case management and defect management What Makes a Worldpayer It’s simple: Think, Act, Win. We stay curious, always asking the right questions to be better every day, finding creative solutions to simplify the complex. We’re dynamic, every Worldpayer is empowered to make the right decisions for their customers. And we’re determined, always staying open – winning and failing as one. Does this sound like you? Then you sound like a Worldpayer. Apply now to write the next chapter in your career. Privacy Statement Worldpay is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how Worldpay protects personal information online, please see the Online Privacy Notice. Sourcing Model Recruitment at Worldpay works primarily on a direct sourcing model; a relatively small portion of our hiring is through recruitment agencies. Worldpay does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company. #pridepass

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Technology-Data Management - Data Integration-Ab Initio Preferred Skills: Technology-Data Management - Data Integration-Ab Initio

Posted 1 week ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to ensure effective Design, Development, Validation and Support activities, to assure that our clients are satisfied with the high levels of service in the technology domain. You will gather the requirements and specifications to understand the client requirements in a detailed manner and translate the same into system requirements. You will play a key role in the overall estimation of work requirements to provide the right information on project estimations to Technology Leads and Project Managers. You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional : Technology-Data Management - Data Integration Administration-Informatica Administration Preferred Skills: Technology-Data Management - Data Integration Administration-Informatica Administration

Posted 1 week ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Primary skills:Informatica MDM/ PIM/ P360/ IDQ/ S360STIBO MDMReltio MDMIBM MDMOther (IBM Preferred Skills: Technology-Data Management - MDM-IBM MDM

Posted 1 week ago

Apply

9.0 - 11.0 years

10 - 13 Lacs

Pune

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Technology-Data on Cloud - Datastore-Cloud based Integration Platforms-Informatica Intelligent Cloud services(IICS) Preferred Skills: Technology-Data on Cloud - Datastore-Cloud based Integration Platforms-Informatica Intelligent Cloud services(IICS)

Posted 1 week ago

Apply

3.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering,BCS,BBA,BCom,MCA,MSc Service Line Data & Analytics Unit Responsibilities " Has good knowledge on Snowflake Architecture. Understanding Virtual Warehouses - multi-cluster warehouse, autoscaling Metadata and system objects - query history, grants to users, grants to roles, users Micro-partitions Table Clustering, Auto Reclustering Materialized Views and benefits Data Protection with Time Travel in Snowflake - extremely imp Analyzing Queries Using Query Profile - extremely important (Explain plan) Cache architecture Virtual Warehouse(VW) Named Stages Direct Loading SnowPipe, Data Sharing,Streams, JavaScript Procedures & Tasks Strong ability to design and develop workflows in Snowflake in at least one cloud technology (preferably, AWS) Apply Snowflake programming and ETL experience to write Snowflake SQL and maintain complex, internally developed Reporting system. Preferable knowledge in ETL Activities like data processing from multiple source systems. Extensive Knowledge on Query Performance tuning. Apply knowledge of BI tools. Manage time effectively. Accurately estimate effortfortasks and meet agreed-upon deadlines. Effectively juggle ad-hocrequests and longer-term projects.Snowflake performance specialist- Familiar withzero copy cloningand usingtime travelfeatures to clone table- Familiar in understandingSnowflake query profileand what each step does andidentifying performance bottlenecks from query profile- Understanding of when a table needs to be clustered- Choosing the right cluster keyas a part of table design to help query optimization- Working with materialized views andbenefits vs cost scenario- How Snowflake micro partitions are maintained and what are the performance implications wrt micro partitions/ pruningetc- Horizontal vs vertical scaling. When to do what.Concept of multi cluster warehouse and autoscaling- Advanced SQL knowledge including window functions, recursive queriesand ability to understand and rewritecomplex SQLs as a part of performance optimization" Additional Responsibilities: Domain*Data Warehousing, Business IntelligencePrecise Work LocationBhubaneswar, Bangalore, Hyderabad, Pune Technical and Professional : Mandatory skills*SnowflakeDesired skills*Teradata/Python(Not Mandatory) Preferred Skills: Cloud Platform-Snowflake Technology-OpenSystem-Python - OpenSystem

Posted 1 week ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Educational Bachelor of Engineering Service Line Data & Analytics Unit Responsibilities Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : Technology-Data Management - MDM-Informatica MDM Preferred Skills: Technology-Data Management - MDM-Informatica MDM Technology-Data Management - MDM-Informatica PIM

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Chennai

Work from Office

Ford Credit IT is looking for a proficient technical anchor who is having excellent hands-on in Salesforce Service Cloud Interaction studio,Omni Studio,Velocity and Mobile Studio, AmpScript,JSON/Apex, JavaScript, Lightning components, Aura Component and Lighting Web Component with software engineering practices. Technical Anchor will build the scalable and fully available technical solutions in Digital space with team of software engineers based out of India and will be responsible to support NA markets. Technical anchor will collaborate directly and continuously with Software Engineers, Product Managers, Designers, Architects, Engineering Manager and Product Owners of Salesforce team. Description for Internal Candidates As a Technical Anchor working in Ford Credit IT, you will join a team that supports to develop enterprise scale applications/building SaaS products in the Salesforce Service Cloud/ Auto Cloud. Work on a balanced product team to define, design, develop and deploy Salesforce Service Cloud/ Auto Cloud in developing Form Data Models, Customer Data Platforms (CDP)/Interaction Studio/Journey builder/Automation Studio/Email and Mobile studio, contact builder,data extension, data sync,Sitemap,content block. Ability to Productize (Build/Author) a document generation product as a SaaS (Software as a Service) products hosted on Mulesoft and Google Cloud Platform (GCP). Build and maintain digital expertise by researching latest industry trends and standards, driving innovation through PoCs and experiments. Develop Salesforce Service Cloud/ Auto Cloud applications . Evaluate potential solutions using both technical and commercial criteria that support the established cost and service requirements with continuous improvement and innovative mindset. Develop and automate unit and integration test scripts. Integrated with MuleSoft applications for integrations around Sales/ Service clouds with Ford Credit Systems. Act as a mentor for less experienced developers through both your technical knowledge and ability to inspire a team to build extraordinary impact together. Understand the depth of the User Stories and provide accurate estimates. Automate performance monitoring and notification in the event of failures using best practices and tools. Research new technologies, influences and implements enterprise technology shifts and new trends impacting Ford application delivery. Do code deployments using CICD Salesforce Salescloud and Mulesoft pipeline with Service cloud - Copado Salesforce deployment. Participate in highly collaborative environment. DevOps o Continuous Integration and Continuous Deployment (CI/CD) Security (SAST/DAST) Monitoring/logging/tracing/ tools (SPLUNK etc ) Experience deployment using source control using Visualsourcecode/Github repo/Copado. Strong sense of code with ability to review code using SonarQube, Checkmarx, rework and deliver Quality code. Build a reusable component using LWC component, AmpScript, Service Side Java Script (SSJS), and SQL. Integrating salesforce Marketing cloud with external system using SFMC APIs Follow enterprise architecture processes and advise teams on cloud design, development, and architecture, service blueprints. Engage in Agile practices including but not limited to Stand-ups, backlog grooming, sprint demos and journey mapping. B.E. / B.Tech / M.C.A Minimum 7 years of experience developing Salesforce Service/Auto Cloud customizations. Responsibilities for Internal Candidates Extensive experience in Ampscript,Apex, JavaScript, Lightning components, Aura Component and Lighting Web Component, Omniscript, Velocity Must have experience in in contact builder,data extension,data sync,Sitemap,content block, Lead Service/Auto Cloud data modeling and architecture including data extension modeling and cross-product data architecture & mapping Ability to integrate Mulesoft, Informatica, Grapghql, Mediallia and Emplifi. Ability to create flows, modify objects, create custom objects, write Apex, triggers and integrate API services using an IDE Demonstrated ability to drive development of highly technical technology services and capabilities. Experience with the Salesforce.com APEX data loader , Salesforce.com web services APIs/Platform Event/Changedata capture/REST/Pub/Sub. Strong sense of code with ability to review code using SonarQube, Checkmarx, rework and deliver Quality code. Demonstrated experience in Customer Data Platforms (CDP)/Interaction Studio/Journey builder/Automation Studio/Email and Mobile studio. Demonstrated experience establishing and maintaining data structures, data extensions and automations within Salesforce Service/Auto Cloud Experience in Enterprise data analytics, Reporting and Monitoring using Splunk, Dynatrace, healthnut etc Qualifications for Internal Candidates 5+ years of experience in architecting and implementing fault tolerant, highly available Service/Auto cloud API/REST/SOAP,Platform Event(Pub/Sub). Salesforce Service/Auto Cloud Developer/Consultant/Salesforce Application Architect certifications will be an added advantage. Should have SQL knowledge and have the experience writing database scripts using DDL or queries using DML. Experience in SRE in Copado and ability to architect the services considering observability, traceability and monitoring aspects. At least 4 years of experience in Agile scrum software development process. Ability to work in team in diverse/ multiple stakeholder environment. Experience and desire to work in a Global delivery environment. Excellent communication skills with the ability to adapt your communication style to the audience. Demonstrated ability to drive development of highly technical technology services and capabilities. Experience deployment using source control using change sets and CICD pipelines.

Posted 1 week ago

Apply

2.0 - 7.0 years

2 - 6 Lacs

Pune

Work from Office

Educational Master Of Engineering,BTech,Bachelor of Engineering,MCA,BCA,MTech,MBA Service Line Cloud & Infrastructure Services Responsibilities A day in the life of an Infoscion: Support the Service & Products across several technical domains with full spectrum of Production Support responsibilities Uphold high standards for timely issue resolution Ensure workflows, processes, tooling and applications are of the highest quality standard Contribute expertise to the management of existing and new IT products and services Define workarounds for known errors and initiate process improvements Maintain a knowledge database Flexible to work in APAC/EMEA hours Additional Responsibilities: Knowledge of more than one technology Basics of Architecture and Design fundamentals Knowledge of Testing tools Knowledge of agile methodologies Understanding of Project life cycle activities on development and maintenance projects Understanding of one or more Estimation methodologies, Knowledge of Quality processes Basics of business domain to understand the business requirements Analytical abilities, Strong Technical Skills, Good communication skills Good understanding of the technology and domain Ability to demonstrate a sound understanding of software quality assurance principles, SOLID design principles and modelling methods Awareness of latest technologies and trends Excellent problem solving, analytical and debugging skills Technical and Professional : ideally 4-6 years of hands-on experience in IT support role within Financial services industry proficient with software development tools, such as Unix, Shell Scripting, Oracle, Sybase, MSSQL, Autosys, Informatica, Splunk, AppDynamics, Java good Knowledge of ITIL processes, knowledge of Cloud Technologies (Azure), RESTful APIs, Microservices, Compliance and Legal IT systems is an added advantage. ability to solve complex issues, good at problem statement analysis and solution design thinking track record of influencing senior IT stakeholders and business partners confident communicator that can explain technology to non-technical audiences capable of understanding client needs and translating this into products and services contribute expertise to the management of existing and new IT products and services define workarounds for known errors and initiate process improvements maintain a knowledge database flexible to work in APAC/EMEA hours Preferred Skills: Application Support Technology-Architecture-Architecture - ITIL Technology-Open System-Shell scripting Technology-Open System-UNIX Technology-Infrastructure- Batch Scheduler-Autosys Technology-Oracle-PL/SQL

Posted 1 week ago

Apply

0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

Are you ready to write your next chapter? Make your mark at one of the biggest names in payments. With proven technology, we process the largest volume of payments in the world, driving the global economy every day. When you join Worldpay, you join a global community of experts and changemakers, working to reinvent an industry by constantly evolving how we work and making the way millions of people pay easier, every day. About The Role Worldpay are on an exciting journey re-engineering our core merchant payments platform to be more cost effective, scalable and cloud ready utilizing the latest cutting-edge technologies. This journey will require the very best engineering talent to get us there as it’s not just a technical change, it’s a cultural change as well. About The Team The New Acquiring Platform (NAP) is pioneering a payment revolution. Our state-of-the-art system equips us for the spending habits of tomorrow, as well as today. We will be able to deliver unrivalled insights into every trade and transaction. The platform is designed around payments, not just cards – so we can cater to every emerging trend with speed and customer centricity. What You'll Own We are looking for bright talent who can build future testing capability for ongoing BAU delivery and drive quality improvements across multiple agile teams. You will be working on the QA Team who caters to the product, platform and business needs of a number of Agile Release Trains. including Acquiring, Billing & Funding, Servicing, Reporting. E2E Volume Functional Testing Quality Assurance team is fundamental to unlocking value for our Merchant business, being relied upon to both ensure the stability of our production releases and deliver new boundary breaking products for our customers. This includes full E2E testing of the full Merchant Acquirer lifecycle from merchant onboarding to Invoice generation and reporting whilst interacting with many payment modules/interfaces as part of delivery. Where you'll own it You will own it in our Vibrant Office Locations as Bangalore and Indore hub . APAC With hubs in the heart of city centers and tech capitals, things move fast in APAC. We pride ourselves on being an agile and dynamic collective, collaborating with different teams and offices across the glob What You Bring Proven track record of E2E systems integration testing across multiple team and application boundaries, including 3rd parties and vendors. Experience of operating within an Agile Team (SAFe methodology preferable) Testing of API’s using SoapUI, Postman, etc Advanced SQL, PL/SQL skills (Procedure, Package), basic performance tuning and DBA metadata. ETL or Data Warehouse concepts, experience in any ETL tools (Informatica, ODI, etc.) Experience with automated testing development (Shell, Python, Java) Experience with testing frameworks - Selenium and tools such as Cucumber Good understanding of CI/CD principles, error logging and reporting including the supporting tools e.g. Github, Jenkins, Splunk Experience on testing against modern cloud platforms and containerised applications (AWS/ Azure). Understanding of Kafka / Hadoop (Spark) and/or event driven design and principles. Understanding of job scheduler tools (Control M, Autosys, etc). Experience of the payments industry is preferable, and working with large volumes of data across real-time processing systems (35+ million data records) Good understanding of Unix/Linux/Windows Operating Systems and Oracle Databases. Working with complex reporting requirements across multiple systems Experience in supporting a small team of experienced Quality Analysts Experience in carrying out internal reviews to ensure quality standards are met Must demonstrate ability to own tasks and defects, and see through to completion Building strong relationships across multiple engineering teams and stakeholders Experience in reviewing progress and presenting results to stakeholders Experience with environment management, deployments, and prioritisation Provide subject matter expert knowledge in Quality Assurance best practices, tools, and software Experience working with Rally for test case management and defect management What Makes a Worldpayer It’s simple: Think, Act, Win. We stay curious, always asking the right questions to be better every day, finding creative solutions to simplify the complex. We’re dynamic, every Worldpayer is empowered to make the right decisions for their customers. And we’re determined, always staying open – winning and failing as one. Does this sound like you? Then you sound like a Worldpayer. Apply now to write the next chapter in your career. Privacy Statement Worldpay is committed to protecting the privacy and security of all personal information that we process in order to provide services to our clients. For specific information on how Worldpay protects personal information online, please see the Online Privacy Notice. Sourcing Model Recruitment at Worldpay works primarily on a direct sourcing model; a relatively small portion of our hiring is through recruitment agencies. Worldpay does not accept resumes from recruitment agencies which are not on the preferred supplier list and is not responsible for any related fees for resumes submitted to job postings, our employees, or any other part of our company. #pridepass

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Role: Informatica IICS Developer Location: Bangalore Work Model: Hybrid Notice Period: Immediate to 30 days JD: We are looking for senior IICS developer profiles having 5 years plus of experience for immediate billable position. Design, develop, implement, and maintain complex ETL/ELT processes using Informatica Intelligent Cloud Services (IICS) including mappings, transformations, taskflow, and schedules Work with various IICS connectors (e.g., databases, flat files, cloud platforms like AWS S3) to extract, transform, and load data from diverse sources. Perform performance tuning and optimization of IICS mappings and workflows to ensure efficient data processing. Perform data analysis, data profiling, and data mapping to understand source and target data structures. shell scripting for file manipulation (e.g., moving, copying, compressing, decompressing files, PGP encryption). Pre-processing and post-processing steps for IICS jobs. Proven experience with Python Scripting for data manipulation, custom logic. If you are interested, please share your resume to jyothiveerabh.akula@hcltech.com

Posted 1 week ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Job Description Are You Ready to Make It Happen at Mondelēz International? Join our Mission to Lead the Future of Snacking. Make It With Pride. You will provide technical contributions to the data science process. In this role, you are the internally recognized expert in data, building infrastructure and data pipelines/retrieval mechanisms to support our data needs How You Will Contribute You will: Operationalize and automate activities for efficiency and timely production of data visuals Assist in providing accessibility, retrievability, security and protection of data in an ethical manner Search for ways to get new data sources and assess their accuracy Build and maintain the transports/data pipelines and retrieve applicable data sets for specific use cases Understand data and metadata to support consistency of information retrieval, combination, analysis, pattern recognition and interpretation Validate information from multiple sources. Assess issues that might prevent the organization from making maximum use of its information assets What You Will Bring A desire to drive your future and accelerate your career and the following experience and knowledge: Extensive experience in data engineering in a large, complex business with multiple systems such as SAP, internal and external data, etc. and experience setting up, testing and maintaining new systems Experience of a wide variety of languages and tools (e.g. script languages) to retrieve, merge and combine data Ability to simplify complex problems and communicate to a broad audience In This Role As a Senior Data Engineer, you will have the opportunity to design and build scalable, secure, and cost-effective cloud-based data solutions. You will develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes, ensuring data quality and validation processes to maintain data accuracy and integrity. You will ensure efficient data storage and retrieval for optimal performance, and collaborate closely with data teams, product owners, and other stakeholders to stay updated with the latest cloud technologies and best practices. Role & Responsibilities: Design and Build: Develop and implement scalable, secure, and cost-effective cloud-based data solutions. Manage Data Pipelines: Develop and maintain data pipelines to extract, transform, and load data into data warehouses or data lakes. Ensure Data Quality: Implement data quality and validation processes to ensure data accuracy and integrity. Optimize Data Storage: Ensure efficient data storage and retrieval for optimal performance. Collaborate and Innovate: Work closely with data teams, product owners, and stay updated with the latest cloud technologies and best practices. Technical Requirements: Programming: Python, PySpark, Go/Java Database: SQL, PL/SQL ETL & Integration: DBT, Databricks + DLT, AecorSoft, Talend, Informatica/Pentaho/Ab-Initio, Fivetran. Data Warehousing: SCD, Schema Types, Data Mart. Visualization: Databricks Notebook, PowerBI (Optional), Tableau (Optional), Looker. GCP Cloud Services: Big Query, GCS, Cloud Function, PubSub, Dataflow, DataProc, Dataplex. AWS Cloud Services: S3, Redshift, Lambda, Glue, CloudWatch, EMR, SNS, Kinesis. Azure Cloud Services: Azure Datalake Gen2, Azure Databricks, Azure Synapse Analytics, Azure Data Factory, Azure Stream Analytics. Supporting Technologies: Graph Database/Neo4j, Erwin, Collibra, Ataccama DQ, Kafka, Airflow. Soft Skills: Problem-Solving: The ability to identify and solve complex data-related challenges. Communication: Effective communication skills to collaborate with Product Owners, analysts, and stakeholders. Analytical Thinking: The capacity to analyze data and draw meaningful insights. Attention to Detail: Meticulousness in data preparation and pipeline development. Adaptability: The ability to stay updated with emerging technologies and trends in the data engineering field. Within Country Relocation support available and for candidates voluntarily moving internationally some minimal support is offered through our Volunteer International Transfer Policy Business Unit Summary At Mondelēz International, our purpose is to empower people to snack right by offering the right snack, for the right moment, made the right way. That means delivering a broad range of delicious, high-quality snacks that nourish life's moments, made with sustainable ingredients and packaging that consumers can feel good about. We have a rich portfolio of strong brands globally and locally including many household names such as Oreo , belVita and LU biscuits; Cadbury Dairy Milk , Milka and Toblerone chocolate; Sour Patch Kids candy and Trident gum. We are proud to hold the top position globally in biscuits, chocolate and candy and the second top position in gum. Our 80,000 makers and bakers are located in more than 80 countries and we sell our products in over 150 countries around the world. Our people are energized for growth and critical to us living our purpose and values. We are a diverse community that can make things happen—and happen fast. Mondelēz International is an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, gender, sexual orientation or preference, gender identity, national origin, disability status, protected veteran status, or any other characteristic protected by law. Job Type Regular Data Science Analytics & Data Science

Posted 1 week ago

Apply

6.0 years

3 - 7 Lacs

Gurgaon

On-site

A Day in Your Life at MKS: We are looking for an exceptional Senior Informatica Programmer/Analyst who can perform development, implementation and usage of Information Technology and management information systems within the Informatica Intelligent Data Management Cloud (IDMC) platform. Working in partnership with the business relationship managers, super-users, end-users and technical team to ensure full adoption, effective usage, and efficient deployment of our IT solutions. Effectively manage the change control process, gathering the end-user requirements, and communicating IT priorities and delivery status to the business. You Will Make an Impact By: Collaborate with business partners to understand and document integration processes and solutions Develop, test, document and implement solutions leveraging Informatica IDMC Actively demonstrate a passion for continuous improvement focused on end user productivity, and enterprise process integration. Work with various business groups in the organization to facilitate cross-functional implementation of new or improved business process requirements for all IT-related business, financial, and operations systems critical to core organizational functions. Effectively manage the IT change control process, gathering the end-user requirements, preparing functional specifications and communicating IT priorities and delivery status to the business. Skills You Bring: Bachelor's degree in Computer Science, Information Technology, Information Systems or any related fields 6+ years of Informatica development experience required with 2 years of Informatica Intelligent Data Management Cloud experience preferred Strong knowledge of SQL and DDL scripting Strong communication skills with experience drafting technical documents Be dissatisfied with status quo with a thirst to introduce change Energetic team player with a can-do attitude We can't wait for your application ! #LI-AM2 Globally, our policy is to recruit individuals from wide and diverse backgrounds. However, certain positions require access to controlled goods and technologies subject to the International Traffic in Arms Regulations (ITAR) or Export Administration Regulations (EAR). Applicants for these positions may need to be “U.S. persons.” “U.S. persons” are generally defined as U.S. citizens, noncitizen nationals, lawful permanent residents (or, green card holders), individuals granted asylum, and individuals admitted as refugees. MKS Instruments, Inc. and its affiliates and subsidiaries (“MKS”) is an affirmative action and equal opportunity employer: diverse candidates are encouraged to apply. We win as a team and are committed to recruiting and hiring qualified applicants regardless of race, color, national origin, sex (including pregnancy and pregnancy-related conditions), religion, age, ancestry, physical or mental disability or handicap, marital status, membership in the uniformed services, veteran status, sexual orientation, gender identity or expression, genetic information, or any other category protected by applicable law. Hiring decisions are based on merit, qualifications and business needs. We conduct background checks and drug screens, in accordance with applicable law and company policies. MKS is generally only hiring candidates who reside in states where we are registered to do business. MKS is committed to working with and providing reasonable accommodations to qualified individuals with disabilities. If you need a reasonable accommodation during the application or interview process due to a disability, please contact us at: accommodationsatMKS@mksinst.com . If applying for a specific job, please include the requisition number (ex: RXXXX), the title and location of the role

Posted 1 week ago

Apply

0 years

4 - 8 Lacs

Hyderābād

On-site

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Lead Consultant – Snowflake & Informatica Cloud Data Engineer. Responsibilities Design and implement scalable data solution in Snowflake following the data engineering best practices and layered architecture. Design and implement scalable data pipelines and ETL/ELT processes using dbt, integrated with Snowflake for modern cloud data warehousing Develop and optimize transformation logic and storage structures in Snowflake using SQL, Python, and Airflow Collaborate with business and technical teams to translate data requirements into robust dbt on Snowflake integration solutions Ensure data quality, security, and compliance by applying governance best practices across data transformation pipelines and within the Snowflake environments Perform performance tuning in Snowflake and streamline ETL pipelines for efficient execution, supported by clear documentation of architecture and integration patterns Qualifications we seek in you! Minimum Qualifications Bachelor's degree in information science, data management, computer science or related field preferred Must have experience in Cloud Data Engineering domain Proven experience in cloud data engineering using Snowflake and Informatica, with hands-on delivery of end-to-end data pipeline implementations Strong knowledge of data warehousing, ELT/ETL design, OLAP concepts, and dimensional modelling using Snowflake, with experience in projects delivering complete data solutions Hands-on expertise in developing, scheduling, and orchestrating scalable ETL/ELT pipelines using Informatica Cloud or PowerCenter Proficiency in Python for data transformation and automation tasks integrated with Snowflake environments Excellent communication and documentation skills, with the ability to clearly articulate Snowflake architectures and Informatica workflows Experience implementing data quality, lineage, and governance frameworks using Informatica and Snowflake capabilities Familiarity with CI/CD practices for deploying Informatica workflows and Snowflake objects within DevOps environments Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Lead Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jul 8, 2025, 11:34:27 PM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 1 week ago

Apply

0 years

2 - 5 Lacs

Hyderābād

On-site

DevOps Content Writer, Assurant-GCC, India The DevOps Content Engineer role exists to bridge the gap between ICS and all of the documentation the department needs to create and have access to in order to operate effectively in the performance of their roles. The person in this role will partner with DevOps engineers, SRE, and platform engineering teams to design and maintain clear, structured, and scalable documentation that supports the entire ICS engineering organizations’ products and platforms in use at Assurant. The DevOps content engineer works with other engineers across a variety of disciplines to ensure all of our documentation is in a consistent framework and easily consumable. The person in this role also makes sure that the ICS engineers are both aware of the large bohave access to the documentation they need from across ICS to effectively perform in their role. The DevOps content Engineer is responsible for building documentation systems, automating content generation where possible, and ensuring engineers can find, understand, and act on the information they need. This position will be in Bangalore/Hyderabad at our India location. What will be my duties and responsibilities in this job? Content Architecture: Develop and maintain content standards and document formats for our cloud architecture, including platforms such as AWS, Azure, and OCI. Ensure that all documentation guidelines are clear, concise, and accessible to relevant stakeholders. Compliance and Security Protocols Documentation: Understand and provide content standards and formats that demonstrate how our solutions adhere to security protocols and access controls. Offer clear, detailed guidelines to support compliance efforts. Technical Administration: Assist team members in resolving issues, which may involve technical troubleshooting or providing guidance on procedural matters such as form-filling. Follow up diligently to ensure that all concerns are addressed in a timely and accurate manner. Documentation Management: Build and maintain technical documentation for infrastructure, automation, and deployment processes. Ensure documentation is tightly integrated with DevOps tools and workflows. Knowledge Sharing: Champion knowledge sharing and documentation automation in CI/CD pipelines. Collaboration: Work closely with SREs, DevOps engineers, and platform teams to document complex systems in a scalable, accessible way. Define and evangelize content standards, templates, and best practices for internal teams What are the requirements needed for this position? Education A bachelor’s degree in computer engineering, computer science, information technology, or another equivalent applicable STEM field. Professional Experience Technical Proficiency : A solid understanding of DevOps principles, CI/CD pipelines, containerization (e.g., Docker, Kubernetes), and cloud platforms (AWS, Azure, GCP). Additional solid understanding of application developer productivity tooling (GitHub, ADO), observability tooling (Datadog, Dynatrace, Graylog), infrastructure-as-code platforms (Terraform, Ansible), and application runtime, hosting, and integration technologies (MuleSoft CloudHub, Informatica). Writing and Communication : Exceptional ability to convey technical information clearly and concisely, tailoring content to various audiences. Tool Familiarity : Experience with documentation tools and platforms such as Confluence Markdown, Sphinx, or MkDocs, and version control systems like Git. Technology Skills Technical Knowledge: Demonstrated experience with cloud architecture components, specifically AWS, Azure, and OCI. Security Protocols Familiarity: Solid understanding of security protocols and access controls, and the ability to articulate how solutions meet these standards. Content Architecture Skills: Proven ability to develop and maintain high-quality content standards and document formats. Automating parts of documentation via scripts or tools (e.g., generating docs from code) Administrative Abilities: Strong organizational skills with the ability to follow up on issues and ensure they are resolved efficiently. Communication Skills: Excellent written and verbal communication skills, with the ability to explain complex technical concepts clearly. ServiceNow Flow and Workflow developer experience. What are the preferred requirements for this position? Education A master’s degree in computer engineering, computer science, information technology, or another equivalent applicable STEM field. Professional Experience 3+ Prior experience participating in a software COE Technology Skills Experience: Previous experience in a similar role, combining content architecture and administration duties. Familiarity with SOC, and SOX controls Any posted application deadline that is blank on a United States role is a pipeline requisition, and we'll continue to collect applications on an ongoing basis. Any posted pay range considers a wide range of compensation factors, including candidate background, experience, and work location, while also allowing for salary growth within the position. Helping People Thrive in a Connected World Connect with us. Bring us your best work and your brightest ideas. And we’ll bring you a place where you can thrive. Learn more at jobs.assurant.com. For U.S. benefit information, visit myassurantbenefits.com. For benefit information outside the U.S., please speak with your recruiter. What’s the culture like at Assurant? Our unique culture is a big reason why talented people choose Assurant. Named a Best/Great Place to Work in 13 countries and awarded the Fortune America’s Most Innovative Companies recognition in 2023, we bring together top talent around the world. Although we have a wide variety of skills and experiences, we share common characteristics that are uniquely Assurant. A passion for service. An ability to innovate in practical ways. And a willingness to take chances. We call our culture The Assurant Way. Company Overview Assurant is a leading global business services company that supports, protects, and connects major consumer purchases. A Fortune 500 company with a presence in 21 countries, Assurant supports the advancement of the connected world by partnering with the world’s leading brands to develop innovative solutions and deliver an enhanced customer experience through mobile device solutions, extended service contracts, vehicle protection services, renters insurance, lender-placed insurance products, and other specialty products. Equal Opportunity Statement Assurant is an Equal Employment Opportunity employer and does not use or consider race, color, religion, sex, national origin, age, disability, veteran status, sexual orientation, gender identity, or any other characteristic protected by federal, state, or local law in employment decisions. Job Scam Alert Please be aware that during Assurant's application process, we will never ask for personal information such as your Social Security number, bank account details, or passwords. Learn more about what to look out for and how to report a scam here.

Posted 1 week ago

Apply

40.0 years

2 - 6 Lacs

Hyderābād

On-site

India - Hyderabad JOB ID: R-218849 ADDITIONAL LOCATIONS: India - Hyderabad WORK LOCATION TYPE: On Site DATE POSTED: Jul. 08, 2025 CATEGORY: Information Systems ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description: We are seeking an experienced MDM Engineer with 8–12 years of experience to lead development and operations of our Master Data Management (MDM) platforms, with hands-on experience in data engineering experience. This role will involve handling the backend data engineering solution within MDM team. This is a technical role that will require hands-on work. To succeed in this role, the candidate must have strong Data Engineering experience. Candidate must have experience on technologies like (SQL, Python, PySpark, Databricks, AWS, API Integrations etc). Roles & Responsibilities: Develop distributed data pipelines using PySpark on Databricks for ingesting, transforming, and publishing master data Write optimized SQL for large-scale data processing, including complex joins, window functions, and CTEs for MDM logic Implement match/merge algorithms and survivorship rules using Informatica MDM or Reltio APIs Build and maintain Delta Lake tables with schema evolution and versioning for master data domains Use AWS services like S3, Glue, Lambda, and Step Functions for orchestrating MDM workflows Automate data quality checks using IDQ or custom PySpark validators with rule-based profiling Integrate external enrichment sources (e.g., D&B, LexisNexis) via REST APIs and batch pipelines Design and deploy CI/CD pipelines using GitHub Actions or Jenkins for Databricks notebooks and jobs Monitor pipeline health using Databricks Jobs API, CloudWatch, and custom logging frameworks Implement fine-grained access control using Unity Catalog and attribute-based policies for MDM datasets Use MLflow for tracking model-based entity resolution experiments if ML-based matching is applied Collaborate with data stewards to expose curated MDM views via REST endpoints or Delta Sharing Basic Qualifications and Experience: 8 to 13 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Advanced proficiency in PySpark for distributed data processing and transformation Strong SQL skills for complex data modeling, cleansing, and aggregation logic Hands-on experience with Databricks including Delta Lake, notebooks, and job orchestration Deep understanding of MDM concepts including match/merge, survivorship, and golden record creation Experience with MDM platforms like Informatica MDM or Reltio, including REST API integration Proficiency in AWS services such as S3, Glue, Lambda, Step Functions, and IAM Familiarity with data quality frameworks and tools like Informatica IDQ or custom rule engines Experience building CI/CD pipelines for data workflows using GitHub Actions, Jenkins, or similar Knowledge of schema evolution, versioning, and metadata management in data lakes Ability to implement lineage and observability using Unity Catalog or third-party tools Comfort with Unix shell scripting or Python for orchestration and automation Hands on experience on RESTful APIs for ingesting external data sources and enrichment feeds Good-to-Have Skills: Experience with Tableau or PowerBI for reporting MDM insights. Exposure to Agile practices and tools (JIRA, Confluence). Prior experience in Pharma/Life Sciences. Understanding of compliance and regulatory considerations in master data. Professional Certifications : Any MDM certification (e.g. Informatica, Reltio etc) Any Data Analysis certification (SQL, Python, PySpark, Databricks) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. GCF Level 05A

Posted 1 week ago

Apply

7.0 years

0 Lacs

Hyderābād

Remote

Capgemini is looking for a skilled Data Engineer to join our dynamic team dedicated to delivering top-notch data solutions for clients. In this role, you will be responsible for designing, building, and maintaining the data architecture, databases, and ETL processes that power data analytics and reporting. You will collaborate closely with data analysts, data scientists, and business stakeholders to understand their data needs and deliver effective solutions. As a Data Engineer, you will work with various technologies and tools, ensuring the quality, reliability, and performance of the data systems. You'll have opportunities to innovate, and help shape data strategies that drive business outcomes. Key Responsibilities: Design, develop, and implement data pipelines for efficient data processing and analytics. Maintain and optimize existing ETL processes to improve data delivery and system performance. Work with SQL and NoSQL databases to extract complex data sets and analyze data trends. Collaborate with stakeholders to gather data requirements and translate them into technical specifications. Troubleshoot and resolve data-related issues to ensure data integrity and accuracy. Participate in code reviews and contribute to best practices in data engineering. Requirements Education: Bachelor's degree in Computer Science, Information Technology, or a related field. Experience: Technical Skills: Data Modelling - Intermediate Data Profiling - Intermediate Data Governance - Advanced Reporting and Visualization - Advanced Informatica - Advanced (7+ Years) Snowflake - Advanced (7+ Years) Power BI - Intermediate (4-6 Years) SQL - Advanced (7+ Years) Excel - Intermediate (4-6 Years) Soft Skills: Strong problem-solving and analytical skills. Excellent communication skills and the ability to work collaboratively in a team environment. Strong attention to detail and a commitment to delivering high-quality solutions. Benefits Competitive compensation and benefits package: Competitive salary and performance-based bonuses Comprehensive benefits package Career development and training opportunities Flexible work arrangements (remote and/or office-based) Dynamic and inclusive work culture within a globally renowned group Private Health Insurance Pension Plan Paid Time Off Training & Development Note: Benefits differ based on employee level. About Capgemini Capgemini is a global leader in partnering with companies to transform and manage their business by harnessing the power of technology. The Group is guided everyday by its purpose of unleashing human energy through technology for an inclusive and sustainable future. It is a responsible and diverse organization of over 340,000 team members in more than 50 countries. With its strong 55-year heritage and deep industry expertise, Capgemini is trusted by its clients to address the entire breadth of their business needs, from strategy and design to operations, fueled by the fast evolving and innovative world of cloud, data, AI, connectivity, software, digital engineering and platforms. The Group €22.5 billion in revenues in 2023. https://www.capgemini.com/us-en/about-us/who-we-are/

Posted 1 week ago

Apply

15.0 years

0 Lacs

Hyderābād

On-site

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Informatica Data Quality Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data is accessible, reliable, and secure for stakeholders. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Develop and optimize data pipelines to enhance data processing efficiency. - Monitor and troubleshoot data quality issues to ensure data integrity. Professional & Technical Skills: - Must to have Profisee - Must To Have Skills: Proficiency in Informatica Data Quality. - Strong understanding of ETL processes and data integration techniques. - Experience with data profiling and data cleansing methodologies. - Familiarity with database management systems and SQL. - Knowledge of data governance and data management best practices. Additional Information: - The candidate should have minimum 3 years of experience in Informatica Data Quality. - This position is based at our Hyderabad office. - A 15 years full time education is required. 15 years full time education

Posted 1 week ago

Apply

5.0 - 6.0 years

3 - 8 Lacs

Chennai

On-site

Req ID: 325293 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Software Development Specialist to join our team in Chennai, Tamil Nādu (IN-TN), India (IN). Responsibilities: 5-6 years of application support experience supproing Dot Net and Azure apps. Ability to debug and coordinate with development teams to ensure efficient issue resolution. Monitoring Tools: Splunk, App Insight Management of deployments and addressing root causes for payment-related issues. Monitor Informatica batch job failures and provide insights for downstream dependencies. Proactively handle payment and application downtime issues Handle deployment monitoring and validations, escalating to the development team for resolution of critical issues when necessary. Shifts: Rotational 24X7 Mandatory Skills High level programming languages: C# (.NET MVC, .NET Core and .NET 6/7) UI: Angular, Javascript, CSS, ASP.NET MVC API: Restm Web API or Azure functions or Azure Durable Functions CI /CD: Azure pipelines, Terraform Scripting: Powershell, Bash Database: Microsoft SQL Server or NoSQL (e.g. CosmosDB) and Oracle Containerization: Azure Kubernetes Service, Kubernetes (open source) and Docker Agile knowledge About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 1 week ago

Apply

5.0 - 8.0 years

4 - 9 Lacs

Chennai

Remote

Req ID: 328612 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Data Engineer / Developer to join our team in Chennai/Bangalore, Tamil Nādu (IN-TN), India (IN). Data Engineer / Developer Primary Skillset (Must Have) Oracle PL/SQL, SnapLogic Secondary Skillset (Good to Have) Informatica, Aerospike Tertiary Skillset (Nice to Have) Python Scripting Minimum Experience on Key Skills 5 to 8 years General Expectation 1) Must have Good Communication 2) Must be ready to work in 10:30 AM to 8:30 PM Shift 3) Flexible to work in Client Location Ramanujam IT park, Taramani, Chennai OR GV, Manyata or EGL, Bangalore 4) Must be ready to work from office in a Hybrid work environment. Full Remote work is not an option 5) Expect Full Return to office in 2025 Pre-Requisites before submitting profiles 1) Must have Genuine and Digitally signed Form16 for ALL employments 2) All employment history/details must be present in UAN/PPF statements 3) Candidate must be screened using Video and ensure he/she is genuine and have proper work setup 4) Candidates must have real work experience on mandatory skills mentioned in JD 5) Profiles must have the companies which they are having payroll with and not the client names as their employers 6) As these are competitive positions and client will not wait for 60 days and carry the risks of drop-out, candidates must of 0 to 3 weeks of Notice Period 7) Candidates must be screened for any gaps after education and during employment for genuineness of the reasons. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 1 week ago

Apply

13.0 - 17.0 years

32 - 35 Lacs

Noida, Gurugram

Work from Office

Requirement: Hands-on experience as a techno-functional consultant in implementing OFSAA FCCM(Financial Crime and Compliance Management) suite Anti Money Laundering(AML), Customer Screening(CS), KYC, Transaction monitoring & filtering, ECM, FATCA management, FCC Studio. Excellent working knowledge on JAVA, PLSQL and Linux. Knowledge on FCCM analytics, CRR and FATCA.

Posted 1 week ago

Apply

8.0 - 11.0 years

6 - 7 Lacs

Noida

On-site

SAP S4HANA Data Migration Lead Full-time Company Description About Sopra Steria Sopra Steria, a major Tech player in Europe with 56,000 employees in nearly 30 countries, is recognized for its consulting, digital services and software development. It helps its clients drive their digital transformation and obtain tangible and sustainable benefits. The Group provides end-to-end solutions to make large companies and organizations more competitive by combining in-depth knowledge of a wide range of business sectors and innovative technologies with a fully collaborative approach. Sopra Steria places people at the heart of everything it does and is committed to putting digital to work for its clients in order to build a positive future for all. In 2023, the Group generated revenues of €5.8 billion. The world is how we shape it. Job Description Position: SAP S4HANA Data Migration Lead Location: Chennai/ Noida Experience: 8 -11 years Education: B.E./ B.Tech./ MCA Job Description At least one end to end experience as Data Lead on one project At least 8 years of IT experience in SAP Data Migration, Data Analysis, Data Audits, Process Management, Business Analysis, ECM, Business Process Re-engineering, RFPs, Quality Assurance, Data Analysis and Modeling and Testing of enterprise wide client/server and Web-based applications and all aspects of Software Engineering and Systems Development Life Cycle SDLC. Experience as ABAP Developer. Able to understand abap, data dictionary , read bapi code ad debug. Certified in any of the ETL tools like Informatica/BODS/ADM Syniti. Good work experience in SAP data migration using LTMC and LTMON. Basic development skill in VBA excel or Microsoft SQL server in case no tool is provided but file management is needed Extensive experience in managing Industrial, Finance and Telecom Clients – Manufacturing process needed for AMC here ( Sales, purchasing, production, finance) Expertise in Cutover Planning, Project Planning, Project Design, Gathering Business and Functional requirements, creating functional specifications, and Use case data flow diagrams Worked in SAP Finance, SAP SD, SAP MM, SAP SCM, SAP MDM, SAP Project Systems, Business partner migration from SAP ECC or Open source system to S/4 HANA. Qualifications B Tech Computer Science. Additional Information At our organization, we are committed to fighting against all forms of discrimination. We foster a work environment that is inclusive and respectful of all differences. All of our positions are open to people with disabilities.

Posted 1 week ago

Apply

0 years

4 - 8 Lacs

Calcutta

On-site

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Lead Consultant – Snowflake & Informatica Cloud Data Engineer. Responsibilities Design and implement scalable data solution in Snowflake following the data engineering best practices and layered architecture. Design and implement scalable data pipelines and ETL/ELT processes using dbt, integrated with Snowflake for modern cloud data warehousing Develop and optimize transformation logic and storage structures in Snowflake using SQL, Python, and Airflow Collaborate with business and technical teams to translate data requirements into robust dbt on Snowflake integration solutions Ensure data quality, security, and compliance by applying governance best practices across data transformation pipelines and within the Snowflake environments Perform performance tuning in Snowflake and streamline ETL pipelines for efficient execution, supported by clear documentation of architecture and integration patterns Qualifications we seek in you! Minimum Qualifications Bachelor's degree in information science, data management, computer science or related field preferred Must have experience in Cloud Data Engineering domain Proven experience in cloud data engineering using Snowflake and Informatica, with hands-on delivery of end-to-end data pipeline implementations Strong knowledge of data warehousing, ELT/ETL design, OLAP concepts, and dimensional modelling using Snowflake, with experience in projects delivering complete data solutions Hands-on expertise in developing, scheduling, and orchestrating scalable ETL/ELT pipelines using Informatica Cloud or PowerCenter Proficiency in Python for data transformation and automation tasks integrated with Snowflake environments Excellent communication and documentation skills, with the ability to clearly articulate Snowflake architectures and Informatica workflows Experience implementing data quality, lineage, and governance frameworks using Informatica and Snowflake capabilities Familiarity with CI/CD practices for deploying Informatica workflows and Snowflake objects within DevOps environments Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Lead Consultant Primary Location India-Kolkata Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jul 8, 2025, 11:22:19 PM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 1 week ago

Apply

7.0 - 12.0 years

30 - 45 Lacs

Bengaluru

Work from Office

Exp required 7+ yrs with data governance informatica tool Key Responsibilities: Data Governance Framework Development : Develop, implement, and maintain data governance frameworks, policies, and standards to ensure high-quality, consistent, and secure data across the organization. Collaborate with business units and stakeholders to define and enforce data governance policies, ensuring alignment with business goals and regulatory requirements. Data Quality Management : Define and enforce data quality standards, monitoring key data quality metrics. Identify, analyze, and resolve data quality issues across various data sources and platforms. Work with cross-functional teams to implement data quality improvement initiatives. Data Lineage & Metadata Management : Implement and maintain data lineage and metadata management solutions to ensure visibility and traceability of data throughout its lifecycle. Work with data architects and engineers to establish and document data flows, transformations, and dependencies. Data Security & Compliance : Ensure that data governance practices comply with relevant regulatory requirements (e.g., GDPR, CCPA, HIPAA). Implement data security controls to protect sensitive data and manage access to sensitive information. Stakeholder Collaboration : Partner with data architects, data engineers, data scientists, and business analysts to ensure alignment between technical and business needs for data governance. Provide training and support for teams on data governance policies, best practices, and tools. Data Governance Tools & Technologies : Lead the implementation and optimization of data governance tools and platforms. Continuously evaluate emerging tools and technologies to improve data governance processes. Reporting & Documentation : Develop and maintain comprehensive data governance documentation and reports. Provide regular updates to senior management on the status of data governance initiatives, risks, and areas of improvement. Requirements: Experience : 7+ years of experience in data governance, data management, or related fields. Proven track record in implementing data governance frameworks and policies at an enterprise level. In-depth knowledge of data governance concepts, including data quality, data lineage, metadata management, and data security. Technical Skills : Experience with data governance tools such as Collibra, Informatica, Alation, or similar. Strong understanding of databases, data warehousing, and big data platforms (e.g., Hadoop, Spark). Familiarity with data integration, ETL processes, and data modeling. Proficiency in SQL and other scripting languages (e.g., Python, Shell). Regulatory Knowledge : Solid understanding of data privacy and compliance regulations (GDPR, CCPA, HIPAA, etc.). Ability to assess and mitigate compliance risks related to data handling. Soft Skills : Excellent communication and interpersonal skills. Strong problem-solving skills and the ability to collaborate across teams. Ability to manage multiple projects and deadlines in a fast-paced environment. Roles and Responsibilities Exp required 7+ yrs with data governance informatica tool Key Responsibilities: Data Governance Framework Development : Develop, implement, and maintain data governance frameworks, policies, and standards to ensure high-quality, consistent, and secure data across the organization. Collaborate with business units and stakeholders to define and enforce data governance policies, ensuring alignment with business goals and regulatory requirements. Data Quality Management : Define and enforce data quality standards, monitoring key data quality metrics. Identify, analyze, and resolve data quality issues across various data sources and platforms. Work with cross-functional teams to implement data quality improvement initiatives. Data Lineage & Metadata Management : Implement and maintain data lineage and metadata management solutions to ensure visibility and traceability of data throughout its lifecycle. Work with data architects and engineers to establish and document data flows, transformations, and dependencies. Data Security & Compliance : Ensure that data governance practices comply with relevant regulatory requirements (e.g., GDPR, CCPA, HIPAA). Implement data security controls to protect sensitive data and manage access to sensitive information. Stakeholder Collaboration : Partner with data architects, data engineers, data scientists, and business analysts to ensure alignment between technical and business needs for data governance. Provide training and support for teams on data governance policies, best practices, and tools. Data Governance Tools & Technologies : Lead the implementation and optimization of data governance tools and platforms. Continuously evaluate emerging tools and technologies to improve data governance processes. Reporting & Documentation : Develop and maintain comprehensive data governance documentation and reports. Provide regular updates to senior management on the status of data governance initiatives, risks, and areas of improvement. Requirements: Experience : 7+ years of experience in data governance, data management, or related fields. Proven track record in implementing data governance frameworks and policies at an enterprise level. In-depth knowledge of data governance concepts, including data quality, data lineage, metadata management, and data security. Technical Skills : Experience with data governance tools such as Collibra, Informatica, Alation, or similar. Strong understanding of databases, data warehousing, and big data platforms (e.g., Hadoop, Spark). Familiarity with data integration, ETL processes, and data modeling. Proficiency in SQL and other scripting languages (e.g., Python, Shell). Regulatory Knowledge : Solid understanding of data privacy and compliance regulations (GDPR, CCPA, HIPAA, etc.). Ability to assess and mitigate compliance risks related to data handling. Soft Skills : Excellent communication and interpersonal skills. Strong problem-solving skills and the ability to collaborate across teams. Ability to manage multiple projects and deadlines in a fast-paced environment.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies