Home
Jobs

3075 Informatica Jobs - Page 13

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

8 - 14 Lacs

Gurugram

Work from Office

Naukri logo

As a BI ETL Test Engineer, you take care of the testing of BI systems. This includes validation of the business data flow , ETL components, data lineage, ETL architecture and you are able to analyze defects during data validation. This includes setting up testing strategy, recommend tools, perform technical feasibility and risk assessments. Primary Skills As a BI ETL Test Specialist, you are expected to be subject matter expert in this area of specialised testing. This includes understanding the business data flow , ETL components, data lineage, ETL architecture and are able to analyze defects during data validation. You have a good technical knowledge on Databases, Unix/Linux and ETL and BI Tools. You are expected to develop testing strategy, recommend tools, perform technical feasibility, conduct risk assessments and build business cases (ROI). You are expected to own delivery of specialised testing projects. You are expected to work independently to provide technical support and guidance. Skills (competencies)

Posted 2 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

About Us Our leading SaaS-based Global Employment Platform™ enables clients to expand into over 180 countries quickly and efficiently, without the complexities of establishing local entities. At G-P, we’re dedicated to breaking down barriers to global business and creating opportunities for everyone, everywhere. Our diverse, remote-first teams are essential to our success. We empower our Dream Team members with flexibility and resources, fostering an environment where innovation thrives and every contribution is valued and celebrated. The work you do here will positively impact lives around the world. We stand by our promise: Opportunity Made Possible. In addition to competitive compensation and benefits, we invite you to join us in expanding your skills and helping to reshape the future of work. At G-P, we assist organizations in building exceptional global teams in days, not months—streamlining the hiring, onboarding, and management process to unlock growth potential for all. About The Position As a Senior Engineering Manager at Globalization Partners, you will be responsible for both technical leadership and people management. This includes contributing to architectural discussions, decisions, and execution, as well as managing and developing a team of Data Engineers (of different experience levels). What You Can Expect To Do Own the strategic direction and execution of initiatives across our Data Platform, aligning technical vision with business goals. Guide teams through architectural decisions, delivery planning, and execution of complex programs that advance our platform capabilities. Lead and grow high-performing engineering teams responsible for the full data and analytics stack—from ingestion (ETL and Streaming) through transformation, storage, and consumption—ensuring quality, reliability, and performance at scale. Partner cross-functionally with product managers, architects, engineering leaders, and stakeholders from Cloud Engineering and other business domains to shape product and platform capabilities, translating business needs into actionable engineering plans. Drive delivery excellence by setting clear expectations, removing blockers, and ensuring engineering teams are progressing efficiently towards milestones while maintaining technical integrity. Ensure adoption and consistency of platform standards and best practices, including shared components, reusable libraries, and scalable data patterns. Support technical leadership across teams by fostering a strong culture of engineering excellence, security, and operational efficiency. Guide technical leads in maintaining high standards in architecture, development, and testing. Contribute to strategic planning, including the evolution of the data platform roadmap, migration strategies, and long-term technology investments aligned with company goals. Champion agile methodologies and DevOps practices, driving continuous improvement in team collaboration, delivery cycles, and operational maturity. Mentor and develop engineering talent, creating an environment where individuals can thrive through coaching, feedback, and growth opportunities. Promote a culture of innovation, accountability, and psychological safety. Challenge the Data Platform Quality and Performance by building/monitoring quality KPI and building a quality-first culture What We Are Looking For Proven experience leading geographically distributed engineering teams in the design and delivery of complex data and analytics platforms. Strong technical foundation with hands-on experience in modern data architectures, handling structured and unstructured data, and programming in Python—capable of guiding teams and reviewing design and code at a high level when necessary. Proficiency in SQL and relational database technologies, with the ability to guide data modeling and performance optimization discussions. In-depth understanding of ETL processes and data integration strategies, with practical experience overseeing data ingestion (batch and streaming), transformation, and quality assurance initiatives. Familiarity with commercial data platforms (e.g., Databricks, Snowflake) and cloud-native data warehouses (e.g., Redshift, BigQuery), including trade-offs and best practices in enterprise environments. Working knowledge of data governance and cataloging solutions, such as Atlan, Alation, Informatica, or Collibra, and experience supporting enterprise data stewardship efforts. Deep understanding of data quality, experience in building quality processes, and usage of tools like Monte Carlo. Understanding of machine learning and AI workloads, including the orchestration of data pipelines for model training and deployment in both batch and streaming contexts. Strong analytical and problem-solving skills, with the ability to drive root-cause analysis, evaluate architectural trade-offs, and support decision-making in ambiguous or fast-changing environments. Exceptional communication skills, with a track record of clear and effective collaboration across technical and non-technical stakeholders. Fluent in English, both verbal and written, with the ability to influence at all levels of the organization. Bachelor’s degree in Computer Science or a related field; advanced degrees or equivalent professional experience are a plus. We will consider for employment all qualified applicants who meet the inherent requirements for the position. Please note that background checks are required, and this may include criminal record checks. G-P. Global Made Possible. G-P is a proud Equal Opportunity Employer, and we are committed to building and maintaining a diverse, equitable and inclusive culture that celebrates authenticity. We prohibit discrimination and harassment against employees or applicants on the basis of race, color, creed, religion, national origin, ancestry, citizenship status, age, sex or gender (including pregnancy, childbirth, and pregnancy-related conditions), gender identity or expression (including transgender status), sexual orientation, marital status, military service and veteran status, physical or mental disability, genetic information, or any other legally protected status. G-P also is committed to providing reasonable accommodations to individuals with disabilities. If you need an accommodation due to a disability during the interview process, please contact us at careers@g-p.com.

Posted 2 days ago

Apply

3.0 - 7.0 years

11 - 16 Lacs

Gurugram

Work from Office

Naukri logo

Project description We are looking for the star Python Developer who is not afraid of work and challenges! Gladly becoming a partner with famous financial institution, we are gathering a team of professionals with wide range of skills to successfully deliver business value to the client. Responsibilities Analyse existing SAS DI pipelines and SQL-based transformations. Translate and optimize SAS SQL logic into Python code using frameworks such as Pyspark. Develop and maintain scalable ETL pipelines using Python on AWS EMR. Implement data transformation, cleansing, and aggregation logic to support business requirements. Design modular and reusable code for distributed data processing tasks on EMR clusters. Integrate EMR jobs with upstream and downstream systems, including AWS S3, Snowflake, and Tableau. Develop Tableau reports for business reporting. Skills Must have 6+ years of experience in ETL development, with at least 5 years working with AWS EMR. Bachelor's degree in Computer Science, Data Science, Statistics, or a related field. Proficiency in Python for data processing and scripting. Proficient in SQL and experience with one or more ETL tools (e.g., SAS DI, Informatica)/. Hands-on experience with AWS servicesEMR, S3, IAM, VPC, and Glue. Familiarity with data storage systems such as Snowflake or RDS. Excellent communication skills and ability to work collaboratively in a team environment. Strong problem-solving skills and ability to work independently. Nice to have N/A Other Languages EnglishB2 Upper Intermediate Seniority Senior

Posted 2 days ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

Hyderabad

Work from Office

Naukri logo

Employment Type: Contract Mandatory Skill: Informatica MDM & IDQ Job Description: 5+ years of experience in performing administrative activities for Informatica/Informatica MDM Experience in Installation, Configuration, Administration and Security of BI tools Experience in software upgrades, Implementation of Hot fixes, implementation of new software offerings and coordination of testing activities with project teams Experience in implementation of SSL and Different Authentications methods Design, Install, Configuration, and Administration of Informatica Platform v10 or higher (currently on v10.2) on Linux, Install experience with Informatica MDM 10.x Preferred Leads software upgrades, Implementation of Hot fixes, implementation of new software offerings and infrastructure, maintenance, and coordinates testing activities with project teams Researches and provides recommendations for capacity modifications, collaborates with PM to document tasks and update status Creates and maintains architecture diagrams, Informatica/Data Integration/Data Quality tools & UNIX troubleshooting and Automating Daily Tasks Informatica/Data Integration/Data Quality Tools Security Informatica MDM Platform administration and Integration support Coordinates patching and other infrastructure related activities with different teams Monitoring of servers and services Responsible for BI and Analytics Administration at an enterprise Level. Identify and improve infrastructure and processes. Ensure environments are stable and available all the time. Administer and support BI Analytics Infrastructure including Upgrades, testing, Security Administration, License Management, troubleshoot ETL, Reports and workbooks, Performance Monitoring and System maintenance. Sarbanes Oxley experience a plus. IBM Cognos: (Good to have) Design, Install, Configuration, Cognos BI 10.x or Higher (currently on v11) on Windows and Install and Configure IIS, Cognos Analytics (v11) Knowledge Preferred. Create and maintain installation documentation and architectural diagrams Implement package deployments and troubleshoot package issues with complex long running reports. Experience working with Motio products a plus devise or modify procedures to solve complex problems considering Cognos administration and/or cube and report generation Perform backup, upgrades, set up security roles and users, and troubleshoot by understanding the various log messages Implement and maintain Cognos security Manage Cognos Analytics reporting, while utilizing best practices and innovative ideas to build, maintain, distribute and educate users on the reporting & query functionality of Cognos Analytics Apply expertise to implementation of BI application to support global reporting Tableau & Alteryx: (Good to have) Install and upgrades, administration and technical support for the Tableau Enterprise wide implementation& Alteryx Enterprise Server 2019.x Experience developing in and administering Tableau Server 2019.x or later, Tableau Desktop and Alteryx Designer and administering Alteryx Server Full Knowledge of SSL and Different Authentications methods Working knowledge of Cloud SaaS and PaaS applications

Posted 2 days ago

Apply

4.0 - 8.0 years

4 - 8 Lacs

Hyderabad, Gandhi

Hybrid

Naukri logo

Requirement Analysis for new enhancements & Implement Change/enhancements by developing the change requests and report customizations as per business. Good Planisware development experience is preferred. Customize the Planisware system based on the Business requirements and deploy the solution on target environments Planisware Product Implementations, Bug Fixing, Production Support, server automation & maintenance of Planisware application. Offer Consultation to the client for improving any existing reports or modules Handling maintenance activities like taking light dpx, restore environments with the required production dump and perform post-restoration steps, configure environment settings, services restart, running batches, monitoring logs on Linux Server, License management. Experience working on Planisware Upgrade/Migration, implementation and support services. Creating a new opx2/lsp script in order to implement required solutions & Installing Patches. Help the Business team to leverage Planisware for their Project Scheduling, Resource & Cost management requirements Responsible for the implementation of technical systems, software and solutions using Planisware Lead the Techno functional team to carry out the implementations on different Planisware instances Support the program in setting up and implementing the system using Planisware Change environment configuration for setting up a new functionalities/forms in the Planisware application instance. Good in-depth knowledge on Environment files, Common Datasets, Styles, Named formulas, Alerts, Locks, Attributes (Additional & Relational), Persistent tables, Breakdown structure, Macros, Cost Table which is required for the Planisware Development Process.

Posted 2 days ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Karnataka

Remote

Naukri logo

Role Overview- Design and development of ETL and BI applications in DataStage- Design/develop testing processes to ensure end to end performance, data integrity and usability.- Carry out performance testing, integration and system testing- Good SQL Knowledge is mandatory - Basic Unix knowledge is required- Should be able to communicate with client and work on technical requirement POSITION GENERAL DUTIES AND TASKS : At NTT DATA, we know that with the right people on board, anything is possible. The quality, integrity, and commitment of our employees are key factors in our companys growth, market presence and our ability to help our clients stay a step ahead of the competition. By hiring the best people and helping them grow both professionally and personally, we ensure a bright future for NTT DATA and for the people who work here. NTT DATA, Inc. currently seeks a DataStage - Developer to join our team in Bangalore. (Currently Remote) Role Overview - Design and development of ETL and BI applications in DataStage - Design/develop testing processes to ensure end to end performance, data integrity and usability. - Carry out performance testing, integration and system testing - Good SQL Knowledge is mandatory - Basic Unix knowledge is required - Should be able to communicate with client and work on technical requirement.

Posted 2 days ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Ab Initio with ETL Tester: Hands on 3-5 years of experience in ETL / Data Warehousing Preferably Ab Initio Hands on 3-5 years experience in Oracle Advanced SQL (ability to construct and execute complex SQL queries understand Oracle errors) Hands on Experience in API testing (Fine to have one of the resource have this skill) Hands experience in Unix Good Analytical reporting communication skills Lead the scrum team in using Agile methodology and scrum practices Helping the product owner and development team to achieve customer satisfaction Lead the scrum and development teams in self-organization Remove impediments and coach the scrum team on removing impediments Help the scrum and development teams to identify and fill in blanks in the Agile framework Resolve conflicts and issues that occur Help the scrum team achieve higher levels of scrum maturity Support the product owner and provide education where needed Required Skills Knowledge on Tool and integration with CI/CD tools like Jenkins Travis CI or AWS CodePipeline Collaborate with clients to understand their business requirements and design custom contact center solutions using AWS Connect Demonstrate deep knowledge of AWS Connect and its integration with other AWS services including Lambda S3 DynamoDB and others Prior experience of 3+ on a scrum team Must have AWS Connect Knowledge Ability to analyze and think quickly and to resolve conflict Knowledgeable in techniques to fill in gaps in the scrum Ability to determine what is scrum and what is not Experience with successful Agile techniques Ability to work with and lead a team Strong communication interpersonal and mentoring skills Ability to adapt to a changing environment Self-motivation and ability to stay focused in the middle of distraction

Posted 2 days ago

Apply

5.0 - 10.0 years

4 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Development of workflows and Connectors for the Collibra Platform Administration and configuration of Collibra Platform Duties: Collibra DGC Administration and Configuration Collibra Connect Administration and Configuration Collibra Development of Workflows and MuleSoft Connectors Ingesting metadata from any external sources into Collibra. Installation, upgrading and Administration Collibra Components Setup, support, deployment & migration of Collibra Components Implement Application changes: review and deploy code packages, perform post implementation verifications. Participate in group meetings (including business partners) for problem solving, decision making and implementation planning Senior Collibra Developer- Mandatory Skills MUST HAVE SKILLS: Collibra Connect Collibra DGC Java Advanced hands-on working knowledge of Unix/Linux Advanced hands on experience wit UNIX scripting SQL Server Groovy

Posted 2 days ago

Apply

6.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a skilled and experienced Cognos and Informatica Administrator to join our team. You will be responsible for the installation, configuration, maintenance, and support of Cognos and Informatica software in our organization. Your role will involve collaborating with cross-functional teams, solve system issues, and ensuring the smooth functioning of the Cognos and Informatica environments. Role Scope Deliverables: Responsibilities: Install, configure, and upgrade Cognos and Informatica application components, including servers, clients, and related tools. Monitor and maintain the performance, availability, and security of Cognos and Informatica environments. Collaborate with developers, business analysts, and other stakeholders to understand requirements and provide technical guidance. Troubleshoot and resolve issues related to Cognos and Informatica applications, databases, servers, and integrations. Perform system backups, disaster recovery planning, and implementation. Implement and enforce best practices for Cognos and Informatica administration, security, and performance tuning. Manage user access, roles, and permissions within Cognos and Informatica environments. Coordinate with vendors for product support, patches, upgrades, and license management. Stay up to date with the latest trends and advancements in Cognos and Informatica technologies. Document technical processes, procedures, and configurations. Nice-to-Have Skills: Development Skills: Familiarity with Cognos Report Studio, Framework Manager, Informatica PowerCenter, and other development tools to assist in troubleshooting and providing guidance to developers and users. Databricks Experience: Practiced in designing and building dashboards in Power BI or Power BI Administration experience Microsoft SQL Server Analysis Services (SSAS) Experience: Install, configure, and maintain Microsoft SQL Server Analysis Services (SSAS) environments. Proven knowledge as a Microsoft SQL Server Analysis Services (SSAS) Administrator Databricks Experience: Knowledge with Databricks, and a strong understanding of its architecture, capabilities, and best practices. Key Skills: Requirements: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience as a Cognos and Informatica Administrator or similar role. Solid understanding of Cognos and Informatica installation, configuration, and administration. Familiarity with relational databases, SQL, and data warehousing concepts. Excellent troubleshooting and problem-solving skills. Ability to work independently and collaboratively in a team environment. Strong communication and interpersonal skills. Attention to detail and ability to prioritize tasks effectively.

Posted 2 days ago

Apply

5.0 - 10.0 years

4 - 7 Lacs

Mumbai

Hybrid

Naukri logo

PF Detection is mandatory Minimum 5 years of experience in database development and ETL tools. 2. Strong expertise in SQL and database platforms (e.g. SQL Server Oracle PostgreSQL). 3. Proficiency in ETL tools (e.g. Informatica SSIS Talend DataStage) and scripting languages (e.g. Python Shell). 4. Experience with data modeling and schema design. 5. Familiarity with cloud databases and ETL tools (e.g. AWS Glue Azure Data Factory Snowflake). 6. Understanding of data warehousing concepts and best practices

Posted 2 days ago

Apply

6.0 - 11.0 years

3 - 7 Lacs

Karnataka

Hybrid

Naukri logo

PF Detection is mandatory : Looking for a candidate with over 6 years of hands-on involvement in Snowflake. The primary expertise required is in Snowflake, must be capable of creating complex SQL queries for manipulating data. The candidate should excel in implementing complex scenarios within Snowflake. The candidate should possess a strong foundation in Informatica PowerCenter, showcasing their proficiency in executing ETL processes. Strong hands-on experience in SQL and RDBMS Strong hands-on experience in Unix Shell Scripting Knowledge in Data warehousing and cloud data warehousing Should have good communication skills

Posted 2 days ago

Apply

0.0 years

6 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Our vision is to transform how the world uses information to enrich life for all . Micron Technology is a world leader in innovating memory and storage solutions that accelerate the transformation of information into intelligence, inspiring the world to learn, communicate and advance faster than ever. Responsibilities and Tasks: Understand the Business Problem and the Relevant Data Maintain an intimate understanding of company and department strategy Translate analysis requirements into data requirements Identify and understand the data sources that are relevant to the business problem Develop conceptual models that capture the relationships within the data Define the data-quality objectives for the solution Be a subject matter expert in data sources and reporting options Architect Data Management Systems: Design and implement optimum data structures in the appropriate data management system (Hadoop, Teradata, SQL Server, etc.) to satisfy the data requirements Plan methods for archiving/deletion of information Develop, Automate, and Orchestrate an Ecosystem of ETL Processes for Varying Volumes of Data. Identify and select the optimum methods of access for each data source (real-time/streaming, delayed, static) Determine transformation requirements and develop processes to bring structured and unstructured data from the source to a new physical data model Develop processes to efficiently load the transform data into the data management system Prepare Data to Meet Analysis Requirements: Work with the data scientist to implement strategies for cleaning and preparing data for analysis (e.g., outliers, missing data, etc.) Develop and code data extracts Follow standard methodologies to ensure data quality and data integrity Ensure that the data is fit to use for data science applications Qualifications and Experience: 0-7 years of experience developing, delivering, and/or supporting data engineering, advanced analytics or business intelligence solutions Ability to work with multiple operating systems (e.g., MS Office, Unix, Linux, etc.) Experienced in developing ETL/ELT processes using Apache Ni-Fi and Snowflake Significant experience with big data processing and/or developing applications and data sources via Hadoop, Yarn, Hive, Pig, Sqoop, MapReduce, HBASE, Flume, etc. Understanding of how distributed systems work Familiarity with software architecture (data structures, data schemas, etc.) Strong working knowledge of databases (Oracle, MSSQL, etc.) including SQL and NoSQL. Strong mathematics background, analytical, problem solving, and organizational skills Strong communication skills (written, verbal and presentation) Experience working in a global, multi-functional environment Minimum of 2 years’ experience in any of the following: At least one high-level client, object-oriented language (e.g., C#, C++, JAVA, Python, Perl, etc.); at least one or more web programming language (PHP, MySQL, Python, Perl, JavaScript, ASP, etc.); one or more Data Extraction Tools (SSIS, Informatica etc.) Software development. Ability to travel as needed Education: B.S. degree in Computer Science, Software Engineering, Electrical Engineering, Applied Mathematics or related field of study. M.S. degree preferred. About Micron Technology, Inc. We are an industry leader in innovative memory and storage solutions transforming how the world uses information to enrich life for all . With a relentless focus on our customers, technology leadership, and manufacturing and operational excellence, Micron delivers a rich portfolio of high-performance DRAM, NAND, and NOR memory and storage products through our Micron® and Crucial® brands. Every day, the innovations that our people create fuel the data economy, enabling advances in artificial intelligence and 5G applications that unleash opportunities — from the data center to the intelligent edge and across the client and mobile user experience. To learn more, please visit micron.com/careers All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, veteran or disability status. To request assistance with the application process and/or for reasonable accommodations, please contact hrsupport_india@micron.com Micron Prohibits the use of child labor and complies with all applicable laws, rules, regulations, and other international and industry labor standards. Micron does not charge candidates any recruitment fees or unlawfully collect any other payment from candidates as consideration for their employment with Micron. AI alert : Candidates are encouraged to use AI tools to enhance their resume and/or application materials. However, all information provided must be accurate and reflect the candidate's true skills and experiences. Misuse of AI to fabricate or misrepresent qualifications will result in immediate disqualification. Fraud alert: Micron advises job seekers to be cautious of unsolicited job offers and to verify the authenticity of any communication claiming to be from Micron by checking the official Micron careers website in the About Micron Technology, Inc.

Posted 2 days ago

Apply

5.0 years

8 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

About this role: Wells Fargo is seeking a Lead data Engineer In this role, you will: Lead complex initiatives with broad impact and act as key participant in large scale software planning for the Technology area Design, develop, and run tooling to discover problems in data and applications and report the issues to engineering and product leadership Review and analyze complex software enhancement initiatives for business, operational or technical improvements that require in depth evaluation of multiple factors including intangibles or unprecedented factors Make decisions in complex and multi-faceted data engineering situations requiring understanding of software package options and programming language and compliance requirements that influence and lead Technology to meet deliverables and drive organizational change Strategically collaborate and consult with internal partners to resolve highly risky data engineering challenges Required Qualifications: 5+ years of Database Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: Need expertise on Informatica power center 10.2, Oracle, Unix and Autosys. Job Expectations: Able to work individually and work along with the US counter part. Able to handle production issues at data level. Able to create new jobs and schedule them with in the time limit should follow agile JIRA process and adhere ti the JIRA standards Should handle the CR effectively and get the job to production. Posting End Date: 26 Jun 2025 *Job posting may come down early due to volume of applicants. We Value Equal Opportunity Wells Fargo is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other legally protected characteristic. Employees support our focus on building strong customer relationships balanced with a strong risk mitigating and compliance-driven culture which firmly establishes those disciplines as critical to the success of our customers and company. They are accountable for execution of all applicable risk programs (Credit, Market, Financial Crimes, Operational, Regulatory Compliance), which includes effectively following and adhering to applicable Wells Fargo policies and procedures, appropriately fulfilling risk and compliance obligations, timely and effective escalation and remediation of issues, and making sound risk decisions. There is emphasis on proactive monitoring, governance, risk identification and escalation, as well as making sound risk decisions commensurate with the business unit's risk appetite and all risk and compliance program requirements. Candidates applying to job openings posted in Canada: Applications for employment are encouraged from all qualified candidates, including women, persons with disabilities, aboriginal peoples and visible minorities. Accommodation for applicants with disabilities is available upon request in connection with the recruitment process. Applicants with Disabilities To request a medical accommodation during the application or interview process, visit Disability Inclusion at Wells Fargo . Drug and Alcohol Policy Wells Fargo maintains a drug free workplace. Please see our Drug and Alcohol Policy to learn more. Wells Fargo Recruitment and Hiring Requirements: a. Third-Party recordings are prohibited unless authorized by Wells Fargo. b. Wells Fargo requires you to directly represent your own experiences during the recruiting and hiring process.

Posted 2 days ago

Apply

10.0 years

6 - 11 Lacs

Hyderābād

On-site

GlassDoor logo

Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 25-Jun-2025 Job ID 9085 Description and Requirements This position is responsible for design, implementation, and support of MetLife's enterprise data management and integration systems, the underlying infrastructure, and integrations with other enterprise systems and applications using AIX, Linux, or Microsoft Technologies. Job Responsibilities Provide technical expertise in the planning, engineering, design, implementation and support of data management and integration system infrastructures and technologies. This includes the systems operational procedures and processes Partner with the Capacity Management, Production Management, Application Development Teams and the Business to ensure customer expectations are maintained and exceeded Participate in the evaluation and recommendation of new products and technologies, maintain knowledge of emerging technologies for application to the enterprise Identify and resolve complex data management and integration system issues (Tier 3 support) utilizing product knowledge and structured troubleshooting tools and techniques Support Disaster Recovery implementation and testing as required Experience in design and developing Automation/Scripting (shell, Perl, PowerShell, Python, Java…) Good decision-making skills Take ownership for the deliverables from the entire team Strong collaboration with leadership groups Learn new technologies based on demand Coach other team members and bring them up to speed Track project status working with team members and report to leadership Participate in cross-departmental efforts Leads initiatives within the community of practice Willing to work in rotational shifts Good Communication skill with the ability to communicate clearly and effectively Knowledge, Skills and Abilities Education Bachelor's degree in computer science, Information Systems, or related field. Experience 10+ years of total experience and at least 7+ years of experience in Informatica applications implementation and support of data management and integration system infrastructures and technologies. This includes the system's operational procedures and processes. Participate in the evaluation and recommendation of new products and technologies, maintain knowledge of emerging technologies for application to the enterprise. Good understanding in Disaster Recovery implementation and testing Design and developing Automation/Scripting (shell, Perl, PowerShell, Python, Java…) Informatica PowerCenter Informatica PWX Informatica DQ Informatica DEI Informatica B2B/DX Informatica MFT Informatica MDM Informatica ILM Informatica Cloud (IDMC/IICS) Ansible (Automation) Operating System Knowledge (Linux/Windows/AIX) Azure Dev Ops Pipeline Knowledge Python and/or Powershell Agile SAFe for Teams Enterprise Scheduling Knowledge (Maestro) Troubleshooting Communications CP4D Datastage Mainframe z/OS Knowledge Open Shift Elastic Experience in creating and working on Service Now tasks/tickets About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!

Posted 2 days ago

Apply

3.0 - 7.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Title: Informatica IDG Specialist / Consultant-Senior Job Summary: We are looking for an experienced Informatica IDG (Data Governance ) professional to lead and support our enterprise data governance initiatives. The candidate will be responsible for configuring and deploying Informatica Axon, Enterprise Data Catalog (EDC), and Data Privacy Management (DPM ) tools to establish robust governance, data discovery, metadata management, and regulatory compliance across the organization. Key Responsibilities: Implement and configure Informatica IDG components including Axon Data Governance, Enterprise Data Catalog (EDC), and Data Privacy Management (DPM). Collaborate with data owners, stewards, and business users to define and maintain business glossaries, data domains, policies, and governance workflows. Integrate IDG with other platforms (IDQ, MDM, IICS, PowerCenter, Snowflake, etc.) to ensure metadata lineage and impact analysis. Design and implement data governance strategies that align with data privacy regulations (GDPR, CCPA, etc.) and internal compliance requirements. Create and maintain data lineage maps, stewardship dashboards, and data quality insights using Informatica tools. Define and enforce role-based access controls and security configurations within IDG tools. Support adoption of data governance processes, including stewardship, policy approval, and issue resolution. Train business users and data stewards on using Axon, EDC, and other governance components. Ensure the sustainability of governance programs through change management, documentation, and governance councils. Required Qualifications: 3-7 years of experience in Informatica Data Governance (IDG) or related tools. Strong hands-on experience with Informatica Axon, Enterprise Data Catalog (EDC), and Data Privacy Management (DPM). Understanding of data governance frameworks, metadata management, and policy management. Familiarity with data classification, data lineage, and data stewardship workflows. Experience with metadata ingestion and cataloging across hybrid/cloud platforms. Solid SQL skills and familiarity with cloud data platforms (AWS, Azure, GCP, Snowflake, etc.). Strong communication, stakeholder engagement, and documentation skills. Preferred Qualifications: Informatica certifications in Axon, EDC, or Data Governance. Experience with Data Quality (IDQ), Master Data Management (MDM), or IICS. Knowledge in CDGC will be an added value Knowledge of industry-specific regulations and data governance mandates. Familiarity with governance best practices from DAMA-DMBOK, DCAM, or similar frameworks. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 days ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Hi ,Greetings From IDESLABS.This is Navyafrom IDESLABS, we have a requirement on Etl Testing for one of our clients for contract to Hire role. job Details: skillsEtl TestingExperience7+ YearsLocationBangaloreJob typeContract to HirePay roll companyIDESLABSWork ModelHybrid JD JD with Primary Skill ETL testing /Strong SQL Looking for ETL/DB tester with 5+ years of experience. Should have strong SQL skills. Should have hands on coding knowledge in any scripting language. should be able to design and write SQL queries for data validation should be able to verify and test ETL processes understanding of Data warehousing concepts is a plus good communication skill and testing mindset

Posted 2 days ago

Apply

3.0 - 6.0 years

6 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

"Spark & Delta Lake Understanding of Spark core concepts like RDDs, DataFrames, DataSets, SparkSQL and Spark Streaming. Experience with Spark optimization techniques. Deep knowledge of Delta Lake features like time travel, schema evolution, data partitioning. Ability to design and implement data pipelines using Spark and Delta Lake as the data storage layer. Proficiency in Python/Scala/Java for Spark development and integrate with ETL process. Knowledge of data ingestion techniques from various sources (flat files, CSV, API, database) Understanding of data quality best practices and data validation techniques. Other Skills: Understanding of data warehouse concepts, data modelling techniques. Expertise in Git for code management. Familiarity with CI/CD pipelines and containerization technologies. Nice to have experience using data integration tools like DataStage/Prophecy/Informatica/Ab Initio"

Posted 2 days ago

Apply

3.0 years

9 - 10 Lacs

Gurgaon

On-site

GlassDoor logo

About the Role: Grade Level (for internal use): 09 S&P Global Mobility The Role: ETL Developer The Team The ETL team forms an integral part of Global Data Operations (GDO) and caters to the North America & EMEA automotive business line. Core responsibilities include translating business requirements into technical design and ETL jobs along with unit testing, integration testing, regression testing, deployments & production operations. The team has an energetic and dynamic group of individuals, always looking to work through a challenge. Ownership, raising the bar and innovation is what the team runs on! The Impact The ETL team, being part of GDO, caters to the automotive business line and helps stakeholders with an optimum solution for their data needs. The role requires close coordination with global teams such as other development teams, research analysts, quality assurance analysts, architects etc. The role is vital for the automotive business as it involves providing highly efficient data solutions with high accuracy to various stakeholders. The role forms a bridge between the business and technical stakeholders. What’s in it for you Constant learning, working in a dynamic and challenging environment! Total Rewards. Monetary, beneficial, and developmental rewards! Work Life Balance. You can't do a good job if your job is all you do! Diversity & Inclusion. HeForShe! Internal Mobility. Grow with us! Responsibilities Using prior experience with file loading, cleansing and standardization, should be able to translate business requirements into ETL design and efficient ETL solutions using Informatica Powercenter (mandatory) and Talend Enterprise (preferred). Knowledge of tibco would be a preferred skill as well. Understand relational database technologies and data warehousing concepts and processes. Using prior experiences with High Volume data processing, be able to deal with complex technical issues Works closely with all levels of management and employees across the Automotive business line. Participates as part of cross-functional teams responsible for investigating issues, proposing solutions and implementing corrective actions. Good communication skills required for interface with various stakeholder groups; detail oriented with analytical skills What We’re Looking For The ETL development team within the Mobility domain is looking for a Software Engineer to work on design, development & operations efforts in the ETL (Informatica) domain. Primary Skills and qualifications required: Experience with Informatica and/or Talend ETL tools Bachelor’s degree in Computer Science, with at least 3+ years of development and maintenance of ETL systems on Informatica PowerCenter and 1+ year of SQL experience. 3+ years of Informatica Design and Architecture experience and 1+ years of Optimization and Performance tuning of ETL code on Informatica 1+ years of python development experience and SQL, XML experience Working knowledge or greater of Cloud Based Technologies, Development, Operations a plus. About S&P Global Mobility At S&P Global Mobility, we provide invaluable insights derived from unmatched automotive data, enabling our customers to anticipate change and make decisions with conviction. Our expertise helps them to optimize their businesses, reach the right consumers, and shape the future of mobility. We open the door to automotive innovation, revealing the buying patterns of today and helping customers plan for the emerging technologies of tomorrow. For more information, visit www.spglobal.com/mobility . What’s In It For You? Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our benefits include: Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert: If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com . S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here . ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 316976 Posted On: 2025-06-25 Location: Gurgaon, Haryana, India

Posted 2 days ago

Apply

15.0 years

0 Lacs

Bhubaneshwar

On-site

GlassDoor logo

Project Role : Advanced Application Engineer Project Role Description : Develop innovative technology solutions for emerging industries and products. Interpret system requirements into design specifications. Must have skills : Informatica MDM Good to have skills : NA Minimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary: As an Advanced Application Engineer, you will engage in the development of innovative technology solutions tailored for emerging industries and products. Your typical day will involve interpreting system requirements and translating them into detailed design specifications, ensuring that the solutions meet the needs of the business and its clients. You will collaborate with cross-functional teams to refine these specifications and contribute to the overall success of the projects you are involved in, while also staying updated on the latest technological advancements in your field. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Assist in the documentation of design specifications and system requirements. - Engage in continuous learning to stay abreast of industry trends and technologies. Professional & Technical Skills: - Must To Have Skills: Proficiency in Informatica MDM. - Strong understanding of data integration and data quality processes. - Experience with data modeling and metadata management. - Familiarity with ETL processes and data warehousing concepts. - Ability to troubleshoot and resolve data-related issues efficiently. Additional Information: - The candidate should have minimum 3 years of experience in Informatica MDM. - This position is based at our Bhubaneswar office. - A 15 years full time education is required. 15 years full time education

Posted 3 days ago

Apply

7.0 - 12.0 years

6 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

Understanding of Spark core concepts like RDDs, DataFrames, DataSets, SparkSQL and Spark Streaming. Experience with Spark optimization techniques. Deep knowledge of Delta Lake features like time travel, schema evolution, data partitioning. Ability to design and implement data pipelines using Spark and Delta Lake as the data storage layer. Proficiency in Python/Scala/Java for Spark development and integrate with ETL process. Knowledge of data ingestion techniques from various sources (flat files, CSV, API, database) Understanding of data quality best practices and data validation techniques. Other Skills: Understanding of data warehouse concepts, data modelling techniques. Expertise in Git for code management. Familiarity with CI/CD pipelines and containerization technologies. Nice to have experience using data integration tools like DataStage/Prophecy/Informatica/Ab Initio"

Posted 3 days ago

Apply

3.0 years

7 - 9 Lacs

Calcutta

On-site

GlassDoor logo

Line of Service Advisory Industry/Sector FS X-Sector Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Senior Associate Exp : 3 - 6 Years Location: Kolkata Technical Skills: · Strong expertise in Azure Databricks, Azure Data Factory (ADF), PySpark, SQL Server, and Python. · Solid understanding of Azure Functions and their application in data processing workflows. · Understanding of DevOps practices and CI/CD pipelines for data solutions. · Experience with other ETL tools such as Informatica Intelligent Cloud Services (IICS) is a plus. · Strong problem-solving skills and ability to work independently and collaboratively in a fast-paced environment. · Excellent communication skills to effectively convey technical concepts to non-technical stakeholders. Key Responsibilities: · Develop, maintain, and optimize scalable data pipelines using Azure Databricks, Azure Data Factory (ADF), and PySpark. · Collaborate with data architects and business stakeholders to translate requirements into technical solutions. · Implement and manage data integration processes using SQL Server and Python. · Design and deploy Azure Functions to support data processing workflows. · Monitor and troubleshoot data pipeline performance and reliability issues. · Ensure data quality, security, and compliance with industry standards and best practices. · Document technical specifications and maintain clear and concise project documentation. Mandatory skill sets: Azure Databricks, Azure Data Factory (ADF), and PySpark. Preferred skill sets: Azure Databricks, Azure Data Factory (ADF), and PySpark. Years of experience required: 3-6 Years Education qualification: B.E.(B.Tech)/M.E/M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis, Intellectual Curiosity, Java (Programming Language), Market Development {+ 11 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

3.0 years

7 - 9 Lacs

Calcutta

On-site

GlassDoor logo

Line of Service Advisory Industry/Sector FS X-Sector Specialism Operations Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary – Senior Associate – Azure Data Engineer Responsibilities: Role : Senior Associate Exp : 3 - 6 Years Location: Kolkata Technical Skills: · Strong expertise in Azure Databricks, Azure Data Factory (ADF), PySpark, SQL Server, and Python. · Solid understanding of Azure Functions and their application in data processing workflows. · Understanding of DevOps practices and CI/CD pipelines for data solutions. · Experience with other ETL tools such as Informatica Intelligent Cloud Services (IICS) is a plus. · Strong problem-solving skills and ability to work independently and collaboratively in a fast-paced environment. · Excellent communication skills to effectively convey technical concepts to non-technical stakeholders. Key Responsibilities: · Develop, maintain, and optimize scalable data pipelines using Azure Databricks, Azure Data Factory (ADF), and PySpark. · Collaborate with data architects and business stakeholders to translate requirements into technical solutions. · Implement and manage data integration processes using SQL Server and Python. · Design and deploy Azure Functions to support data processing workflows. · Monitor and troubleshoot data pipeline performance and reliability issues. · Ensure data quality, security, and compliance with industry standards and best practices. · Document technical specifications and maintain clear and concise project documentation. Mandatory skill sets: Azure Databricks, Azure Data Factory (ADF), and PySpark. Preferred skill sets: Azure Databricks, Azure Data Factory (ADF), and PySpark. Years of experience required: 3-6 Years Education qualification: B.E.(B.Tech)/M.E/M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Microsoft Azure, PySpark Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 3 days ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

SSIS Senior Developer At least 5+ years of data integration (sourcing, staging, mapping, loading, ) experience, SSIS preferred Demonstrated experience with an enterprise-class integration tool such as SSIS, Informatica, Ab Initio, Data Stage Demonstrated experience working in a team development environment using an IDE

Posted 3 days ago

Apply

3.0 - 8.0 years

3 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Minimum 3 to 5 years of Talend Developer experience. Work on the User stories and develop the Talend jobs development following the best practices. Create detailed technical design documents of talend jobs development work. Work with the SIT team and involve for defect fixing for Talend components. Note: Maximo IBM tool knowledge would have an advantage for Coned otherwise it is Ok.

Posted 3 days ago

Apply

3.0 - 7.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Title: Informatica IDG Specialist / Consultant-Senior Job Summary: We are looking for an experienced Informatica IDG (Data Governance ) professional to lead and support our enterprise data governance initiatives. The candidate will be responsible for configuring and deploying Informatica Axon, Enterprise Data Catalog (EDC), and Data Privacy Management (DPM ) tools to establish robust governance, data discovery, metadata management, and regulatory compliance across the organization. Key Responsibilities: Implement and configure Informatica IDG components including Axon Data Governance, Enterprise Data Catalog (EDC), and Data Privacy Management (DPM). Collaborate with data owners, stewards, and business users to define and maintain business glossaries, data domains, policies, and governance workflows. Integrate IDG with other platforms (IDQ, MDM, IICS, PowerCenter, Snowflake, etc.) to ensure metadata lineage and impact analysis. Design and implement data governance strategies that align with data privacy regulations (GDPR, CCPA, etc.) and internal compliance requirements. Create and maintain data lineage maps, stewardship dashboards, and data quality insights using Informatica tools. Define and enforce role-based access controls and security configurations within IDG tools. Support adoption of data governance processes, including stewardship, policy approval, and issue resolution. Train business users and data stewards on using Axon, EDC, and other governance components. Ensure the sustainability of governance programs through change management, documentation, and governance councils. Required Qualifications: 3-7 years of experience in Informatica Data Governance (IDG) or related tools. Strong hands-on experience with Informatica Axon, Enterprise Data Catalog (EDC), and Data Privacy Management (DPM). Understanding of data governance frameworks, metadata management, and policy management. Familiarity with data classification, data lineage, and data stewardship workflows. Experience with metadata ingestion and cataloging across hybrid/cloud platforms. Solid SQL skills and familiarity with cloud data platforms (AWS, Azure, GCP, Snowflake, etc.). Strong communication, stakeholder engagement, and documentation skills. Preferred Qualifications: Informatica certifications in Axon, EDC, or Data Governance. Experience with Data Quality (IDQ), Master Data Management (MDM), or IICS. Knowledge in CDGC will be an added value Knowledge of industry-specific regulations and data governance mandates. Familiarity with governance best practices from DAMA-DMBOK, DCAM, or similar frameworks. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies