Home
Jobs

2510 Informatica Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 8.0 years

18 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a highly skilled and motivated Data Engineer to join our team. The ideal candidate will have a strong background in ETL development , data warehousing , and Informatica , along with proficiency in Python for data processing and automation tasks. You will be responsible for designing, developing, and maintaining scalable data pipelines and ensuring high data quality and availability across the organization. Key Responsibilities: Design, develop, and maintain ETL workflows and data pipelines using Informatica PowerCenter/Cloud . Build and manage data warehouse solutions to support business intelligence and analytics. Collaborate with data analysts, business stakeholders, and software engineers to understand data requirements. Implement data integration processes from various sources (structured and unstructured). Develop scripts in Python to support data transformation, automation, and validation tasks. Optimize performance of data workflows and address data quality issues. Ensure data security, compliance , and governance standards are met. Monitor and troubleshoot production jobs and provide support as needed. Required Skills and Qualifications: Bachelor's degree in Computer Science, Engineering, or a related field. 3+ years of experience in data engineering or ETL development . Proficient in Informatica (PowerCenter, Cloud, or similar tools). Strong experience with ETL processes and data warehousing concepts (star/snowflake schema, slowly changing dimensions, etc.). Hands-on programming experience in Python for data manipulation and automation. Solid understanding of SQL and experience with relational databases (e.g., Oracle, SQL Server, PostgreSQL). Experience working with large-scale data systems and performance tuning. Familiarity with cloud platforms (AWS, Azure, or GCP) is a plus.

Posted -1 days ago

Apply

12.0 - 17.0 years

7 - 12 Lacs

Chandigarh

Work from Office

Naukri logo

Job Summary As a key contributor to our ERP Transformation Services team, the Senior ETL Data Migration Analyst is responsible for owning the design, development, and execution of enterprise-wide data migration activities. This role is instrumental in the success of global ERP implementations primarily Oracle EBS and SAP ECC by ensuring consistent, auditable, and high-quality data migration processes using industry-standard tools and frameworks. In This Role, Your Responsibilities Will Be: Pre-Go-Live: Planning & Development Design and implement global data migration strategies for Oracle and SAP ERP projects. Develop ETL processes using Syniti DSP / SKP or an equivalent tool to support end-to-end migration. Collaborate with legacy system teams to extract and analyze source data. Build workflows for data profiling, cleansing, enrichment, and transformation. Ensure audit ability and traceability of migrated data, aligned with compliance and governance standards. Go-Live & Cutover Execution Support mock loads, cutover rehearsals, and production data loads. Monitor data load progress and resolve issues related to performance, mapping, or data quality. Maintain a clear log of data migration actions and reconcile with source systems. Post-Go-Live: Support & Stewardship Monitor data creation and updates to ensure business process integrity post go-live. Provide data extract/load services for ongoing master data maintenance. Contribute to legacy data archiving strategies, tools, and execution. Tools, Documentation & Collaboration Maintain documentation of ETL procedures, technical specifications, and data lineage. Partner with implementation teams to translate business requirements into technical solutions. Contribute to the development and refinement of ETL frameworks and reusable components. Travel Requirements Willingness to travel up to 20% for project needs, primarily during key implementation phases Who You Are: You show a tremendous amount of initiative in tough situations; you are someone who has strong analytical and problem-solving skills. You are self-motivated, accountable, and proactive in learning and applying new technologies. You possess superb communication and collaboration across global teams. For This Role, You Will Need: 12+ years of IT experience with a focus on ETL, data management, and ERP data migration. Strong hands-on experience with Oracle EBS or SAP ECC implementations. Proficiency in Syniti DSP, Informatica, Talend, or similar enterprise ETL tools. Proficient SQL skills; ability to write and optimize queries for large datasets. Demonstrable track record in data profiling, cleansing, and audit trail maintenance. Academic background in MCA / BE / BSC - Computer Science, Engineering, Information Systems, or Business Administration Proven Application development experience in .NET, ABAP, or scripting languages Familiarity with Data Migration implementations and data modeling principles. Knowledge of project management methodologies (Agile, PMP, etc.). Performance Indicators Successful execution of data migration cutovers with minimal errors. Complete Data traceability and audit compliance from source to target. Timely delivery of ETL solutions and reports per project phases. Continuous improvement and reuse of ETL frameworks and standard processes. Our Culture & Commitment to You: .

Posted -1 days ago

Apply

3.0 - 6.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Fusion Plus Solutions Inc is looking for mulesoft developer to join our dynamic team and embark on a rewarding career journey Ability in implementing Mule recommended best practices and solution, meeting the acceptance criteria; Conducts reviews of developed code; End to End Testing; Develop and guide the team members in enhancing their technical capabilities and increasing productivity; Requirement understanding; Designs ETL processes using Informatica Power Center; Provide estimates for development; Provide on-call and after hours support for production systems; Adaptable to challenging environment; Gather requirements to define data definitions, transformation logic, and data model logical and physical designs, data flow and process; Take ownership of projects and manage statuses and timelines effectively; Perform data analysis and data profiling against source systems; To be responsible for providing technical guidance / solutions; Unit testing covering positive & negative scenarios Maintain up-to-date JIRA status

Posted -1 days ago

Apply

7.0 - 12.0 years

5 - 14 Lacs

Guwahati

Work from Office

Naukri logo

Job Description – Server Management & Data Protection Work Location: Guwahati Responsibilities: Privacy Team @ Jio is focused on automating various aspects of the Privacy Program. Privacy Data Protection Technology Team specifically is engaged operationalizing state of the art technologies to drive the Privacy Automation Program in Jio. Deployment and management of IT infrastructure systems including the webservers, application servers and database servers Work with various Jio teams to setup the operating system and other infrastructure to deploy the IT infrastructure systems Deploy various system components including opensource technologies on for enabling the environment for the set up of IT infrastructure systems Deploy application and provide ongoing monitoring and maintenance support. Installation, administration & maintenance of ETL (Extract-Transform-Load) solution such as Informatica Solution components (PowerCenter, Test Data Management, Dynamic Data Management) or similar solution . Create Rules, Policies for identifying Sensitive Data stored in Jio’s Infrastructure. Hand’s on with Regular Expressions. Liaison with Application Support & DBA Team for establishing connectivity for discovery targets. Design, Configure & Implement Data Transformation Workflows. Troubleshoot and address errors encountered in scans & Data Transformation. Liaison with Product Support for troubleshooting and feature enhancement requirements. Integrate Incident Reports into Solution’s Analytics Platform either inhouse or otherwise. Monitor & analyse Workflow Executions. Installation, administration & maintenance of DLP Solution or other similar leak detection components including developed inhouse Configure Targets, Policies & perform discovery scans for identifying Personally Identifiable data (PII) or other sensitive data. Troubleshoot and address errors encountered in scans. Liaison with Product Support for troubleshooting and feature enhancement requirements. Monitor & analyse Incidents and track to closure. Perform Rules, Policies fine tuning to enhance productivity of the process. Perform Rules, Policies, Workflow, Data Domains & Patterns fine tuning to enhance productivity of the process. Perform Data Classification to identify & categorize Sensitive, Personal data. Writing scripts (Shell, Python, AWK/GAWK, VBScript) for Data Discovery, Transformation. Integrate Incident Reports into Analytics Platform. Enhance Analytics platform to manage Incidents and generate actionable reports for management decisions. Assuring that the use of technologies to sustain, and do not erode, privacy protections relating to the use, collection, and disclosure of personal information. Conducting a privacy impact assessment of proposed rules on the privacy of personal information, including the type of personal information collected and the number of people affected Work with other internal stake holders to enable the implementation of privacy requirements within the organization Document identified issues and discuss with business owners for their review and acceptance. Communicate issues with various stake holder and track implementation and closure of these issues. Support periodic reporting of the issues as required by management. Qualification : B.E./B.Tech/MCA Work experience : 5-12 Years 5-12 years of experience in Information Security and Privacy with atleast 2-3 years of experience in system administration preferably Linux based operating systems, experience in security technologies such Data Leak prevention systems or data classification, security incident handling, and similar. Systems administration experience in Data Leakage prevention (DLP), sensitive data scanning and searching. Database administration experience including performing of ETL functions. Experience in ETL tools similar to Informatica, etc. Exposure of Information Technology systems / services / for Security / Privacy implementation Exposure to technical application architecture that handle PII data Understanding of Privacy concepts and current use of technology in the area of Privacy. Strong conceptual understanding of IT technology, systems, concepts Good understanding cryptographic controls for the protection of data Good understanding of computer and networking protocols Strong interest in security vulnerability and conceptual understanding of security vulnerabilities Interest in project management, tracking and management activities Interest in the areas of risk management and security management standards such as ISO 27001, ISO 22301, Cobit, PCI-DSS, others Competencies /Expertise Required (Functional & Behavioral) Systematic problem-solving skills, with the ability to think. Excellent in analytical thinking for translating data into informative visuals and reports. Adaptable to change. Quick Learner – Open learn and work on new technologies and products

Posted -1 days ago

Apply

5.0 - 9.0 years

12 - 16 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Naukri logo

The role involves hands-on experience with data testing, data integration, and supporting data quality in big data environments. Key responsibilities include selecting and integrating data tools and frameworks, providing technical guidance for software engineers, and collaborating with data scientists, data engineers, and other stakeholders. This role requires implementing ETL processes, monitoring performance, advising on infrastructure, and defining data retention policies. Candidates should be proficient in Python, advanced SQL, Hive QL, and Spark QL, with hands-on experience in data testing tools like DBT, iCEDQ, QuerySurge, Denodo, or Informatica. Strong experience with NoSQL, Linux/Unix, and messaging systems (Kafka or RabbitMQ) is also required. Additional responsibilities include troubleshooting, debugging, UAT with business users in Agile environments, and automating tests to increase coverage and efficiency. Location: Chennai, Hyderabad, Pune, Kolkata, Ahmedabad, RemotE

Posted -1 days ago

Apply

7.0 - 9.0 years

12 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

We are seeking an experienced ETL Developer with a strong background in Python and Airflow to join our dynamic team in Hitech City, Hyderabad. The ideal candidate will have over 7 years of experience in ETL processes and data integration, with a focus on optimizing and enhancing data pipelines. While expertise in Snowflake is not mandatory, a strong understanding of RDBMS and SQL is essential.

Posted Just now

Apply

3.0 - 8.0 years

20 - 30 Lacs

Hyderabad, Pune

Hybrid

Naukri logo

Job Summary: oin our team and what well accomplish together As an MDM Developer, you will be responsible for implementing and managing Master Data Management (MDM) projects. The ideal candidate will have extensive experience with Informatica MDM and proficiency in configuring MDM tools and integrating them with cloud environments. You will utilize your expertise in data engineering to build and maintain data pipelines, optimize performance, and ensure data quality. You will be working as part of a friendly, cross-discipline agile team who helps each other solve problems across all functions. As a custodian of customer trust, you will employ best practice in development, security, accessibility and design to achieve the highest quality of service for our customers. Our development team uses a range of technologies to get the job done including ETL and data quality tools from Informatica, streaming via Apache NiFi, google native tools on GCP (Dataflow, Composer, Big Query, etc.). We also do some API design and development with Postman and Node.js You will be part of the team building data pipelines that support our marketing, finance, campaign and Executive Leadership team as well as implementing Informatica Master Data Management (MDM) hosted on Amazon Web Services (AWS). Specifically you’ll be building pipelines that support insights to enable our business partners’ analytics and campaigns. You are a fast learner, highly technical, passionate person looking to work within a team of multidisciplinary experts to improve your craft and contribute to the data development practice. Here’s how Learn new skills & advance your data development practice Analyze and profile data Design, develop, test, deploy, maintain and improve batch and real-time data pipelines Assist with design and development of solution prototypes Support consumers with understanding the data outcomes and technical design Collaborate closely with multiple teams in an agile environment What you bring You are a senior developer with 3+ years of experience in IT platform implementation in a technical capacity Bachelor of Computer Science, Engineering or equivalent Extensive experience with Informatica MDM (Multi-Domain Edition) version 10 Proficiency in MDM configuration, including Provisioning Tool, Business Entity Services, Customer 360, data modeling, match rules, cleanse rules, and metadata analysis Expertise in configuring data models, match and merge rules, database schemas, and trust and validation settings Understanding of data warehouses/cloud architectures and ETL processes Working SQL knowledge and experience working with relational databases and query authoring (SQL), as well as working familiarity with a variety of databases Experience with the Google Cloud Platform (GCP) and its related technologies (Kubernetes, CloudSQL, PubSub, Storage, Logging, Dashboards, Airflow, BigQuery, BigTable, Python, BQ SQL, Dataplex, Datastream etc.) Experience with Informatica IDQ/PowerCenter/IICS, Apache NiFi and other related ETL tools Experience with Informatica MDM (preferred) but strong skills in other MDM tools still an asset Experience working with message queues like JMS, Kafka, PubSub A passion for data quality Great-to-haves Experience with Informatica MDM SaaS Experience with Python and software engineering best practices API development using Node.js and testing using Postman/SoapUI Understanding of TMF standards

Posted Just now

Apply

157.0 years

8 - 13 Lacs

Gurgaon

On-site

GlassDoor logo

You are as unique as your background, experience and point of view. Here, you’ll be encouraged, empowered and challenged to be your best self. You'll work with dynamic colleagues - experts in their fields - who are eager to share their knowledge with you. Your leaders will inspire and help you reach your potential and soar to new heights. Every day, you'll have new and exciting opportunities to make life brighter for our Clients - who are at the heart of everything we do. Discover how you can make a difference in the lives of individuals, families and communities around the world. Job Description: Are you ready to shine? At Sun Life, we empower you to be your most brilliant self. Who we are? Sun Life is a leading financial services company with 157 years of history that helps our clients achieve lifetime financial security and live healthier lives. We serve millions in Canada, the U.S., Asia, the U.K., and other parts of the world. We have a network of Sun Life advisors, third-party partners, and other distributors. Through them, we’re helping set our clients free to live their lives their way, from now through retirement. We’re working hard to support their wellness and health management goals, too. That way, they can enjoy what matters most to them. And that’s anything from running a marathon to helping their grandchildren learn to ride a bike. To do this, we offer a broad range of protection and wealth products and services to individuals, businesses, and institutions, including: Insurance. Life, health, wellness, disability, critical illness, stop-loss, and long-term care insurance Investments. Mutual funds, segregated funds, annuities, and guaranteed investment products Advice. Financial planning and retirement planning services Asset management. Pooled funds, institutional portfolios, and pension funds With innovative technology, a strong distribution network and long-standing relationships with some of the world’s largest employers, we are today providing financial security to millions of people globally. Sun Life is a leading financial services company that helps our clients achieve lifetime financial security and live healthier lives, with strong insurance, asset management, investments, and financial advice portfolios. At Sun Life, our asset management business draws on the talent and experience of professionals from around the globe. SunLife Global Solutions (SLGS) provides Technology and Business Services to Sun Life businesses globally. Started in 2006, since then SLGS has achieved scale, growth, and operational maturity. We’ve also institutionalized Sun Life's global standards by integrating closely with their corporate functions. Asia Service Centres today has more than 1100 Engineers, 200+ AWS professionals, and a group of Certified Actuaries. They deliver various complex business solutions. The current workforce comprises of 75% Gen Y. We’re proud to be a young, bustling organization. The SLGS’s core digital competencies are cloud, mobile, data analytics, visualization, and RPA. In the past three years, we have launched over 20 digital assets, including the Sun Life Mobile Application, Conversational AI bots and many cutting-edge solutions. We’ve evolved our capability in information technology, business processing, investment research, and enterprise infrastructure to Sun Life businesses around the globe. Digital transformation, for us, is not just about technology advancement and application, but also about enabling business strategy. It's about building new business models, enhancing operational and value-chain efficiency, and creating best-in-class experiences. It’s also about building a digital culture and mindset. We enable all this with the latest technologies, data-driven insights, skillsets, talent and change frameworks. We are constantly expanding our strength in Information technology and are looking for fresh talents who can bring ideas and values aligning with our Digital strategy. Our Client Impact strategy is motivated by the need to create an inclusive culture, empowered by highly engaged people. We are entering a new world that focuses on doing purpose driven work. The kind that fills your day with excitement and determination, because when you love what you do, it never feels like work. We want to create an environment where you feel empowered to take action and are surrounded by people who challenge you, support you and inspire you to become the best version of yourself. As an employer, we not only want to attract top talent, but we want you to have the best Sun Life Experience. We strive to Shine Together, Make Life Brighter & Shape the Future! What will you do? Responsible for building technical product and ensuring that it works from end-to-end , from low level design to code. A strong ETL developer who has owned or played a pivotal role in Platform & Technology migration work in the past. He/she is a quick learner who adapts new technologies & frameworks and have good learning attitude. We are looking for a ETL Professional with a strong background in design and development using MS SQL and Informatica Powercenter. The candidate should have hands on exposure to design and development of user interfaces in ETL. The successful candidate will be an adaptable individual, who enjoys driving projects to successful completion. This position requires a mix of techno-functional skills on a platform team that supports a vendor project. The role will work closely with Canadian IT teams using agile methodologies, delivering ETL solutions Key responsibilities: An expert in solution design with the ability to see the big picture across the portfolio; providing guidance and governance for the analysis, solution design, development and implementation of projects A strategic thinker who will be responsible for the technical strategy within the portfolio; ensuring it aligns with the overall architecture roadmap and business strategy. An effective communicator who will utilize their technical/business knowledge to lead technical discussions with project teams, business sponsors, and external technical partners in a manner that ensures understanding of risks, options, and overall solutions. An individual with the ability to effectively consult/support technical discussions with Account Delivery and Business Sponsors ensuring alignment to both technology and business visions. Collaborate with Designers, Business System Analysts, Application Analysts and Testing specialists to deliver high quality solutions Able to prepare high-level and detailed-level designs based on technical/business requirements and defined architectures and maintain documentation Have been instrumental in platform migration work and technical migration work in the past and understands the involved intricacies. Analyze, define and document requirements for data, workflow, logical processes, interface design, internal and external checks, controls, and outputs Ensure information security standards and requirements are incorporated into all solutions Stay current with trends in emerging technologies and how they could apply to Sun Life. Key experience: A Bachelor’s or master’s degree in Computer Science or related field 8 -11 years of progressive information technology experience with full application development life cycle. Domain knowledge of Insurance and Retail Wealth Management. Experience in Informatica Powercenter / IDMC Development. Experience of applying various informatica transformations and different type of sources. Ability to write complex T-SQL and stored procedures, views. Experience in SQL Server 2014 and above. Exposure to DevOps and API architecture Should have experience leading small teams (5-8 developers). Good knowledge and experience of Java1.8 or above. Experience in PostGRE SQL and No-SQL DB like MongoDB etc. Good knowledge of coding best practices and should be able to do code review of peer. Produce clean, efficient code based on specifications and troubleshoot, debug and upgrade existing software. Primary Location: IN-Haryana – Gurgaon Schedule: Full-time Job Category: IT - Digital Development Posting End Date: 26/06/2025

Posted 2 hours ago

Apply

5.0 years

3 - 17 Lacs

Pune

Remote

GlassDoor logo

We're Hiring: SAP Specialist Experience: 5+ Years Location: remote About the Role: We are seeking a skilled SAP MDM Specialist to join our team and lead the management and governance of master data within our SAP landscape (S/4HANA). The ideal candidate will have strong technical knowledge, hands-on experience in SAP MDM/MDG, and the ability to collaborate with cross-functional teams to drive data integrity and quality. Key Responsibilities: Maintain and govern master data (customer, vendor, material, etc.) in SAP Ensure data accuracy, consistency, and compliance across systems Define data standards and governance frameworks Collaborate with business units to gather data requirements and ensure process alignment Configure and support SAP MDM/MDG workflows Troubleshoot data issues and support system enhancements Drive data migration and integration efforts during system implementations Monitor data quality KPIs and support audits Required Qualifications: 5+ years in SAP MDM/MDG with ECC and S/4HANA Expertise in data governance, data cleansing, and data lifecycle management Strong understanding of SAP modules (MM, SD, FI) Hands-on with SAP Data Services, LSMW, SQL, and Excel Bachelor's degree in IT, Data Management, or related field Preferred: SAP Certification (MDM/MDG) Industry experience in manufacturing, retail, or finance (beverage industry is a plus) Familiarity with data migration tools (Informatica, etc.) Soft Skills: High attention to detail and accuracy Strong analytical and communication skills Ability to manage multiple priorities and work independently Interested candidates may share their resume at [ hr@irizpro.com ] or apply directly through this post. Let’s build a data-driven future together. #SAPJobs #SAPMDM #SAPS4HANA #DataGovernance #MasterData #ETLJobs #HiringNow #TechCareers Job Type: Contractual / Temporary Contract length: 4 months Pay: ₹370,538.09 - ₹1,706,746.44 per year Schedule: Day shift Morning shift Work Location: Remote

Posted 2 hours ago

Apply

5.0 years

5 - 10 Lacs

Bengaluru

On-site

GlassDoor logo

Job requisition ID :: 84163 Date: Jun 23, 2025 Location: Bengaluru Designation: Senior Consultant Entity: We are seeking a Senior Data Engineer with extensive experience in cloud platforms and data engineering tools, with a strong emphasis on Databricks. The ideal candidate will have deep expertise in designing and optimizing data pipelines, building scalable ETL workflows, and leveraging Databricks for advanced analytics and data processing. Experience with Google Cloud Platform is beneficial, particularly in integrating Databricks with cloud storage solutions and data warehouses such as BigQuery. The candidate should have a proven track record of working on data enablement projects across various data domains and be well-versed in the Data as a Product approach, ensuring data solutions are scalable, reusable, and aligned with business needs. Key Responsibilities: Design, develop, and optimize scalable data pipelines using Databricks, ensuring efficient data ingestion, transformation, and processing. Implement and manage data storage solutions, including Delta Tables for structured storage and seamless data versioning. 5+ years of experience with cloud data services, with a strong focus on Databricks and its integration with Google Cloud Platform storage and analytics tools such as BigQuery. Leverage Databricks for advanced data processing, including the development and optimization of data workflows, Delta Live Tables, and ML-based data transformations. Monitor and optimize Databricks performance, focusing on cluster configurations, resource utilization, and Delta Table performance tuning. Collaborate with cross-functional teams to drive data enablement projects, ensuring scalable, reusable, and efficient solutions using Databricks. Apply the Data as a Product / Data as an Asset approach, ensuring high data quality, accessibility, and usability within Databricks environments. 5+ years of experience with analytical software and languages, including Spark (Databricks Runtime), Python, and SQL for data engineering and analytics. Should have strong expertise in Data Structures and Algorithms (DSA) and problem-solving, enabling efficient design and optimization of data workflows. Experienced in CI/CD pipelines using GitHub for automated data pipeline deployments within Databricks. Experienced in Agile/Scrum environments, contributing to iterative development processes and collaboration within data engineering teams. Experience in Data Streaming is a plus, particularly leveraging Kafka or Spark Structured Streaming within Databricks. Familiarity with other ETL/ELT tools is a plus, such as Qlik Replicate, SAP Data Services, or Informatica, with a focus on integrating these with Databricks. Qualifications: A Bachelor’s or Master’s degree in Computer Science, Engineering, or a related discipline. Over 5 years of hands-on experience in data engineering or a closely related field. Proven expertise in AWS and Databricks platforms. Advanced skills in data modeling and designing optimized data structures. Knowledge of Azure DevOps and proficiency in Scrum methodologies. Exceptional problem-solving abilities paired with a keen eye for detail. Strong interpersonal and communication skills for seamless collaboration. A minimum of one certification in AWS or Databricks, such as Cloud Engineering, Data Services, Cloud Practitioner, Certified Data Engineer, or an equivalent from reputable MOOCs.

Posted 2 hours ago

Apply

1.0 - 9.0 years

5 - 8 Lacs

Bengaluru

On-site

GlassDoor logo

Job requisition ID :: 84728 Date: Jun 22, 2025 Location: Bengaluru Designation: Consultant Entity: Technology & Transformation-EAD: ETL Testing-Analyst/Consultant/Senior Consultant Y our potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realize your potential amongst cutting edge leaders, and organizations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Deloitte’s Technology & Transformation practice can help you uncover and unlock the value buried deep inside vast amounts of data. Our global network provides strategic guidance and implementation services to help companies manage data from disparate sources and convert it into accurate, actionable information that can support fact-driven decision-making and generate an insight-driven advantage. Our practice addresses the continuum of opportunities in business intelligence & visualization, data management, performance management and next-generation analytics and technologies, including big data, cloud, cognitive and machine learning. Your work profile: As a Analyst/Consultant/Senior Consultant in our T&T Team you’ll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations: - Develop and execute automated test cases for ETL processes. Validate data transformation, extraction, and loading accuracy. Collaborate with data engineers and QA teams to understand ETL workflows. Identify and document defects and inconsistencies. Maintain test documentation and support manual testing efforts. Design and implement automated ETL test scripts and frameworks. Validate end-to-end data flows and transformation logic. Collaborate with data architects, developers, and QA teams. Integrate ETL testing into CI/CD pipelines where applicable. Analyze test results and troubleshoot data issues. Lead the architecture and development of advanced ETL automation frameworks. Drive best practices in ETL testing and data quality assurance. Mentor and guide junior consultants and analysts. Collaborate with stakeholders to align testing strategies with business goals. Integrate ETL testing within DevOps and CI/CD pipelines. Desired Qualifications 1 to 9 years experience in ETL testing and automation. Knowledge of ETL tools such as Informatica, Talend, or DataStage. Experience with SQL and database querying. Basic scripting or programming skills (Python, Shell, etc.). Good analytical and communication skills. Strong SQL skills and experience with ETL tools like Informatica, Talend, or DataStage. Proficiency in scripting languages for automation (Python, Shell, etc.). Knowledge of data warehousing concepts and best practices. Strong problem-solving and communication skills. Expert knowledge of ETL tools and strong SQL proficiency. Experience with automation scripting and data validation techniques. Strong leadership, communication, and stakeholder management skills. Familiarity with big data technologies and cloud platforms is a plus. Location and way of working: Base location: Bangalore This profile involves occasional travelling to client locations. Hybrid is our default way of working. Each domain has customized the hybrid approach to their unique needs. How you’ll grow Connect for impact Our exceptional team of professionals across the globe are solving some of the world’s most complex business problems, as well as directly supporting our communities, the planet, and each other. Know more in our Global Impact Report and our India Impact Report. Empower to lead You can be a leader irrespective of your career level. Our colleagues are characterised by their ability to inspire, support, and provide opportunities for people to deliver their best and grow both as professionals and human beings. Know more about Deloitte and our One Young World partnership. Inclusion for all At Deloitte, people are valued and respected for who they are and are trusted to add value to their clients, teams and communities in a way that reflects their own unique capabilities. Know more about everyday steps that you can take to be more inclusive. At Deloitte, we believe in the unique skills, attitude and potential each and every one of us brings to the table to make an impact that matters. Drive your career At Deloitte, you are encouraged to take ownership of your career. We recognise there is no one size fits all career path, and global, cross-business mobility and up / re-skilling are all within the range of possibilities to shape a unique and fulfilling career. Know more about Life at Deloitte. Everyone’s welcome… entrust your happiness to us Our workspaces and initiatives are geared towards your 360-degree happiness. This includes specific needs you may have in terms of accessibility, flexibility, safety and security, and caregiving. Here’s a glimpse of things that are in store for you. Interview tips We want job seekers exploring opportunities at Deloitte to feel prepared, confident and comfortable. To help you with your interview, we suggest that you do your research, know some background about the organisation and the business area you’re applying to. Check out recruiting tips from Deloitte professionals.

Posted 2 hours ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Digital Risk – Manager - Data Governance Key Responsibilities: The purpose of this role will be to build & enhance the data governance capabilities and supervise delivery, provide technical and project leadership to your team members, as well as build relationships with clients. While delivering quality client services and enabling high-performing teams, you will drive high-value work products within expected timeframes and budget. You will monitor progress, manage risks and ensure key stakeholders are kept informed about progress and expected outcomes. Additionally, you will: Foster an innovative and inclusive team-oriented work environment. Play an active role in counselling and mentoring junior consultants within the firm. Consistently deliver quality client services. Drive high-quality work products within expected timeframes and on budget. Monitor progress manage risk and ensure key stakeholders are kept informed about progress and expected outcomes. Use knowledge of the current IT environment and industry trends to identify engagement and client service issues and communicate this information to the engagement team and client management through written correspondence and verbal presentations. Stay abreast of current business and industry trends relevant to the client's business. Foster relationships with client personnel to analyse, evaluate, and enhance information systems to develop and improve security at procedural and technology levels. Assist with cultivating and managing business development opportunities. Understand EY and its service lines and actively assess/present ways to serve clients. Demonstrate deep technical capabilities and professional knowledge. Demonstrate ability to quickly assimilate to new knowledge. Qualifications: Bachelor’s degree in Computer Science / Master's degree Computer Science, Information Management, Business Administration or a related field. Proven experience (6+ years) in data governance with hands-on experience with multiple disciplines within data management, such as Master Data Management Data Security Metadata Management Data Quality Management Business Intelligence Practical experience with DG, DMQ and MDM tools such as Informatica, Collibra, MS Purview and Databricks Unity Catalog. Good communication skills in English, both written and oral Ability to work collaboratively with stakeholders. Relevant certifications such as DAMA and DCAM EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 hours ago

Apply

12.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Description: About Us At Bank of America, we are guided by a common purpose to help make financial lives better through the power of every connection. Responsible Growth is how we run our company and how we deliver for our clients, teammates, communities, and shareholders every day. One of the keys to driving Responsible Growth is being a great place to work for our teammates around the world. We’re devoted to being a diverse and inclusive workplace for everyone. We hire individuals with a broad range of backgrounds and experiences and invest heavily in our teammates and their families by offering competitive benefits to support their physical, emotional, and financial well-being. Bank of America believes both in the importance of working together and offering flexibility to our employees. We use a multi-faceted approach for flexibility, depending on the various roles in our organization. Working at Bank of America will give you a great career with opportunities to learn, grow and make an impact, along with the power to make a difference. Join us! Global Business Services Global Business Services delivers Technology and Operations capabilities to Lines of Business and Staff Support Functions of Bank of America through a centrally managed, globally integrated delivery model and globally resilient operations. Global Business Services is recognized for flawless execution, sound risk management, operational resiliency, operational excellence and innovation. In India, we are present in five locations and operate as BA Continuum India Private Limited (BACI), a non-banking subsidiary of Bank of America Corporation and the operating company for India operations of Global Business Services. Process Overview* Employee experience technology, designs and delivers modern technology solutions for all teammates globally to interact, perform in their roles and service critical staff support organizations including Chief Administrative Office, Global Strategy & Enterprise Platforms, Global Human Resources, Corporate Audit & Credit Review, and Legal. Legal Technology enables modern practice of law through technology transformation and is responsible for delivering strategic technology solutions to the Legal Department and Office of the Corporate. Job Description* We are seeking a Senior Engineer to lead the architecture, design, and development of complex data solutions. The role requires the individual to be hands-on and as well collaborate with stakeholders to drive strategic design decisions. The ideal candidate has a proven track record in leading design and delivery including integration, data cleaning, transformation and control of data in operational and analytical data systems. Responsibilities* Codes complex solutions to integrate, clean, transform, and control data, builds processes supporting data transformation, data structures, metadata, data quality controls, dependency, and workload management, assembles complex data sets, and communicates required information for deployment. Leads documentation of system requirements, collaborates with development teams to understand data requirements and feasibility, and leverages architectural components to develop client requirements. Leads complex information technology projects to ensure on-time delivery and adherence to release processes and risk management and defines and builds data pipelines to enable data-informed decision making. Assembles large, complex data sets that meet functional and non-functional requirements, ensuring that the design and engineering approach is consistent across multiple systems. Maintains, improves, cleans, and manipulates large data for operational and analytics data systems, builds complex processes supporting data transformation, data structures, metadata, data quality controls, dependency, and workload management, and communicates required information for deployment, maintenance, and support of business functionality. Utilizes multiple architectural components in the design and development of client requirements and collaborates with development teams to understand data requirements and ensure the data architecture is feasible to implement. Defines and builds data pipelines to enable data-informed decision making, ensuring adherence to release processes and risk management routines. Leads the identification of gaps in data management standards adherence and works with appropriate partners to develop plans to close gaps, leading concept testing and conducting research to prototype toolsets and improve existing processes. Designs, develops, and maintains innovative automated reports, dashboards, and scorecards using Business Intelligence tool Tableau. Analyzes disparate database sources including relational structures, dimensional data models, and cubes. Designs and builds relational data models to support the development of actionable reports, dashboards, and scorecards. Design, develop, conducts unit testing, and maintains complex Tableau reports for scalability, manageability, extensibility, performance, and re-use. Determine the best implementation that will meet the design of the architect. Requirements* Education* Graduation / Post Graduation: BE/B.Tech/MCA Certifications If Any: NA Experience Range* 12-16 Years Foundational Skills* 12+ years of experience with strong knowledge of core BI skills and platform administration. 12+ years of experience in data engineering and data architecture. Extensive hands-on experience with SQL, Tableau, ETL Tools (Informatica or SSIS). Experience in designing Tableau dashboards and visualizations that align to business requirements. Experience working with technology and data teams with good analytical and problem-solving skills. Effective communication, Strong stakeholder engagement skills, Proven ability in leading and mentoring a team of software engineers in a dynamic environment. Desired Skills* Experience in Big Data technologies, Azure data factory, AI/ML is a big plus. Work Timings* 11:30 AM to 8:30 PM IST Job Location* Hyderabad

Posted 3 hours ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Digital Risk – Manager - Data Governance Key Responsibilities: The purpose of this role will be to build & enhance the data governance capabilities and supervise delivery, provide technical and project leadership to your team members, as well as build relationships with clients. While delivering quality client services and enabling high-performing teams, you will drive high-value work products within expected timeframes and budget. You will monitor progress, manage risks and ensure key stakeholders are kept informed about progress and expected outcomes. Additionally, you will: Foster an innovative and inclusive team-oriented work environment. Play an active role in counselling and mentoring junior consultants within the firm. Consistently deliver quality client services. Drive high-quality work products within expected timeframes and on budget. Monitor progress manage risk and ensure key stakeholders are kept informed about progress and expected outcomes. Use knowledge of the current IT environment and industry trends to identify engagement and client service issues and communicate this information to the engagement team and client management through written correspondence and verbal presentations. Stay abreast of current business and industry trends relevant to the client's business. Foster relationships with client personnel to analyse, evaluate, and enhance information systems to develop and improve security at procedural and technology levels. Assist with cultivating and managing business development opportunities. Understand EY and its service lines and actively assess/present ways to serve clients. Demonstrate deep technical capabilities and professional knowledge. Demonstrate ability to quickly assimilate to new knowledge. Qualifications: Bachelor’s degree in Computer Science / Master's degree Computer Science, Information Management, Business Administration or a related field. Proven experience (6+ years) in data governance with hands-on experience with multiple disciplines within data management, such as Master Data Management Data Security Metadata Management Data Quality Management Business Intelligence Practical experience with DG, DMQ and MDM tools such as Informatica, Collibra, MS Purview and Databricks Unity Catalog. Good communication skills in English, both written and oral Ability to work collaboratively with stakeholders. Relevant certifications such as DAMA and DCAM EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 hours ago

Apply

8.0 - 11.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Title: Informatica MDM Specialist-Manager Job Summary: We are looking for a skilled Informatica MDM Specialist . The candidate will have hands-on experience implementing and maintaining Master Data Management solutions using Informatica MDM (Customer 360, Supplier 360, Product 360 and MDM Hub) . This role involves architecting and developing MDM solutions, managing data quality, and ensuring data governance practices across the enterprise. Key Responsibilities: Design, develop, and implement end-to-end MDM solutions using Informatica MDM platform. Configure Data Models, Match & Merge rules, Hierarchies, Trust Framework, and workflows. Collaborate with business stakeholders, data architects, and developers to gather and analyse requirements. Perform data profiling, cleansing, standardization, and validation for master data domains. Implement data governance and stewardship workflows for maintaining data quality. Monitor MDM performance, manage error handling and system tuning. Prepare and maintain technical documentation, deployment guides, and support materials. Provide technical support and troubleshooting during and post-deployment. Stay up to date with Informatica MDM product updates, industry trends, and best practices. Required Qualifications: 8-11 years of experience in Informatica MDM development and implementation. Strong understanding of MDM architecture, data modelling, and metadata management. Hands-on experience with Informatica MDM Hub, e360, IDD, SIF, MDM Provisioning Tool, and ETL/ELT. Experience with data quality tools (Informatica DQ or others) and MDM integration patterns. Understanding of data governance principles and master data domains (customer, product, vendor, etc.). Strong analytical and problem-solving skills. Excellent communication and stakeholder engagement skills. Preferred Qualifications: Informatica MDM certification(s). Experience with IDMC MDM – MDM SaaS Familiarity with data governance platforms (e.g., Collibra, Informatica Axon). Exposure to Agile/Scrum delivery methodologies. Experience in large-scale MDM implementations in domains like Retail, Manufacturing, Healthcare, or BFSI. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 hours ago

Apply

6.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Digital Risk – Manager - Data Governance Key Responsibilities: The purpose of this role will be to build & enhance the data governance capabilities and supervise delivery, provide technical and project leadership to your team members, as well as build relationships with clients. While delivering quality client services and enabling high-performing teams, you will drive high-value work products within expected timeframes and budget. You will monitor progress, manage risks and ensure key stakeholders are kept informed about progress and expected outcomes. Additionally, you will: Foster an innovative and inclusive team-oriented work environment. Play an active role in counselling and mentoring junior consultants within the firm. Consistently deliver quality client services. Drive high-quality work products within expected timeframes and on budget. Monitor progress manage risk and ensure key stakeholders are kept informed about progress and expected outcomes. Use knowledge of the current IT environment and industry trends to identify engagement and client service issues and communicate this information to the engagement team and client management through written correspondence and verbal presentations. Stay abreast of current business and industry trends relevant to the client's business. Foster relationships with client personnel to analyse, evaluate, and enhance information systems to develop and improve security at procedural and technology levels. Assist with cultivating and managing business development opportunities. Understand EY and its service lines and actively assess/present ways to serve clients. Demonstrate deep technical capabilities and professional knowledge. Demonstrate ability to quickly assimilate to new knowledge. Qualifications: Bachelor’s degree in Computer Science / Master's degree Computer Science, Information Management, Business Administration or a related field. Proven experience (6+ years) in data governance with hands-on experience with multiple disciplines within data management, such as Master Data Management Data Security Metadata Management Data Quality Management Business Intelligence Practical experience with DG, DMQ and MDM tools such as Informatica, Collibra, MS Purview and Databricks Unity Catalog. Good communication skills in English, both written and oral Ability to work collaboratively with stakeholders. Relevant certifications such as DAMA and DCAM EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 hours ago

Apply

8.0 - 11.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Title: Informatica IDG Specialist / Consultant-Senior Job Summary: We are looking for an experienced Informatica IDG (Data Governance) professional to lead and support our enterprise data governance initiatives. The candidate will be responsible for configuring and deploying Informatica Axon, Enterprise Data Catalog (EDC), and Data Privacy Management (DPM) tools to establish robust governance, data discovery, metadata management, and regulatory compliance across the organization. Key Responsibilities: Implement and configure Informatica IDG components including Axon Data Governance, Enterprise Data Catalog (EDC), and Data Privacy Management (DPM). Collaborate with data owners, stewards, and business users to define and maintain business glossaries, data domains, policies, and governance workflows. Integrate IDG with other platforms (IDQ, MDM, IICS, PowerCenter, Snowflake, etc.) to ensure metadata lineage and impact analysis. Design and implement data governance strategies that align with data privacy regulations (GDPR, CCPA, etc.) and internal compliance requirements. Create and maintain data lineage maps, stewardship dashboards, and data quality insights using Informatica tools. Define and enforce role-based access controls and security configurations within IDG tools. Support adoption of data governance processes, including stewardship, policy approval, and issue resolution. Train business users and data stewards on using Axon, EDC, and other governance components. Ensure the sustainability of governance programs through change management, documentation, and governance councils. Required Qualifications: 8-11 years of experience in Informatica Data Governance (IDG) or related tools. Strong hands-on experience with Informatica Axon, Enterprise Data Catalog (EDC), and Data Privacy Management (DPM). Understanding of data governance frameworks, metadata management, and policy management. Familiarity with data classification, data lineage, and data stewardship workflows. Experience with metadata ingestion and cataloging across hybrid/cloud platforms. Solid SQL skills and familiarity with cloud data platforms (AWS, Azure, GCP, Snowflake, etc.). Strong communication, stakeholder engagement, and documentation skills. Preferred Qualifications: Informatica certifications in Axon, EDC, or Data Governance. Experience with Data Quality (IDQ), Master Data Management (MDM), or IICS. Knowledge in CDGC will be an added value Knowledge of industry-specific regulations and data governance mandates. Familiarity with governance best practices from DAMA-DMBOK, DCAM, or similar frameworks. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 hours ago

Apply

3.0 - 7.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Title: Informatica IDQ/CDQ Developer / Data Quality Specialist-Senior Job Summary: We are seeking a highly skilled Informatica DQ (Data Quality)/Cloud Data Quality Developer. The candidate will be responsible for designing, developing, and deploying data quality solutions using Informatica IDQ/CDQ to ensure accurate, complete, and reliable data across the enterprise. This role involves close collaboration with data stewards, business analysts, data engineers, and governance teams to define and enforce data quality standards, rules, and processes. Key Responsibilities: Design and implement data quality rules, scorecards, and dashboards using Informatica DQ. Perform data profiling, data cleansing, standardization, parsing, matching, and de-duplication. Collaborate with business stakeholders to define data quality metrics, thresholds, and SLAs. Develop reusable data quality assets (rules, mappings, workflows) and deploy them in production. Integrate DQ solutions with Informatica MDM, PowerCenter, IICS, or other ETL platforms. Monitor and troubleshoot DQ jobs and provide data quality issue resolution support. Work with data stewards and governance teams to establish data stewardship workflows. Conduct data analysis to identify root causes of data quality issues and recommend improvements. Create and maintain technical documentation, including data dictionaries and rule repositories. Participate in data governance programs, supporting continuous improvement and regulatory compliance. Required Qualifications: 3-7 years of experience in Informatica Data Quality (IDQ/CDQ) development. Strong knowledge of data profiling, cleansing, and transformation techniques. Proficiency in building Informatica DQ mappings, mapplets, workflows, and scorecards. Experience working with relational databases (Oracle, SQL Server, etc.) and writing SQL queries. Familiarity with data governance frameworks and master data concepts. Solid understanding of data lifecycle management and data architecture principles. Strong problem-solving, analytical, and communication skills. Preferred Qualifications: Informatica DQ certification. Experience with IICS (Informatica Intelligent Cloud Services) and Cloud Data Quality modules. Exposure to data governance tools like Informatica Axon, Collibra , or similar. Familiarity with Agile or DevOps methodologies and tools like JIRA, Git, Jenkins. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 hours ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Reference # 313438BR Job Type Full Time Your role Are you confident managing evolving user requirements? Do you have the know-how to apply business analysis techniques to deliver IT projects? At UBS, we re-imagine the way we work, the way we connect with each other – our colleagues, clients and partners – and the way we deliver value. Being agile will make us more responsive, more adaptable, and ultimately more innovative. We’re looking for a Software Engineer – Data & Analytics Specialist to: Work in a dynamic, fast-paced environment that provides exposure to reference data and cloud technologies understand business requirement and help define high level design and testing strategies collaborate to refine user requirements within the Agile framework communicate effectively with the larger global team and take full ownership of assigned tasks demonstrate strong problem-solving skills and share new concepts contribute to reduction of security and operational risks, in line with policies, standards Your team In our agile operating model, crews are aligned to larger products and services fulfilling client needs and encompass multiple autonomous pods. You’ll be working in the Banking & Employee Compliance team in Pune focusing on Data & Analytics Your expertise Your expertise Ideally 5+ years of experience as Business/System Analyst IT Projects, Reference Data with Agile delivery model in Financial Institutions Business Process Modelling (BPMN) Data flow diagrams and data entity modelling strong analytical and soft-skills in communication, stakeholders, expectations and conflict management solid experience in requirements management, system analysis and data modelling, understanding of SDLC phases and dependencies ability to learn the new technologies and willing of consistent improvement of soft-skills hands on experience on SQL, HTTP, XML, JSON, Confluence, JIRA, Share Point, Azure (preferred) highly motivated and self-driven, able to work independently and ability to challenge status quo Agile methodology and close collaboration with business and IT pod Experience Writing In Informatica Preferrable But Not Essential You should be a strong team player and a good listener self-organized and can pro-actively time manage your workload goal-oriented and proactive good knowledge of devops and CI/CD pipeline, using gitlab About Us UBS is the world’s largest and the only truly global wealth manager. We operate through four business divisions: Global Wealth Management, Personal & Corporate Banking, Asset Management and the Investment Bank. Our global reach and the breadth of our expertise set us apart from our competitors.. We have a presence in all major financial centers in more than 50 countries. How We Hire We may request you to complete one or more assessments during the application process. Learn more Disclaimer / Policy Statements UBS is an Equal Opportunity Employer. We respect and seek to empower each individual and support the diverse cultures, perspectives, skills and experiences within our workforce.

Posted 3 hours ago

Apply

6.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Digital Risk – Manager - Data Governance Key Responsibilities: The purpose of this role will be to build & enhance the data governance capabilities and supervise delivery, provide technical and project leadership to your team members, as well as build relationships with clients. While delivering quality client services and enabling high-performing teams, you will drive high-value work products within expected timeframes and budget. You will monitor progress, manage risks and ensure key stakeholders are kept informed about progress and expected outcomes. Additionally, you will: Foster an innovative and inclusive team-oriented work environment. Play an active role in counselling and mentoring junior consultants within the firm. Consistently deliver quality client services. Drive high-quality work products within expected timeframes and on budget. Monitor progress manage risk and ensure key stakeholders are kept informed about progress and expected outcomes. Use knowledge of the current IT environment and industry trends to identify engagement and client service issues and communicate this information to the engagement team and client management through written correspondence and verbal presentations. Stay abreast of current business and industry trends relevant to the client's business. Foster relationships with client personnel to analyse, evaluate, and enhance information systems to develop and improve security at procedural and technology levels. Assist with cultivating and managing business development opportunities. Understand EY and its service lines and actively assess/present ways to serve clients. Demonstrate deep technical capabilities and professional knowledge. Demonstrate ability to quickly assimilate to new knowledge. Qualifications: Bachelor’s degree in Computer Science / Master's degree Computer Science, Information Management, Business Administration or a related field. Proven experience (6+ years) in data governance with hands-on experience with multiple disciplines within data management, such as Master Data Management Data Security Metadata Management Data Quality Management Business Intelligence Practical experience with DG, DMQ and MDM tools such as Informatica, Collibra, MS Purview and Databricks Unity Catalog. Good communication skills in English, both written and oral Ability to work collaboratively with stakeholders. Relevant certifications such as DAMA and DCAM EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 hours ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru East, Karnataka, India

Remote

Linkedin logo

Req ID: 328866 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a SQL, PL/SQL, Informatica, Unix/Linux, EKS, AWS/Azure - Developer to join our team in Bangalore, Karnātaka (IN-KA), India (IN). SQL, PL/SQL, Informatica, Unix/Linux, EKS, AWS/Azure - Developer FMRJP00035038 - Systems Engineer 2 (3-5 Years) Technical/Process SQL & PL/SQL - Expertise in writing and debugging/Trouble shooting skills on SQL and PL-SQL code, Stored Procedures, Functions & Triggers Informatica or any similar ETL tool - Expertise in debugging/Trouble shooting Session/Workflow logs, understands Mapping flows Hands-on experience with Autosys/Control-M is a must Unix/Linux – Write Shell/PERL scripting and programming experience and basic commands Experience in Support /developing applications in Cloud & EKS, AWS/Azure certified (good to have) Hands-on experience with Splunk/Sitescope/Datadog Ability to handle incident bridge calls and crisis situations to mitigate incident impact. Experience in incident management and problem management. Ability to understand the business criticality of various applications as they relate to complex business processes. Familiarity with ITIL framework and/or Agile Project Management Good Analytical, Reporting and Problem-Solving Skills Apache Tomcat & Core Java - Experience in supporting Java based applications (good to have) Working knowledge of basic investment terms and practices is desirable. Minimum Experience On Key Skills 3-5 Years General Expectation Must have Good Communication Must be ready to work in 10:30 AM to 8:30 PM Shift Flexible to work in Client Location GV, Manyata or EGL, Bangalore Must be ready to work from office in a Hybrid work environment. Full Remote work is not an option Expect Full Return to office from Feb/Mar'25 Pre-Requisites before submitting profiles Must have Genuine and Digitally signed Form16 for ALL employments All employment history/details must be present in UAN/PPF statements Candidate must be screened using Video and ensure he/she is genuine and have proper work setup Candidates must have real work experience on mandatory skills mentioned in JD Profiles must have the companies which they are having payroll with and not the client names as their employers As these are competitive positions and client will not wait for 60 days and carry the risks of drop-out, candidates must of 0 to 3 weeks of Notice Period Candidates must be screened for any gaps after education and during employment for genuineness of the reasons About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us . This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here . If you'd like more information on your EEO rights under the law, please click here . For Pay Transparency information, please click here .

Posted 3 hours ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

ETL +SQL Are you looking for a new career challenge? With LTIMindtree, are you ready to embark on a data-driven career? Working for global leading manufacturing client for providing an engaging product experience through best-in-class PIM implementation and building rich, relevant, and trusted product information across channels and digital touchpoints so their end customers can make an informed purchase decision – will surely be a fulfilling experience. Location: Hyderabad E-mail: Swathi.Gangu@ltimindtree.com Allen.Prashanth@ltimindtree.com Enterprise Data Hub Data Engineer BI Developer Work Location Google Hyderabad Primary Skills Advanced SQL and strong analytical skills HiveNoSQL database experience Strong experience in building reports with tools such as Looker Tableau PowerBI etc Note Candidates would need to take a 45 minute SQL technical assessment test along with other interviews on core skill sets Primary Responsibilities Develop and support data pipelines to extract transform and load data into Google internal analytical systems BuildEnhance reports and dashboards to support business critical decision making Job Description Develop ETLdata pipelines to populate the enterprise data warehouse from a variety of custom source systemsdatabases Design build launch optimize and extend fullstack Data and business intelligence solutions spanning extraction storage transformation and reportingvisualization layers Exposure in designing reusable extractioningestion and data quality framework generic metadata and data integration patterns Build continuous Integration CIContinuous DeploymentCD data pipelinesreporting dashboard with version control repository Perform RCA Root Cause Analysis Troubleshoot and fix datatechnology issues Implement enhancements in existing data pipelines and reportsdashboards Data Analysis SQL Develop support and optimize reports and dashboards using LookerTableauPowerBIOther Reporting tools for business growth Ability to showcase the actionable insights through reportsdashboards Demonstrate ability and willingness to learn quickly and deliver project work with high quality Demonstrate excellent collaboration and stakeholder management interpersonal communication and written skills with the ability to work in a team environment Create project specific artifacts such as technical design documents source to target transformation mapping docs data dictionary data flow diagrams data model etc Proactive measures to maintain the good health of the overall data integration and reporting systems Open to participate in POCsPOV for new technology stacks Minimum Qualifications 8 years of solid handson experience with SQL scripting or ETL tools and ReportingDashboard development Handson experience with design development and support of data analysis Understanding of Google Cloud Platform GCP technologies in the big data and data warehousing space Big Query etc Experience with data platform and visualization technologies such as Looker Tableau and PowerBI or other similar toolstechnologies Experience working with Agile Software Development lifecycle Strong analytical troubleshooting and organizational skills Ability to analyze and troubleshoot complex issues and proficiency in managing stakeholders Skills Mandatory Skills : ANSI-SQL, Informatica PowerCenter, Informatica PowerExchange Good to Have Skills : Dimensional Data Modeling Why join us? Work in industry leading implementations for Tier-1 clients Accelerated career growth and global exposure Collaborative, inclusive work environment rooted in innovation Exposure to best-in-class automation framework Innovation first culture: We embrace automation, AI insights and clean data Know someone who fits this perfectly? Tag them – let’s connect the right talent with right opportunity DM or email to know more Let’s build something great together

Posted 3 hours ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Role Description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments Tech skills Proficient in Python (Including popular python packages e.g. Pandas, NumPy etc.) and SQL Strong background in distributed data processing and storage (e.g. Apache Spark, Hadoop) Large scale (TBs of data) data engineering skills - Model data, create production ready ETL pipelines Development experience with at least one cloud (Azure high preference, AWS, GCP) Knowledge of data lake and data lake house patterns Knowledge of ETL performance tuning and cost optimization Knowledge of data structures and algorithms and good software engineering practices Soft skills Strong communication skills to articulate complex situation concisely Comfortable with picking up new technologies independently Eye for detail, good data intuition, and a passion for data quality Comfortable working in a rapidly changing environment with ambiguous requirements Skills Python,Sql,Aws,Azure

Posted 3 hours ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Digital Risk – Manager - Data Governance Key Responsibilities: The purpose of this role will be to build & enhance the data governance capabilities and supervise delivery, provide technical and project leadership to your team members, as well as build relationships with clients. While delivering quality client services and enabling high-performing teams, you will drive high-value work products within expected timeframes and budget. You will monitor progress, manage risks and ensure key stakeholders are kept informed about progress and expected outcomes. Additionally, you will: Foster an innovative and inclusive team-oriented work environment. Play an active role in counselling and mentoring junior consultants within the firm. Consistently deliver quality client services. Drive high-quality work products within expected timeframes and on budget. Monitor progress manage risk and ensure key stakeholders are kept informed about progress and expected outcomes. Use knowledge of the current IT environment and industry trends to identify engagement and client service issues and communicate this information to the engagement team and client management through written correspondence and verbal presentations. Stay abreast of current business and industry trends relevant to the client's business. Foster relationships with client personnel to analyse, evaluate, and enhance information systems to develop and improve security at procedural and technology levels. Assist with cultivating and managing business development opportunities. Understand EY and its service lines and actively assess/present ways to serve clients. Demonstrate deep technical capabilities and professional knowledge. Demonstrate ability to quickly assimilate to new knowledge. Qualifications: Bachelor’s degree in Computer Science / Master's degree Computer Science, Information Management, Business Administration or a related field. Proven experience (6+ years) in data governance with hands-on experience with multiple disciplines within data management, such as Master Data Management Data Security Metadata Management Data Quality Management Business Intelligence Practical experience with DG, DMQ and MDM tools such as Informatica, Collibra, MS Purview and Databricks Unity Catalog. Good communication skills in English, both written and oral Ability to work collaboratively with stakeholders. Relevant certifications such as DAMA and DCAM EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 hours ago

Apply

8.0 - 11.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Title: Informatica MDM Specialist-Manager Job Summary: We are looking for a skilled Informatica MDM Specialist . The candidate will have hands-on experience implementing and maintaining Master Data Management solutions using Informatica MDM (Customer 360, Supplier 360, Product 360 and MDM Hub) . This role involves architecting and developing MDM solutions, managing data quality, and ensuring data governance practices across the enterprise. Key Responsibilities: Design, develop, and implement end-to-end MDM solutions using Informatica MDM platform. Configure Data Models, Match & Merge rules, Hierarchies, Trust Framework, and workflows. Collaborate with business stakeholders, data architects, and developers to gather and analyse requirements. Perform data profiling, cleansing, standardization, and validation for master data domains. Implement data governance and stewardship workflows for maintaining data quality. Monitor MDM performance, manage error handling and system tuning. Prepare and maintain technical documentation, deployment guides, and support materials. Provide technical support and troubleshooting during and post-deployment. Stay up to date with Informatica MDM product updates, industry trends, and best practices. Required Qualifications: 8-11 years of experience in Informatica MDM development and implementation. Strong understanding of MDM architecture, data modelling, and metadata management. Hands-on experience with Informatica MDM Hub, e360, IDD, SIF, MDM Provisioning Tool, and ETL/ELT. Experience with data quality tools (Informatica DQ or others) and MDM integration patterns. Understanding of data governance principles and master data domains (customer, product, vendor, etc.). Strong analytical and problem-solving skills. Excellent communication and stakeholder engagement skills. Preferred Qualifications: Informatica MDM certification(s). Experience with IDMC MDM – MDM SaaS Familiarity with data governance platforms (e.g., Collibra, Informatica Axon). Exposure to Agile/Scrum delivery methodologies. Experience in large-scale MDM implementations in domains like Retail, Manufacturing, Healthcare, or BFSI. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 hours ago

Apply

Exploring Informatica Jobs in India

The informatica job market in India is thriving with numerous opportunities for skilled professionals in this field. Companies across various industries are actively hiring informatica experts to manage and optimize their data integration and data quality processes.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for informatica professionals in India varies based on experience and expertise: - Entry-level: INR 3-5 lakhs per annum - Mid-level: INR 6-10 lakhs per annum - Experienced: INR 12-20 lakhs per annum

Career Path

A typical career progression in the informatica field may include roles such as: - Junior Developer - Informatica Developer - Senior Developer - Informatica Tech Lead - Informatica Architect

Related Skills

In addition to informatica expertise, professionals in this field are often expected to have skills in: - SQL - Data warehousing - ETL tools - Data modeling - Data analysis

Interview Questions

  • What is Informatica and why is it used? (basic)
  • Explain the difference between a connected and unconnected lookup transformation. (medium)
  • How can you improve the performance of a session in Informatica? (medium)
  • What are the various types of cache in Informatica? (medium)
  • How do you handle rejected rows in Informatica? (basic)
  • What is a reusable transformation in Informatica? (basic)
  • Explain the difference between a filter and router transformation in Informatica. (medium)
  • What is a workflow in Informatica? (basic)
  • How do you handle slowly changing dimensions in Informatica? (advanced)
  • What is a mapplet in Informatica? (medium)
  • Explain the difference between an aggregator and joiner transformation in Informatica. (medium)
  • How do you create a mapping parameter in Informatica? (basic)
  • What is a session and a workflow in Informatica? (basic)
  • What is a rank transformation in Informatica and how is it used? (medium)
  • How do you debug a mapping in Informatica? (medium)
  • Explain the difference between static and dynamic cache in Informatica. (advanced)
  • What is a sequence generator transformation in Informatica? (basic)
  • How do you handle null values in Informatica? (basic)
  • Explain the difference between a mapping and mapplet in Informatica. (basic)
  • What are the various types of transformations in Informatica? (basic)
  • How do you implement partitioning in Informatica? (medium)
  • Explain the concept of pushdown optimization in Informatica. (advanced)
  • How do you create a session in Informatica? (basic)
  • What is a source qualifier transformation in Informatica? (basic)
  • How do you handle exceptions in Informatica? (medium)

Closing Remark

As you prepare for informatica job opportunities in India, make sure to enhance your skills, stay updated with the latest trends in data integration, and approach interviews with confidence. With the right knowledge and expertise, you can excel in the informatica field and secure rewarding career opportunities. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies