Home
Jobs

3333 Informatica Jobs - Page 20

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

TCS Hiring for Snowflake Developer Role!! TCS presents an excellent opportunity for Snowflake Developer Role: Snowflake on AWS Developer Desired Experience Range: 6-10 years Location: Chennai, Hyderabad Mode of Interview : Virtual Date: 03-07-2025 (Thursday) Must Have : Strong experience in Snowflake, SQL for data manipulation and engineering tasks. Strong Expertise in SQL and Snowflake architecture Experience in one or more cloud platforms – Preferably Azure /AWS/GCP In-depth knowledge of data warehousing concepts and experience in building and managing data warehouses. Knowledge in ETL concepts - needs to manage data migration into Snowflake. This includes extracting data from various sources, transforming it into a usable form, and loading it into the Snowflake platform. Informatica DBT

Posted 6 days ago

Apply

6.0 years

0 Lacs

India

Remote

Job Title: ServiceNow to ServiceMax Data Migration Specialist Experience: 6+ Years Location: Remote Employment Type: Full-Time / Contract (Both) Job Summary: We are looking for an experienced Data Migration Specialist to lead the end-to-end migration of data from ServiceNow to ServiceMax as part of a strategic transformation initiative. The ideal candidate will have strong experience in data extraction, mapping, transformation, and validation, with prior exposure to Salesforce ecosystems, including ServiceMax . This is a remote opportunity , and we are seeking candidates who can work independently and collaboratively across global teams. Key Responsibilities: Design and implement data migration strategy from ServiceNow to ServiceMax . Collaborate with business and technical stakeholders to define data mapping rules , transformation logic, and validation criteria. Extract, cleanse, transform, and load (ETL) data using appropriate tools and technologies. Ensure data integrity, accuracy, and consistency throughout the migration lifecycle. Conduct data quality audits and support User Acceptance Testing (UAT) . Troubleshoot data-related issues and support go-live data readiness. Document the data migration process, rules, and any post-migration validations. Required Skills & Experience: Minimum 6+ years of experience in data migration , especially across enterprise applications. Proven experience with ServiceNow and Salesforce / ServiceMax platforms. Hands-on experience in ETL tools (such as Informatica, Jitterbit, MuleSoft, Data Loader, etc.). Deep understanding of Salesforce data models , ServiceMax architecture, and ServiceNow schema. Proficient in data mapping, cleansing, validation, and reconciliation techniques . Strong problem-solving and analytical skills. Excellent communication and documentation abilities. Preferred Qualifications: Experience with ServiceMax implementation/migration projects. Familiarity with Salesforce APIs, SOQL/SOSL , and data loader tools. Certifications in Salesforce or ServiceMax are a plus. Experience working in remote, cross-functional environments .

Posted 6 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description and Requirements "At BMC trust is not just a word - it's a way of life!" Description And Requirements CareerArc Code CA_UT Hybrid "At BMC trust is not just a word - it's a way of life!" We are an award-winning, equal opportunity, culturally diverse, fun place to be. Giving back to the community drives us to be better every single day. Our work environment allows you to balance your priorities, because we know you will bring your best every day. We will champion your wins and shout them from the rooftops. Your peers will inspire, drive, support you, and make you laugh out loud! We help our customers free up time and space to become an Autonomous Digital Enterprise that conquers the opportunities ahead - and are relentless in the pursuit of innovation! We are seeking a highly analytical and technically skilled professional to join our Finance team as Manager Finance | Principal Finance Analyst. In this role, you will own and lead the reporting and analytics of key top line metrics in the Software industry, ensuring the accuracy, consistency, and usability of data to support critical financial and business decisions. You will work cross functionally with Finance, Sales Operations, Business Intelligence, and IT teams to build scalable solutions, streamline reporting processes, and drive data integrity across systems. Here is how, through this exciting role, YOU will contribute to BMC's and your own success: Lead the design and delivery of reporting and analytics for top line KPIs, including Total Contract Value (TCV), Annual Recurring Revenue (ARR), Remaining Performance Obligations (RPO), and more. Partners with FP&A, Revenue Accounting, SalesOps, and BI teams on alignment of standardized metric definitions, data logic, and governance across systems like Tableau, Snowflake, and Workday Adaptive Planning Support monthly and quarterly financial close processes by validating, reconciling, and finalizing revenue-related data in partnership with accounting teams. Design and manage interactive, self-service dashboards in Tableau that enable business users to explore revenue and customer trends effectively. Build and maintain robust ETL pipelines using tools such as Informatica or SSIS to transform and model data from various sources into finance reporting layers. Develop and optimize complex SQL queries and stored procedures to support dynamic reporting, reconciliations, and business insights. Ensure data quality and accuracy by implementing automated data validation, reconciliation checks, and exception reporting mechanisms. Identify and lead process automation opportunities to enhance reporting speed, consistency, and scalability. Collaborate with IT and Finance Systems teams to test, implement, and document system and data model enhancements. Support audit and compliance activities by preparing necessary documentation, validating financial controls, and participating in audit walkthroughs. Cross Train team members/End Users To ensure you are set up for success, you will bring the following skillset & experience: Required Skills Bachelor's degree required (with MBA preferred), with at least 10 years’ experience especially as Domain Expert on building and maintaining Financial Metrics (TCV, ACV, ARR, Revenue). Technical Skills Strong proficiency in SQL (Snowflake preferred), with experience building scalable, modular queries and views. Hands-on experience with Tableau: workbook development, LODs, parameters, dashboard actions, and performance tuning. Knowledge of ETL tools such as Informatica, SSIS, Alteryx, or custom Python/SQL-based pipelines. Understanding of data warehousing concepts and data modeling (e.g., star schema, dimensional modeling). Experience working with ERP/CRM systems such as Salesforce or Oracle. Familiarity with Workday Adaptive Planning, Power BI is a plus. Finance & Business Acumen Strong understanding of Software industry top line metrics (ARR, TCV, RPO, churn, upsell, etc.) and the finance logic behind software revenue recognition. Prior experience supporting FP&A, Revenue Accounting, or Business Operations teams in a data or systems role. Strong communication and collaboration skills to work effectively with both technical and non-technical stakeholders. Our commitment to you! BMC’s culture is built around its people. We have 6000+ brilliant minds working together across the globe. You won’t be known just by your employee number, but for your true authentic self. BMC lets you be YOU! If after reading the above, You’re unsure if you meet the qualifications of this role but are deeply excited about BMC and this team, we still encourage you to apply! We want to attract talents from diverse backgrounds and experience to ensure we face the world together with the best ideas! BMC is committed to equal opportunity employment regardless of race, age, sex, creed, color, religion, citizenship status, sexual orientation, gender, gender expression, gender identity, national origin, disability, marital status, pregnancy, disabled veteran or status as a protected veteran. If you need a reasonable accommodation for any part of the application and hiring process, visit the accommodation request page. BMC Software maintains a strict policy of not requesting any form of payment in exchange for employment opportunities, upholding a fair and ethical hiring process. At BMC we believe in pay transparency and have set the midpoint of the salary band for this role at 3,380,000 INR. Actual salaries depend on a wide range of factors that are considered in making compensation decisions, including but not limited to skill sets; experience and training, licensure, and certifications; and other business and organizational needs. The salary listed is just one component of BMC's employee compensation package. Other rewards may include a variable plan and country specific benefits. We are committed to ensuring that our employees are paid fairly and equitably, and that we are transparent about our compensation practices. ( Returnship@BMC ) Had a break in your career? No worries. This role is eligible for candidates who have taken a break in their career and want to re-enter the workforce. If your expertise matches the above job, visit to https://bmcrecruit.avature.net/returnship know more and how to apply.

Posted 6 days ago

Apply

3.0 years

0 Lacs

India

On-site

Job Title: Oracle Product Data Hub (PDH) Technical Consultant – Product Master Data Specialist Location: India Job Type: Full-Time Consultant Experience Level: Mid to Senior-Level Industry: ERP / Master Data Management / Manufacturing / Retail / Supply Chain Job Summary: We are seeking a skilled Oracle Product Data Hub (PDH) Technical Consultant with deep expertise in Product Master Data Management to support the end-to-end lifecycle of finished goods, raw materials, and pricing data in Oracle Fusion PDH. The ideal candidate will have hands-on experience in data cleansing, enrichment, transformation, validation, and mass data loading into Oracle Cloud PDH using best practices and tools such as FBDI, REST/SOAP APIs, and Data Import Templates . This role requires strong technical knowledge of Oracle PDH, a problem-solving mindset, and experience collaborating with functional teams and business users to ensure clean, standardized, and accurate product data is maintained across systems.  Key Responsibilities: Lead technical efforts in product data onboarding , including finished goods , raw materials , and pricing structures into Oracle Fusion Product Data Hub. Perform data cleansing, de-duplication, normalization, and transformation activities using industry best practices and custom rulesets. Develop and execute data migration strategies using Oracle FBDI templates , Import Maps , REST/SOAP APIs , and spreadsheets . Create and maintain scripts or tools for mass upload, update, and validation of product data. Collaborate with business analysts, data stewards, and IT to define and implement product data governance, data quality rules, and workflows. Conduct data validation and reconciliation activities post-load, ensuring accuracy, completeness, and compliance with business rules. Troubleshoot and resolve technical issues related to PDH data imports, validations, and integrations. Support product hierarchy setup, item class configuration, attribute groups, catalogs, and data quality scorecards. Document technical specifications, data load procedures, and configuration guides. Required Skills and Experience: 3+ years of hands-on technical experience with Oracle Fusion Product Data Hub (PDH) . Proven experience in mass loading and maintaining product data, including finished goods , raw materials , and pricing . Strong experience with Oracle FBDI templates , REST/SOAP Web Services , and Excel-based data load tools . Proficiency in SQL and PL/SQL for data analysis and transformation. Solid understanding of Oracle Fusion Product Hub structures: Item Classes, Templates, Catalogs, Attributes, and Change Orders . Knowledge of item lifecycle management , global product definitions , and cross-functional data dependencies . Familiarity with Oracle SCM modules (Inventory, Costing, Pricing) is a plus. Experience in large-scale data migration, cleansing, and conversion projects. Excellent analytical, communication, and stakeholder engagement skills. Preferred Qualifications: Oracle Cloud Certification in Product Data Management or SCM . Experience with data governance frameworks or MDM tools . Exposure to tools like Oracle Integration Cloud (OIC) , OACS , or Informatica MDM . Experience in manufacturing, apparel, or retail industries preferred.

Posted 6 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

ETL Tester Location : Chennai. Exp : 5 to 8 years. Key Responsibilities Review ETL design documents and understand data flows, mapping documents, and business requirements. Develop comprehensive test plans, test cases, and test scripts for validating ETL processes. Perform data validation and data quality testing at various stages of the ETL cycle. Write and execute SQL queries to verify data transformation logic, source-to-target data mapping, and business rules. Identify, troubleshoot, and document data anomalies, discrepancies, and system defects. Work closely with development teams to replicate, debug, and resolve issues. Participate in daily stand-ups, sprint planning, and defect triage meetings. Communicate clearly with stakeholders and provide timely updates on test status and results. Contribute to the development and maintenance of automated ETL testing solutions (optional, based on project). Ensure compliance with testing standards and best practices across data projects. Required Skills And Qualifications Bachelor's degree in Computer Science, Information Systems, or related field. 5+ years of hands-on experience in ETL testing or data validation roles. Strong knowledge of SQL and ability to write complex queries for data verification. Familiarity with ETL tools (e.g., Informatica, Talend, DataStage, SSIS, etc.) Experience working with large datasets and relational databases (Oracle, SQL Server, PostgreSQL, etc.) Excellent problem-solving skills with a keen eye for identifying data quality issues. Strong analytical and critical thinking skills. Clear and concise verbal and written communication skills for cross-functional collaboration. Ability to work in agile/scrum environments with fast-changing priorities. Nice To Have Experience with test automation for ETL pipelines using tools like Selenium, PyTest, or Apache Airflow validation scripts. Familiarity with cloud platforms such as AWS, Azure, or GCP. Exposure to BI tools like Power BI, Tableau, or Looker. Understanding of data warehousing and data lake concepts. (ref:hirist.tech)

Posted 6 days ago

Apply

5.0 years

0 Lacs

Kochi, Kerala, India

On-site

The selected candidate will primarily work on Databricks and Reltio projects, focusing on data integration and transformation tasks. This role requires a deep understanding of Databricks, ETL tools, and data warehousing/data lake concepts. Experience in the life sciences domain is preferred. Candidate with Databricks certification is preferred. Key Responsibilities: Design, develop, and maintain data integration solutions using Databricks. Collaborate with cross-functional teams to understand data requirements and deliver efficient data solutions. Implement ETL processes to extract, transform, and load data from various sources into data warehouses/data lakes. Optimize and troubleshoot Databricks workflows and performance issues. Ensure data quality and integrity throughout the data lifecycle. Provide technical guidance and mentorship to junior developers. Stay updated with the latest industry trends and best practices in data integration and Databricks. Required Qualifications: Bachelor’s degree in computer science or equivalent. Minimum of 5 years of hands-on experience with Databricks. Strong knowledge of any ETL tool (e.g., Informatica, Talend, SSIS). Well-versed in data warehousing and data lake concepts. Proficient in SQL and Python for data manipulation and analysis. Experience with cloud platforms (e.g., AWS, Azure, GCP) and their data services. Excellent problem-solving skills. Strong communication and collaboration skills. Preferred Qualifications: Certified Databricks Engineer. Experience in the life sciences domain. Familiarity with Reltio or similar MDM (Master Data Management) tools. Experience with data governance and data security best practices. IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide. Learn more at https://jobs.iqvia.com

Posted 6 days ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About ProcDNA ProcDNA is a global consulting firm. We fuse design thinking with cutting-edge technology to create game-changing Commercial Analytics and Technology solutions for our clients. We're a passionate team of 275+ across 6 offices, all growing and learning together since our launch during the pandemic. Here, you won't be stuck in a cubicle - you'll be out in the open water, shaping the future with brilliant minds. At ProcDNA, innovation isn't just encouraged; it's ingrained in our DNA. Ready to join our epic growth journey? What We Are Looking For You’ll be driving the adoption of the latest technologies in our solutions, bringing in thought leadership to guide clients on complex data management problems, and driving business performance. You will work with the leadership team to bring subject matter expertise in areas such as Big Data, ETL, Reporting, CRM, Data Warehousing, MDM, DevOps, Software Development, etc. We are seeking an individual who not only possesses the requisite expertise but also thrives in the dynamic landscape of a fast-paced global firm. What You’ll Do Leading end-to-end data management solution projects for multiple clients across data engineering and BI technologies. Responsible for creating a project management plan and ensuring adherence to project timelines. Integrate multiple data sources into one visualization to tell a story. Interact with customers to understand their business problems and provide best-in-class analytics solutions. Interact with Data Platform leaders and understand data flows that integrate into Tableau/analytics. Understand data governance, quality, security, and integrate analytics with these enterprise platforms. Interact with UX/UI global functions and design best-in class visualization for customers, harnessing all product capabilities. Must have 7 - 10 years of data warehousing and data engineering. Experience in interacting with Life Science clients directly, discussing requirements, and stakeholder management. Experience in requirement gathering and designing enterprise warehouse solutions from scratch. Hands-on experience with ETL tools like ADF, Databricks, and Informatica; experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc; experience in data warehouse: SQL/NoSQL, Amazon Redshift, Snowflake, Apache Hive, HDFS, etc. BI tools knowledge and experience in leading the implementation of dashboards. Deep understanding of data governance and data quality management frameworks. Strong communication and presentation skills with a strong problem-solving attitude. Excellent analytical, problem-solving, and debugging skills, with a strong ability to quickly learn and comprehend business processes and problems to effectively develop technical solutions to their requirements. Skills: mdm,sql,hdfs,data warehousing,big data,devops,cloud,amazon redshift,snowflake,pharmaceutical consulting,data management,apache hive,azure,reporting,problem-solving,luigi,informatica,analytical skills,presentation skills,data governance,adf,data engineering,crm,databricks,bi technologies,airflow,team management,business technology,aws,azkaban,software development,etl,client management,data quality management,life science

Posted 6 days ago

Apply

5.0 - 8.0 years

12 - 22 Lacs

Mumbai

Hybrid

Informatica Developer- Exp- 5 to 8 years Location- Mumbai Skills- Informatica Power center , ETL, RDBMS concepts, Informatica, Shell scripting, Java and Python Informatica Developer with 5-7 years of experience of ETL RDBMS concepts, Informatica, Shell scripting, Java and Python to support the applications in the Surveillance area Duties and Responsibilities Support the applications in the Surveillance area Build enhance applications in the Surveillance area as needed Support Develop ETL Data loads on Informatica Shell Scripts Work with users stakeholders including application owners for upstream systems to resolve any support issues Take full ownership of issues that arise provide analysis of issues escalate them as and when necessary and take them through to resolution within defined SLAs Interested candidates share your CV at himani.girnar@alikethoughts.com with below details Candidate's name- Email and Alternate Email ID- Contact and Alternate Contact no- Total exp- Relevant experience- Current Org- Notice period- CCTC- ECTC- Current Location- Preferred Location- Pancard No-

Posted 6 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Chennai

Work from Office

Implement data integration solutions using Informatica PowerCenter and Teradata. Optimize ETL workflows and ensure efficient data processing for analytics.

Posted 6 days ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Implement and manage Informatica Data Quality (IDQ) solutions to ensure the accuracy, completeness, and reliability of data. Design data profiling, cleansing, and validation processes to maintain high-quality data across systems.

Posted 6 days ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Work from Office

Develop and implement OBIEE solutions, focusing on Oracle Business Intelligence Enterprise Edition (OBIEE) for reporting and analytics. You will work with SQL, Unix, and IPC for application development. Expertise in OBIEE and SQL is required.

Posted 6 days ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Develop and manage integration solutions using Informatica Data Management Cloud (IDMC). Ensure smooth data integration processes, automate workflows, and ensure the consistency and reliability of data across systems.

Posted 6 days ago

Apply

8.0 - 11.0 years

20 - 30 Lacs

Gurugram, Bengaluru

Hybrid

Key responsibilities: An expert in solution design with the ability to see the big picture across the portfolio; providing guidance and governance for the analysis, solution design, development and implementation of projects A strategic thinker who will be responsible for the technical strategy within the portfolio; ensuring it aligns with the overall architecture roadmap and business strategy. An effective communicator who will utilize their technical/business knowledge to lead technical discussions with project teams, business sponsors, and external technical partners in a manner that ensures understanding of risks, options, and overall solutions. An individual with the ability to effectively consult/support technical discussions with Account Delivery and Business Sponsors ensuring alignment to both technology and business visions. Collaborate with Designers, Business System Analysts, Application Analysts and Testing specialists to deliver high quality solutions Able to prepare high-level and detailed-level designs based on technical/business requirements and defined architectures and maintain documentation Have been instrumental in platform migration work and technical migration work in the past and understands the involved intricacies. Analyze, define and document requirements for data, workflow, logical processes, interface design, internal and external checks, controls, and outputs Ensure information security standards and requirements are incorporated into all solutions Stay current with trends in emerging technologies and how they could apply to Sun Life. Key experience: A Bachelors or master’s degree in Computer Science or related field 8 -11 years of progressive information technology experience with full application development life cycle. Domain knowledge of Insurance and Retail Wealth Management. Experience in Informatica Powercenter / IDMC Development. Experience of applying various informatica transformations and different type of sources. Ability to write complex T-SQL and stored procedures, views. Experience in SQL Server 2014 and above. Exposure to DevOps and API architecture Should have experience leading small teams (5-8 developers). Good knowledge and experience of Java1.8 or above. Experience in PostGRE SQL and No-SQL DB like MongoDB etc. Good knowledge of coding best practices and should be able to do code review of peer. Produce clean, efficient code based on specifications and troubleshoot, debug and upgrade existing software.

Posted 6 days ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Manage and maintain OBIEE (Oracle Business Intelligence Enterprise Edition) environments. Oversee installation, configuration, and performance tuning of OBIEE systems to ensure high availability and efficient reporting.

Posted 6 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai

Work from Office

Develops ETL solutions using Informatica PowerCentre.

Posted 6 days ago

Apply

5.0 - 6.0 years

7 - 8 Lacs

Pune

Work from Office

Develop and manage data solutions using Snowflake, focusing on optimizing data storage, integration, and processing. Ensure data consistency and provide analytical insights through Snowflake’s cloud data platform.

Posted 6 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Hyderabad

Work from Office

Design and implement data integration and management solutions using Informatica Big Data Management (BDM). Ensure efficient handling of large data sets, optimizing performance and ensuring seamless data flow across systems.

Posted 6 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Hyderabad

Work from Office

Develop and manage ETL processes using Informatica, ensuring smooth data extraction, transformation, and loading across multiple systems. Optimize data workflows to ensure high-quality data management.

Posted 6 days ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Mumbai

Work from Office

Design and optimize ETL workflows using Talend. Ensure data integrity and process automation.

Posted 6 days ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

About ProcDNA ProcDNA is a global consulting firm. We fuse design thinking with cutting-edge technology to create game-changing Commercial Analytics and Technology solutions for our clients. We're a passionate team of 275+ across 6 offices, all growing and learning together since our launch during the pandemic. Here, you won't be stuck in a cubicle - you'll be out in the open water, shaping the future with brilliant minds. At ProcDNA, innovation isn't just encouraged; it's ingrained in our DNA. Ready to join our epic growth journey? What We Are Looking For You’ll be driving the adoption of the latest technologies in our solutions, bringing in thought leadership to guide clients on complex data management problems, and driving business performance. You will work with the leadership team to bring subject matter expertise in areas such as Big Data, ETL, Reporting, CRM, Data Warehousing, MDM, DevOps, Software Development, etc. We are seeking an individual who not only possesses the requisite expertise but also thrives in the dynamic landscape of a fast-paced global firm. What You’ll Do Leading end-to-end data management solution projects for multiple clients across data engineering and BI technologies. Responsible for creating a project management plan and ensuring adherence to project timelines. Integrate multiple data sources into one visualization to tell a story. Interact with customers to understand their business problems and provide best-in-class analytics solutions. Interact with Data Platform leaders and understand data flows that integrate into Tableau/analytics. Understand data governance, quality, security, and integrate analytics with these enterprise platforms. Interact with UX/UI global functions and design best-in class visualization for customers, harnessing all product capabilities. Must have 7 - 10 years of data warehousing and data engineering. Experience in interacting with Life Science clients directly, discussing requirements, and stakeholder management. Experience in requirement gathering and designing enterprise warehouse solutions from scratch. Hands-on experience with ETL tools like ADF, Databricks, and Informatica; experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc; experience in data warehouse: SQL/NoSQL, Amazon Redshift, Snowflake, Apache Hive, HDFS, etc. BI tools knowledge and experience in leading the implementation of dashboards. Deep understanding of data governance and data quality management frameworks. Strong communication and presentation skills with a strong problem-solving attitude. Excellent analytical, problem-solving, and debugging skills, with a strong ability to quickly learn and comprehend business processes and problems to effectively develop technical solutions to their requirements. Skills: mdm,sql,hdfs,data warehousing,big data,devops,cloud,amazon redshift,snowflake,pharmaceutical consulting,data management,apache hive,azure,reporting,problem-solving,luigi,informatica,analytical skills,presentation skills,data governance,adf,data engineering,crm,databricks,bi technologies,airflow,team management,business technology,aws,azkaban,software development,etl,client management,data quality management,life science

Posted 6 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai

Work from Office

Lead data governance initiatives using Collibra. Manage data policies, procedures, and standards across the organization. Ensure data quality, compliance, and accessibility. Collaborate with data stewards and business units to define data ownership and accountability. Provide training on data governance tools and principles, and support the implementation of Collibra across business units.

Posted 6 days ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai

Work from Office

Design and implement data integration solutions using IBM Cognos. Focus on extracting, transforming, and loading data between various systems to provide accurate and actionable business insights.

Posted 6 days ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Hyderabad

Work from Office

Lead the development, implementation, and management of Power BI reporting solutions, focusing on data modeling, dashboards, and business intelligence.

Posted 6 days ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Design and optimize ETL processes using Informatica PowerCenter, PL/SQL, and Oracle 10g. Improve data integration performance.

Posted 6 days ago

Apply

4.0 - 5.0 years

6 - 7 Lacs

Bengaluru

Work from Office

Implement and manage Collibra’s data governance platform, ensuring the proper classification, access, and compliance of data across the organization. Provide support and training to users.

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies