Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
20 - 35 Lacs
Bengaluru
Remote
Notice Period : immediate -30 days Role Overview : As a Collibra Expert, you will be responsible for implementing, maintaining, and optimizing the Collibra Data Governance Platform to ensure data quality, governance, and lineage across the organization. You will partner with cross-functional teams to develop data management strategies and integrate Collibra solutions with Google Cloud Platform (GCP) to create a robust, scalable, and efficient data governance framework for the retail domain. Key Responsibilities : - Data Governance Management : Design, implement, and manage the Collibra Data Governance Platform for data cataloging, data quality, and data lineage within the retail domain. - Collibra Expertise : Utilize Collibra for metadata management, data quality monitoring, policy enforcement, and data stewardship across various business units. - Data Cataloging : Lead the implementation and continuous improvement of data cataloging processes to enable a centralized, user-friendly view of the organization's data assets. - Data Quality Management : Collaborate with business and technical teams to ensure that data is high-quality, accessible, and actionable. Define data quality rules and KPIs to monitor data accuracy, completeness, consistency, and timeliness. - Data Lineage Implementation : Build and maintain comprehensive data lineage models to visualize the flow of data from source to consumption, ensuring compliance with data governance standards. - GCP Integration : Architect and implement seamless integrations between Collibra and the Google Cloud Platform (GCP) tools such as BigQuery, Dataflow, and Cloud Storage, ensuring data governance policies are enforced in the cloud environment. - Collaboration & Stakeholder Management : Collaborate with Data Engineers, Analysts, Business Intelligence teams, and leadership to define and implement data governance best practices and standards. - Training & Support : Provide ongoing training and support to business users and technical teams on data governance practices, Collibra platform usage, and GCP-based solutions. - Compliance & Security : Ensure data governance initiatives comply with internal policies, industry standards, and regulations (e.g., GDPR, CCPA). Key Requirements : - Proven Expertise in Collibra : Hands-on experience implementing and managing Collibra Data Governance Platform (cataloging, lineage, data quality). - Google Cloud Platform (GCP) Proficiency : Strong experience with GCP tools (BigQuery, Dataflow, Pub/Sub, Cloud Storage, etc.) and integrating them with Collibra for seamless data governance. - Data Quality and Lineage Expertise : In-depth knowledge of data quality frameworks, metadata management, and data lineage implementation. - Retail Industry Experience : Prior experience in data governance within the retail or eCommerce domain is a plus. - Technical Skills : Strong understanding of cloud data architecture and best practices for managing data at scale in the cloud (preferably in GCP). - Problem-Solving and Analytical Skills : Ability to analyze complex data governance issues and find practical solutions to ensure high-quality data management across the organization. - Excellent Communication Skills : Ability to communicate effectively with both technical and non-technical stakeholders to advocate for data governance best practices. - Certifications : Relevant certifications in Collibra, Google Cloud, or Data Governance are highly desirable. Education & Experience : - Bachelor's degree (B. Tech/BE) mandatory, masters optional - 5+ years of experience in Data Governance, with at least 3 years of specialized experience in Collibra and GCP. - Experience working with data teams in a retail environment is a plus.
Posted 3 weeks ago
15.0 - 20.0 years
20 - 25 Lacs
Indore, Hyderabad, Ahmedabad
Work from Office
Locations: Offices in Austin (USA), Singapore, Hyderabad, Indore, Ahmedabad (India) Primary Job Location: Hyderabad / Indore / Ahmedabad (India) Role Type: Full-time | Onsite What You Will Do Role Overview As a Data Governance Architect, you will define and lead enterprise-wide data governance strategies, design robust governance architectures, and enable seamless implementation of tools like Microsoft Purview, Informatica, and other leading data governance platforms. This is a key role bridging compliance, data quality, security, and metadata management across cloud and enterprise ecosystems. Key Responsibilities 1. Strategy, Framework, and Operating Model Define governance strategies, standards, and policies for compliance and analytics readiness. Establish a governance operating model with clear roles and responsibilities. Conduct maturity assessments and lead change management efforts. 2. 5. Metadata, Lineage & Glossary Management Architect technical and business metadata workflows. Validate end-to-end lineage across ADF Synapse Power BI. Govern glossary approvals and term workflows. 6. Policy & Data Classification Management Define and enforce rules for: Classification, Access, Retention, and Sharing. Leverage Microsoft Information Protection (MIP) for automation. Ensure alignment with GDPR, HIPAA, CCPA, SOX. 7. Data Quality Governance Define quality KPIs, validation logic, and remediation rules. Build scalable frameworks embedded in pipelines and platforms. 8. Compliance, Risk & Audit Oversight Establish compliance standards, dashboards, and alerts. Enable audit readiness and reporting through governance analytics. 9. Automation & Integration Automate workflows using: PowerShell, Azure Functions, Logic Apps, REST APIs. Integrate governance into: Azure Monitor, Synapse Link, Power BI, and third-party tools. Primary Skills Microsoft Purview Architecture & Administration Data Governance Framework Design Metadata & Data Lineage Management (ADF Synapse Power BI) Data Quality and Compliance Governance Informatica / Collibra / BigID / Alation / Atlan PowerShell, REST APIs, Azure Functions, Logic Apps RBAC, Glossary Governance, Classification Policies MIP, Insider Risk, DLP, Compliance Reporting Azure Data Factory, Agile Methodologies #Tags #DataGovernance #MicrosoftPurview #GovernanceArchitect #MetadataManagement #DataLineage #DataQuality #Compliance #RBAC #PowerShell #RESTAPI #Informatica #Collibra #BigID #Azure Functions #ADF #Synapse #PowerBI #GDPR #HIPAA #CCPA #SOX #OnsiteJobs #HyderabadJobs #IndoreJobs #AhmedabadJobs #HiringNow #DataPrivacy #EnterpriseArchitecture #DSPM #Governance Strategy #Information Security Would you like this JD tailored for a LinkedIn post, referral message, or email template as well
Posted 3 weeks ago
7.0 - 11.0 years
12 - 18 Lacs
Mumbai, Indore, Hyderabad
Work from Office
Key Responsibilities 1. Governance Strategy & Stakeholder Enablement Define and drive enterprise-level data governance frameworks and policies.Align governance objectives with compliance, analytics, and business priorities.Work with IT, Legal, Compliance, and Business teams to drive adoption.Conduct training, workshops, and change management programs. 2. Microsoft Purview Implementation & Administration Administer Microsoft Purview: accounts, collections, RBAC, and scanning policies.Design scalable governance architecture for large-scale data environments (>50TB).Integrate with Azure Data Lake, Synapse, SQL DB, Power BI, and Snowflake. 3. Metadata & Data Lineage Management Design metadata repositories and workflows.Ingest technical/business metadata via ADF, REST APIs, PowerShell, Logic Apps.Validate end-to-end lineage (ADF Synapse Power BI), impact analysis, and remediation. 4. Data Classification & SecurityImplement and govern sensitivity labels (PII, PCI, PHI) and classification policies.Integrate with Microsoft Information Protection (MIP), DLP, Insider Risk, and Compliance Manager.Enforce lifecycle policies, records management, and information barriers.Working knowledge of GDPR, HIPAA, SOX, CCPA.Strong communication and leadership to bridge technical and business governance Location-Mumbai,Hyderabad,Indore,Ahmedabad
Posted 3 weeks ago
3.0 - 8.0 years
10 - 12 Lacs
Hyderabad, Chennai, Mumbai (All Areas)
Hybrid
POSITION: Data Documentarian LOCATION: Global (remote) REPORTS TO: DAPS Team (Data, Analytics, Products, and Strategy) JOB SUMMARY Client is seeking a team member with experience documenting key aspects of enterprise data. The Data Documentarian will play a key role in creating and maintaining high-quality documentation of data assets, including but not limited to definitions, procedures, methodologies, data flows, ownership, stewardship, and metadata. This work will enable client to improve data transparency, governance, and literacy across the organization. The ideal candidate will support documentation, cataloging, research, and knowledge management efforts for data practices across various functional groups. This is an entrepreneurial environment that fosters learning, continuous improvement, and cross-functional collaboration. The Data Documentarian role is internal-facing and offers meaningful opportunities to shape the organizations data culture and maturity. Client is a leading social impact and performance solutions firm that serves state, local, education, technology, and commercial clients across the U.S. and abroad. By elevating education systems, managing and securing critical networks, solving complex human capital and fiscal problems, and advancing equity as a performance imperative, we impact communities for good through strong client partnerships. Celebrating its 50th year in 2024, the firm attracts exceptional talent and empowers them to exceed client expectations as they navigate the dynamic needs of those we serve. MAJOR AREAS OF RESPONSIBILITY Maintain and evolve the organization’s data documentation, including data dictionary, glossary, lineage, ownership, and governance protocols. Write and maintain clear, accurate, and user-friendly guides, training materials, and standard operating procedures (SOPs) for data-related workflows. Develop templates, standards, and best practices for documenting data processes, transformation logic, source systems, and change management workflows. Partner with analysts, engineers, and business stakeholders to identify and codify definitions and business logic behind critical data elements. Collaborate with data governance, systems, and analytics teams to ensure metadata and lineage are accurately captured and accessible. Support the roll-out of data catalog and metadata management tools; contribute content and structure to support usability. Support cross-functional efforts to improve data literacy and ensure documentation reflects evolving data practices. Track updates, revisions, and the lifecycle of key data assets to ensure documentation stays current and trusted. Conduct interviews and research with subject matter experts to document tacit knowledge and institutional memory about data usage and history. Support the development of onboarding and training materials to orient new employees to data sources and reporting conventions. MINIMUM QUALIFICATIONS Bachelor’s degree from an accredited college or university. Three (3) or more years of experience in technical writing, business analysis, data governance, information science, or related roles. Excellent English language writing, editing, and organization skills with strong attention to detail. Familiarity with common data and analytics concepts, including data lineage, ETL/ELT processes, and data modeling. Experience developing publication-ready documentation with professional formatting and visual design for distribution to stakeholders and end users. PREFERRED QUALIFICATIONS Experience with data catalog tools and metadata standards. Strong interpersonal and communication skills; able to translate technical concepts for diverse audiences. Ability to thrive in a fast-paced, dynamic environment with multiple ongoing initiatives. A collaborative mindset and interest in helping teams work more effectively with data. A sense of humor and curiosity about how things work.
Posted 3 weeks ago
5.0 - 9.0 years
8 - 12 Lacs
Noida
Work from Office
5-9 years In Data Engineering, software development such as ELT/ETL, data extraction and manipulation in Data Lake/Data Warehouse environment Expert level Hands to the following: Python, SQL PySpark DBT and Apache Airflow DevOps, Jenkins, CI/CD Data Governance and Data Quality frameworks Data Lakes, Data Warehouse AWS services including S3, SNS, SQS, Lambda, EMR, Glue, Athena, EC2, VPC etc. Source code control - GitHub, VSTS etc. Mandatory Competencies Python - Python Database - SQL Data on Cloud - AWS S3 DevOps - CI/CD DevOps - Github ETL - AWS Glue Beh - Communication
Posted 3 weeks ago
4.0 - 9.0 years
7 - 11 Lacs
Hyderabad
Work from Office
1. Role Overview Support as the MD Engineer for SAP Plant Maintenance (PM) master data within an Enterprise Asset Management (EAM) framework. Ensure accuracy, completeness, and integrity of Master data—Equipment, Functional locations, Maintenance plans, BOMs, Task lists—aligned with EAM practices. Collaborate with engineering, maintenance and data analysis team to understand business needs and drive data quality and process efficiency. 2. Key Responsibilities Master data development : Develop PM master data as per the standards by extracting the information from engineering documents and legacy data. Master Data Governance : Create, maintain, and manage SAP PM master data following standards and governance frameworks Data Quality & Reporting : Run quality audits, cleanse data, maintain KPIs, dashboards and analytics using Excel, PowerBI etc . EAMCentric Modelling : Structure and maintain equipment, functional locations, BOMs, maintenance plans, technical objects in line with EAM best practices Process Improvement : Support in Identifying gaps, propose enhancements, draft SOPs and support continuous improvement initiatives Planning & Tactics : Develop standard maintenance procedures from various maintenance documents. Create maintenance plans and schedule in SAP PM 3. Required Experience Education : Bachelor’s in engineering (Mechanical/Electrical/Mining/Instrumentation/Production) SAP Expertise : 2-3 years working with SAP PM , functional locations, equipment structures, maintenance plans, BOMs. EAM Knowledge : Familiar with EAM best practices—asset hierarchy, lifecycle, preventive maintenance. Maintenance : Hands on experience in industrial maintenance practices. Good awareness on different types of Rotary and Static equipment viz. Scrubbers, dryers, screens, crushers, Data Analysis Tools : Skilled in Excel (advanced), SQL, PowerBI. Data Governance : Handson experience in governance, data cleansing, and audit processes . Communication & Training : Strong stakeholder management, with ability to document processes. Soft Skills : Detailoriented, analytical, able to multitask, and drive continuous improvement in a crossfunctional setting. 4. Nice to Have Experience in PM master data remediation projects. Exposure towards different types of maintenance practices and their implementation in SAP PM Exposure towards reliability centered maintenance.
Posted 3 weeks ago
5.0 - 10.0 years
10 - 15 Lacs
Pune
Work from Office
We are looking for a highly skilled and experienced Data Engineer with over 5 years of experience to join our growing data team. The ideal candidate will be proficient in Databricks, Python, PySpark, and Azure, and have hands-on experience with Delta Live Tables. In this role, you will be responsible for developing, maintaining, and optimizing data pipelines and architectures to support advanced analytics and business intelligence initiatives. You will collaborate with cross-functional teams to build robust data infrastructure and enable data-driven decision-making. Key Responsibilities: .Design, develop, and manage scalable and efficient data pipelines using PySpark and Databricks .Build and optimize Spark jobs for processing large volumes of structured and unstructured data .Integrate data from multiple sources into data lakes and data warehouses on Azure cloud .Develop and manage Delta Live Tables for real-time and batch data processing .Collaborate with data scientists, analysts, and business teams to ensure data availability and quality .Ensure adherence to best practices in data governance, security, and compliance .Monitor, troubleshoot, and optimize data workflows and ETL processes .Maintain up-to-date technical documentation for data pipelines and infrastructure components Qualifications: 5+ years of hands-on experience in Databricks platform development. Proven expertise in Delta Lake and Delta Live Tables. Strong SQL and Python/Scala programming skills. Experience with cloud platforms such as Azure, AWS, or GCP (preferably Azure). Familiarity with data modeling and data warehousing concepts.
Posted 3 weeks ago
3.0 - 5.0 years
5 - 8 Lacs
Noida
Work from Office
Must be: Bachelors or Masters degree in Computer Science, Information Technology, or a related discipline. 35+ years of experience in SQL Development and Data Engineering . Strong hands-on skills in T-SQL , including complex joins, indexing strategies, and query optimization. Proven experience in Power BI development, including building dashboards, writing DAX expressions, and using Power Query . Should be: At least 1+ year of hands-on experience with one or more components of the Azure Data Platform : Azure Data Factory (ADF) Azure Databricks Azure SQL Database Azure Synapse Analytics Solid understanding of data warehouse architecture , including star and snowflake schemas , and data lake design principles. Familiarity with: Data Lake and Delta Lake concepts Lakehouse architecture Data governance , data lineage , and security controls within Azure
Posted 3 weeks ago
8.0 - 12.0 years
35 - 45 Lacs
Hyderabad, Pune, Delhi / NCR
Hybrid
Write specifications for Master Data Management builds Create requirements, including rules of survivorship, for migrating data to Markit EDM implementation of data governance Support testing Develop data quality reports for the data warehouse Required Candidate profile 5+ years of experience documenting data management requirements Experience writing technical specifications for MDM builds Familiarity with enterprise data warehouse Knowledge of data governance
Posted 3 weeks ago
8.0 - 12.0 years
40 - 45 Lacs
Bhubaneswar, Bengaluru, Delhi / NCR
Hybrid
Write specifications for Master Data Management builds Create requirements, including rules of survivorship, for migrating data to Markit EDM implementation of data governance Support testing Develop data quality reports for the data warehouse Required Candidate profile 5+ years of experience documenting data management requirements Experience writing technical specifications for MDM builds Familiarity with enterprise data warehouse Knowledge of data governance
Posted 3 weeks ago
2.0 - 4.0 years
10 - 11 Lacs
Pune
Work from Office
Role & responsibilities Have strong data engineering knowledge and cloud development exposure. Proficiency in python Proficiency in both RDBMS (Mysql preferred) and NoSql datastores Spark, Cassandra, AWS data pipeline stack (Athena, S3, Glue data catalog etc.), Airflow etc are technologies you have used. Very comfortable with data lakes, warehouses, and ETL/ELT paradigms Worked in an agile development environment.• Optional Basic knowledge of statistical analysis, mathematical modelling and machine learning Experience Have used or are very hands-on with Microservices, Docker, Kubernetes, Gradle/Ant,Kafka, GIT/bitbucket in an agile workplace. Develop high quality code with strong unit/integration tests. Comfort with test-driven development is a plus. Comfortable with exploring proven open-source tech stack like Grafana, Kibana, Jira, Prometheus, caches like Redis/Memcached, task queues like celery, to name a few. Proficiency in SQL, Python, Java, Spring boot, Hibernate, REST API development will be a good plus. Preferred candidate profile Last organization preference: Product based , Fintech, NBFC Minimum 2 years of relevant experience in Data Engineering.
Posted 3 weeks ago
10.0 - 15.0 years
35 - 40 Lacs
Gurugram, Bengaluru
Work from Office
Department: Technology Reports To: Middle and Back Office Data Product Owner About your team The Technology function provides IT services that are integral to running an efficient run-the business operating model and providing change-driven solutions to meet outcomes that deliver on our business strategy. These include the development and support of business applications that underpin our revenue, operational, compliance, finance, legal, marketing and customer service functions. The broader organisation incorporates Infrastructure services that the firm relies on to operate on a day-to-day basis including data centre, networks, proximity services, security, voice, incident management and remediation. The ISS Technology group is responsible for providing Technology solutions to the Investment Solutions & Services (ISS) business (which covers Investment Management, Asset Management Operations & Distribution business units globally) The ISS Technology team supports and enhances existing applications as well as designs, builds and procures new solutions to meet requirements and enable the evolving business strategy. As part of this group, a dedicated ISS Data Programme team has been mobilised as a key foundational programme to support the execution of the overarching ISS strategy. About your role The Middle and Back Office Data Analyst role is instrumental in the creation and execution of a future state design for Fund Servicing & Oversight data across Fidelitys key business areas. The successful candidate will have an in- depth knowledge of data domains that represent Middle and Back-office operations and technology. The role will sit within the ISS Delivery Data Analysis chapter and fully aligned to deliver Fidelitys cross functional ISS Data Programme in Technology, and the candidate will leverage their extensive industry knowledge to build a future state platform in collaboration with Business Architecture, Data Architecture, and business stakeholders. The role is to maintain strong relationships with the various business contacts to ensure a superior service to our clients. Data Product - Requirements Definition and Delivery of Data Outcomes Analysis of data product requirements to enable business outcomes, contributing to the data product roadmap Capture both functional and non-functional data requirements considering the data product and consumers perspectives. Conduct workshops with both the business and tech stakeholders for requirements gathering, elicitation and walk throughs. Responsible for the definition of data requirements, epics and stories within the product backlog and providing analysis support throughout the SDLC. Responsible for supporting the UAT cycles, attaining business sign off on outcomes being delivered Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering principles. Coordination and Communication: Excellent communication skills to influence technology and business stakeholders globally, attaining alignment and sign off on the requirements. Coordinate with internal and external stakeholders to communicate data product deliveries and the change impact to the operating model. An advocate for the ISS Data Programme. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. About you At least 10 years of proven experience as a business/technical/data analyst within technology and/or business changes within the financial servicesasset management industry. Minimum 5 years as a senior business/technical/data analyst adhering to agile methodology, delivering data solutions using industry leading data platforms such as Snowflake, State Street Alpha Data, Refinitiv Eikon, SimCorp Dimension, BlackRock Aladdin, FactSet etc. Proven experience. of delivering data driven business outcomes using industry leading data platforms such as Snowflake. Excellent knowledge of data life cycle that drives Middle and Back Office capabilities such as trade execution, matching, confirmation, trade settlement, record keeping, accounting, fund & cash positions, custody, collaterals/margin movements, corporate actions , derivations and calculations such as holiday handling, portfolio turnover rates, funds of funds look through . In Depth expertise in data and calculations across the investment industry covering the below. Asset-specific data: This includes data related to financial instruments reference data like asset specifications, maintenance records, usage history, and depreciation schedules. Market data: This includes data like security prices, exchange rates, index constituents and licensing restrictions on them. ABOR & IBOR data: This includes calculation engines covering input data sets, calculations and treatment of various instruments for ABOR and IBOR data leveraging platforms such as Simcorp, Neoxam, Invest1, Charles River, Aladdin etc. Knowledge of TPAs, how data can be structured in a unified way from heterogenous structures. Should possess Problem Solving, Attention to detail, Critical thinking. Technical Skills: Excellent hands-on SQL, Advanced Excel, Python, ML (optional) and proven experience and knowledge of data solutions. Knowledge of data management, data governance, and data engineering practices Hands on experience on data modelling techniques such as dimensional, data vault etc. Willingness to own and drive things, collaboration across business and tech stakeholders.
Posted 3 weeks ago
10.0 - 15.0 years
10 - 15 Lacs
Mumbai
Work from Office
Develop predictive models to accurately forecast product demand and perform SKU segmentation based on sales velocity, margin, and seasonality. Maintain high data integrity standards, ensuring accurate inputs across multiple business units and systems Required Candidate profile Bachelor’s or Master’s degree in Data Science, Business Analytics, or Supply Chain Management. Advanced skills in Excel, SQL, and BI/visualisation tools (e.g., Power BI, Tableau) Oversee MIS platforms
Posted 3 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Mumbai
Work from Office
We are seeking a skilled Python Developer with expertise in Django, Flask, and API development to join our growing team. The Python Developer will be responsible for designing and implementing backend services, APIs, and integrations that power our core platform. The ideal candidate should have a strong foundation in Python programming, experience with Django and/or Flask frameworks, and a proven track record of delivering robust and scalable solutions. Responsibilities: Design, develop, and maintain backend services and APIs using Python frameworks such as Django and Flask. Collaborate with front-end developers, product managers, and stakeholders to translate business requirements into technical solutions. Build and integrate RESTful APIs for seamless communication between our applications and external services. Qualifications: Bachelors degree in computer science, Engineering, or related field; or equivalent experience. 5+ years of professional experience as a Python Developer, with a focus on backend development. Secondary Skill Amazon Elastic File System (EFS) Amazon Redshift Amazon S3 Apache Spark Ataccama DQ Analyzer AWS Apache Airflow AWS Athena Azure Data Factory Azure Data Lake Storage Gen2 (ADLS) Azure Databricks Azure Event Hub Azure Stream Analytics Azure Synapse Analytics BigID C++ Cloud Storage Collibra Data Governance (DG) Collibra Data Quality (DQ) Data Lake Storage Data Vault Modeling Databricks DataProc DDI Dimensional Data Modeling EDC AXON Electronic Medical Record (EMR) Extract, Transform & Load (ETL) Financial Services Logical Data Model (FSLDM) Google Cloud Platform (GCP) BigQuery Google Cloud Platform (GCP) Bigtable Google Cloud Platform (GCP) Dataproc HQL IBM InfoSphere Information Analyzer IBM Master Data Management (MDM) Informatica Data Explorer Informatica Data Quality (IDQ) Informatica Intelligent Data Management Cloud (IDMC) Informatica Intelligent MDM SaaS Inmon methodology Java Kimball Methodology Metadata Encoding & Transmission Standards (METS) Metasploit Microsoft Excel Microsoft Power BI NewSQL noSQL OpenRefine OpenVAS Performance Tuning Python R RDD Optimization SaS SQL Tableau Tenable Nessus TIBCO Clarity
Posted 3 weeks ago
5.0 - 7.0 years
10 - 15 Lacs
Hyderabad
Work from Office
Position: Product Manager Locations : Hyderabad / Pune Experience Level : 5 + Years About Us: Lera Technologies is a future-focused, AI-led digital transformation company that empowers businesses to innovate and grow in today s fast-paced technology landscape. Our core strength lies in our flagship products like the 9X Data Platform which is a state-of-the-art solution for seamless data ecosystem management. Additionally, FinSight 360 is our advanced GenBI, platform that elevates decision-making through intelligent business insights. We partner with enterprises with an ensemble of services to solve complex challenges around data modernization, integration, governance, and operational efficiency. By fostering a culture of continuous innovation and client-centricity, we deliver scalable, impactful solutions that drive measurable business outcomes. At Lera, we dont just enable transformation. We engineer it! We are looking for a Technical Product Manager (TPM) who is passionate about building high-impact data platforms and analytics products. You will serve as the bridge between engineering, data science, and business teams owning the end-to-end product lifecycle from ideation through delivery and beyond. The ideal candidate will have deep technical fluency, strong stakeholder management skills, and a track record of driving data-centric products from concept to production. What You Bring: 5+ years of experience as a Technical Product Manager, ideally in a data engineering, analytics, or enterprise SaaS environment. Strong technical background you can discuss architecture, databases, cloud infrastructure, APIs, and dev workflows with confidence. Experience working with data engineering tools and platforms (e.g., Spark, Airflow, Kafka, Snowflake, dbt, etc.). Proven track record of shipping technically complex products at scale. Strong understanding of agile methodologies and product lifecycle management. Excellent communication and stakeholder management skills. Ability to make data-driven product decisions and manage competing priorities. Bachelor s or Master s degree in Computer Science, Engineering, or a related technical field. Desirable Skills: Hands-on coding or data engineering experience. Experience working with cloud platforms (AWS, GCP, Azure). Exposure to ML/AI product workflows and platforms. Familiarity with data governance, security, and compliance frameworks. Your Role: As a Product Manager, you will Own and evolve the roadmap for core data engineering products (e.g., data pipelines, ETL frameworks, data platforms, APIs, real-time streaming services). Define clear product requirements and technical specifications based on business and user needs. Collaborate closely with engineering to scope technical feasibility, manage trade-offs, and ensure timely delivery. Act as a translator between technical and non-technical stakeholders ensuring alignment across business goals and engineering execution. Drive agile development processes including sprint planning, backlog grooming, and user story definition. Lead product discovery, validation, and rollout strategies for new features and platform enhancements. Identify technical risks and work with architects/engineers to mitigate them early in the product lifecycle. Define product success metrics and use data to inform ongoing prioritization and performance tuning. Stay ahead of market and technology trends in data infrastructure, cloud platforms, and analytics tooling. Why Choose LERA I.C.E. Philosophy : Embrace our core values of Innovation, Creativity, and Experimentation . We encourage boundary-pushing ideas and solutions. Impact : Make a significant impact on our clients success across various industries through strategic data solutions. Culture : Thrive in a workplace that celebrates diversity and inclusive excellence. Professional Growth : Benefit from opportunities for professional development and advancement in a supportive environment. Join Us: We invite you to apply if you are ready to take ownership, shape impactful solutions, and grow with a dynamic team! LERA: Pioneering solutions, inspiring leaders. Apply today and be a part of shaping the digital future.
Posted 3 weeks ago
7.0 - 9.0 years
9 - 11 Lacs
Indore, Hyderabad, Ahmedabad
Work from Office
We're Hiring: Data Governance LeadLocations:Offices in Austin (USA), Singapore, Hyderabad, Indore, Ahmedabad (India)Primary Job Location: Mumbai/ Hyderabad / Indore / Ahmedabad (Work from Office) Compensation Range: Competitive | Based on experience and expertise To Apply, Share Your Resume With:Current CTCExpected CTCNotice PeriodPreferred Location What You Will Do Role Overview A Key Responsibilities 1 Governance Strategy & Stakeholder EnablementDefine and drive enterprise-level data governance frameworks and policies Align governance objectives with compliance, analytics, and business priorities Work with IT, Legal, Compliance, and Business teams to drive adoption Conduct training, workshops, and change management programs 2 Microsoft Purview Implementation & AdministrationAdminister Microsoft Purview: accounts, collections, RBAC, and scanning policies Design scalable governance architecture for large-scale data environments (>50TB) Integrate with Azure Data Lake, Synapse, SQL DB, Power BI, and Snowflake 3 Metadata & Data Lineage ManagementDesign metadata repositories and workflows Ingest technical/business metadata via ADF, REST APIs, PowerShell, Logic Apps Validate end-to-end lineage (ADF Synapse Power BI), impact analysis, and remediation 4 Data Classification & SecurityImplement and govern sensitivity labels (PII, PCI, PHI) and classification policies.Integrate with Microsoft Information Protection (MIP), DLP, Insider Risk, and Compliance Manager.Enforce lifecycle policies, records management, and information barriers. Working knowledge of GDPR, HIPAA, SOX, CCPA.Strong communication and leadership to bridge technical and business governance.
Posted 3 weeks ago
8.0 - 13.0 years
7 - 11 Lacs
Noida
Work from Office
We are looking for a skilled professional with 8 to 15 years of experience in MDG technologies, specifically SAP MDG. The ideal candidate will have a strong background in designing master data governance processes and mapping functional capabilities of SAP MDG to business needs. Roles and Responsibility Design master data governance processes and map functional capabilities of SAP MDG to business needs. Define process models including entity types, change requests, and business activities within SAP MDG. Configure SAP MDG solutions for master data domains such as Material, Customer, Vendor, Finance, or others based on business requirements. Enhance SAP MDG data models, workflows, and user interfaces to meet specific organizational needs. Create functional specifications, test plans, and scripts for unit, integration, and user acceptance testing (UAT). Troubleshoot and resolve technical and functional issues in MDG implementations. Job Minimum 8 years of experience in MDG technologies, specifically SAP MDG. Strong knowledge of SAP MDG, master data management, and data governance. Experience in designing and implementing master data governance processes. Ability to map functional capabilities of SAP MDG to business needs. Strong analytical and problem-solving skills. Excellent communication and training skills. A graduate degree is required for this position.
Posted 3 weeks ago
10.0 - 17.0 years
35 - 60 Lacs
Noida, Gurugram, Bengaluru
Hybrid
This is a individual contributor role. Looking candidates from Product/Life Science/ Pharma/Consulting background only. POSITION: Data Architect. LOCATION: NCR/Bangalore/Gurugram. PRODUCT: Axtria DataMAx is a global cloud-based data management product specifically designed for the life sciences industry. It facilitates the rapid integration of both structured and unstructured data sources, enabling accelerated and actionable business insights from trusted data This product is particularly useful for pharmaceutical companies looking to streamline their data processes and enhance decision-making capabilities. JOB OBJECTIVE: To leverage expertise in data architecture and management to design, implement, and optimize a robust data warehousing platform for the pharmaceutical industry. The goal is to ensure seamless integration of diverse data sources, maintain high standards of data quality and governance, and enable advanced analytics through the definition and management of semantic and common data layers. Utilizing Axtria DataMAx and generative AI technologies, the aim is to accelerate business insights and support regulatory compliance, ultimately enhancing decision-making and operational efficiency. Key Responsibilities: Data Modeling: Design logical and physical data models to ensure efficient data storage and retrieval. ETL Processes: Develop and optimize ETL processes to accurately and efficiently move data from various sources into the data warehouse. Infrastructure Design: Plan and implement the technical infrastructure, including hardware, software, and network components. Data Governance: Ensure compliance with regulatory standards and implement data governance policies to maintain data quality and security. Performance Optimization: Continuously monitor and improve the performance of the data warehouse to handle large volumes of data and complex queries. Semantic Layer Definition: Define and manage the semantic layer architecture and technology stack to manage the lifecycle of semantic constructs including consumption into downstream systems. Common Data Layer Management: Integrate data from multiple sources into a centralized repository, ensuring consistency and accessibility. Deep expertise in architecting enterprise grade software systems that are performant, scalable, resilient and manageable. Architecting GenAI based systems is an added plus. Advanced Analytics: Enable advanced analytics and machine learning to identify patterns in genomic data, optimize clinical trials, and personalize medication. Generative AI: Should have worked with production ready usecase for GenAI based data and Stakeholder Engagement: Work closely with business stakeholders to understand their data needs and translate them into technical solutions. Cross-Functional Collaboration: Collaborate with IT, data scientists, and business analysts to ensure the data warehouse supports various analytical and operational needs. Data Modeling: Strong expertise in Data Modelling, with ability to design complex data models from the ground up and clearly articulate the rationale behind design choices. ETL Processes: Must have worked with different loading strategies for facts and dimensions like SCD, Full Load, Incremental Load, Upsert, Append only, Rolling Window etc.. Cloud Warehouse skills: Expertise in leading cloud data warehouse platformsSnowflake, Databricks, and Amazon Redshift—with a deep understanding of their architectural nuances, strengths, and limitations, enabling the design and deployment of scalable, high-performance data solutions aligned with business objectives. Qualifications: Proven experience in data architecture and data warehousing, preferably in the pharmaceutical industry. Strong knowledge of data modeling, ETL processes, and infrastructure design. Experience with data governance and regulatory compliance in the life sciences sector. Proficiency in using Axtria DataMAx or similar data management products. Excellent analytical and problem-solving skills. Strong communication and collaboration skills. Preferred Skills: Familiarity with advanced analytics and machine learning techniques. Experience in managing semantic and common data layers. Knowledge of FDA guidelines, HIPAA regulations, and other relevant regulatory standards. Experience with generative AI technologies and their application in data warehousing.
Posted 3 weeks ago
3.0 - 5.0 years
3 - 7 Lacs
Noida
Work from Office
We are looking for a skilled IDMC MDM Specialist with 3 to 5 years of relevant experience in Informatica IDMC implementation within the Pharma/Life sciences domain. The ideal candidate will have expertise in MDM SaaS Platform, IDQ, CAI/CDI, Regulatory Compliance & Data Governance. Roles and Responsibility Implement and manage Informatica IDMC solutions for clients. Collaborate with cross-functional teams to ensure seamless integration of IDMC with other systems. Provide expert guidance on MDM SaaS Platform, IDQ, CAI/CDI, and Regulatory Compliance. Develop and maintain documentation of IDMC implementations and configurations. Troubleshoot and resolve issues related to IDMC implementation and data governance. Ensure compliance with industry standards and regulations in IDMC implementations. Job Minimum 3 years of experience in Informatica IDMC implementation. Expertise in MDM SaaS Platform, IDQ, CAI/CDI, Regulatory Compliance & Data Governance. Strong understanding of data governance principles and practices. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills. Additional Info The job opportunity is a full-time/long-term position with our company.
Posted 3 weeks ago
8.0 - 12.0 years
11 - 15 Lacs
Noida
Work from Office
We are looking for a skilled Reltio Architect with 8 to 12 years of experience to lead the design and implementation of enterprise-level MDM solutions using the Reltio Cloud platform. This position is based in Ranchi and Noida. Roles and Responsibility Lead the design and architecture of Reltio-based MDM solutions for large-scale enterprise systems. Collaborate with data governance, analytics, and business teams to define data domains and governance policies. Define data models, match rules, survivorship, hierarchies, and integration strategies. Provide technical leadership for Reltio implementations including upgrades, optimizations, and scaling. Conduct solution reviews and troubleshoot complex data integration or performance issues. Mentor developers and ensure technical deliverables meet architectural standards. Job Minimum 8 years of experience in MDM, with at least 3+ years in Reltio Cloud MDM. Expertise in Reltio data modeling, workflow design, integration strategy, match/merge, and hierarchy management. Experience designing large-scale Reltio implementations across multiple domains. Hands-on experience with Reltio APIs, Reltio Integration Hub, and Informatica/IICS. Strong background in enterprise architecture, data strategy, and cloud platforms (AWS/GCP/Azure). Strong problem-solving, leadership, and communication skills.
Posted 3 weeks ago
10.0 - 15.0 years
3 - 7 Lacs
Bengaluru
Work from Office
We are looking for a skilled MDM Engineer with extensive experience in Informatica MDM to join our team. The ideal candidate will be responsible for designing, developing, and maintaining our Master Data Management (MDM) solutions to ensure data accuracy, consistency, and reliability across the organization. This role requires 10-15 years of experience. Roles and Responsibility Design and implement MDM solutions using Informatica MDM, ensuring alignment with business requirements and data governance standards. Develop and manage ETL processes to integrate data from various sources into the MDM system. Implement data quality rules and processes to ensure the accuracy and consistency of master data. Configure Informatica MDM Hub, including data modeling, data mappings, match and merge rules, and user exits. Monitor and optimize the performance of MDM solutions, ensuring high availability and reliability. Collaborate with data stewards, business analysts, and other stakeholders to gather requirements and ensure the MDM solution meets their needs. Create and maintain comprehensive documentation for MDM processes, configurations, and best practices. Troubleshoot issues related to MDM processes and systems. Job Minimum 10 years of hands-on experience in MDM design, development, and support using Informatica MDM. Proficiency in Informatica MDM ETL processes and data integration technologies. Strong understanding of data governance, data quality, and master data management principles. Excellent problem-solving and analytical skills with the ability to troubleshoot complex data issues. Strong communication and interpersonal skills with the ability to collaborate effectively with cross-functional teams. Experience in Employment Firms/Recruitment Services Firms industry is preferred.
Posted 3 weeks ago
5.0 - 10.0 years
3 - 6 Lacs
Noida
Work from Office
We are looking for a skilled Reltio MDM Developer with 5 to 10 years of experience to join our team. The ideal candidate will be responsible for implementing and managing the Reltio MDM platform, ensuring data consistency, accuracy, and up-to-date-ness across different systems. Roles and Responsibility Design and implement data models in Reltio, focusing on data structure and relationships to ensure a unified view of master data. Integrate Reltio MDM with other systems such as CRMs, ERPs, data warehouses, and external data sources to ensure data synchronization. Set up and enforce data governance policies to ensure data quality, accuracy, and compliance with regulations. Use tools to cleanse, enrich, and standardize data to make it consistent across different systems. Customize the Reltio platform using APIs and configurations to meet organizational needs. Develop and implement business rules in Reltio to ensure data processing according to predefined criteria and workflows. Perform performance tuning of the MDM platform to handle large volumes of data efficiently. Monitor the system for issues and perform maintenance tasks for continuous operation. Job Deep understanding of Reltio's capabilities, including its data model, APIs, and integrations. Familiarity with Java, Python, or other programming languages for customization and scripting. Ability to design and implement data models that fit the organization's needs. Knowledge of MDM concepts like data governance, data quality, and data lifecycle management. Experience with databases and query languages like SQL or NoSQL. Familiarity with integrating MDM solutions with various systems using RESTful APIs, SOAP, or other protocols. Experience with cloud services (e.g., AWS, Azure) as Reltio can be deployed on the cloud. Knowledge of data quality tools and techniques for data validation and enrichment. The company offers a full-time/long-term job opportunity with opportunities for growth and development.
Posted 3 weeks ago
7.0 - 8.0 years
6 - 10 Lacs
Noida
Work from Office
We are looking for a skilled Data Warehouse Lead with 7 to 8 years of experience to design, develop, and maintain data models optimized for reporting and analysis. The ideal candidate will have a strong background in data warehousing concepts, principles, and methodologies. This position is based remotely. Roles and Responsibility Lead the design, development, and maintenance of data models optimized for reporting and analysis. Ensure data quality, integrity, and consistency throughout the data warehousing process. Troubleshoot and resolve issues related to data pipelines and data integrity. Collaborate with business analysts and stakeholders to understand their data needs and provide solutions. Communicate technical concepts effectively to non-technical audiences. Ensure the data warehouse is scalable to accommodate growing data volumes and user demands. Adhere to data governance and privacy policies and procedures. Implement and monitor data quality metrics and processes. Lead and mentor a team of data warehouse developers, providing technical guidance and support. Stay updated with the latest trends and technologies in data warehousing and business intelligence. Foster a collaborative and high-performing team environment. Job Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. 7-8 years of progressive experience in data warehousing, with at least 3 years in a lead or senior role. Deep understanding of data warehousing concepts, principles, and methodologies. Strong proficiency in SQL and experience with various database platforms (e.g., BigQuery, Redshift, Snowflake). Good understanding of Affiliate Marketing Data (GA4, Paid marketing channels like Google Ads, Facebook Ads, etc. the more the better). Hands-on experience with dbt and other ETL/ELT tools and technologies. Experience with data modeling techniques (e.g., dimensional modeling, star schema, snowflake schema). Experience with cloud-based data warehousing solutions (e.g., AWS, Azure, GCP) - GCP is highly preferred. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication, presentation, and interpersonal skills. Ability to thrive in a fast-paced and dynamic environment. Familiarity with business intelligence and reporting tools (e.g., Tableau, Power BI, Looker). Experience with data governance and data quality frameworks is a plus.
Posted 3 weeks ago
5.0 - 7.0 years
4 - 7 Lacs
Noida
Work from Office
We are looking for a skilled Informatica MDM professional with 5 to 7 years of experience. The ideal candidate will have expertise in defining data models and architectures, configuring MDM solutions, and designing and developing BES UI. Roles and Responsibility Define data models and architecture for MDM solutions. Configure MDM (Base Objects, Stg tables, Match & Merge rules, Hierarchies, Relationship objects). Design and develop BES UI. Design and develop C360 applications for data stewards according to client needs. Define data migration processes from legacy systems during M&A activities to MDM systems. Support and maintain MDM applications. Job Minimum 5 years of experience in Informatica MDM. Strong knowledge of data modeling and architecture. Experience in configuring MDM solutions and designing BES UI. Ability to define data migration processes. Strong understanding of data stewardship concepts. Excellent problem-solving skills and attention to detail.
Posted 3 weeks ago
10.0 - 20.0 years
7 - 11 Lacs
Noida
Work from Office
company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=10 to 71 , jd= Job Title:-SAP MDG Technical Consultant Job Location:- Bengaluru Job Type:- Full Time JD:- Job Summary: We are seeking an experienced SAP MDG Technical Consultant with over 7 years of experience in Master Data Governance development and configuration. The ideal candidate should have strong technical expertise in SAP MDG customizations , data modeling , workflow , and integration with SAP and non-SAP systems. The role involves working closely with business and functional teams to implement robust master data solutions across domains such as Business Partner, Customer, Vendor, and Material Master. Key Responsibilities: Lead technical design and development of SAP MDG solutions for Customer, Vendor, Material, and Finance master data. Customize and enhance MDG data models, entity types, UI modeling (FPM), and data replication frameworks. Develop BRF+ rules for validations and derivations. Design and configure workflow templates using SAP Business Workflow and Process Framework. Integrate SAP MDG with ECC, S/4HANA, and non-SAP systems using ALE/IDoc, Web Services, or SAP PI/PO/CPI. Create and maintain data replication models and key mapping (using DRF, SOA, and MDG Consolidation/Mass Processing). Optimize performance and ensure data quality across MDG processes. Collaborate with functional consultants and data governance teams for requirement gathering, testing, and deployment. Prepare technical specifications, solution documents, and participate in code reviews and quality assurance. Required Skills and Experience: 7+ years of technical experience in SAP MDG development and support. Hands-on expertise in: MDG data modeling (custom and standard) FPM (Floorplan Manager) UI enhancements BRF+ for rules configuration SAP MDG Workflow (including rule-based and custom workflows) DRF (Data Replication Framework) and SOA services Strong ABAP knowledge including Object-Oriented ABAP, enhancements (BADI, user exits), and Web Dynpro. Experience with ALE/IDoc, Web Services, and integration tools like SAP PI/PO or SAP CPI. Familiarity with MDG Consolidation and Central Governance frameworks. Good understanding of S/4HANA master data architecture is a plus. Preferred Qualifications: SAP MDG Technical Certification. Experience in SAP BTP-based MDG extensions or Fiori apps. Familiarity with data migration tools like SAP Data Services or LSMW. Domain knowledge in Manufacturing, Retail, or Finance industries. , Title=SAP MDG Technical Consultant, ref=6566274
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France