Jobs
Interviews

2724 Data Governance Jobs - Page 39

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 - 15.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Skill-Retail - Omni Channel Product manager Exp-10 Yrs 15 yrs Key Responsibilities Effectively translate business strategies into product strategies, value increments, and product specifications to deliver against our core customer value propositions, and our company strategic and financial goals Help manage the creation and maintenance of user stories and business requirements for new features and enhancements leveraging multiple work streams Prioritize new feature launches based on competitive analysis, industry trends, emerging technologies, and company vision Demonstrate empathy for the customer and steer discussions to how to build customer trust Influence cross functional team(s) without formal authority Articulate clear and concise requirements for new products and features Analyze complex data sets and leverage that analysis to make data-driven product decisions Work within a matrix organization collaborating with business stakeholders, User Experience (UX) teams, engineers, and other relevant digital and technology and business teams Improve value creation by defining and aligning on KPIs to measure success Work under rapid development cycles with medium to large teams to achieve a common goal Participate in day-to-day product team activities, driving high quality customer experiences and exploring what is possible with technology Basic Qualifications Bachelors degree in IT, Computer Science, Engineering, Business, Marketing or related field OR equivalent experience. 3-4 years of Product Management experience, OR experience working with development, User Experience, Strategy, or related. Including: 1+ year of experience with direct/indirect people management preferred 1+ year of relevant experience in strategy creation, customer-focused solutioning, cross-functional leadership, or related Experience successfully leading and delivering complex data products and initiatives. Experience with data visualization tools, data warehousing, big data technologies, and/or machine learning. Experience in collaborating with cross-functional teams in a global and diverse environment. Preferred Qualifications Experience working in an omni-channel retail environment Experience in setting ambitious, tangible, and measurable team objectives and key results. Strong understanding of data governance and best practices. Experience with Google Cloud Platform. Omni Channel, Product Managers, Retail

Posted 2 weeks ago

Apply

3.0 - 6.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Detailed job description - Skill Set: Technically strong hands-on Self-driven Good client communication skills Able to work independently and good team player Flexible to work in PST hour(overlap for some hours) Past development experience for Cisco client is preferred.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Gurugram

Work from Office

As the Senior Manager - Analytics at Nutrabay, you will be the strategic owner of all analytics efforts across the organization. You ll be responsible for leading the data and analytics roadmap, building scalable data infrastructure, enabling data-driven product decisions, and managing analytics tools, team, and processes. This role demands a blend of technical expertise, product mindset, business acumen, and leadership to translate data into insights and insights into action. You should apply if you have: 6- 10 years of experience in data analytics/product analytics with a strong foundation in e-commerce or product-based companies. Proven experience building analytics platforms, pipelines, dashboards, and experimentation frameworks from scratch. Hands-on expertise in Power BI , SQL , Google Analytics (GA4) , Mixpanel , Firebase , and Appsflyer . Strong understanding of data modelling , data governance , and ETL frameworks (Airflow/DBT preferred). Solid understanding of product metrics , conversion funnels , A/B testing , LTV , retention , and user segmentation . Experience working closely with engineering, product, marketing, and business leadership. Ability to manage and mentor a team of analysts, driving both execution and strategy. You should not apply if you: Haven t worked with cloud data warehouses like BigQuery or Redshift . Are unfamiliar with analytics tools like Power BI , Mixpanel , or Firebase . Do not have experience in managing cross-functional analytics projects or teams. Are uncomfortable driving business and product decisions based on analytics insights. Prefer execution-only roles with minimal strategic involvement. Skills Required: Data Visualization & BI : Power BI, Google Looker Studio SQL & Data Warehousing : BigQuery, AWS Redshift Analytics Tools : GA4, Firebase, Mixpanel, Appsflyer ETL & Data Modeling : Airflow, DBT, CDP, custom pipeline architecture Product Analytics : Funnel analysis, Retention, A/B Testing, Journey Mapping Python (Preferred for automation and advanced analytics) Data Governance & Compliance Stakeholder Communication & Data Storytelling Team Leadership & Strategic Thinking What will you do? Lead and scale the analytics function across the organization, including data engineering, BI, and product analytics. Own and drive the analytics roadmap aligned with business and product OKRs. Build a robust analytics infrastructure to support real-time decision-making. Define and implement key product and business KPIs for all departments. Work with PMs and engineers to implement event tracking systems across platforms. Run A/B tests, conversion analysis, and user behaviour research to guide product strategy. Ensure data quality, privacy, and compliance with IT and security policies. Develop self-serve dashboards and reporting for leadership and operations teams. Hire, mentor, and manage a high-performing analytics team. Work Experience: 6 - 10 years in data or product analytics with at least 3 years of experience in a leadership or team management role. Prior experience in a fast-paced, product-first, or e-commerce environment is a strong advantage. Working Days: Monday to Friday (Full-Time, Work from Office) Location: Golf Course Road, Gurugram, Haryana Perks: Opportunity to build the analytics ecosystem from the ground up. Work closely with founders and product leadership to shape company direction. High learning and personal growth opportunities. Flexible timings and an open, transparent culture. Freedom to drive experimentation, decision-making, and innovation. Why Nutrabay: We believe in an open, intellectually honest culture where everyone is given the autonomy to contribute and do their life s best work. As a part of the dynamic team at Nutrabay, you will have a chance to learn new things, solve new problems, build your competence and be a part of an innovative marketing-and-tech startup that s revolutionising the health industry. Working with Nutrabay can be fun, and a place of a unique growth opportunity. Here you will learn how to maximise the potential of your available resources. You will get the opportunity to do work that helps you master a variety of transferable skills, or skills that are relevant across roles and departments. You will be feeling appreciated and valued for the work you delivered. We are creating a unique company culture that embodies respect and honesty that will create more loyal employees than a company that simply shells out cash. We trust our employees and their voice and ask for their opinions on important business issues. About Nutrabay: Nutrabay is the largest health & nutrition store in India. Our vision is to keep growing, having a sustainable business model and continue to be the market leader in this segment by launching many innovative products. We are proud to have served over 1 million customers uptill now and our family is constantly growing. We have built a complex and high converting eCommerce system and our monthly traffic has grown to a million. We are looking to build a visionary and agile team to help fuel our growth and contribute towards further advancing the continuously evolving product. Funding: We raised $5 Million in a Series A funding.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Chennai

Work from Office

Who we are Perch Energy is a leading community solar servicer on a mission to make renewable energy more accessible and equitable for all. Community solar breaks down the traditional barriers preventing most people from participating in the renewable energy economy. We work in numerous states across the US to bring community solar to communities and individuals who can most benefit from a more inclusive energy system. By managing the customer experience for solar farm owners, Perch is able to bring electricity bill savings to the masses, from renters and homeowners to businesses, institutions, municipalities and more by connecting them to community solar projects in their area. Perch isn t just a for-profit company, we re a for-purpose company accelerating the shift to renewables nationwide. Everyone deserves to benefit from clean energy. Everyone has a place on this Perch! What we re looking for As a Data Engineer, you will play a key role in designing, developing, and maintaining our data infrastructure and pipelines. You will collaborate closely with the rest of our Data and Analytics Engineering team and with engineering and operations teams to ensure the smooth flow and availability of high-quality data for analysis and reporting purposes. Your expertise will be essential in optimizing data workflows, ensuring data integrity, and scaling our data infrastructure to support our companys growth. This is an exceptional opportunity for someone who relishes the chance to engage with cutting-edge technology, influence the development of a world-class data ecosystem and work in a fast-paced environment on a small, high-impact team. Our core data stack makes heavy use of Snowflake and dbt Core, orchestrated in Prefect and Argo in our broader AWS-based ecosystem. Most of our wide range of data sources are loaded with Fivetran or Segment, but we use custom Python when it s the right tool for the job. What you ll do Design, develop, and maintain scalable and efficient data pipelines in an AWS environment, centered on our Snowflake instance and using Fivetran, Prefect, Argo, and dbt. Collaborate with business analysts, analytics engineers, and software engineers to understand data requirements and deliver reliable solutions. Design, build and maintain tooling that enables users and services to interact with our data platform, including CI/CD pipelines for our data lakehouse, unit/integration/validation testing frameworks for our data pipelines, and command-line tools for ad-hoc data evaluation. Identify and implement best practices for data ingestion, transformation, and storage to ensure data integrity and accuracy. Optimize and tune data pipelines for improved performance, scalability, and reliability. Monitor data pipelines and proactively address any issues or bottlenecks to ensure uninterrupted data flow. Develop and maintain documentation for data pipelines, ensuring knowledge sharing and smooth onboarding of new team members. Implement data governance and security measures to ensure compliance with industry standards and regulations. Keep up to date with emerging technologies and trends in data engineering and recommend their adoption as appropriate. What will help you succeed Must-haves 3+ years as a Data Engineer, data-adjacent Software Engineer, or a did-everything small data team member with a focus on building and maintaining data pipelines. Strong Python skills, especially in the context of data orchestration. Strong understanding of database management and design, including experience with Snowflake or an equivalent platform. Proficiency in SQL Familiarity with data integration patterns, ETL/ELT processes, and data warehousing concepts. Experience with Argo, Prefect, Airflow, or similar data orchestration tools. Excellent problem-solving and analytical skills with a strong attention to detail. Ability to bring a customer-oriented and empathetic approach to understanding how data is used to drive the business. Strong communication skills. Nice-to-haves Undergraduate and/or graduate degree in math, statistics, engineering, computer science, or related technical field Experience with our stack: AWS, Snowflake, Fivetran, Argo, Prefect, dbt, and Github Actions, along with some ancillary tools Experience with DevOps practices, especially CI/CD Previous experience managing enterprise-level data pipelines and working with large datasets Experience in the energy sector Benefits: Competitive compensation based on market standards. We are working on a hybrid model with remote first policy Apart from Fixed Base Salary potential candidates are eligible for following benefits Flexible Leave Policy Office is in the heart of the city in case you need to step in for any purpose. Medical Insurance (1+5 Family Members) We provide comprehensive coverage including accident policy and life Insurance. Annual performance cycle Quarterly team engagement activities and rewards & recognitions L&D programs to foster professional growth A supportive engineering culture that values diversity, empathy, teamwork, trust, and efficiency Eliminating carbon footprints, eliminating carbon copies. Here at Perch, we cultivate diversity, celebrate individuality, and believe unique perspectives are key to our collective success in creating a clean energy future. Perch is committed to equal employment opportunities regardless of race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, disability, genetic information, protected veteran status, or any status protected by applicable federal, state, or local law. While we are currently unable to consider candidates who will require visa sponsorship, we welcome applications from all qualified candidates eligible to work in India. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request Thank you

Posted 3 weeks ago

Apply

6.0 - 11.0 years

17 - 19 Lacs

Bengaluru

Work from Office

Minimum of 6+ years of experience in IT Industry Creating data models, building data pipelines, and deploying fully operational data warehouses within Snowflake Writing and optimizing SQL queries, tuning database performance, and identifying and resolving performance bottlenecks Integrating Snowflake with other tools and platforms, including ETL/ELT processes and third party applications Implementing data governance policies, maintaining data integrity, and managing access controls Creating and maintaining technical documentation for data solutions, including data models, architecture, and processes Familiarity with cloud platforms and their integration with Snowflake Basic coding skills in languages like Python or Java can be helpful for scripting and automation Outstanding ability to communicate, both verbally and in writing Strong analytical and problem solving skills Experience in Banking domain

Posted 3 weeks ago

Apply

5.0 - 8.0 years

8 - 12 Lacs

Pune

Work from Office

Role The purpose of this role is to provide strategic guidance and recommendations on pricing of contracts being executed in the assigned SBU while maintaining the competitive advantage and profit margins. Responsible for ensuring the SoW adherence to internal guidelines of all contracts in the SBU. Do - Contract pricing review and advise - Pricing strategy deployment - Drive the deployment of pricing strategy for the SBU/ Vertical / Account in line with the overall pricing strategy for Wipro - Partner and educate the Business Leaders about adherence to the pricing strategy, internal guidelines and SoW. - Business partnering for advice on contract commercials - Work closely with pre-sales and BU leadership to review the contracts about to be finalized and provide inputs on its structuring, payment milestones and terms & conditions - Review the Resources Loading Sheet (RLS)) submitted by pre-sales / delivery team and work on the contract pricing - Collaborate with the business leaders to propose a competitive pricing basis the effort estimate by considering the cost of resources, skills availability and identified premium skills - Review adherence of contract's commercial terms and conditions - Review the commercial terms and conditions proposed in the SoW - Ensure they are aligned with internal guidelines for credit period and the existing MSAs and recommend payment milestones - Ensure accurate revenue recognition and provide forecast - Implement and drive adherence to revenue recognition guidelines - Ensure revenue recognition by the BFMs / Service Line Finance Manage are done as per the IFRS standards - Partner with Finance Managers and educate them on revenue recognition standards and internal guidelines of Wipro - Provide accurate and timely forecast of revenue for the assigned SBU/ Vertical / Cluster / Accounts - Validation of order booking - Adherence to order booking guidelines - Oversee and ensure all documents, approvals and guidelines are adhered before the order is confirmed in the books of accounts - Highlight any deviations to the internal guidelines / standards and work with the concerned teams to address the deviations - Team Management - Team Management - Clearly define the expectations for the team - Assign goals for the team, conduct timely performance reviews and provide constructive feedback to own direct reports - Guide the team members in acquiring relevant knowledge and develop their professional competence - Educate and build awareness in the team in Wipro guidelines on revenue recognition, pricing strategy, contract terms and MSA - Ensure that the Performance Nxt is followed for the entire team - Employee Satisfaction and Engagement - Lead and drive engagement initiatives for the team - Track team satisfaction scores and identify initiatives to build engagement within the team 1. Financials Monetizing Wipro's efforts and value additions Comprehensiveness of pricing recommendations Accurate inputs in forecasting of revenue as per revenue recognition guidelines 2. Internal Customer Completeness of contracts checklist before order booking 3. Team Management Team attrition %, Employee satisfaction score, localization %, gender diversity % Training and skill building of team on pricing operations Mandatory Skills: Data Governance. Experience: 5-8 Years.

Posted 3 weeks ago

Apply

0.0 - 5.0 years

7 - 12 Lacs

Mumbai

Work from Office

Join our dynamic Integrated Data Platform Operations team and be at the forefront of data innovation. Collaborate with clients and technology partners to ensure data excellence. Elevate your career by driving data quality and governance in a strategic environment. Job Summary As an Associate in the Integrated Data Platform Operations team, you will work with clients and technology partners to implement data quality and governance practices. You will define data standards and ensure data meets the highest quality. You will play a crucial role in enhancing data management across the Securities Services business. Job Responsibilities Define data quality standards Investigate data quality issues Collaborate with technology partners Establish dashboards and metrics Support data view and lineage tools Embed data quality in UAT cycles Assist Operations users with data access Work with project teams on implementations Implement data ownership processes Deliver tools and training for data owners Champion improvements to data quality Required Qualifications, Capabilities, and Skills Engage effectively across teams Understand data components for IBOR Comprehend trade lifecycle and cash management Possess technical data management skills Solve operational and technical issues Deliver with limited supervision Partner in a virtual team environment Preferred Qualifications, Capabilities, and Skills Demonstrate strong communication skills Exhibit leadership in data governance Adapt to changing project requirements Analyze complex data sets Implement innovative data solutions Foster collaboration across departments Drive continuous improvement initiatives Join our dynamic Integrated Data Platform Operations team and be at the forefront of data innovation. Collaborate with clients and technology partners to ensure data excellence. Elevate your career by driving data quality and governance in a strategic environment. Job Summary As an Associate in the Integrated Data Platform Operations team, you will work with clients and technology partners to implement data quality and governance practices. You will define data standards and ensure data meets the highest quality. You will play a crucial role in enhancing data management across the Securities Services business. Job Responsibilities Define data quality standards Investigate data quality issues Collaborate with technology partners Establish dashboards and metrics Support data view and lineage tools Embed data quality in UAT cycles Assist Operations users with data access Work with project teams on implementations Implement data ownership processes Deliver tools and training for data owners Champion improvements to data quality Required Qualifications, Capabilities, and Skills Engage effectively across teams Understand data components for IBOR Comprehend trade lifecycle and cash management Possess technical data management skills Solve operational and technical issues Deliver with limited supervision Partner in a virtual team environment Preferred Qualifications, Capabilities, and Skills Demonstrate strong communication skills Exhibit leadership in data governance Adapt to changing project requirements Analyze complex data sets Implement innovative data solutions Foster collaboration across departments Drive continuous improvement initiatives

Posted 3 weeks ago

Apply

7.0 - 8.0 years

5 - 8 Lacs

Hyderabad

Remote

Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 3 weeks ago

Apply

15.0 - 20.0 years

18 - 22 Lacs

Hyderabad

Remote

Contract : 6months Position Summary : We are seeking an experienced SAP Data Architect with a strong background in enterprise data management, SAP S/4HANA architecture, and data integration. This role demands deep expertise in designing and governing data structures, ensuring data consistency, and leading data transformation initiatives across large-scale SAP implementations. Key Responsibilities : - Lead the data architecture design across multiple SAP modules and legacy systems. - Define data governance strategies, master data management (MDM), and metadata standards. - Architect data migration strategies for SAP S/4HANA projects, including ETL, data validation, and reconciliation. - Develop data models (conceptual, logical, and physical) aligned with business and technical requirements. - Collaborate with functional and technical teams to ensure integration across SAP and nonSAP platforms. - Establish data quality frameworks and monitoring practices. - Conduct impact assessments and ensure scalability of data architecture. - Support reporting and analytics requirements through structured data delivery frameworks (e.g., SAP BW, SAP HANA). Required Qualifications : - 15+ years of experience in enterprise data architecture, with 8+ years in SAP landscapes. - Proven experience in SAP S/4HANA data models, SAP Datasphere, SAP HANA Cloud & SAC and integrating this with AWS Data Lake (S3) - Strong knowledge of data migration tools (SAP Data Services, LTMC / LSMW, BODS, etc.). - Expertise in data governance, master data strategy, and data lifecycle management. - Experience with cloud data platforms (Azure, AWS, or GCP) is a plus. - Strong analytical and communication skills to work across business and IT stakeholders. Preferred Certifications : - SAP Certified Technology Associate SAP S/4HANA / Datasphere - TOGAF or other Enterprise Architecture certifications - ITIL Foundation (for process alignment)

Posted 3 weeks ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Hyderabad

Work from Office

We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.

Posted 3 weeks ago

Apply

7.0 - 10.0 years

9 - 12 Lacs

Ahmedabad

Work from Office

Location : Remote (India). Employment Type : Contract (Remote). Experience Required : 7+ Years. Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 3 weeks ago

Apply

7.0 - 10.0 years

10 - 14 Lacs

Hyderabad

Work from Office

About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-thinking approach to integrating cutting-AI technologies, particularly LLM Mesh for Generative AI applications. If you are passionate about building scalable data solutions and are eager to explore the cutting edge of AI, we encourage you to apply. Key Responsibilities : - Dataiku Leadership : Drive data engineering initiatives with a strong emphasis on leveraging Dataiku capabilities for data preparation, analysis, visualization, and the deployment of data solutions. - Data Pipeline Development : Design, develop, and optimize robust and scalable data pipelines to support various business intelligence and advanced analytics projects. This includes developing and maintaining ETL/ELT processes to automate data extraction, transformation, and loading from diverse sources. - Data Modeling & Architecture : Apply expertise in data modeling techniques to design efficient and scalable database structures, ensuring data integrity and optimal performance. - ETL/ELT Expertise : Implement and manage ETL processes and tools to ensure efficient and reliable data flow, maintaining high data quality and accessibility. - Gen AI Integration : Explore and implement solutions leveraging LLM Mesh for Generative AI applications, contributing to the development of innovative AI-powered features. - Programming & Scripting : Utilize programming languages such as Python and SQL for data manipulation, analysis, automation, and the development of custom data solutions. - Cloud Platform Deployment : Deploy and manage scalable data solutions on cloud platforms such as AWS or Azure, leveraging their respective services for optimal performance and cost-efficiency. - Data Quality & Governance : Ensure seamless integration of data sources, maintaining high data quality, consistency, and accessibility across all data assets. Implement data governance best practices. - Collaboration & Mentorship : Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver impactful solutions. Potentially mentor junior team members. - Performance Optimization : Continuously monitor and optimize the performance of data pipelines and data systems. Required Skills & Experience : - Proficiency in Dataiku : Demonstrable expertise in Dataiku for data preparation, analysis, visualization, and building end-to-end data pipelines and applications. - Expertise in Data Modeling : Strong understanding and practical experience in various data modeling techniques (e.g., dimensional modeling, Kimball, Inmon) to design efficient and scalable database structures. - ETL/ELT Processes & Tools : Extensive experience with ETL/ELT processes and a proven track record of using various ETL tools (e.g., Dataiku's built-in capabilities, Apache Airflow, Talend, SSIS, etc.). - Familiarity with LLM Mesh : Familiarity with LLM Mesh or similar frameworks for Gen AI applications, understanding its concepts and potential for integration. - Programming Languages : Strong proficiency in Python for data manipulation, scripting, and developing data solutions. Solid command of SQL for complex querying, data analysis, and database interactions. - Cloud Platforms : Knowledge and hands-on experience with at least one major cloud platform (AWS or Azure) for deploying and managing scalable data solutions (e.g., S3, EC2, Azure Data Lake, Azure Synapse, etc.). - Gen AI Concepts : Basic understanding of Generative AI concepts and their potential applications in data engineering. - Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. - Communication : Strong communication and interpersonal skills to collaborate effectively with cross-functional teams. Bonus Points (Nice to Have) : - Experience with other big data technologies (e.g., Spark, Hadoop, Snowflake). - Familiarity with data governance and data security best practices. - Experience with MLOps principles and tools. - Contributions to open-source projects related to data engineering or AI. Education : Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

8 - 18 Lacs

Pune

Work from Office

About Position: Persistent is scaling up its global Digital Trust practice. Digital Trust encompasses the domains of Data Privacy, Responsible AI (RAI), GRC (Governance, Risk & Compliance), and other related areas. This is a rapidly evolving domain globally that is at the intersection of technology, law, ethics, and compliance. Team members of this practice get an opportunity to work on innovative and cutting-edge solutions. As a Digital Trust Consultant, you will independently manage client projects in Digital Trust including assess, design, implement, manage, test, monitor and audit Digital Trust programs across multiple industries. Role: Digital Trust Consultant Location: All PSL Location Experience: 3-8 Years Job Type: Full Time Employment What Youll Do: Project Ownership and Delivery Lead the end-to-end execution of one or more projects in the Digital Trust domain. Assist in conducting Impact/Gap Assessments vis-a-vis global and local Digital Trust related regulations or standards (e.g. GDPR, DPDPA, CCPA, EU AI Act, ISO 42001, NIST Standards, etc.) or as per Client Policies. Risk Identification, Mitigation and monitoring of Privacy, AI and compliance related risks Work on policies and procedures (e.g., privacy notice, cookie policy, Model documentation, Data Provenance, AI usage guidelines, data retention policy, etc), impact assessments (e.g. Data Protection Impact Assessments (DPIA), AI Impact Assessments (AIIA)), Privacy by Design (PbD), Bias/Fairness evaluations of AI models and other related aspects that are integral to digital trust programs. Map data flows, prepare Records of Processing Activities, and support consent and preference management activities. Participate in and support delivery of vendor risk assessments, third-party due diligence, and automated assessment processes. Understand Client Technology & Cloud Platforms, Databases Systems, AI systems & other applications and translate policy/process level controls and/or compliance requirements to specific, ground-level controls. Utilize internal and client-specific tools such as: Privacy Management Platforms (OneTrust, TrustArc, Securiti.ai, etc.) Responsible AI Tools (e.g., Model monitoring platforms, fairness checkers, data lineage tools) GRC Tools (e.g., Archer, LogicGate) Review and support in customizing and improving internal assessment templates, frameworks, and reporting dashboards. Engage with internal and client stakeholders across legal, IT, product engineering and applicable business & functional teams. Capture stakeholder requirements and translate them into technical and/or policy/process controls & requirements. Lead the planning and delivery of client workshops, trainings, and review meetings. Help track stakeholder action items and maintain transparent communication with project leads. Team Contribution & Mentoring Mentor and review work done by Junior Consultants in the project team. Support knowledge transfer, share updates from your research, and contribute to internal wikis/templates. Practice Development Contribute to the development of accelerators and artefacts Participate in proposal drafting, PoCs, and responding to client RFPs Expertise Youll Bring: Bachelors Degree in Engineering, Computer Science, Information Technology, Law, Business, or related fields. Candidates with a Postgraduate Degree / Diploma in Data Privacy, Cybersecurity, Law, or AI Ethics are preferred. IAPP Certifications CIPP/E, CIPP/US, CIPM, CIPT, AIGP ISACA / (ISC) Certifications CDPSE, CISA, ISO/IEC 27001 or ISO/IEC 42001 implementation Strong attention to detail, logical structuring, and analytical thinking Ability to grasp new concepts quickly and keep up with developments in the field proactively without expectations of being taught Ability to understand technical aspects in depth Good verbal and written communication skills, especially for report writing and presentations Demonstrated interest in privacy, ethics, compliance, and Responsible AI Ability to handle multiple projects and prioritize tasks under supervision High level of professionalism, discretion, and integrity Ability to handle multiple projects and prioritize tasks under supervision Benefits : Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry's best Let's unleash your full potential at Persistent " Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind. "

Posted 3 weeks ago

Apply

5.0 - 9.0 years

27 - 32 Lacs

Bengaluru

Hybrid

Role: Data Governance (Collibra) Specialist Location: Bangalore (Hybrid model) Shift: 02:30 - 10:30 AM Skills Required: 5+ Years Microsoft Certified: Power BI Data Analyst Associate 3 years Collibra experience *************************************************** IMMEDIATE JOINERS REQUIRED Send your updated CV directly to: 9152808909 **************************************************** Position Summary: We are seeking to hire a Data Governance Specialist with Collibra expertise. This role involves implementing our data governance methodology, including identifying and prioritizing data elements in GBS systems, documenting business definitions, technical metadata and lineage, and defining and implementing data governance policies, standards, and processes. The Data Governance Specialist will load metadata into Collibra and use the tool to maintain the data catalog and perform data governance. They will serve as a mentor to team members and other stakeholders on the use of Collibra to put in practice data governance best practices. This role will also support profiling data to identify data quality issues and performing root cause analysis. ESSENTIAL RESPONSIBILITIES: Analyze large datasets using Collibra, SQL, and other technologies to identify, describe, and profile data. Load business and technical metadata, definitions, and lineage in Collibra and manage the data catalog. Guide data owners, stewards, and custodians in the use of Collibra and application of data governance practices. Support data quality efforts by profiling source data, defining data quality rules, and performing root cause analysis. Compliance & Security: Ensure compliance with data privacy regulations and maintain data security protocols. Implement best practices for data handling, storage, and sharing to protect sensitive information. Collaboration with Cross-Functional Teams: Work closely with business analysts, data engineers, and other stakeholders to understand requirements and deliver solutions EDUCATION AND EXPERIENCE Minimum Required Degree: Bachelor's degree Preferred Degree: Degree in Computers/Maths/Stats. Microsoft Certified: Power BI Data Analyst Associate KNOWLEDGE, SKILLS AND ABILITY: At least 3 years Collibra experience Proficiency in SQL, and data analysis and profiling tools Strong understanding of data analysis and data quality principles Familiarity with data governance, data privacy, and security best practices Experience with foundational source systems and metadata management Ability to document technical attributes, business descriptions, and data lineage Agile delivery, including user story grooming and sprint planning, development, and user acceptance. Excellent communication and collaboration skills to engage with the team and stakeholders Ability to stay updated with the latest industry trends and technologies

Posted 3 weeks ago

Apply

1.0 - 3.0 years

3 - 6 Lacs

Pune

Work from Office

About Position: Persistent is scaling up its global Digital Trust practice. Digital Trust encompasses the domains of Data Privacy, Responsible AI, GRC (Governance, Risk & Compliance), and other related areas. This is a rapidly evolving domain globally that is at the intersection of technology, law, ethics, and compliance. Team members of this practice get an opportunity to work on innovative and cutting-edge solutions.As an Digital Trust Junior Consultant , you will support client consulting engagements in Digital Trust. You will work alongside senior team members to assess, design, implement, manage, test, monitor and audit Digital Trust programs across multiple industries. Role: Digital Trust Junior Consultant Location: All PSL Location Experience: 1-3 years Job Type: Full Time Employment What youll Do: Client Delivery & Advisory Support Assist in conducting gap assessments vis-a-vis global and local Digital Trust related regulations (e.g ., GDPR, DPDPA, CCPA, EU AI Act, ISO 42001, NIST Standards, etc). Work on policies and procedures (e.g., privacy notice, cookie policy, AI usage guidelines, data retention policy, etc), impact assessments (e.g. Data Protection Impact Assessments (DPIA), AI Impact Assessments (AIIA)), Privacy by Design (PbD), Bias/Fairness evaluations of AI models and other related aspects that are integral to digital trust programs. Map data flows, prepare Records of Processing Activities (RoPA), and support consent and preference management activities. Participate in and support delivery of vendor risk assessments, third-party due diligence, and automated assessment processes. Understand Client Technology & Cloud Platforms, Databases Systems, AI systems & other applications and translate policy/process level controls and/or compliance requirements to specific, ground-level controls. Utilize internal and client-specific tools such as: Privacy Management Platforms (OneTrust, TrustArc, Securiti.ai, etc.) Responsible AI Tools (e.g., Model monitoring platforms, fairness checkers, data lineage tools) GRC Tools (e.g., Archer, LogicGate) Support in customizing and improving internal assessment templates, frameworks, and reporting dashboards. Engage with internal and client stakeholders across legal, IT, product engineering and applicable business & functional teams. Capture stakeholder requirements and translate them into technical and/or policy/process controls & requirements. Support the planning and delivery of client workshops, trainings, and review meetings. Help track stakeholder action items and maintain transparent communication with project leads. Expertise Youll Bring Bachelors Degree in Engineering, Computer Science, Information Technology, Law, Business, or related fields. Candidates with a Postgraduate Degree / Diploma in Data Privacy, Cybersecurity, Law, or AI Ethics are preferred. IAPP Certifications CIPP/E, CIPP/US, CIPM, CIPT, AIGP ISACA / (ISC) Certifications CDPSE, CISA, CISSP ISO/IEC 27001 or ISO/IEC 42001 implementation Strong attention to detail, logical structuring, and analytical thinking Ability to grasp new concepts quickly and keep up with developments in the field proactively without expectations of ‘being taught’ Ability to understand technical aspects in depth Good verbal and written communication skills, especially for report writing and presentations Demonstrated interest in privacy, ethics, compliance, and Responsible AI Ability to handle multiple projects and prioritize tasks under supervision High level of professionalism, discretion, and integrity Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry’s best Let’s unleash your full potential at Persistent “Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind.”

Posted 3 weeks ago

Apply

7.0 - 10.0 years

9 - 12 Lacs

Pune

Remote

Key Responsibilities : - Own and manage all Master Data Management (MDM) activities for SAP projects, ensuring alignment with business objectives. - Develop and implement comprehensive MDM strategies and roadmaps. - Lead the design and implementation of data governance frameworks and processes. - Lead data migration and cutover activities in SAP S/4HANA projects, including Greenfield implementations, system migrations, and rollouts. - Develop and execute data migration plans, ensuring data accuracy and consistency. - Manage data cleansing, transformation, and validation processes. - Establish and implement MDM best practices and data management capabilities. - Define and enforce data management principles, policies, and lifecycle strategies. - Ensure data compliance with regulatory requirements and internal policies. - Work closely with MDM Leads and stakeholders to drive data governance initiatives. - Develop and implement data quality metrics and reporting mechanisms. - Monitor data quality and identify areas for improvement. - Implement data quality controls and validation rules. - Track and manage MDM objects, ensuring timely delivery and adherence to project timelines. - Participate in daily stand-ups, issue tracking, and dashboard updates. - Collaborate with cross-functional teams, including functional consultants, developers, and business stakeholders. - Identify risks and process improvements for MDM. - Conduct training sessions for teams on S/4HANA MDM best practices and processes. - Develop and maintain training materials and documentation. Required Skills & Qualifications : - 7-10 years of experience in SAP Master Data Management (MDM). - Strong knowledge of SAP S/4HANA, Data Migration, and Rollouts. - Expertise in data governance, lifecycle management, and compliance. - Experience in defining data management principles, policies, and lifecycle strategies. - Ability to monitor data quality with consistent metrics and reporting. - Familiarity with KANBAN boards, ticketing tools, and dashboards. - Strong problem-solving and communication skills. - Ability to track and manage MDM objects, ensuring timely delivery. Preferred Skills : - Experience in training teams on MDM best practices. Automation & Productivity Tools : - Knowledge of automation and productivity improvement tools. - Familiarity with ABAP and SQL. - Experience with SAP Data Services or other data migration tools.

Posted 3 weeks ago

Apply

7.0 - 10.0 years

9 - 12 Lacs

Visakhapatnam

Remote

Key Responsibilities : - Own and manage all Master Data Management (MDM) activities for SAP projects, ensuring alignment with business objectives. - Develop and implement comprehensive MDM strategies and roadmaps. - Lead the design and implementation of data governance frameworks and processes. - Lead data migration and cutover activities in SAP S/4HANA projects, including Greenfield implementations, system migrations, and rollouts. - Develop and execute data migration plans, ensuring data accuracy and consistency. - Manage data cleansing, transformation, and validation processes. - Establish and implement MDM best practices and data management capabilities. - Define and enforce data management principles, policies, and lifecycle strategies. - Ensure data compliance with regulatory requirements and internal policies. - Work closely with MDM Leads and stakeholders to drive data governance initiatives. - Develop and implement data quality metrics and reporting mechanisms. - Monitor data quality and identify areas for improvement. - Implement data quality controls and validation rules. - Track and manage MDM objects, ensuring timely delivery and adherence to project timelines. - Participate in daily stand-ups, issue tracking, and dashboard updates. - Collaborate with cross-functional teams, including functional consultants, developers, and business stakeholders. - Identify risks and process improvements for MDM. - Conduct training sessions for teams on S/4HANA MDM best practices and processes. - Develop and maintain training materials and documentation. Required Skills & Qualifications : - 7-10 years of experience in SAP Master Data Management (MDM). - Strong knowledge of SAP S/4HANA, Data Migration, and Rollouts. - Expertise in data governance, lifecycle management, and compliance. - Experience in defining data management principles, policies, and lifecycle strategies. - Ability to monitor data quality with consistent metrics and reporting. - Familiarity with KANBAN boards, ticketing tools, and dashboards. - Strong problem-solving and communication skills. - Ability to track and manage MDM objects, ensuring timely delivery. Preferred Skills : - Experience in training teams on MDM best practices. Automation & Productivity Tools : - Knowledge of automation and productivity improvement tools. - Familiarity with ABAP and SQL. - Experience with SAP Data Services or other data migration tools.

Posted 3 weeks ago

Apply

7.0 - 10.0 years

9 - 12 Lacs

Mumbai

Remote

Key Responsibilities : - Own and manage all Master Data Management (MDM) activities for SAP projects, ensuring alignment with business objectives. - Develop and implement comprehensive MDM strategies and roadmaps. - Lead the design and implementation of data governance frameworks and processes. - Lead data migration and cutover activities in SAP S/4HANA projects, including Greenfield implementations, system migrations, and rollouts. - Develop and execute data migration plans, ensuring data accuracy and consistency. - Manage data cleansing, transformation, and validation processes. - Establish and implement MDM best practices and data management capabilities. - Define and enforce data management principles, policies, and lifecycle strategies. - Ensure data compliance with regulatory requirements and internal policies. - Work closely with MDM Leads and stakeholders to drive data governance initiatives. - Develop and implement data quality metrics and reporting mechanisms. - Monitor data quality and identify areas for improvement. - Implement data quality controls and validation rules. - Track and manage MDM objects, ensuring timely delivery and adherence to project timelines. - Participate in daily stand-ups, issue tracking, and dashboard updates. - Collaborate with cross-functional teams, including functional consultants, developers, and business stakeholders. - Identify risks and process improvements for MDM. - Conduct training sessions for teams on S/4HANA MDM best practices and processes. - Develop and maintain training materials and documentation. Required Skills & Qualifications : - 7-10 years of experience in SAP Master Data Management (MDM). - Strong knowledge of SAP S/4HANA, Data Migration, and Rollouts. - Expertise in data governance, lifecycle management, and compliance. - Experience in defining data management principles, policies, and lifecycle strategies. - Ability to monitor data quality with consistent metrics and reporting. - Familiarity with KANBAN boards, ticketing tools, and dashboards. - Strong problem-solving and communication skills. - Ability to track and manage MDM objects, ensuring timely delivery. Preferred Skills : - Experience in training teams on MDM best practices. Automation & Productivity Tools : - Knowledge of automation and productivity improvement tools. - Familiarity with ABAP and SQL. - Experience with SAP Data Services or other data migration tools.

Posted 3 weeks ago

Apply

7.0 - 10.0 years

9 - 12 Lacs

Chennai

Remote

Key Responsibilities : - Own and manage all Master Data Management (MDM) activities for SAP projects, ensuring alignment with business objectives. - Develop and implement comprehensive MDM strategies and roadmaps. - Lead the design and implementation of data governance frameworks and processes. - Lead data migration and cutover activities in SAP S/4HANA projects, including Greenfield implementations, system migrations, and rollouts. - Develop and execute data migration plans, ensuring data accuracy and consistency. - Manage data cleansing, transformation, and validation processes. - Establish and implement MDM best practices and data management capabilities. - Define and enforce data management principles, policies, and lifecycle strategies. - Ensure data compliance with regulatory requirements and internal policies. - Work closely with MDM Leads and stakeholders to drive data governance initiatives. - Develop and implement data quality metrics and reporting mechanisms. - Monitor data quality and identify areas for improvement. - Implement data quality controls and validation rules. - Track and manage MDM objects, ensuring timely delivery and adherence to project timelines. - Participate in daily stand-ups, issue tracking, and dashboard updates. - Collaborate with cross-functional teams, including functional consultants, developers, and business stakeholders. - Identify risks and process improvements for MDM. - Conduct training sessions for teams on S/4HANA MDM best practices and processes. - Develop and maintain training materials and documentation. Required Skills & Qualifications : - 7-10 years of experience in SAP Master Data Management (MDM). - Strong knowledge of SAP S/4HANA, Data Migration, and Rollouts. - Expertise in data governance, lifecycle management, and compliance. - Experience in defining data management principles, policies, and lifecycle strategies. - Ability to monitor data quality with consistent metrics and reporting. - Familiarity with KANBAN boards, ticketing tools, and dashboards. - Strong problem-solving and communication skills. - Ability to track and manage MDM objects, ensuring timely delivery. Preferred Skills : - Experience in training teams on MDM best practices. Automation & Productivity Tools : - Knowledge of automation and productivity improvement tools. - Familiarity with ABAP and SQL. - Experience with SAP Data Services or other data migration tools.

Posted 3 weeks ago

Apply

7.0 - 10.0 years

9 - 12 Lacs

Lucknow

Remote

Key Responsibilities : - Own and manage all Master Data Management (MDM) activities for SAP projects, ensuring alignment with business objectives. - Develop and implement comprehensive MDM strategies and roadmaps. - Lead the design and implementation of data governance frameworks and processes. - Lead data migration and cutover activities in SAP S/4HANA projects, including Greenfield implementations, system migrations, and rollouts. - Develop and execute data migration plans, ensuring data accuracy and consistency. - Manage data cleansing, transformation, and validation processes. - Establish and implement MDM best practices and data management capabilities. - Define and enforce data management principles, policies, and lifecycle strategies. - Ensure data compliance with regulatory requirements and internal policies. - Work closely with MDM Leads and stakeholders to drive data governance initiatives. - Develop and implement data quality metrics and reporting mechanisms. - Monitor data quality and identify areas for improvement. - Implement data quality controls and validation rules. - Track and manage MDM objects, ensuring timely delivery and adherence to project timelines. - Participate in daily stand-ups, issue tracking, and dashboard updates. - Collaborate with cross-functional teams, including functional consultants, developers, and business stakeholders. - Identify risks and process improvements for MDM. - Conduct training sessions for teams on S/4HANA MDM best practices and processes. - Develop and maintain training materials and documentation. Required Skills & Qualifications : - 7-10 years of experience in SAP Master Data Management (MDM). - Strong knowledge of SAP S/4HANA, Data Migration, and Rollouts. - Expertise in data governance, lifecycle management, and compliance. - Experience in defining data management principles, policies, and lifecycle strategies. - Ability to monitor data quality with consistent metrics and reporting. - Familiarity with KANBAN boards, ticketing tools, and dashboards. - Strong problem-solving and communication skills. - Ability to track and manage MDM objects, ensuring timely delivery. Preferred Skills : - Experience in training teams on MDM best practices. Automation & Productivity Tools : - Knowledge of automation and productivity improvement tools. - Familiarity with ABAP and SQL. - Experience with SAP Data Services or other data migration tools.

Posted 3 weeks ago

Apply

2.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Data Quality and Governance Analyst 2 JLL/Technologies About JLL Technologies JLL Technologies is a specialized group within JLL. We deliver unparalleled digital advisory, implementation, and services solutions to organizations globally. We provide best-in-class technologies to bring digital ambitions to life aligning technology, people and processes. Our goal is to leverage technology to increase the value and liquidity of the world's buildings, while enhancing the productivity and the happiness of those that occupy them. What this job involves: About the role This role will maintain and develop the data governance and management function for the account supporting the regional context, in tight cooperation with the global structures. This role will be responsible for building a solid and long-term data governance program aimed at reducing organizational risks coming from poor quality of real estate data. This position will report to the APAC Regional Data Governance Lead, and functionally will be embedded in the Account structures. Specifically, this position will involve: Following the guidance from the broader Data Governance team, development and maintenance of the Client Account level Data Governance policies, processes, and data standards Establishment and development of Client Account data governance and management framework across service lines to ensure the consistency and integrity of the data Data profiling followed by defining and maintenance of compliance, usage, and quality metrics Oversight of Master Data Management for critical data elements - the key repository involving gatekeeping of standards, and promoting the procedures Daily business communication and relationship building with multiple stakeholders, both internal and external, to solve ongoing data issues and democratize knowledge on data, including development of mature Data Stewardship program Creation and maintenance of the Data Dictionaries Facilitating meetings with data consumers, data stewards, data producers and other stakeholders to ensure data governance rules and standards are applied to data related changes/ projects Partnering with the IT, BI and service lines counterparts to understand the business context for data management activities Identifying areas for improvement and recommending solutions regarding data quality and processes Supporting the Account in knowledge management by establishing and documenting data flows and processes Sound like you Before you apply its worth knowing what we are looking for: Required Qualifications, Skills & Experience 2+ years of data governance / data management related experience in an enterprise environment across multiple application systems and business functions Strong experience in Client Relationship Management Experience in a role of business analyst or similar Familiarity with SQL and database management processes High level of attention to detail and accuracy and ability to make effective decisions and solve problems Knowledge of data lifecycle and maintenance processes Good understanding of data quality management (processes and concepts) Knowledge of data governance and how it impacts business processes Advanced skills in MS Excel, Word, and PowerPoint required Skills Relationship management skills: excellent listening and consultative capability, the ability to influence and negotiate with business and technology partners to drive change, and the ability to take a broad perspective and make key connections Strong ownership - responsibility paired with engagement Demonstrated information management, analytics and data profiling skills. Control, compliance, audit, risk management and relevant experience in risk identification, assessment, monitoring, and remediation will be helpful. Ability to establish and maintain a high level of customer trust and confidence in the overall information and analytics space. Excellent oral, written, and presentation communication skills. Strong negotiation and group facilitation skills; ability to move a process forward, while meeting the needs of a variety of clients. Ability to work with various levels of peers including analysts, developers and executives regarding complex business and data related issues. Consulting and organization skills. Demonstrated ability to facilitate mission-critical projects. Manage task timelines and deliverable schedules, recognize and share risks and roadblocks. Collaborative, imaginative, resourceful, reliable, technically savvy. Superior ability to manage, manipulate and analyze raw data, draw conclusions, and develop actionable recommendations using technology. Articulate the issues and resolutions via business-friendly communications. Serve as primary day-to-day contact for regional data management issues. Good understanding of data quality management. Knowledge of data governance and how it impacts business processes.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

25 - 35 Lacs

Pune, Chennai

Hybrid

Roles and Responsibilities Design and develop data models using Erwin/Data Modeling tools to meet business requirements. Collaborate with stakeholders to gather requirements and ensure data model aligns with business needs. Develop and maintain data governance policies, procedures, and standards for the organization. Provide guidance on best practices for data management and analytics initiatives. Participate in project planning, execution, and delivery of projects related to data modeling. Desired Candidate Profile 7-12 years of experience in Data Modeling or a related field (e.g., database design). Strong understanding of Erwin/Data Modeling tools (e.g., Erwin/Snowflake) and ability to apply them effectively. Experience working with large datasets and developing complex data models for various industries (BFSI preferred). Excellent communication skills with ability to work collaboratively across multiple teams. Proven track record of delivering high-quality results under tight deadlines.

Posted 3 weeks ago

Apply

13.0 - 15.0 years

22 - 27 Lacs

Noida, Pune, Chennai

Work from Office

Technical Architect and lead should have expertise in Azure Data Services, Databricks, ETL/ELT processes, data quality frameworks, and data governance. Skilled in data visualization using Qlik and Power BI.

Posted 3 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Your Impact: Good Logical and Analytical skills. Clear verbal and written communication capability.Strong communication skills. Have the ability to present and explain content to team, users and stakeholder.Experience in Pre sales and sales domain would be advantage . What the role offers: Excellent knowledge of Enterprise Content Management and Governance Domain (Data Discovery / Analysis / Classification and Management). Experience on Sensitive information management or PII discovery on unstructured data. (GDPR). Strong experience on TRIM/HPRM/HP Records Manager/Opentext Content Manager Deployment, customization and Upgrade. Experience in Data Governance, Metadata Management and Data Quality frameworks using TRIM/HPRM/Records Manager/Opentext Content Manager. Strong experience in leading the end-to-end design, Architecture, and implementation of ECM (Enterprise Content Management) solutions. Strong experience on defining and implementing Master Data Governance Process including Data Quality, Metadata Management, and Business ownership. Experience in managing extremely large records sets and hundreds of users across global sites. Experience with Enterprise Content Management cloud solution and migration. Experienced on TRIM/Records Manager/Content Manager integration with third party applications like SharePoint, Iron Mountain, O365 etc. Strong knowledge of Content Manager Security, audit configurations and workflow. Hands on Experience on SQL Database queries. Microsoft C#.NET (Programming language C#) Web Services development Web App development Windows Scheduler development Content Manager/Records Manager with .NET SDK (programming) Used in above mentioned Web Services and Windows Scheduler Custom Add-Ins development. Troubleshoot problems as necessary. What you need to Suceed : Experience on Capture, Imaging and Scanning products and technologies. Hands On experience on ECM products like Opentext xECM, Content Server. Cloud Certification Knowledge of Operation system (Win/Unix) and basic networking.

Posted 3 weeks ago

Apply

7.0 - 12.0 years

12 - 22 Lacs

Chennai

Hybrid

About Company: Papigen is a fast-growing global technology services company, delivering innovative digital solutions through deep industry experience and cutting-edge expertise. We specialize in technology transformation, enterprise modernization, and dynamic areas like Cloud, Big Data, Java, React, DevOps, and more. Our client-centric approach combines consulting, engineering, and data science to help businesses evolve and scale efficiently. About the Role: We are seeking a Senior Data Engineer to join our growing cloud data team. In this role, you will design and implement scalable data pipelines and ETL processes using Azure Databricks , Azure Data Factory , PySpark , and Spark SQL . Youll work with cross-functional teams to develop high-quality, secure, and efficient data solutions in a data lakehouse architecture on Azure. Key Responsibilities: Design, develop, and optimize scalable data pipelines using Databricks , ADF , PySpark , Spark SQL , and Python Build robust ETL workflows to transform and load data into a lakehouse architecture on Azure Ensure data quality, security, and compliance with data governance and privacy standards Collaborate with stakeholders to gather business requirements and deliver technical data solutions Create and maintain technical documentation for workflows, architecture, and data models Work within an Agile environment and track tasks using tools like Azure DevOps Required Skills & Experience: 8+ years of experience in data engineering and enterprise data platform development Proven expertise in Azure Databricks , Azure Data Factory , PySpark , and Spark SQL Strong understanding of Data Warehouses , Data Marts , and Operational Data Stores Proficient in writing complex SQL / PL-SQL queries and understanding data models and data lineage Knowledge of data management best practices: data quality , lineage , metadata , reference/master data Experience working in Agile teams with tools like Azure DevOps Strong problem-solving skills, attention to detail, and the ability to multi-task effectively Excellent communication skills for interacting with both technical and business teams Benefits and Perks: Opportunity to work with leading global clients Exposure to modern technology stacks and tools Supportive and collaborative team environment Continuous learning and career development opportunities

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies