Jobs
Interviews

2193 Data Governance Jobs - Page 41

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Workday Advanced Reporting Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : Looking for a Workday Time Tracking resource with experience and certifications in Workday Absence Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. Your typical day will involve collaborating with stakeholders to understand their needs and translating them into functional design solutions. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead design discussions and provide innovative solutions- Conduct regular code reviews and ensure best practices are followed Professional & Technical Skills: - Must To Have Skills: Proficiency in Workday Advanced Reporting, Looking for a Workday Time Tracking resource with experience and certifications in Workday Absence- Strong understanding of Workday reporting functionalities- Experience in designing and implementing complex Workday reports- Knowledge of Workday security and data governance- Ability to troubleshoot and resolve issues related to Workday reporting Additional Information:- The candidate should have a minimum of 5 years of experience in Workday Advanced Reporting- This position is based at our Hyderabad office- A Workday Time Tracking resource with experience and certifications in Workday Absence is required Qualification Looking for a Workday Time Tracking resource with experience and certifications in Workday Absence

Posted 3 weeks ago

Apply

3.0 - 5.0 years

6 - 10 Lacs

Pune

Work from Office

Job Summary As an Associate Manager in Spend Analytics and Project Management, you will be responsible for leading the design, development, and implementation of AI/ML-powered procurement and analytics solutions. You will be working closely with cross-functional teams to conceptualize and deploy platforms that identify cost-saving opportunities, enhance supplier management, and deliver business intelligence to enterprise clients. Roles & Responsibilities Lead end-to-end project management of spend analytics and procurement automation solutions. Implement AI-driven sourcing and savings assessment engines across multiple spend categories including IT, Temp Labor, and Travel. Drive the architecture of GenAI-integrated platforms for PO-Contract matching and compliance monitoring. Build and deliver business cases, custom demos, and POCs for prospective clients during pre-sales cycles. Collaborate with clients to understand pain points and tailor BI dashboards and tools that drive actionable insights. Drive client success through continuous program governance, risk mitigation, and value realization. Mentor junior team members and lead multi-disciplinary teams for project execution and delivery. Qualification Professional & Technical Skills Must Have Skills : Project Management for Spend Analytics Expertise in Spend Analytics and NLP-powered classification tools Contract analytics, supplier clustering, and MDM frameworks Good to Have Skills : ML/NLP tools for text classification and anomaly detection Cloud platforms such as AWS and Databricks SQL/NoSQL and data governance frameworks

Posted 3 weeks ago

Apply

7.0 - 12.0 years

18 - 22 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Architect Project Role Description : Architects the data platform blueprint and implements the design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : GCP Dataflow Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : BTech Summary :As a Data Platform Architect, you will architect the data platform blueprint and implement the design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Your day will involve designing and implementing data platform components, ensuring seamless integration across systems and data models. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead data platform architecture design and implementation- Ensure seamless integration between data platform components- Provide guidance and support to Integration Architects and Data Architects Professional & Technical Skills: - Must To Have Skills: Proficiency in GCP Dataflow- Strong understanding of cloud data architecture- Experience with data modeling and data integration- Hands-on experience with data platform implementation- Knowledge of data governance and security practices Additional Information:- The candidate should have a minimum of 7.5 years of experience in GCP Dataflow- This position is based at our Bengaluru office- A BTech degree is required Qualification BTech

Posted 3 weeks ago

Apply

7.0 - 12.0 years

15 - 19 Lacs

Mumbai

Work from Office

Project Role : Program/Project Management Lead Project Role Description : Manage overall delivery of a program or project to achieve business outcomes. Define project scope and monitor execution of deliverables. Communicate across multiple stakeholders to manage expectations, issues and outcomes. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Program/Project Management Lead, you will oversee the delivery of a program or project to meet business objectives. You will define project scope, monitor deliverables, and communicate with various stakeholders to manage expectations and outcomes effectively. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Lead project planning and execution.- Manage project scope, budget, and timeline.- Identify and mitigate project risks.- Ensure project deliverables meet quality standards. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Master Data Governance MDG Tool.- Strong understanding of data governance principles.- Experience in managing large-scale projects.- Excellent communication and stakeholder management skills.- Good To Have Skills: Experience with SAP ERP systems. Additional Information:- The candidate should have a minimum of 7.5 years of experience in SAP Master Data Governance MDG Tool.- This position is based at our Mumbai office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 3 weeks ago

Apply

7.0 - 12.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop solutions and ensure applications align with business needs. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Conduct regular team meetings to discuss progress and challenges- Stay updated on industry trends and technologies to enhance team performance Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Master Data Governance MDG Tool- Strong understanding of data governance principles- Experience in data modeling and data quality management- Knowledge of SAP ERP systems and integration with MDG Tool- Hands-on experience in configuring and customizing MDG Tool functionalities Additional Information:- The candidate should have a minimum of 7.5 years of experience in SAP Master Data Governance MDG Tool- This position is based at our Hyderabad office- A 15 years full-time education is required Qualification 15 years full time education

Posted 3 weeks ago

Apply

5.0 - 10.0 years

10 - 14 Lacs

Pune

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the effort to design, build, and configure applications- Act as the primary point of contact- Manage the team and ensure successful project delivery Professional & Technical Skills: - Must To Have Skills: Technical Proficiency in SAP Master Data Governance MDG Tool- Strong understanding of data governance principles and best practices- Experience in implementing and configuring SAP MDG Tool- Knowledge of data modeling and data integration concepts- Hands-on experience in data quality management and data cleansing techniques Additional Information:- The candidate should have a minimum of 5 years of experience in SAP Master Data Governance MDG Tool- This position is based in Mumbai- A 15 years full-time education is required Qualification 15 years full time education

Posted 3 weeks ago

Apply

5.0 - 10.0 years

35 - 45 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Hybrid

Write specifications for Master Data Management builds Create requirements, including rules of survivorship, for migrating data to Markit EDM implementation of data governance Support testing Develop data quality reports for the data warehouse Required Candidate profile 5+ years of experience documenting data management requirements Experience writing technical specifications for MDM builds Familiarity with enterprise data warehouse Knowledge of data governance

Posted 3 weeks ago

Apply

5.0 - 7.0 years

10 - 14 Lacs

Mumbai

Work from Office

Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.

Posted 3 weeks ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

Bengaluru

Remote

Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 3 weeks ago

Apply

8.0 - 12.0 years

12 - 16 Lacs

Hyderabad

Work from Office

At Storable , were on a mission to power the future of storage. Our innovative platform helps businesses manage, track, and grow their self-storage operations, and we re looking for a Data Manager to join our data-driven team. Storable is committed to leveraging cutting-edge technologies to improve the efficiency, accessibility, and insights derived from data, empowering our team to make smarter decisions and foster impactful growth. As a Data Manager, you will play a pivotal role in overseeing and shaping our data operations, ensuring that our data is organized, accessible, and effectively managed across the organization. You will lead a talented team, work closely with cross-functional teams, and drive the development of strategies to enhance data quality, availability, and security. Key Responsibilities: Lead Data Management Strategy: Define and execute the data management vision, strategy, and best practices, ensuring alignment with Storables business goals and objectives. Oversee Data Pipelines: Design, implement, and maintain scalable data pipelines using industry-standard tools to efficiently process and manage large-scale datasets. Ensure Data Quality & Governance: Implement data governance policies and frameworks to ensure data accuracy, consistency, and compliance across the organization. Manage Cross-Functional Collaboration: Partner with engineering, product, and business teams to make data accessible and actionable, and ensure it drives informed decision-making. Optimize Data Infrastructure: Leverage modern data tools and platforms (e.g., AWS, Apache Airflow, Apache Iceberg ) to create an efficient, reliable, and scalable data infrastructure. Monitor & Improve Performance: Proactively monitor data processes and workflows, troubleshoot issues, and optimize performance to ensure high reliability and data integrity. Mentorship & Leadership: Lead and develop a team of data engineers and analysts, fostering a collaborative environment where innovation and continuous improvement are valued. Qualifications: Proven Expertise in Data Management: Significant experience in managing data infrastructure, data governance, and optimizing data pipelines at scale. Technical Proficiency: Strong hands-on experience with data tools and platforms such as Apache Airflow, Apache Iceberg, and AWS services (S3, Lambda, Redshift, Glue) . Data Pipeline Mastery: Familiarity with designing, implementing, and optimizing data pipelines and workflows in Python or other languages for data processing. Experience with Data Governance: Solid understanding of data privacy, quality control, and governance best practices. Leadership Skills: Ability to lead and mentor teams, influence stakeholders, and drive data initiatives across the organization. Analytical Mindset: Strong problem-solving abilities and a data-driven approach to improving business operations. Excellent Communication: Ability to communicate complex data concepts to both technical and non-technical stakeholders effectively. Bonus Points: Experience with visualization tools (e.g., Looker, Tableau ) and reporting frameworks to provide actionable insights. Why Storable Cutting-Edge Technology: Work with the latest tools and technologies to solve complex data challenges. Impactful Work: Join a dynamic and growing company where your work directly contributes to shaping the future of the storage industry. Collaborative Culture: Be part of a forward-thinking, inclusive environment where innovation and teamwork are at the core of everything we do. Career Growth: We believe in continuous learning and provide ample opportunities for personal and professional development.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

16 - 20 Lacs

Hyderabad

Work from Office

What You Will Do: As a Data Governance Architect at Kanerika, you will play a pivotal role in shaping and executing the enterprise data governance strategy. Your responsibilities include: 1. Strategy, Framework, and Governance Operating Model Develop and maintain enterprise-wide data governance strategies, standards, and policies. Align governance practices with business goals like regulatory compliance and analytics readiness. Define roles and responsibilities within the governance operating model. Drive governance maturity assessments and lead change management initiatives. 2. Stakeholder Alignment & Organizational Enablement Collaborate across IT, legal, business, and compliance teams to align governance priorities. Define stewardship models and create enablement, training, and communication programs. Conduct onboarding sessions and workshops to promote governance awareness. 3. Architecture Design for Data Governance Platforms Design scalable and modular data governance architecture. Evaluate tools like Microsoft Purview, Collibra, Alation, BigID, Informatica. Ensure integration with metadata, privacy, quality, and policy systems. 4. Microsoft Purview Solution Architecture Lead end-to-end implementation and management of Microsoft Purview. Configure RBAC, collections, metadata scanning, business glossary, and classification rules. Implement sensitivity labels, insider risk controls, retention, data map, and audit dashboards. 5. Metadata, Lineage & Glossary Management Architect metadata repositories and ingestion workflows. Ensure end-to-end lineage (ADF \u2192 Synapse \u2192 Power BI). Define governance over business glossary and approval workflows. 6. Data Classification, Access & Policy Management Define and enforce rules for data classification, access, retention, and sharing. Align with GDPR, HIPAA, CCPA, SOX regulations. Use Microsoft Purview and MIP for policy enforcement automation. 7. Data Quality Governance Define KPIs, validation rules, and remediation workflows for enterprise data quality. Design scalable quality frameworks integrated into data pipelines. 8. Compliance, Risk, and Audit Oversight Identify risks and define standards for compliance reporting and audits. Configure usage analytics, alerts, and dashboards for policy enforcement. 9. Automation & Integration Automate governance processes using PowerShell, Azure Functions, Logic Apps, REST APIs. Integrate governance tools with Azure Monitor, Synapse Link, Power BI, and third-party platforms. Requirements 1 5+ years in data governance and management. Expertise in Microsoft Purview, Informatica, and related platforms. Experience leading end-to-end governance initiatives. Strong understanding of metadata, lineage, policy management, and compliance regulations. Hands-on skills in Azure Data Factory, REST APIs, PowerShell, and governance architecture. Familiar with Agile methodologies and stakeholder communication. Benefits 1. Culture: Open Door Policy: Encourages open communication and accessibility to management. Open Office Floor Plan: Fosters a collaborative and interactive work environment. Flexible Working Hours: Allows employees to have flexibility in their work schedules. Employee Referral Bonus: Rewards employees for referring qualified candidates. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. 2. Inclusivity and Diversity: Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. Mandatory POSH training: Promotes a safe and respectful work environment. 3. Health Insurance and Wellness Benefits: GMC and Term Insurance: Offers medical coverage and financial protection. Health Insurance: Provides coverage for medical expenses. Disability Insurance: Offers financial support in case of disability. 4. Child Care & Parental Leave Benefits: Company-sponsored family events: Creates opportunities for employees and their families to bond. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. Family Medical Leave: Offers leave for employees to take care of family members medical needs. 5. Perks and Time-Off Benefits: Company-sponsored outings: Organizes recreational activities for employees. Gratuity: Provides a monetary benefit as a token of appreciation. Provident Fund: Helps employees save for retirement. Generous PTO: Offers more than the industry standard for paid time off. Paid sick days: Allows employees to take paid time off when they are unwell. Paid holidays: Gives employees paid time off for designated holidays. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. 6. Professional Development Benefits: L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. Mentorship Program: Offers guidance and support from experienced professionals. Job Training: Provides training to enhance job-related skills. Professional Certification Reimbursements: Assists employees in obtaining professional certifications.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

12 - 16 Lacs

Pune

Work from Office

I am sharing the job description (JD) for the Data Architect role. We are looking for someone who can join as soon as possible, and I have included a few key bullet points below. The ideal candidate should have between 6 to 8 years of experience, and I am flexible in terms of a strong profile. Job Description: Key Responsibilities The ideal profile should have a strong foundation in data concepts, design, and strategy, with the ability to work across diverse technologies in an agnostic manner. Transactional Database Architecture Design and implement high-performance, reliable, and scalable transactional database architectures. Collaborate with cross-functional teams to understand transactional data requirements and create solutions that ensure data consistency, integrity, and availability. Optimize database designs and recommend best practices and technology stacks. Oversee the management of entire transactional databases, including modernization and de-duplication initiatives. Data Lake Architecture Design and implement data lakes that consolidate data from disparate sources into a unified, scalable storage solution. Architect and deploy cloud-based or on-premises data lake infrastructure. Ensure self-service capabilities across the data engineering space for the business. Work closely with Data Engineers, Product Owners, and Business teams. Data Integration & Governance: Understand ingestion and orchestration strategies. Implement data sharing, data exchange, and assess data sensitivity and criticality to recommend appropriate designs. Basic understanding of data governance practices. Innovation Evaluate and implement new technologies, tools, and frameworks to improve data accessibility, performance, and scalability. Stay up-to-date with industry trends and best practices to continuously innovate and enhance the data architecture strategy. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 3 weeks ago

Apply

3.0 - 6.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Job Title: Data Governance Associate Location: Hyderabad Experience: 4-6 Years Key Responsibilities: 1. Data Quality and Master Data Management: Assist in the development and implementation of data quality frameworks and master data management processes. Monitor and report on data quality metrics, identifying areas for improvement and ensuring compliance with data governance standards. 2. Data Object Dictionary: Support the creation and maintenance of a comprehensive data object dictionary to ensure that data assets are well-documented and easily accessible. Collaborate with cross-functional teams to standardize terminology and enhance understanding of data objects across the organization. 3. Data Governance Support: Assist in the execution of data governance initiatives, including data stewardship and data lifecycle management. Participate in data governance meetings and contribute to the development of policies and procedures that promote data integrity and compliance. 4. Collaboration and Communication: Work closely with data owners, data stewards, and other stakeholders to ensure alignment on data governance practices. Communicate effectively with technical and non-technical teams to promote data governance awareness and best practices. Qualifications: Education: Bachelor s degree in Computer Science, Information Management, or a related field. Experience: 3 to 4 years of experience in data governance, data quality, and master data management. Familiarity with data object dictionaries and data documentation practices. Experience in monitoring and improving data quality metrics. Technical Skills: Basic proficiency in SQL for querying and extracting data from databases. Familiarity with data governance tools and platforms is a plus. Understanding of data integration techniques and tools. Soft Skills: Strong analytical and problem-solving skills with attention to detail. Excellent communication skills, with the ability to convey complex data concepts clearly. Ability to work collaboratively in a team environment and manage multiple tasks effectively. Preferred Qualifications: Experience in supporting data governance initiatives and projects. Knowledge of data quality principles and practices

Posted 3 weeks ago

Apply

6.0 - 8.0 years

12 - 16 Lacs

Pune

Work from Office

I am sharing the job description (JD) for the Data Architect role. We are looking for someone who can join as soon as possible, and I have included a few key bullet points below. The ideal candidate should have between 6 to 8 years of experience, and I am flexible in terms of a strong profile. Job Description: Key Responsibilities The ideal profile should have a strong foundation in data concepts, design, and strategy, with the ability to work across diverse technologies in an agnostic manner. Transactional Database Architecture Design and implement high-performance, reliable, and scalable transactional database architectures. Collaborate with cross-functional teams to understand transactional data requirements and create solutions that ensure data consistency, integrity, and availability. Optimize database designs and recommend best practices and technology stacks. Oversee the management of entire transactional databases, including modernization and de-duplication initiatives. Data Lake Architecture Design and implement data lakes that consolidate data from disparate sources into a unified, scalable storage solution. Architect and deploy cloud-based or on-premises data lake infrastructure. Ensure self-service capabilities across the data engineering space for the business. Work closely with Data Engineers, Product Owners, and Business teams. Data Integration & Governance: Understand ingestion and orchestration strategies. Implement data sharing, data exchange, and assess data sensitivity and criticality to recommend appropriate designs. Basic understanding of data governance practices. Innovation Evaluate and implement new technologies, tools, and frameworks to improve data accessibility, performance, and scalability. Stay up-to-date with industry trends and best practices to continuously innovate and enhance the data architecture strategy. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 3 weeks ago

Apply

10.0 - 14.0 years

45 - 55 Lacs

Bengaluru

Work from Office

As a Senior Engineering Manager - Myntra Data Platform, you will oversee the technical aspects of the data platform, driving innovation, and ensuring efficient data management processes. Your role will have a significant impact on the organizations data strategy and overall business objectives. Roles and Responsibilities: Lead and mentor a team of engineers to deliver high-quality data solutions. Develop and execute strategies for data platform scalability and performance optimization. Collaborate with cross-functional teams to align data platform initiatives with business goals. Define and implement best practices for data governance, security, and compliance. Drive continuous improvement through innovation and technological advancement. Monitor and analyze data platform metrics to identify areas for enhancement. Ensure seamless integration of new data sources and technologies into the platform. Qualifications: Bachelor s or Master s degree in Computer Science, Engineering, or related field. 10-14 years of experience in engineering roles with a focus on data management and analysis. Proven experience in leading high-performing engineering teams. Strong proficiency in data architecture, ETL processes, and database technologies. Excellent communication and collaboration skills to work effectively with stakeholders. Relevant certifications in data management or related fields are a plus. " Who are we Myntra is India s leading fashion and lifestyle platform, where technology meets creativity. As pioneers in fashion e-commerce, we ve always believed in disrupting the ordinary. We thrive on a shared passion for fashion, a drive to innovate to lead, and an environment that empowers each one of us to pave our own way. We re bold in our thinking, agile in our execution, and collaborative in spirit. Here, we create MAGIC by inspiring vibrant and joyous self-expression and expanding fashion possibilities for India, while staying true to what we believe in. We believe in taking bold bets and changing the fashion landscape of India. We are a company that is constantly evolving into newer and better forms and we look for people who are ready to evolve with us. From our humble beginnings as a customization company in 2007 to being technology and fashion pioneers today, Myntra is going places and we want you to take part in this journey with us. Working at Myntra is challenging but fun - we are a young and dynamic team, firm believers in meritocracy, believe in equal opportunity, encourage intellectual curiosity and empower our teams with the right tools, space, and opportunities.

Posted 3 weeks ago

Apply

9.0 - 14.0 years

0 - 0 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role & responsibilities Design and develop conceptual, logical, and physical data models for enterprise and application-level databases. Translate business requirements into well-structured data models that support analytics, reporting, and operational systems. Define and maintain data standards, naming conventions, and metadata for consistency across systems. Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes. Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements. Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships. Support data governance initiatives including data lineage, quality, and cataloging. Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements. Looking for a Data Modeler Architect to design conceptual, logical, and physical data models. Must translate business needs into scalable models for analytics and operational systems. Strong in normalization , denormalization , ERDs , and data governance practices. Experience in star/snowflake schemas and medallion architecture preferred. Role requires close collaboration with architects, engineers, and analysts. Data modelling, Normalization, Denormalization, Star and snowflake schemas, Medallion architecture, ERDLogical data model, Physical data model & Conceptual data model

Posted 3 weeks ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Chennai

Work from Office

Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

10 - 14 Lacs

Kolkata

Work from Office

Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.

Posted 3 weeks ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

Chennai

Work from Office

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

10 - 14 Lacs

Chennai

Work from Office

Job Title : Sr. Data Engineer Ontology & Knowledge Graph Specialist Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.

Posted 3 weeks ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.

Posted 3 weeks ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Kolkata

Work from Office

Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.

Posted 3 weeks ago

Apply

7.0 - 9.0 years

7 - 12 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

About KPI Partners. KPI Partners is a leading provider of enterprise software solutions, enabling organizations to harness the power of data through innovation and advanced analytics. Our team is dedicated to delivering high-quality services and products that transform the way businesses operate. We are seeking a seasoned Senior Report Developer – Tableau to drive and manage a strategic migration project from Tableau to Power BI . This role requires strong hands-on experience with Tableau dashboards, a deep understanding of Power BI, and the technical leadership to manage end-to-end migration with minimal disruption to business users. You will serve as the bridge between business stakeholders, BI developers, and project managers, ensuring a seamless transition while maintaining data integrity, performance, and user experience. Key Responsibilities: Lead the assessment, planning, and execution of Tableau-to-Power BI migration Analyze existing Tableau dashboards, data models, and data sources to define migration scope and approach Translate Tableau visualizations and logic (calculated fields, filters, LOD expressions, etc.) into Power BI equivalents Oversee data model optimization, dashboard redesign, and user experience improvements during migration Collaborate with business stakeholders to validate migrated reports and ensure alignment with reporting needs Manage a team of BI developers and provide technical direction and code reviews Ensure proper version control, documentation, and change management throughout the migration Establish performance benchmarks and implement best practices for Power BI deployment Conduct knowledge transfer and training sessions for end-users transitioning to Power BI Required Qualifications: 7+ years of experience in Business Intelligence and Analytics 5+ years of hands-on experience with Tableau development (dashboards, data blending, LODs, parameters) 2+ years of experience with Power BI , including DAX, Power Query, and data modeling Proven experience in BI migration projects (preferably Tableau to Power BI) Strong SQL skills and experience working with relational databases (SQL Server, Oracle, etc.) Solid understanding of data governance, security, and access controls Excellent communication, leadership, and stakeholder management skills Why Join KPI Partners? - Opportunity to work with a talented and passionate team in a fast-paced environment. - Competitive salary and benefits package. - Continuous learning and professional development opportunities. - A collaborative and inclusive workplace culture that values diversity and innovation. KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Join us at KPI Partners and help us unlock the power of data for our clients!

Posted 3 weeks ago

Apply

12.0 - 15.0 years

16 - 20 Lacs

Chennai

Work from Office

Group Company: MINDSPRINT DIGITAL (INDIA) PRIVATE LIMITED Designation: Data SolutionArchitect JobDescription: Design,architect, and implement scalable data solutions on Google Cloud Platform (GCP)to meet the strategic data needs of the organization. Leadthe integration of diverse data sources into a unified data platform, ensuringseamless data flow and accessibility across the organization. Developand enforce robust data governance, security, and compliance frameworkstailored to GCP's architecture. Collaboratewith cross-functional teams, including data engineers, data scientists, andbusiness stakeholders, to translate business requirements into technical datasolutions. Optimizedata storage, processing, and analytics solutions using GCP services such asBigQuery, Dataflow, and Cloud Storage. Drivethe adoption of best practices in data architecture and cloud computing toenhance the performance, reliability, and scalability of data solutions. Conductregular reviews and audits of the data architecture to ensure alignment withevolving business goals and technology advancements. Stayinformed about emerging GCP technologies and industry trends to continuouslyimprove data solutions and drive innovation. ProfileDescription: Experience:12-15 years of experience in data architecture, with extensive expertise inGoogle Cloud Platform (GCP). Skills:Deep understanding of GCP services including BigQuery, Dataflow, Pub/Sub, CloudStorage, and IAM. Proficiency in data modeling, ETL processes, and datawarehousing. Qualifications:Masters degree in Computer Science, Data Engineering, or a related field. Competencies:Strong leadership abilities, with a proven track record of managing large-scaledata projects. Ability to balance technical and business needs in designingdata solutions. Certifications:Google Cloud Professional Data Engineer or Professional Cloud Architectcertification preferred. Knowledge:Extensive knowledge of data governance, security best practices, and compliancein cloud environments. Familiarity with big data technologies like ApacheHadoop and Spark. SoftSkills: Excellent communication skills to work effectively with both technicalteams and business stakeholders. Ability to lead and mentor a team of dataengineers and architects. Tools:Experience with version control (Git), CI/CD pipelines, and automation tools.Proficient in SQL, Python, and data visualization tools like Looker or Power BI. Required abilities Physical: Other: Work Environment Details: Specific requirements Travel: Vehicle: Work Permit: Other details Pay Rate: Contract Types: Time Constraints: Compliance Related: Union Affiliation:

Posted 3 weeks ago

Apply

12.0 - 15.0 years

15 - 19 Lacs

Chennai

Work from Office

Group Company: MINDSPRINT DIGITAL (INDIA) PRIVATE LIMITED Designation: Data Solution Architect Job Description: Design, architect, and implement scalable data solutions on Google Cloud Platform (GCP) to meet the strategic data needs of the organization. Lead the integration of diverse data sources into a unified data platform, ensuring seamless data flow and accessibility across the organization. Develop and enforce robust data governance, security, and compliance frameworks tailored to GCP's architecture. Collaborate with cross-functional teams, including data engineers, data scientists, and business stakeholders, to translate business requirements into technical data solutions. Optimize data storage, processing, and analytics solutions using GCP services such as BigQuery, Dataflow, and Cloud Storage. Drive the adoption of best practices in data architecture and cloud computing to enhance the performance, reliability, and scalability of data solutions. Conduct regular reviews and audits of the data architecture to ensure alignment with evolving business goals and technology advancements. Stay informed about emerging GCP technologies and industry trends to continuously improve data solutions and drive innovation. Profile Description: Experience: 12-15 years of experience in data architecture, with extensive expertise in Google Cloud Platform (GCP). Skills: Deep understanding of GCP services including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and IAM. Proficiency in data modeling, ETL processes, and data warehousing. Qualifications: Masters degree in Computer Science, Data Engineering, or a related field. Competencies: Strong leadership abilities, with a proven track record of managing large-scale data projects. Ability to balance technical and business needs in designing data solutions. Certifications: Google Cloud Professional Data Engineer or Professional Cloud Architect certification preferred. Knowledge: Extensive knowledge of data governance, security best practices, and compliance in cloud environments. Familiarity with big data technologies like Apache Hadoop and Spark. Soft Skills: Excellent communication skills to work effectively with both technical teams and business stakeholders. Ability to lead and mentor a team of data engineers and architects. Tools: Experience with version control (Git), CI/CD pipelines, and automation tools. Proficient in SQL, Python, and data visualization tools like Looker or Power BI. Required abilities Physical: Other: Work Environment Details: Specific requirements Travel: Vehicle: Work Permit: Other details Pay Rate: Contract Types: Time Constraints: Compliance Related: Union Affiliation:

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies