Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 7.0 years
10 - 14 Lacs
Ahmedabad
Work from Office
Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 2 weeks ago
10.0 - 15.0 years
25 - 30 Lacs
Pune
Work from Office
we're seeking a future team member for the role of Vice President, Data Management Engineer I to join our Data Solution Platform team. This role is located in Pune, Maharashtra - HYBRID In this role, you'll make an impact in the following ways: The Data Governance Analyst collaborates with the Platform s Data Leader / DDO to drive data management within the Platform. This role oversees the implementation and enforcement of data governance policies and procedures, and works closely with stakeholders to define data standards and ensure regulatory compliance. Work under the direction of the data leader or manager to implement and enforce data governance policies and procedures. Identify, manage, and measure data risks. Participate in the Platforms data maturity assessment. Manage the identification and maintenance of authoritative data for the Platform including metadata. Adhere to the requirements of the data management policies. To be successful in this role, we're seeking the following: Bachelors degree or applicable work experience Eager to learn and lead data initiatives
Posted 2 weeks ago
4.0 - 9.0 years
4 - 9 Lacs
Pune, Chennai, Bengaluru
Work from Office
Roles and Responsibilities Collaborate with cross-functional teams to design, develop, test, deploy, and maintain Collibra DQ solutions. Ensure seamless integration of Collibra DQ with other systems using APIs. Provide technical guidance on data governance best practices to stakeholders. Troubleshoot issues related to Collibra DQ implementation and provide timely resolutions. Participate in agile development methodologies such as Scrum. Desired Candidate Profile 4-9 years of experience in Collibra Data Quality (DQ) development or similar roles. Strong understanding of SQL queries for data extraction and manipulation. Experience working with API integrations for system connectivity. Bachelor's degree in Any Specialization (BCA or B.Sc). Proficiency in Agilent tools for testing purposes.
Posted 2 weeks ago
8.0 - 13.0 years
14 - 18 Lacs
Hyderabad
Work from Office
Overview Customer Data Stewardship Sr Analyst (IBP) Job Overview PepsiCo Data Governance Program OverviewPepsiCo is establishing a Data Governance program that will be the custodian of the processes, policies, rules and standards by which the Company will define its most critical data. Enabling this program will - Define ownership and accountability of our critical data assets to ensure they are effectively managed and maintain integrity throughout PepsiCos systems - Leverage data as a strategic enterprise asset enabling data-based decision analytics - Improve productivity and efficiency of daily business operations Position OverviewThe Customer Data Steward IBP role is responsible for working within the global data governance team and with their local businesses to maintain alignment to the Enterprise Data Governance's (EDG) processes, rules and standards set to ensure data is fit for purpose. Responsibilities Responsibilities Primary Accountabilities - Deliver key elements of Data Discovery, Source Identification, Data Quality Management, cataloging for program & Customer Domain data. - Ensure data accuracy and adherence to PepsiCo defined global governance practices, as well as driving acceptance of PepsiCo's enterprise data standards and policies across the various business segments. - Maintain and advise relevant stakeholders on data governance-related matters in the customer domain and with respect to Demand Planning in IBP, with a focus on the business use of the data - Define Data Quality Rules from Source systems, within the Enterprise Data Foundation and through to the End User systems to enable End to End Data Quality management to deliver a seamless user experience. - Advise on various projects and initiatives to ensure that any data related changes and dependencies are identified , communicated and managed to ensure adherence with the Enterprise Data Governance established standards. - Accountable for ensuring that data-centric activities are aligned with the EDG program and leverage applicable data standards, governance processes, and overall best practices. Data Governance Business Standards - Ensures alignment of the data governance processes and standards with applicable enterprise, business segment, and local data support models. - Champions the single set of Enterprise-level data standards & repository of key elements pertaining to their in-scope data domain ( e.g Customer, Material, Vendor, Finance, Consumer) and promoting their use throughout the PepsiCo organization. - Advise on various projects and initiatives to ensure that any data related changes and dependencies are identified , communicated and managed to ensure adherence with the Enterprise Data Governance established standards. Data Domain Coordination and Collaboration - Responsible for helping identify the need for sector-level data standards (and above) based on strategic business objectives and the evolution of enterprise-level capabilities and analytical requirements. - Collaborates across the organization to ensure consistent and effective execution of data governance and management principles across PepsiCo's enterprise and analytical systems and data domains. - Accountable for driving organizational acceptance of EDG established data standards, policies, and definitions and process standards for critical / related enterprise data. - Promotes and champions PepsiCo's Enterprise Data Governance Capability and data management program across the organization. Qualifications Qualifications 8+ years of experience working in Customer Operations, Demand Planning, Order to Cash, Commercial Data Governance or Data Management within a global CPG.
Posted 2 weeks ago
6.0 - 11.0 years
25 - 27 Lacs
Hyderabad
Work from Office
Overview We are seeking a highly skilled and experienced Azure Data Engineer to join our dynamic team. In this critical role, you will be responsible for designing, developing, and maintaining robust and scalable data solutions on the Microsoft Azure platform. You will work closely with data scientists, analysts, and business stakeholders to translate business requirements into effective data pipelines and data models. Responsibilities Design, develop, and implement data pipelines and ETL/ELT processes using Azure Data Factory, Azure Databricks, and other relevant Azure services. Develop and maintain data lakes and data warehouses on Azure, including Azure Data Lake Storage Gen2 and Azure Synapse Analytics. Build and optimize data models for data warehousing, data marts, and data lakes. Develop and implement data quality checks and data governance processes. Troubleshoot and resolve data-related issues. Collaborate with data scientists and analysts to support data exploration and analysis. Stay current with the latest advancements in cloud computing and data engineering technologies. Participate in all phases of the software development lifecycle, from requirements gathering to deployment and maintenance Qualifications 6+ years of experience in data engineering, with at least 3 years of experience working with Azure data services. Strong proficiency in SQL, Python, and other relevant programming languages. Experience with data warehousing and data lake architectures. Experience with ETL/ELT tools and technologies, such as Azure Data Factory, Azure Databricks, and Apache Spark. Experience with data modeling and data warehousing concepts. Experience with data quality and data governance best practices. Strong analytical and problem-solving skills. Excellent communication and collaboration skills. Experience with Agile development methodologies. Bachelor's degree in Computer Science, Engineering, or a related field (Master's degree preferred). Relevant Azure certifications (e.g., Azure Data Engineer Associate) are a plus
Posted 2 weeks ago
12.0 - 17.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Overview We are seeking an experienced and strategic leader to join our Business Intelligence & Reporting organization as Deputy Director BI Governance. This role will lead the design, implementation, and ongoing management of BI governance frameworks across sectors and capability centres. The ideal candidate will bring deep expertise in BI governance, data stewardship, demand management, and stakeholder engagement to ensure a standardized, scalable, and value-driven BI ecosystem across the enterprise. Responsibilities Key Responsibilities Governance Leadership Define and implement the enterprise BI governance strategy, policies, and operating model. Drive consistent governance processes across sectors and global capability centers. Set standards for BI solution lifecycle, metadata management, report rationalization, and data access controls. Stakeholder Management Serve as a trusted partner to sector business leaders, IT, data stewards, and COEs to ensure alignment with business priorities. Lead governance councils, working groups, and decision forums to drive adoption and compliance. Policy and Compliance Establish and enforce policies related to report publishing rights, tool usage, naming conventions, and version control. Implement approval and exception processes for BI development outside the COE. Demand and Intake Governance Lead the governance of BI demand intake and prioritization processes. Ensure transparency and traceability of BI requests and outcomes across business units. Metrics and Continuous Improvement Define KPIs and dashboards to monitor BI governance maturity and compliance. Identify areas for process optimization and lead continuous improvement efforts. Qualifications Experience12+ years in Business Intelligence, Data Governance, or related roles, with at least 4+ years in a leadership capacity. Domain ExpertiseStrong understanding of BI platforms (Power BI, Tableau, etc.), data management practices, and governance frameworks Strategic MindsetProven ability to drive change, influence at senior levels, and align governance initiatives with enterprise goals. Operational ExcellenceExperience managing cross-functional governance processes and balancing centralized control with local flexibility. EducationBachelor's degree required; MBA or Masters in Data/Analytics preferred.
Posted 2 weeks ago
0.0 years
25 - 30 Lacs
Hyderabad
Work from Office
Overview The MDG Master Data Harmonization Senior Manager is a key contributor in Sustaining , developing, and maintaining PGT SAP master data solution. This role combines technical expertise with a deep understanding of Master Data processes to create robust, scalable, and efficient systems that enable data-driven decision-making. The ideal candidate will excel in master data harmonization, stakeholder collaboration, and aligning technical implementations with strategic business goals. Responsibilities Sustain, Design and Maintain Harmonised SAP PGT Master Data Develop, troubleshoot and maintain robust SAP Master Data, including business partner (Customer and Vendors), Materials and Finance master data Collaborate with stakeholders to design and implement scalable, future-proof solutions that meet business requirements. Support Master Data Harmonisation Reports Engage with business teams highlight the data differences across the landscapes., create a synchronization plan for master data , gather requirements, and translate them into effective technical designs. Provide advisory support to harmonise master data processes Ensure Master Data is optimized for better System Performance Ensure the stability and performance of SAP Master data, performing optimization and tuning to handle growing data and user demands efficiently. Data Integration and Automation Manage data flows between PGT SAP and other systems, automating processes for data loading, transformation, and reconciliation. Governance and Standards Implement best practices for data governance, model development, documentation, and version control to maintain system reliability and accuracy. Stakeholder Collaboration and Communication Act as a liaison between technical teams and business stakeholders, translating complex technical solutions into clear, actionable outcomes for non-technical users. Training and Support Deliver training and support to finance teams, empowering them to leverage TM1 solutions effectively for business insights. Qualifications Technical Expertise Advanced proficiency in SAP Master Data processes, MDG and supporting tools Strong knowledge of data modeling, relational databases, and ETL processes. Familiarity with data harmonization add-ins (e.g., GDQ sustain reports and integration with other tools (e.g., ERP systems, visualization tools). SAP Process Knowledge Solid understanding of OTC, MTD, R2R processes and DEPENDENCIES OF MASTER DATA Experience designing solutions aligned with SAP MDG-driven priorities and goals. Solution Design and Advisory Skills: Expertise in analyzing business requirements and providing innovative, strategic solutions. Ability to design architecture for scalability, reliability, and future growth.
Posted 2 weeks ago
12.0 - 13.0 years
15 - 19 Lacs
Hyderabad
Work from Office
Overview PepsiCo is embarking on a significant initiative of digitalization and standardization of the FP&A solution across all its divisions to make the finance organization more Capable, more Agile, and more Efficient. The MOSAIC program is a key enabler of that vision. It is the FP&A solution of the PepsiCo Global Template (PGT) that, for the first time, aims to integrate vertical planning for Operating Units (OUs) or markets, and horizontal planning for functions (e.g., Global Procurement, Compensation and Benefits, etc.) that have accountability across markets. The program aims to harmonize data, planning processes and ways of working across PepsiCo market. The Finance Application Developer / Architect (TM1) is a key contributor in designing, developing, and maintaining financial planning and analytics solutions using IBM Planning Analytics (TM1). This role combines technical expertise with a deep understanding of finance processes to create robust, scalable, and efficient systems that enable data-driven decision-making. The ideal candidate will excel in solution design, stakeholder collaboration, and aligning technical implementations with strategic business goals. Responsibilities Design, Enhance and Maintain Mosaic Solution Develop, troubleshoot and maintain robust TM1/Planning Analytics applications, including cubes, rules, and TurboIntegrator (TI) processes, to support financial planning, forecasting, and reporting. Collaborate with stakeholders to design and implement scalable, future-proof solutions that meet business requirements. Business Incident Triage Engage with finance and business teams to understand objectives, gather requirements, and translate them into effective technical designs. Provide advisory support to optimize financial processes and restore the solution Optimize System Performance Ensure the stability and performance of TM1 models, performing optimization and tuning to handle growing data and user demands efficiently. Data Integration and Automation Manage data flows between TM1 and other systems, automating processes for data loading, transformation, and reconciliation. Governance and Standards Implement best practices for data governance, model development, documentation, and version control to maintain system reliability and accuracy. Training and Support Deliver training and support to finance teams, empowering them to leverage TM1 solutions effectively for business insights. Qualifications Bachelors degree required. Masters degree preferred. 12-13+ years of experience configuring, deploying and managing TM1 (Preferred) or SAP based Financial Planning & Analysis solution with a focus on Topline Planning.
Posted 2 weeks ago
5.0 - 10.0 years
19 - 25 Lacs
Hyderabad
Work from Office
Overview Seeking an Associate Manager, Data Operations, to support our growing data organization. In this role, you will assist in maintaining data pipelines and corresponding platforms (on-prem and cloud) while working closely with global teams on DataOps initiatives. Support the day-to-day operations of data pipelines, ensuring data governance, reliability, and performance optimization on Microsoft Azure. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and real-time streaming architectures is preferred. Assist in ensuring the availability, scalability, automation, and governance of enterprise data pipelines supporting analytics, AI/ML, and business intelligence. Contribute to DataOps programs, aligning with business objectives, data governance standards, and enterprise data strategy. Help implement real-time data observability, monitoring, and automation frameworks to improve data reliability, quality, and operational efficiency. Support the development of governance models and execution roadmaps to enhance efficiency across Azure, AWS, GCP, and on-prem environments. Work on CI/CD integration, data pipeline automation, and self-healing capabilities to improve enterprise-wide DataOps processes. Collaborate with cross-functional teams to support and maintain next-generation Data & Analytics platforms while promoting an agile and high-performing DataOps culture. Assist in the adoption of Data & Analytics technology transformations, ensuring automation for proactive issue identification and resolution. Partner with cross-functional teams to support process improvements, best practices, and operational efficiencies within DataOps. Responsibilities Assist in the implementation and optimization of enterprise-scale data pipelines using Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, and Azure Stream Analytics. Support data ingestion, transformation, orchestration, and storage workflows, ensuring data reliability, integrity, and availability. Help ensure seamless batch, real-time, and streaming data processing, focusing on high availability and fault tolerance. Contribute to DataOps automation efforts, including CI/CD for data pipelines, automated testing, and version control using Azure DevOps and Terraform. Collaborate with Data Engineering, Analytics, AI/ML, CloudOps, and Business Intelligence teams to support data-driven decision-making. Assist in aligning DataOps practices with regulatory and security requirements by working with IT, data stewards, and compliance teams. Support data operations and sustainment activities, including testing and monitoring processes for global products and projects. Participate in data capture, storage, integration, governance, and analytics efforts, working alongside cross-functional teams. Assist in managing day-to-day DataOps activities, ensuring adherence to service-level agreements (SLAs) and business requirements. Engage with SMEs and business stakeholders to ensure data platform capabilities align with business needs. Contribute to Agile work intake and execution processes, helping to maintain efficiency in data platform teams. Help troubleshoot and resolve issues related to cloud infrastructure and data services in collaboration with technical teams. Support the development and automation of operational policies and procedures, improving efficiency and resilience. Assist in incident response and root cause analysis, contributing to self-healing mechanisms and mitigation strategies. Foster a customer-centric approach, advocating for operational excellence and continuous improvement in service delivery. Help build a collaborative, high-performing team culture, promoting automation and efficiency within DataOps. Adapt to shifting priorities and support cross-functional teams in maintaining productivity and achieving business goals. Utilize technical expertise in cloud and data operations to support service reliability and scalability. Qualifications 5+ years of technology work experience in a large-scale global organization, with CPG industry experience preferred. 5+ years of experience in Data & Analytics roles, with hands-on expertise in data operations and governance. 2+ years of experience working within a cross-functional IT organization, collaborating with multiple teams. Experience in a lead or senior support role, with a focus on DataOps execution and delivery. Strong communication skills, with the ability to collaborate with stakeholders and articulate technical concepts to non-technical audiences. Analytical and problem-solving abilities, with a focus on prioritizing customer needs and operational improvements. Customer-focused mindset, ensuring high-quality service delivery and operational efficiency. Growth mindset, with a willingness to learn and adapt to new technologies and methodologies in a fast-paced environment. Experience supporting data operations in a Microsoft Azure environment, including data pipeline automation. Familiarity with Site Reliability Engineering (SRE) principles, such as monitoring, automated issue remediation, and scalability improvements. Understanding of operational excellence in complex, high-availability data environments. Ability to collaborate across teams, building strong relationships with business and IT stakeholders. Basic understanding of data management concepts, including master data management, data governance, and analytics. Knowledge of data acquisition, data catalogs, data standards, and data management tools. Strong execution and organizational skills, with the ability to follow through on operational plans and drive measurable results. Adaptability in a dynamic, fast-paced environment, with the ability to shift priorities while maintaining productivity.
Posted 2 weeks ago
8.0 - 13.0 years
15 - 19 Lacs
Hyderabad
Work from Office
Overview The Master data workflow Specialist will be a key contributor to the functional designs, configuration, and implementation of SAP workflow processes to manage SAP master data. This specialist, working with the S4 Functional Leads, will be responsible to remediate the data maintenance tool (workflow solution, LSMW, Winshuttle etc ) to ensure seamless transition to S4 from ECC Responsibilities This SAP data Maintenace tool expert is responsible for remediating the exiting tools to work seamlessly in S4 Lead delivery/remediation of SAP workflow solutions for Business Partners Establish design patterns for existing data maintenance tools that enable reuse across multiple markets Consult with architecture resources, data conversion team, process teams, governance team to redesign the data tools Partner with the Data capability delivery teams (i.e. Conversion & Readiness, Master Data Governance) to ensure that data design changes are incorporated Ability to understand complex functional and IT requirements, and be able to identify and offer multiple solution options to facilitate the best outcome Ability to quickly adapt to changes in timelines and sequences, deal with ambiguity, and succeed in a high-pressure environment Qualifications 8+ years functional design, delivery, and sustain of SAP ERP (ECC) Business Workflow solutions with a focus on master data - Material, Customer, Vendor and/or Finance Experience delivering global workflow solutions across mulitple PepsiCo businesses and geographies Deep functional and technical experience architecting, designing and delivering complex, re-usable business process workflow solutions with particular emphasis on employing flexible, configured solutions Experience with SAP workflow/BRF integration and analyst coded business rules
Posted 2 weeks ago
10.0 - 15.0 years
4 - 8 Lacs
Noida
Work from Office
Highly skilled and experienced Data Modeler to join Enterprise Data Modelling team. The candidate will be responsible for creating and maintaining conceptual logical and physical data models ensuring alignment with industry best practices and standards. Working closely with business and functional teams the Data Modeler will play a pivotal role in standardizing data models at portfolio and domain levels driving efficiencies and maximizing the value of clients data assets. Preference will be given to candidates with prior experience within an Enterprise Data Modeling team. The ideal domain experience would be Insurance or Investment Banking. Roles and Responsibilities: Develop comprehensive conceptual logical and physical data models for multiple domains within the organization leveraging industry best practices and standards. Collaborate with business and functional teams to understand their data requirements and translate them into effective data models that support their strategic objectives. Serve as a subject matter expert in data modeling tools such as ERwin Data Modeler providing guidance and support to other team members and stakeholders. Establish and maintain standardized data models across portfolios and domains ensuring consistency governance and alignment with organizational objectives. Identify opportunities to optimize existing data models and enhance data utilization particularly in critical areas such as fraud banking AML. Provide consulting services to internal groups on data modeling tool usage administration and issue resolution promoting seamless data flow and application connections. Develop and deliver training content and support materials for data models ensuring that stakeholders have the necessary resources to understand and utilize them effectively. Collaborate with the enterprise data modeling group to develop and implement a robust governance framework and metrics for model standardization with a focus on longterm automated monitoring solutions. Qualifications: Bachelors or masters degree in computer science Information Systems or a related field. 10 years of experience working as a Data Modeler or in a similar role preferably within a large enterprise environment. Expertise in data modeling concepts and methodologies with demonstrated proficiency in creating conceptual logical and physical data models. Handson experience with data modeling tools such as Erwin Data Modeler as well as proficiency in database environments such as Snowflake and Netezza. Strong analytical and problemsolving skills with the ability to understand complex data requirements and translate them into effective data models. Excellent communication and collaboration skills with the ability to work effectively with crossfunctional teams and stakeholders. problem-solving skills,business intelligence platforms,erwin,data modeling,database management systems,data warehousing,etl processes,big data technologies,agile methodologies,data governance,sql,enterprise data modelling,data visualization tools,cloud data services,analytical skills,data modelling tool,,data architecture,communication skills
Posted 2 weeks ago
3.0 - 8.0 years
3 - 6 Lacs
Bengaluru
Hybrid
KEY RESPONSIBILITIES : Maintain and update material codes in the IFS ERP system, ensuring accurate part numbers, descriptions, and categorization. Conducting research and analysis of technical information provided by requesters to specify and register materials, ensuring compliance with established procedures and standards. Responding to tickets related to material registration and purchasing issues, supporting customer areas by updating data, and ensuring the correct equipment is purchased for maintenance and operations Analyzing TAGs and BOM using official company documents to ensure that equipment and its spare parts are properly registered according to the exploded view of the drawings. Analyze duplicate and obsolete materials to Identify opportunities for cost savings, inventory reduction, and process improvements, enabling more efficient material management. Collaborate with the procurement and inventory management teams to ensure parts data aligns with inventory levels and procurement needs. Implement and oversee data governance policies for part management to ensure data integrity and consistency across the organization. Support the load of parts related to new assets. JOB REQUIREMENTS Bachelors degree in Engineering, Data Management, related field. Proven experience working with ERP systems, with a strong understanding of data management principles and practices. Excellent attention to detail and a strong commitment to data accuracy and integrity. Strong analytical and problem-solving skills, with the ability to manage multiple tasks and projects simultaneously. Excellent communication and interpersonal skills, with the ability to work effectively with stakeholders at all levels of the organization. Autonomy to search for data on engineering vendor documentation. Experience in parts management, procurement, or inventory management. Effective time management and crisis resolution skills, with the ability to respond to urgent situations and meet tight deadlines. SW or tools: Knowledge of Advanced English (required) Excel and PowerPoint (required) VBA IFS (differential) SQL (differential) Power BI (differential) Professional experience in the Material cataloging / BOM analysis area (differential) Experience 3+ years of experience in data management or data governance roles, preferably working with IFS ERP. Experience in managing master data, metadata, and data structures in an ERP environment. Proven track record of ensuring data quality, conducting data audits, and resolving data discrepancies. Hands-on experience with implementing and managing data governance policies and processes
Posted 2 weeks ago
0.0 - 5.0 years
3 - 5 Lacs
Bengaluru
Work from Office
We are looking for a qualified and detail-oriented GLP Archivist to support the implementation of the OECD Principles of Good Laboratory Practice (GLP). The successful candidate will be responsible for managing the archiving of scientific study records, ensuring compliance with international GLP standards, and supporting the integrity and traceability of non-clinical safety data. We invite motivated and deserving candidates with a passion for regulatory compliance and data stewardship to apply for this opportunity. Roles and Responsibilities Responsible for the management, operations, and procedures for archiving in accordance with OECD Principles of GLP. Creating and maintaining archives for the collection for easy retrieval of Records. Maintain a stable physical environment for the receipt, storage, and handling of the archival holdings. Knowledge of OECD Principles of Good Laboratory Practices (GLP).
Posted 2 weeks ago
4.0 - 9.0 years
0 - 0 Lacs
Hyderabad
Work from Office
About the Role: We are looking for an experienced Atlan Developer to join our Data Engineering team. The ideal candidate will have hands-on experience in implementing and customizing the Atlan platform, building integrations with upstream and downstream systems, and enabling seamless data discovery and governance across the organization. Responsibilities: Set up and configure Atlan for enterprise-scale deployment. Integrate Atlan with data platforms (e.g., Snowflake, BigQuery, Redshift, Databricks, dbt, Looker, Tableau, etc.). Develop and maintain Atlan connectors, APIs, and SDKs. Automate metadata ingestion from various sources using Atlan APIs. Customize Atlan to align with internal data governance policies. Work closely with Data Stewards, Engineers, and Analysts to define and manage metadata standards. Implement and manage data lineage, quality checks, and cataloging pipelines. Support data discovery and access control processes. Monitor Atlan platform performance and handle upgrades or maintenance. Required Skills: Strong experience with Atlans APIs, SDKs, and platform architecture. Solid knowledge of metadata management, data cataloging, and data governance. Hands-on experience with modern data stacks: Snowflake, dbt, Airflow, etc. Proficiency in Python and REST APIs. Experience integrating with BI tools like Looker, Power BI, Tableau. Familiarity with data lineage, data quality, and access control concepts. Understanding of data compliance (GDPR, HIPAA, etc.) is a plus. Excellent communication and documentation skills.
Posted 2 weeks ago
8.0 - 12.0 years
10 - 14 Lacs
Hyderabad
Work from Office
Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
vadodara, gujarat
On-site
The purpose of your role is to define and develop Enterprise Data Structure, Data Warehouse, Master Data, Integration, and transaction processing while maintaining and strengthening modeling standards and business information. You will define and develop Data Architecture that supports the organization and clients in new/existing deals. This includes partnering with business leadership to provide strategic recommendations, assessing data benefits and risks, creating data strategy and roadmaps, engaging stakeholders for data governance, ensuring data storage and database technologies are supported, monitoring compliance with Data Modeling standards, overseeing frameworks for data management, and collaborating with vendors and clients to maximize the value of information. Additionally, you will be responsible for building enterprise technology environments for data architecture management. This involves developing, maintaining, and implementing standard patterns for data layers, data stores, data hub & lake, evaluating implemented systems, collecting and integrating data, creating data models, implementing best security practices, and demonstrating strong experience in database architectures and design patterns. You will also enable Delivery Teams by providing optimal delivery solutions and frameworks. This includes building relationships with delivery and practice leadership teams, defining database structures and specifications, establishing relevant metrics, monitoring system capabilities and performance, integrating new solutions, managing projects, identifying risks, ensuring quality assurance, recommending tools for reuse and automation, and supporting the integration team for better efficiency. In addition, you will ensure optimal Client Engagement by supporting pre-sales teams, negotiating and coordinating with client teams, demonstrating thought leadership, and acting as a trusted advisor. Join Wipro to reinvent your world and be a part of an end-to-end digital transformation partner with bold ambitions. Realize your ambitions in a business powered by purpose and empowered to design your reinvention. Applications from people with disabilities are explicitly welcome.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
navi mumbai, maharashtra
On-site
Seekify Global is looking for an experienced and motivated Data Catalog Engineer to join the Data Engineering team. The ideal candidate should have a significant background in designing and implementing metadata and data catalog solutions within AWS-centric data lake and data warehouse environments. As a Data Catalog Engineer at Seekify Global, you will play a crucial role in improving data discoverability, governance, and lineage across our enterprise data assets. Your responsibilities will include leading the end-to-end implementation of a data cataloging solution within AWS, establishing and managing metadata frameworks for structured and unstructured data assets, and integrating the data catalog with various AWS-based storage solutions such as S3, Redshift, Athena, Glue, and EMR. You will collaborate closely with data Governance/BPRG/IT projects teams to define metadata standards, data classifications, and stewardship processes. Additionally, you will be responsible for developing automation scripts for catalog ingestion, lineage tracking, and metadata updates using tools like Python, Lambda, Pyspark, or Glue/EMR custom jobs. Working in coordination with data engineers, data architects, and analysts, you will ensure that metadata is accurate, relevant, and up to date. Implementing role-based access controls and ensuring compliance with data privacy and regulatory standards will also be part of your role. Moreover, you will be expected to create detailed documentation and conduct training/workshops for internal stakeholders on effectively utilizing the data catalog. **Key Responsibilities:** - Lead end-to-end implementation of a data cataloging solution within AWS, preferably AWS Glue Data Catalog or third-party tools like Apache Atlas, Alation, Collibra, etc. - Establish and manage metadata frameworks for structured and unstructured data assets in data lake and data warehouse environments. - Integrate the data catalog with AWS-based storage solutions such as S3, Redshift, Athena, Glue, and EMR. - Collaborate with data Governance/BPRG/IT projects teams to define metadata standards, data classifications, and stewardship processes. - Develop automation scripts for catalog ingestion, lineage tracking, and metadata updates using Python, Lambda, Pyspark, or Glue/EMR custom jobs. - Work closely with data engineers, data architects, and analysts to ensure metadata is accurate, relevant, and up to date. - Implement role-based access controls and ensure compliance with data privacy and regulatory standards. **Required Skills and Qualifications:** - 7-8 years of experience in data engineering or metadata management roles. - Proven expertise in implementing and managing data catalog solutions within AWS environments. - Strong knowledge of AWS Glue, S3, Athena, Redshift, EMR, Data Catalog, and Lake Formation. - Hands-on experience with metadata ingestion, data lineage, and classification processes. - Proficiency in Python, SQL, and automation scripting for metadata pipelines. - Familiarity with data governance and compliance standards (e.g., GDPR, RBI guidelines). - Experience integrating with BI tools (e.g., Tableau, Power BI) and third-party catalog tools is a plus. - Strong communication, problem-solving, and stakeholder management skills. **Preferred Qualifications:** - AWS Certifications (e.g., AWS Certified Data Analytics, AWS Solutions Architect). - Experience with data catalog tools like Alation, Collibra, or Informatica EDC, or open-source tools hands-on experience. - Exposure to data quality frameworks and stewardship practices. - Knowledge of data migration with data catalog and data-mart is a plus. This is a full-time position with the work location being in person.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
As a Data Management Consultant at SAP Success Delivery Center, you play a crucial role in supporting customers on their digital transformation journey by implementing Data Management solutions, including Data Migrations and Master Data Governance. Working as a tech-no functional consultant, you will be an integral part of project teams responsible for delivering SAP Implementations to clients. Your responsibilities include being hands-on with solutions, possessing good communication skills for engaging in business discussions, and having a functional understanding of Data Management. Prior development experience is considered an added advantage. While occasional travel may be required based on customer needs, the primary focus will be on remote and offshore delivery. One of your key objectives is to own or acquire relevant SAP Business AI skills to effectively position and deliver SAP's AI offerings to customers. Your role also involves enhancing the adoption and consumption of various SAP AI offerings within customer use cases. You will be joining the Data Management Solution Area within BTP Delivery @ Scale, which is a robust team of over 100 professionals delivering engagements across a wide range of Data Management topics such as Data Migration, Data Integration, Data Engineering, Data Governance, and Data Quality. At SAP, our innovations empower over four hundred thousand customers globally to collaborate more efficiently and leverage business insights effectively. Our company, known for its leadership in enterprise resource planning (ERP) software, has evolved into a market leader in end-to-end business application software and related services, including database, analytics, intelligent technologies, and experience management. With a cloud-based approach, two hundred million users, and a diverse workforce of over one hundred thousand employees worldwide, we are committed to being purpose-driven and future-focused. Our culture emphasizes collaboration, personal development, and a strong team ethic. We believe in connecting global industries, people, and platforms to provide solutions for every challenge. At SAP, you have the opportunity to bring out your best. Diversity and inclusion are at the core of SAP's culture, with a focus on health, well-being, and flexible working models that ensure every individual, regardless of background, feels included and empowered to perform at their best. We believe in the strength that comes from the unique capabilities and qualities each person brings to our organization, and we invest in our employees to nurture confidence and unlock their full potential. SAP is dedicated to unleashing all talent and contributing to a better and more equitable world. SAP is an equal opportunity workplace and an affirmative action employer. We uphold the values of Equal Employment Opportunity and provide accessibility accommodations to applicants with physical and/or mental disabilities. If you require accommodation or special assistance to navigate our website or complete your application, please contact the Recruiting Operations Team at Careers@sap.com. For SAP employees, only permanent roles qualify for the SAP Employee Referral Program, subject to the eligibility criteria outlined in the SAP Referral Policy. Specific conditions may apply to roles in Vocational Training. EOE AA M/F/Vet/Disability: Successful candidates may undergo a background verification with an external vendor. Requisition ID: 422298 | Work Area: Consulting and Professional Services | Expected Travel: 0 - 10% | Career Status: Professional | Employment Type: Regular Full Time | Additional Locations: #LI-Hybrid.,
Posted 2 weeks ago
12.0 - 18.0 years
0 Lacs
noida, uttar pradesh
On-site
We are looking for an experienced Manager - Data Engineering with a strong background in Databricks or the Apache data stack to lead the implementation of complex data platforms. In this role, you will be responsible for overseeing impactful data engineering projects for global clients, delivering scalable solutions, and steering digital transformation initiatives. With 12-18 years of overall experience in data engineering, including 3-5 years in a leadership position, you will need hands-on expertise in either Databricks or the core Apache stack (Spark, Kafka, Hive, Airflow, NiFi, etc.). Proficiency in at least one cloud platform such as AWS, Azure, or GCP, ideally with Databricks on the cloud, is required. Strong programming skills in Python, Scala, and SQL are essential, along with experience in constructing scalable data architectures, delta lakehouses, and distributed data processing. Familiarity with modern data governance, cataloging, and data observability tools is also necessary. You should have a proven track record of managing delivery in an onshore-offshore or hybrid model, coupled with exceptional communication, stakeholder management, and team mentoring abilities. As a Manager - Data Engineering, your key responsibilities will include leading the design, development, and deployment of modern data platforms utilizing Databricks, Apache Spark, Kafka, Delta Lake, and other big data tools. You will be tasked with designing and implementing data pipelines (both batch and real-time), data lakehouses, and large-scale ETL frameworks. Furthermore, you will take ownership of delivery accountability for data engineering programs across various industries, collaborating with global stakeholders, product owners, architects, and business teams to drive data-driven outcomes. Ensuring best practices in DevOps, CI/CD, infrastructure-as-code, data security, and governance will be crucial. Additionally, you will be responsible for managing and mentoring a team of 10-25 engineers, conducting performance reviews, capability building, and coaching, as well as supporting presales activities including solutioning, technical proposals, and client workshops. At GlobalLogic, we prioritize a culture of caring where people come first. We offer continuous learning and development opportunities to help you grow personally and professionally. You'll have the chance to work on interesting and meaningful projects that have a real impact. With various career areas, roles, and work arrangements, we believe in providing a balance between work and life. As a high-trust organization, integrity is at the core of everything we do. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner known for creating innovative digital products and experiences. Join us in collaborating with forward-thinking companies to transform businesses and redefine industries through intelligent products, platforms, and services.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As an Azure ML and Python Dev- Senior 1/2 at EY GDS Consulting digital engineering, you will be responsible for designing and implementing data pre-processing, feature engineering, and model training pipelines. Your role will involve collaborating closely with data scientists to ensure model performance and reliability in production environments. Proficiency in Azure ML services, Python programming, and a strong background in machine learning are essential for this position. You will have the opportunity to lead and contribute to cutting-edge projects on the Azure platform. Your key responsibilities will include developing and deploying machine learning models on the Azure cloud platform, designing efficient data pipelines, collaborating with stakeholders, implementing best practices for ML development, and maintaining APIs for model deployment and integration with applications. Additionally, you will monitor and optimize model performance, participate in code reviews and troubleshooting, stay updated with industry trends, mentor junior team members, and contribute to innovation initiatives. To qualify for this role, you must have a bachelor's or master's degree in computer science, data science, or a related field, along with 6-8 years of experience in machine learning, data engineering, and cloud computing. Strong communication skills, a proven track record of successful project delivery, and relevant certifications are highly desirable. Experience with other cloud platforms and programming languages is a plus. Ideally, you will possess analytical ability to manage multiple projects simultaneously, familiarity with advanced ML techniques and frameworks, knowledge of cloud security principles, and experience with Big Data technologies. Working at EY offers you the opportunity to work on inspiring projects, receive support and coaching from engaging colleagues, develop new skills, progress your career, and enjoy freedom and flexibility in your role. EY is dedicated to building a better working world by creating new value for clients, people, society, and the planet. With a focus on data, AI, and advanced technology, EY teams help clients shape the future with confidence and address pressing issues. By working across a full spectrum of services in assurance, consulting, tax, strategy, and transactions, EY teams provide services globally and emphasize high quality, knowledge exchange, and interdisciplinary collaboration.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
bhubaneswar
On-site
You will be responsible for providing digital estate planning services to clients, helping them identify, organize, manage, and plan their assets and identities in the online world for orderly and safe inheritance and disposal. Your role will involve collaborating with legal, technical, product, and customer service teams to promote the implementation of digital estate planning. Your main responsibilities will include communicating with customers to understand their digital asset needs, assisting them in organizing and classifying their online assets, designing digital wills and access rights, ensuring legal and compliant digital estate plans, and maintaining customer digital estate archives. Additionally, you will participate in the formulation of company standards related to digital estate services and provide external education and consulting services to enhance public awareness. To qualify for this role, you should have a Bachelor's degree or above in relevant fields such as law, information management, data security, psychology, or sociology. You should have more than 3 years of experience in wealth management, legal planning, estate management, data governance, or customer consulting services. It is essential to have a good understanding of digital assets, major platform policies, digital literacy, privacy regulations, and data security laws. Strong communication, empathy, project coordination, responsibility, and confidentiality skills are required. Proficiency in digital tools like Office suites and document management systems is essential. Candidates with experience as a lawyer, will planner, trust consultant, or data governance consultant are preferred. Familiarity with blockchain technology, digital identity management, and Web3 ecology is a plus. Experience in psychological counseling, living will services, or cross-platform digital estate planning projects for multinational clients is advantageous. Proficiency in multiple languages for overseas client support is desirable.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
As an Enterprise Architect, Data Integration and BI at Myers-Holum, you will be responsible for leading the strategic design, architecture, and implementation of enterprise data solutions to ensure alignment with our clients" long-term business goals. Your role will involve developing and promoting the architectural vision for data integration, Business Intelligence (BI), and analytics solutions across various business functions and applications. You will design and build scalable, high-performance data warehouses and BI solutions for clients using cutting-edge cloud-based and on-premise technologies. Additionally, you will lead cross-functional teams in developing data governance frameworks, data models, and integration architectures to support seamless data flow across disparate systems. You will translate high-level business requirements into technical specifications, ensuring alignment with broader organizational IT strategies and compliance standards. Your responsibilities will also include architecting end-to-end data pipelines, data integration frameworks, and data governance models to enable the seamless flow of structured and unstructured data from multiple sources. Furthermore, you will provide thought leadership in evaluating and recommending emerging technologies, tools, and best practices for data management, integration, and business intelligence. You will oversee the deployment and adoption of key enterprise data initiatives, engage with C-suite executives and senior stakeholders to communicate architectural solutions, and lead and mentor technical teams to foster a culture of continuous learning and innovation in the areas of data management, BI, and integration. Your role will involve conducting architectural reviews, providing guidance on best practices for data security, compliance, and performance optimization, as well as leading technical workshops, training sessions, and collaborative sessions with clients to ensure successful adoption of data solutions. You will contribute to the development of internal frameworks, methodologies, and standards for data architecture, integration, and BI, while staying up to date with industry trends and emerging technologies to continuously evolve the enterprise data architecture. To qualify for this position, you should have 10+ years of relevant professional experience in data management, business intelligence, and integration architecture, along with 6+ years of experience in designing and implementing enterprise data architectures. You should possess expertise in cloud-based data architectures, proficiency in data integration tools, experience with relational databases, and a solid understanding of BI platforms. Additionally, strong business analysis, stakeholder management, written and verbal communication skills, and experience in leading digital transformation initiatives are required. At Myers-Holum, you will have the opportunity to collaborate with other curious minds, shape your future, positively influence change for customers, and discover your true potential. As part of our team, you will be encouraged to remain curious, humble, and resilient, while contributing to our mission and operating principles. With over 40 years of experience, a strong internal framework, cutting-edge technology partners, and a focus on employee well-being and growth, Myers-Holum offers a rewarding and supportive environment for professional development and career advancement.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
panchkula, haryana
On-site
You are a highly skilled Data Architect & ETL Engineer with a strong understanding of data, SQL expertise, and experience in ETL transformations and analytics tools. Your role involves designing, implementing, and managing scalable data architectures and ETL pipelines to enable high-quality business intelligence and analytics solutions. You will be responsible for designing and implementing scalable data architectures to support analytics, data integration, and reporting. Developing and maintaining ETL pipelines using Pentaho or other ETL transformation tools will be a key part of your role. Collaboration with business analysts, data scientists, and application teams is essential to ensure efficient data flow. Optimizing SQL queries and building efficient data models for reporting and data warehousing, such as star/snowflake schema, will be part of your responsibilities. Implementing data governance, quality control, and security measures is critical. Additionally, developing interactive reports and dashboards using Tableau, QuickSight, or Power BI is part of the role. Monitoring and troubleshooting data pipelines to ensure high availability and reliability, as well as documenting data flows, ETL processes, and architectural designs, are also important tasks. You should possess a Bachelor's or Master's degree in Computer Science, Information Systems, or a related field, along with 5+ years of experience in data engineering, ETL development, or data architecture. Strong SQL skills and experience with relational databases (PostgreSQL, MySQL, SQL Server, Oracle) are required. Hands-on experience with Pentaho or similar ETL transformation tools, as well as experience with Tableau, QuickSight, or Power BI for data visualization and reporting, is necessary. Knowledge of data modeling, data warehousing, and BI best practices, along with an understanding of data governance, metadata management, and data security, is essential. Strong problem-solving and communication skills are also important. Preferred skills include experience with cloud platforms (AWS, Azure, or GCP) and cloud-based data services, as well as experience in Python or Java for data transformation and automation. Knowledge of CI/CD pipelines for data workflows is a plus. Joining the team offers the opportunity to work on high impact projects in the Fintech industry, providing valuable learning experiences.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
coimbatore, tamil nadu
On-site
You are seeking a NetSuite Analytics Developer & Data Warehousing expert to design, build, and optimize NetSuite analytics solutions and enterprise data warehouses. Your role will involve leveraging NetSuite's SuiteAnalytics tools and external data warehousing platforms such as Oracle Analytics Warehouse, Google Cloud Platform (GCP), and Snowflake to deliver scalable, data-driven insights through advanced visualizations and reporting across the organization. As a candidate, you will be responsible for designing, developing, and maintaining SuiteAnalytics reports, saved searches, and dashboards within NetSuite to meet evolving business needs. You will also build and optimize data pipelines and ETL processes to integrate NetSuite data into enterprise data warehouses, develop data models and schemas, and maintain data marts to support business intelligence and analytical requirements. Additionally, you will implement advanced visualizations and reports using tools such as Tableau, Power BI, or Looker, ensuring high performance and usability. Collaboration with business stakeholders to gather requirements and translate them into effective technical solutions, monitoring, troubleshooting, and optimizing data flow and reporting performance, ensuring data governance, security, and quality standards are upheld across analytics and reporting systems, providing documentation, training, and support to end-users on analytics solutions are also part of your responsibilities. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Systems, or a related field, along with a minimum of 3 years of experience working with NetSuite, including SuiteAnalytics (saved searches, datasets, workbooks). You should have strong expertise in data warehousing concepts, ETL processes, and data modeling, hands-on experience with external data warehouse platforms such as Oracle Analytics Warehouse, GCP (BigQuery), or Snowflake, proficiency in SQL and performance optimization of complex queries, experience with BI and visualization tools like Tableau, Power BI, or Looker, and an understanding of data governance, compliance, and best practices in data security. In summary, as a NetSuite Analytics Developer & Data Warehousing expert, you will play a crucial role in designing, building, and optimizing NetSuite analytics solutions and enterprise data warehouses, ensuring the delivery of scalable, data-driven insights through advanced visualizations and reporting across the organization.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
About Credit Saison India Established in 2019, Credit Saison India (CS India) is one of the country's fastest-growing Non-Bank Financial Company (NBFC) lenders. With verticals in wholesale, direct lending, and tech-enabled partnerships with Non-Bank Financial Companies (NBFCs) and fintechs, CS India's tech-enabled model, coupled with underwriting capability, facilitates lending at scale, addressing India's significant credit gap, especially within underserved and underpenetrated population segments. Committed to long-term growth as a lender in India, CS India aims to evolve its offerings for MSMEs, households, individuals, and more. Registered with the Reserve Bank of India (RBI) and boasting an AAA rating from CRISIL and CARE Ratings, CS India currently operates through a branch network of 45 physical offices, servicing 1.2 million active loans, managing an AUM of over US$1.5B, and employing around 1,000 individuals. As part of Saison International, a global financial company with a mission to foster resilient and innovative financial solutions for positive impact by bringing people, partners, and technology together, CS India aligns with Saison International's commitment to being a transformative partner in creating opportunities and enabling people's dreams. With a presence in Singapore and operations across various countries, including India, Indonesia, Thailand, Vietnam, Mexico, and Brazil, Saison International boasts a diverse workforce of over 1,000 employees. Product Head for Data, Analytics, and Platform - Director/Sr. Director Role Overview: As the Product Head for Data and Analytics, you will spearhead the strategic development and execution of data platform initiatives, ensuring alignment with business objectives and delivering measurable results. Your responsibilities include defining product strategy, setting OKRs, and overseeing the successful implementation of programs that drive the product roadmap. Collaborating cross-functionally with technical teams, you will integrate advanced analytics, machine learning, and AI capabilities into scalable data platforms. Key Responsibilities: - Define and drive the strategic direction for data platforms, ensuring alignment with business goals. - Set clear OKRs for data platform development and ensure alignment across all teams. - Establish and enforce standards for planning, project management, execution, and documentation. - Lead the design and implementation of scalable data pipelines, data lakes, and real-time analytics architectures. - Collaborate with Data Science, Analytics, Platform, and AI teams to integrate machine learning models, predictive analytics, and AI technologies seamlessly. - Maintain high standards for project management, ensuring timely delivery within budget, using Agile methodologies. - Ensure comprehensive documentation of technical processes, product specifications, and architectural decisions. - Implement data governance and security protocols to ensure compliance with data protection regulations. - Develop key performance metrics to assess the success of data products and drive continuous improvement. - Mentor and lead product managers and technical teams to foster a culture of ownership, innovation, and excellence. Qualifications: - BTech and/or MBA from reputed colleges (e.g., IITs, NITs, ISBs, IIMs, or equivalent). - 10+ years of product management experience with a focus on building data platforms and integrating advanced analytics. - Proven track record in setting and executing strategic roadmaps, OKRs, and ensuring business alignment. - Deep understanding of cloud technologies (e.g., AWS, Azure) and data frameworks (Hadoop, Spark, Kafka). - Strong experience in data architectures, data governance, and real-time data pipelines. - Proficiency in integrating AI/ML models into data products and driving BI initiatives. - Program management expertise with the ability to lead cross-functional teams and deliver complex projects. - Knowledge of data governance frameworks and experience in ensuring regulatory compliance. - Excellent leadership, communication, and interpersonal skills for engaging and aligning stakeholders.,
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough