Jobs
Interviews

2637 Data Governance Jobs - Page 27

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 10.0 years

0 - 1 Lacs

Bengaluru

Work from Office

Data Governance Engineer : Level of Experience: 8+ years of experience Must-Have Skillset: Hands-on experience with Data Governance: Data Governance encompasses various components such as Data Control, Data Privacy, Data Ethics, and Data Strategy. While the candidate may not have experience in all areas, hands-on experience in at least one and an understanding of the others is essential. Existing understanding of Understanding of EDM/DAMA/DCAM would be very useful. Experience in multi-team global collaboration: The CoE team is central to multiple global teams in International. The candidate should be adept at navigating these complexities. Experience with strategic initiatives: The I-DM CoE, particularly the Data Governance segment, is a strategic team within International. Prior experience with strategic solutioning is crucial, whereas experience in delivery roles may not be suitable. Strong communication skills. Good-to-Have Skillset: Pharma background, as the enterprise data landscape in the pharma industry differs from other domains. Experience working with non-US clients. Consulting background Our Commitment to Diversity & Inclusion: Did you know that Apexon has been Certified by Great Place To Work®, the global authority on workplace culture, in each of the three regions in which it operates: USA (for the fourth time in 2023), India (seven consecutive certifications as of 2023), and the UK.Apexon is committed to being an equal opportunity employer and promoting diversity in the workplace. We take affirmative action to ensure equal employment opportunity for all qualified individuals. Apexon strictly prohibits discrimination and harassment of any kind and provides equal employment opportunities to employees and applicants without regard to gender, race, color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. You can read about our Job Applicant Privacy policy here Job Applicant Privacy Policy (apexon.com) Our Perks and Benefits: Our benefits and rewards program has been thoughtfully designed to recognize your skills and contributions, elevate your learning/upskilling experience and provide care and support for you and your loved ones. As an Apexon Associate, you get continuous skill-based development, opportunities for career advancement, and access to comprehensive health and well-being benefits and assistance. We also offer: o Group Health Insurance covering family of 4 Term Insurance and Accident Insurance Paid Holidays & Earned Leaves Paid Parental LeaveoLearning & Career Development Employee Wellness About Apexon: Apexon is a digital-first technology services firm specializing in accelerating business transformation and delivering human-centric digital experiences. We have been meeting customers wherever they are in the digital lifecycle and helping them outperform their competition through speed and innovation.Apexon brings together distinct core competencies in AI, analytics, app development, cloud, commerce, CX, data, DevOps, IoT, mobile, quality engineering and UX, and our deep expertise in BFSI, healthcare, and life sciences – to help businesses capitalize on the unlimited opportunities digital offers. Our reputation is built on a comprehensive suite of engineering services, a dedication to solving clients’ toughest technology problems, and a commitment to continuous improvement. Backed by Goldman Sachs Asset Management and Everstone Capital, Apexon now has a global presence of 15 offices (and 10 delivery centers) across four continents. We enable #HumanFirstDigital

Posted 1 week ago

Apply

12.0 - 20.0 years

35 - 45 Lacs

Gurugram

Work from Office

Role & responsibility Type of profiles we are looking for - 10+ years of experience in driving large data programs in Banking. 8-10 years of Experience in implementing Data Governance frameworks. In depth understanding of RDAR, BCBS 239, Financial & Non-Financial risks. Experience in data engineering and good understanding of ETL & data platforms. Experience in Risk Regulatory & Data programs. Experience of creating data architectures in GCP. Working knowledge of databrics. BFS domain experience is a must Good Communication skills. Must visit office 3 days a week Key day to day responsibilities of the candidate Work with client technology partners. Be the link b/w engineering team & business stakeholders. Take reporting & data aggregation requirements from business and liaison with Tech to integrate the logical data models into Datahub. Assist the client tech teams in building new data platform. Experience in building data models, quality controls and data profiling. Good to have Understanding of ServiceNow. Worked on BCBS 239. Project management experience

Posted 1 week ago

Apply

15.0 - 20.0 years

18 - 22 Lacs

Noida

Remote

Contract : 6months Position Summary : We are seeking an experienced SAP Data Architect with a strong background in enterprise data management, SAP S/4HANA architecture, and data integration. This role demands deep expertise in designing and governing data structures, ensuring data consistency, and leading data transformation initiatives across large-scale SAP implementations. Key Responsibilities : - Lead the data architecture design across multiple SAP modules and legacy systems. - Define data governance strategies, master data management (MDM), and metadata standards. - Architect data migration strategies for SAP S/4HANA projects, including ETL, data validation, and reconciliation. - Develop data models (conceptual, logical, and physical) aligned with business and technical requirements. - Collaborate with functional and technical teams to ensure integration across SAP and nonSAP platforms. - Establish data quality frameworks and monitoring practices. - Conduct impact assessments and ensure scalability of data architecture. - Support reporting and analytics requirements through structured data delivery frameworks (e.g., SAP BW, SAP HANA). Required Qualifications : - 15+ years of experience in enterprise data architecture, with 8+ years in SAP landscapes. - Proven experience in SAP S/4HANA data models, SAP Datasphere, SAP HANA Cloud & SAC and integrating this with AWS Data Lake (S3) - Strong knowledge of data migration tools (SAP Data Services, LTMC / LSMW, BODS, etc.). - Expertise in data governance, master data strategy, and data lifecycle management. - Experience with cloud data platforms (Azure, AWS, or GCP) is a plus. - Strong analytical and communication skills to work across business and IT stakeholders. Preferred Certifications : - SAP Certified Technology Associate SAP S/4HANA / Datasphere - TOGAF or other Enterprise Architecture certifications - ITIL Foundation (for process alignment)

Posted 1 week ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Noida

Work from Office

Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.

Posted 1 week ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

Noida

Remote

Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Noida

Work from Office

Position Overview : We are seeking an experienced Data Catalog Lead to lead the implementation and ongoing development of enterprise data catalog using Collibra. This role focuses specifically on healthcare payer industry requirements, including complex regulatory compliance, member data privacy, and multi-system data integration challenges unique to health plan operations. Key Responsibilities : - Data Catalog Implementation & Development - Configure and customize Collibra workflows, data models, and governance processes to support health plan business requirements - Develop automated data discovery and cataloging processes for healthcare data assets including claims, eligibility, provider networks, and member information - Design and implement data lineage tracking across complex healthcare data ecosystems spanning core administration systems, data warehouses, and analytics platforms - Healthcare-Specific Data Governance - Build specialized data catalog structures for healthcare data domains including medical coding systems (ICD-10, CPT, HCPCS), pharmacy data (NDC codes), and provider taxonomies - Configure data classification and sensitivity tagging for PHI (Protected Health Information) and PII data elements in compliance with HIPAA requirements - Implement data retention and privacy policies within Collibra that align with healthcare regulatory requirements and member consent management - Develop metadata management processes for regulatory reporting datasets (HEDIS, Medicare Stars, MLR reporting, risk adjustment) Technical Integration & Automation : - Integrate Collibra with healthcare payer core systems including claims processing platforms, eligibility systems, provider directories, and clinical data repositories - Implement automated data quality monitoring and profiling processes that populate the data catalog with technical and business metadata - Configure Collibra's REST APIs to enable integration with existing data governance tools and business intelligence platforms Required Qualifications : - Collibra Platform Expertise - 8+ years of hands-on experience with Collibra Data Intelligence Cloud platform implementation and administration - Expert knowledge of Collibra's data catalog, data lineage, and data governance capabilities - Proficiency in Collibra workflow configuration, custom attribute development, and role-based access control setup - Experience with Collibra Connect for automated metadata harvesting and system integration - Strong understanding of Collibra's REST APIs and custom development capabilities - Healthcare Payer Industry Knowledge - 4+ years of experience working with healthcare payer/health plan data environments - Deep understanding of healthcare data types including claims (professional, institutional, pharmacy), eligibility, provider data, and member demographics - Knowledge of healthcare industry standards including HL7, X12 EDI transactions, and FHIR specifications - Familiarity with healthcare regulatory requirements (HIPAA, ACA, Medicare Advantage, Medicaid managed care) - Understanding of healthcare coding systems (ICD-10-CM/PCS, CPT, HCPCS, NDC, SNOMED CT) Technical Skills : - Strong SQL skills and experience with healthcare databases (claims databases, clinical data repositories, member systems) - Knowledge of cloud platforms (AWS, Azure, GCP) and their integration with Collibra cloud services - Understanding of data modeling principles and healthcare data warehouse design patterns Data Governance & Compliance : - Experience implementing data governance frameworks in regulated healthcare environments - Knowledge of data privacy regulations (HIPAA, state privacy laws) and their implementation in data catalog tools - Understanding of data classification, data quality management, and master data management principles - Experience with audit trail requirements and compliance reporting in healthcare organizations Preferred Qualifications : - Advanced Healthcare Experience - Experience with specific health plan core systems (such as HealthEdge, Facets, QNXT, or similar platforms) - Knowledge of Medicare Advantage, Medicaid managed care, or commercial health plan operations - Understanding of value-based care arrangements and their data requirements - Experience with clinical data integration and population health analytics Technical Certifications & Skills : - Collibra certification (Data Citizen, Data Steward, or Technical User) - Experience with additional data catalog tools (Alation, Apache Atlas, IBM Watson Knowledge Catalog) - Knowledge of data virtualization tools and their integration with data catalog platforms - Experience with healthcare interoperability standards and API management

Posted 1 week ago

Apply

7.0 - 10.0 years

10 - 14 Lacs

Noida

Work from Office

About the Job : We are seeking a highly skilled and experienced Senior Data Engineer to join our dynamic team. In this pivotal role, you will be instrumental in driving our data engineering initiatives, with a strong emphasis on leveraging Dataiku's capabilities to enhance data processing and analytics. You will be responsible for designing, developing, and optimizing robust data pipelines, ensuring seamless integration of diverse data sources, and maintaining high data quality and accessibility to support our business intelligence and advanced analytics projects. This role requires a unique blend of expertise in traditional data engineering principles, advanced data modeling, and a forward-thinking approach to integrating cutting-AI technologies, particularly LLM Mesh for Generative AI applications. If you are passionate about building scalable data solutions and are eager to explore the cutting edge of AI, we encourage you to apply. Key Responsibilities : - Dataiku Leadership : Drive data engineering initiatives with a strong emphasis on leveraging Dataiku capabilities for data preparation, analysis, visualization, and the deployment of data solutions. - Data Pipeline Development : Design, develop, and optimize robust and scalable data pipelines to support various business intelligence and advanced analytics projects. This includes developing and maintaining ETL/ELT processes to automate data extraction, transformation, and loading from diverse sources. - Data Modeling & Architecture : Apply expertise in data modeling techniques to design efficient and scalable database structures, ensuring data integrity and optimal performance. - ETL/ELT Expertise : Implement and manage ETL processes and tools to ensure efficient and reliable data flow, maintaining high data quality and accessibility. - Gen AI Integration : Explore and implement solutions leveraging LLM Mesh for Generative AI applications, contributing to the development of innovative AI-powered features. - Programming & Scripting : Utilize programming languages such as Python and SQL for data manipulation, analysis, automation, and the development of custom data solutions. - Cloud Platform Deployment : Deploy and manage scalable data solutions on cloud platforms such as AWS or Azure, leveraging their respective services for optimal performance and cost-efficiency. - Data Quality & Governance : Ensure seamless integration of data sources, maintaining high data quality, consistency, and accessibility across all data assets. Implement data governance best practices. - Collaboration & Mentorship : Collaborate closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver impactful solutions. Potentially mentor junior team members. - Performance Optimization : Continuously monitor and optimize the performance of data pipelines and data systems. Required Skills & Experience : - Proficiency in Dataiku : Demonstrable expertise in Dataiku for data preparation, analysis, visualization, and building end-to-end data pipelines and applications. - Expertise in Data Modeling : Strong understanding and practical experience in various data modeling techniques (e.g., dimensional modeling, Kimball, Inmon) to design efficient and scalable database structures. - ETL/ELT Processes & Tools : Extensive experience with ETL/ELT processes and a proven track record of using various ETL tools (e.g., Dataiku's built-in capabilities, Apache Airflow, Talend, SSIS, etc.). - Familiarity with LLM Mesh : Familiarity with LLM Mesh or similar frameworks for Gen AI applications, understanding its concepts and potential for integration. - Programming Languages : Strong proficiency in Python for data manipulation, scripting, and developing data solutions. Solid command of SQL for complex querying, data analysis, and database interactions. - Cloud Platforms : Knowledge and hands-on experience with at least one major cloud platform (AWS or Azure) for deploying and managing scalable data solutions (e.g., S3, EC2, Azure Data Lake, Azure Synapse, etc.). - Gen AI Concepts : Basic understanding of Generative AI concepts and their potential applications in data engineering. - Problem-Solving : Excellent analytical and problem-solving skills with a keen eye for detail. - Communication : Strong communication and interpersonal skills to collaborate effectively with cross-functional teams. Bonus Points (Nice to Have) : - Experience with other big data technologies (e.g., Spark, Hadoop, Snowflake). - Familiarity with data governance and data security best practices. - Experience with MLOps principles and tools. - Contributions to open-source projects related to data engineering or AI. Education : Bachelor's or Master's degree in Computer Science, Data Science, Engineering, or a related quantitative field.

Posted 1 week ago

Apply

4.0 - 7.0 years

15 - 18 Lacs

Bhubaneswar, Coimbatore, Bengaluru

Work from Office

Role & responsibilities The candidate must have deep expertise in data management maturity models, data governance frameworks, and regulatory requirements, ensuring businesses can maximize their data assets while complying with both local and international regulations. This is an exciting opportunity to work in a consulting environment, collaborating with industry leaders and driving data-driven business transformation. This role is based in India, with the expectation of traveling to Middle Eastern client locations as required 1. Data Strategy & Advisory Develop and implement enterprise-wide data strategies aligned with business objectives. Assess data maturity levels using industry-standard frameworks and define roadmaps for data-driven transformation. Advise clients on data monetization, data quality, and data lifecycle management 2. Data Governance & Compliance Define and implement data governance frameworks, policies, and best practices. Ensure compliance with local and international data regulations, including GDPR, HIPAA, and region-specific laws. Develop data stewardship programs, ensuring clear roles and responsibilities for data management. 3. Regulatory & Risk Management Provide expertise on data privacy, security, and risk management strategies. Align data strategies with regulatory frameworks such as ISO 27001, NIST, and other industry-specific compliance standards. Advise on data sovereignty and cross-border data transfer policies. 4. Consulting & Pre-Sales Support Conduct client workshops to define data strategy and governance models. Develop thought leadership, whitepapers, and strategic insights to support client engagements. Assist in business development efforts, including proposals and pre-sales discussions. 5. Team Mentorship & Leadership Mentor junior consultants on data governance and strategic advisory. Stay updated on emerging trends in data strategy, regulations, and governance technologies. Represent the company at industry events, conferences, and knowledge-sharing forums. Preferred candidate profile 1. Education & Experience Bachelors or Masters in Data Management, Business Analytics, Information Systems, or a related field. 5 years of experience in data strategy, governance, or regulatory compliance consulting. 2. Technical & Regulatory Expertise Deep understanding of data management maturity models (e.g., DAMA-DMBOK, CMMI for Data Management) ; Should be DAMA Certified Basic Proficiency in data governance tools such as Collibra, Informatica, or Azure Purview. Strong knowledge of local and international data regulations (e.g., GDPR, CCPA, PDPA, UAE’s NDPL, KSA-NDMO , UAE DGE Data Regulations, Dubai Data Law).

Posted 1 week ago

Apply

9.0 - 12.0 years

16 - 21 Lacs

Pune

Hybrid

So, what’s the role all about? As a Program Manager, you will be responsible for overseeing multiple projects and initiatives that support the organization's strategic goals. You will work closely with cross-functional teams to ensure successful project execution, on-time delivery, and adherence to quality standards. How will you make an impact? Define project scope, goals, and deliverables that support business goals in collaboration with senior management and stakeholders. Develop and maintain a detailed project plan to track progress and ensure timely delivery of project milestones. Monitor project progress and performance, identify and mitigate risks and issues, and communicate status updates to stakeholders and senior management Collaborate with cross-functional teams to identify and resolve project-related issues and roadblocks. End to end Agile project management responsibility– in terms of scope, quality, resources and risk management as well as timeline and organizational release readiness Develop and maintain strong relationships with key stakeholders to ensure project success and alignment with business objectives Ensure adherence to project management methodologies, standards, and best practices, and continuously improve project management processes and tools Lead project meetings and presentations, and facilitate communication and collaboration among team members and stakeholders Have you got what it takes? 10-14 years of experience in IT industry with 5+ years of experience in hard core Software Development Project & Program management Strong understanding of project management methodologies, tools, and techniques Proven track record of successfully managing multiple projects and initiatives simultaneously. Excellent communication, negotiation, and interpersonal skills Ability to work collaboratively with cross-functional teams and manage multiple stakeholders. Strong attention to detail and ability to manage competing priorities. Working knowledge of various methodologies like, Agile-Scrum Practices Ability to drive project decisions through strong Data governance, Metrics. Strong problem-solving and decision-making skills Hands-on knowledge & experience on- Software Development & Quality - Processes & standards Release Management Pre & Post-Production Product Launches Hands on exp on Atlassian Tools (JIRA/Confluence), Basic Knowledge of: Cloud- AWS, DevOps practices PMP certification preferred. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Reporting into: Tech Manager, Program Management Role Type: Individual Contributor

Posted 1 week ago

Apply

4.0 - 8.0 years

8 - 12 Lacs

Pune

Hybrid

So, what’s the role all about? We are looking for a highly driven and technically skilled Software Engineer to lead the integration of various Content Management Systems with AWS Knowledge Hub, enabling advanced Retrieval-Augmented Generation (RAG) search across heterogeneous customer data—without requiring data duplication. This role will also be responsible for expanding the scope of Knowledge Hub to support non-traditional knowledge items and enhance customer self-service capabilities. You will work at the intersection of AI, search infrastructure, and developer experience to make enterprise knowledge instantly accessible, actionable, and AI-ready. How will you make an impact? Integrate CMS with AWS Knowledge Hub to allow seamless RAG-based search across diverse data types—eliminating the need to copy data into Knowledge Hub instances. Extend Knowledge Hub capabilities to ingest and index non-knowledge assets, including structured data, documents, tickets, logs, and other enterprise sources. Build secure, scalable connectors to read directly from customer-maintained indices and data repositories. Enable self-service capabilities for customers to manage content sources using App Flow, Tray.ai, configure ingestion rules, and set up search parameters independently. Collaborate with the NLP/AI team to optimize relevance and performance for RAG search pipelines. Work closely with product and UX teams to design intuitive, powerful experiences around self-service data onboarding and search configuration. Implement data governance, access control, and observability features to ensure enterprise readiness. Have you got what it takes? Proven experience with search infrastructure, RAG pipelines, and LLM-based applications. 5+ Years’ hands-on experience with AWS Knowledge Hub, AppFlow, Tray.ai, or equivalent cloud-based indexing/search platforms. Strong backend development skills (Python, Typescript/NodeJS, .NET/Java) and familiarity with building and consuming REST APIs. Infrastructure as a code (IAAS) service like AWS Cloud formation, CDK knowledge Deep understanding of data ingestion pipelines, index management, and search query optimization. Experience working with unstructured and semi-structured data in real-world enterprise settings. Ability to design for scale, security, and multi-tenant environment. What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy NICE-FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Reporting into: Tech Manager, Engineering, CX Role Type: Individual Contributor

Posted 1 week ago

Apply

8.0 - 13.0 years

13 - 17 Lacs

Mumbai

Work from Office

Project description Our client is a leading commodity trading and logistics company. They are committed to building and maintaining world-class IT applications and infrastructure. The Trading IT group directly supports the trading business, and this business has started a far-reaching programme to enhance and improve its trading applications using an innovative architecture to support business growth across the full range of business lines and geographies, and to enable the sharing of systems across different businesses. This programme is aimed at delivering functional capabilities, enhancements, and technical infrastructure upgrades to enable continued business growth and enhanced profitability for the firm. Client is looking to replace existing reconciliation system Gresham with Exceptor which will be enterprise-wide recon platform across FO, MO and BO Responsibilities a) Determine and define project scope and objectives b) Predict resources needed to reach objectives and manage resources in an effective and efficient manner c) Develop and manage a detailed project schedule and work plan d) Provide project updates on a consistent basis to various stakeholders about strategy, adjustments, and progress e) Manage vendors and stakeholder tasks and communicating expected deliverables f) Utilize industry best practices, techniques, and standards throughout entire project execution g) Monitor progress and make adjustments as needed h) Measure project performance to identify areas for improvement i) Maintain roadmap and maintain resource allocation / utilization Skills Must have Knowledge & Experience: Overall 8+ years of experience out of which at least 5 years in OTC derivatives space Minimum 5 years of experience as project manager Knows how to handle project complexity in terms of stakeholder management, conflict management, change management etc. Understand concepts such as static data, industry codes, data governance and control as well as financial reporting Have worked in a finance department and understand basic reporting concepts Experience working inteam engagements to finalize new operating models and roadmaps for change across people, process, data and technology Review processes, bypasses, challenges ahead and propose proxy approach Adaptable to an evolving scope of tasks, comfortable with uncertainty as well as changing global requirements Leads by example change management best practice on initiatives driven by the workstreams Familiarity with AGILE methodologies Knowledge of project planning tools. Familiar with and able to apply project management methodologies (for example, PMI, Prince II and agile) Good understanding of current and emerging technologies and how other enterprises are employing them to drive digital business Exceptional verbal and written communication skills; expertise in setting and managing customer expectations Distinctive blend of business, IT, financial and communication skills, as this is a highly visible position with substantial impact Effective influencing and negotiating skills in an environment where this role may not directly control resources Nice to have Prior experience in reconciliation Other Languages EnglishC2 Proficient Seniority Senior

Posted 1 week ago

Apply

3.0 - 10.0 years

0 Lacs

thane, maharashtra

On-site

You are seeking a dynamic and experienced individual to manage Call Centre operations for service as a Contact Centre Manager. As the Contact Centre Manager, you will be responsible for optimizing Customer Response Centre (CRC) processes, leading a team to achieve service excellence, and managing added responsibilities such as Warranty Administration and Service Master data management. Your role will involve ensuring compliance with manufacturer and company policies, maintaining accurate records, and facilitating excellent customer support both internally and externally. Your strong interpersonal skills will enable you to build and maintain positive relationships with colleagues, clients, and stakeholders, fostering a collaborative and supportive work environment. To qualify for this role, you should have a Bachelor's degree in business administration, Electronics and Telecommunications, Electrical, or a related field. Additionally, you should have at least 10 years of experience in contact centre operations, with a minimum of 3 years in a managerial role. Your key responsibilities will include overseeing the daily operations of subcontracted contact centre, developing customer service strategies, coaching and managing a team of customer service representatives, monitoring key performance indicators, handling escalated customer issues, analyzing call centre data, developing training programs, ensuring compliance with company policies and industry regulations, and working closely with service and IT teams to improve customer support processes. In addition, you will be responsible for reviewing extended warranty claims, communicating with Service Engineers and manufacturers, tracking and monitoring warranty claims, maintaining detailed records, assisting customers and internal teams with warranty-related inquiries, staying updated on internal policies and warranty guidelines, and supporting service department operations as needed. You will also be involved in developing, implementing, and maintaining master data management policies, collaborating with cross-functional teams, managing data lifecycle processes, resolving data quality issues, enforcing data governance frameworks, generating reports from master data, providing training to business users, and using enterprise resource planning tools to log and track warranty claims and service requests. To excel in this role, you should have proven experience in contact centre management or a similar leadership role, a strong understanding of customer service principles and call centre technologies, excellent leadership and team-building skills, the ability to analyze data and make strategic decisions, proficiency in Oracle E Business Suite, call centre software, and workforce management tools, and the ability to handle high-pressure situations and multitask effectively. Furthermore, you should possess good domain knowledge in the field service and service sales domain, including understanding Service Level Agreements (SLAs), Key Performance Indicators (KPIs), service processes, sales processes, problem-solving skills, and critical thinking. Your soft skills should include strong communication and presentation skills, collaboration skills, attention to detail, curiosity, continuous learning, and the ability to work in an interruption-driven environment. Travel may be required up to 5% (domestic and international), and the successful candidate will be expected to embrace Vertiv's Core Principles & Behaviors to help execute the company's Strategic Priorities. Please note that Vertiv will only employ those who are legally authorized to work in the United States, and this position does not offer sponsorship for work authorization.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Data Quality Engineer, you will collaborate with product, engineering, and customer teams to gather requirements and develop a comprehensive data quality strategy. You will lead data governance processes, including data preparation, obfuscation, integration, slicing, and quality control. Testing data pipelines, ETL processes, APIs, and system performance to ensure reliability and accuracy will be a key responsibility. Additionally, you will prepare test data sets, conduct data profiling, and perform benchmarking to identify inconsistencies or inefficiencies. Creating and implementing strategies to verify the quality of data products and ensuring alignment with business standards will be crucial. You will set up data quality environments and applications in compliance with defined standards, contributing to CI/CD process improvements. Participation in the design and maintenance of data platforms, as well as building automation frameworks for data quality testing and resolving potential issues, will be part of your role. Providing support in troubleshooting data-related issues to ensure timely resolution is also expected. It is essential to ensure that all data quality processes and tools align with organizational goals and industry best practices. Collaboration with stakeholders to enhance data platforms and optimize data quality workflows will be necessary to drive success in this role. Requirements: - Bachelors degree in Computer Science or a related technical field involving coding, such as physics or mathematics - At least three years of hands-on experience in Data Management, Data Quality verification, Data Governance, or Data Integration - Strong understanding of data pipelines, Data Lakes, and ETL testing methodologies - Proficiency in CI/CD principles and their application in data processing - Comprehensive knowledge of SQL, including aggregation and window functions - Experience in scripting with Python or similar programming languages - Databricks and Snowflake experience is a must, with good exposure to notebook, SQL editor, etc. - Experience in developing test automation frameworks for data quality assurance - Familiarity with Big Data principles and their application in modern data systems - Experience in data analysis and requirements validation, including gathering and interpreting business needs - Experience in maintaining QA environments to ensure smooth testing and deployment processes - Hands-on experience in Test Planning, Test Case design, and Test Result Reporting in data projects - Strong analytical skills, with the ability to approach problems methodically and communicate solutions effectively - English proficiency at B2 level or higher, with excellent verbal and written communication skills Nice to have: - Familiarity with advanced data visualization tools to enhance reporting and insights - Experience in working with distributed data systems and frameworks like Hadoop,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

The Applications Development Technology Lead Analyst position at our organization entails taking charge of establishing and executing new or updated application systems and programs in collaboration with the Technology team. Your main responsibility will be to lead activities related to applications systems analysis and programming. In this role, we are looking for a skilled Database Architect who can contribute to designing, developing, and maintaining our upcoming big data platform. Your key focus will be on shaping our data strategy and ensuring the scalability, performance, and reliability of our data infrastructure. Proficiency in distributed systems, Hadoop, Spark, NoSQL databases, and either Python or Scala is essential for this position. Your responsibilities will include designing and implementing scalable big data solutions, developing and managing data models and ETL processes, optimizing database performance, and ensuring high availability. Collaboration with data engineers, scientists, and analysts to grasp data requirements and offer technical guidance is crucial. Additionally, you will be expected to evaluate new technologies, uphold data security and governance policies, troubleshoot database issues, document database architecture decisions, and mentor junior team members. To qualify for this role, you should hold a Bachelor's degree in Computer Science or a related field, along with a total of 12+ years of experience, including at least 5 years as a Data Architect. Strong knowledge of database design principles, data modeling techniques, and extensive experience with Hadoop, Spark, Kafka, and related technologies are necessary. Proficiency in Python or Scala, experience with NoSQL databases such as Cassandra and MongoDB, as well as excellent communication and collaboration skills are also required. Any additional experience with data warehousing, business intelligence tools, data governance, security best practices, or relevant certifications in technologies like Hadoop and Spark would be considered a bonus. A Master's degree is preferred but not mandatory. Please note that this job description is a general overview of the role's responsibilities, and additional duties may be assigned as needed.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

As a Senior Specialist, HR Data and Digital at NTT DATA, your primary focus will be on innovating HR platforms and data constructs. You will collaborate closely with HR, IT, and finance teams to ensure alignment and collaboration within the organization. Your responsibilities will include regular reviews to maintain data integrity, testing system changes, report writing, and analyzing data flows. You will extract and compile data, write reports using appropriate tools, and provide support for HR platforms like Workday, SuccessFactors, and Phenom People. Additionally, you will participate in major release reviews and integration testing, maintain HRIS procedures and documentation, and manage HR data and digital projects. To excel in this role, you should have a strong understanding of HR data management principles, data analytics concepts, and data governance. You should be familiar with HR technology systems, data privacy regulations, and emerging digital trends in HR. Proficiency in data analysis tools, attention to detail, problem-solving skills, and effective communication are essential for success in this role. Academically, a Bachelor's degree in Information Technology or related field is required, along with certifications such as Workday Success Factors, Lean Six Sigma Black Belt, and Certified Maintenance & Reliability Professional. Previous experience with HRIS platforms, talent analytics, and digital HR projects is crucial for this role. This position offers a hybrid working environment and is an equal opportunity employer. If you are looking to drive innovation in HR, optimize processes, and enhance employee experiences, this role at NTT DATA could be the perfect fit for you.,

Posted 2 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

telangana

On-site

As the Vice President of Engineering at Teradata in India, you will be responsible for leading the software development organization for the AI Platform Group. This includes overseeing the execution of the product roadmap for key technologies such as Vector Store, Agent platform, Apps, user experience, and AI/ML-driven use-cases. Your success in this role will be measured by your ability to build a world-class engineering culture, attract and retain technical talent, accelerate product delivery, and drive innovation that brings tangible value to customers. In this role, you will lead a team of over 150 engineers with a focus on helping customers achieve outcomes with Data and AI. Collaboration with key functions such as Product Management, Product Operations, Security, Customer Success, and Executive Leadership will be essential to your success. You will also lead a regional team of up to 500 individuals, including software development, cloud engineering, DevOps, engineering operations, and architecture teams. Collaboration with various stakeholders at regional and global levels will be a key aspect of your role. To be considered a qualified candidate for this position, you should have at least 10 years of senior leadership experience in product development or engineering within enterprise software product companies. Additionally, you should have a minimum of 3 years of experience in a VP Product or equivalent role managing large-scale technical teams in a growth market. You must have a proven track record of leading agentic AI development and scaling AI in a hybrid cloud environment, as well as experience with Agile and DevSecOps methodologies. Your background should include expertise in cloud platforms, data harmonization, data analytics for AI, Kubernetes, containerization, and microservices-based architectures. Experience in delivering SaaS-based data and analytics platforms, modern data stack technologies, AI/ML infrastructure, enterprise security, and performance engineering is also crucial. A passion for open-source collaboration, building high-performing engineering cultures, and inclusive leadership is highly valued. Ideally, you should hold a Master's degree in engineering, Computer Science, or an MBA. At Teradata, we prioritize a people-first culture, offer a flexible work model, focus on well-being, and are committed to Diversity, Equity, and Inclusion. Join us in our mission to empower our customers and drive innovation in the world of AI and data analytics.,

Posted 2 weeks ago

Apply

3.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

We are seeking a highly skilled and experienced Snowflake Architect to take charge of designing, developing, and deploying enterprise-grade cloud data solutions. The ideal candidate should possess a solid background in data architecture, cloud data platforms, and Snowflake implementation, along with practical experience in end-to-end data pipeline and data warehouse design. In this role, you will be responsible for leading the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. You will also be tasked with defining data modeling standards, best practices, and governance frameworks. Collaborating with stakeholders to comprehend data requirements and translating them into robust architectural solutions will be a key part of your responsibilities. Furthermore, you will be required to design and optimize ETL/ELT pipelines utilizing tools like Snowpipe, Azure Data Factory, Informatica, or DBT. Implementing data security, privacy, and role-based access controls within Snowflake is also essential. Providing guidance to development teams on performance tuning, query optimization, and cost management within Snowflake will be part of your duties. Additionally, ensuring high availability, fault tolerance, and compliance across data platforms will be crucial. Mentoring developers and junior architects on Snowflake capabilities is an important aspect of this role. Qualifications and Experience: - 8+ years of overall experience in data engineering, BI, or data architecture, with a minimum of 3+ years of hands-on Snowflake experience. - Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization. - Strong proficiency in SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP). - Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion. - Good understanding of data lakes, data mesh, and modern data stack principles. - Experience with CI/CD for data pipelines, DevOps, and data quality frameworks. - Solid knowledge of data governance, metadata management, and cataloging. Desired Skills: - Snowflake certification (e.g., SnowPro Core/Advanced Architect). - Familiarity with Apache Airflow, Kafka, or event-driven data ingestion. - Knowledge of data visualization tools such as Power BI, Tableau, or Looker. - Experience in healthcare, BFSI, or retail domain projects. Please note that this job description is sourced from hirist.tech.,

Posted 2 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

maharashtra

On-site

Join our dynamic Integrated Data Platform Operations team and be at the forefront of data innovation. Collaborate with clients and technology partners to ensure data excellence. Elevate your career by driving data quality and governance in a strategic environment. As an Associate in the Integrated Data Platform Operations team, you will work with clients and technology partners to implement data quality and governance practices. You will define data standards and ensure data meets the highest quality. You will play a crucial role in enhancing data management across the Securities Services business. Key Responsibilities: - Define data quality standards - Investigate data quality issues - Collaborate with technology partners - Establish dashboards and metrics - Support data view and lineage tools - Embed data quality in UAT cycles - Assist Operations users with data access - Work with project teams on implementations - Implement data ownership processes - Deliver tools and training for data owners - Champion improvements to data quality Required Qualifications, Capabilities, and Skills: - Engage effectively across teams - Understand data components for IBOR - Comprehend trade lifecycle and cash management - Possess technical data management skills - Solve operational and technical issues - Deliver with limited supervision - Partner in a virtual team environment Preferred Qualifications, Capabilities, and Skills: - Demonstrate strong communication skills - Exhibit leadership in data governance - Adapt to changing project requirements - Analyze complex data sets - Implement innovative data solutions - Foster collaboration across departments - Drive continuous improvement initiatives,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As a GTM Operations Analyst at Snowflake, you will play a crucial role in supporting the GTM Operations Shared Services organization. Your responsibilities will include aligning data across users, records, and systems, supporting Partner processes, troubleshooting and resolving process gaps, and participating in projects to enhance GTM systems and processes. You will be at the forefront of ensuring sales systems, processes, and data are driving business value, with a focus on operational excellence and continuous improvement. Your day-to-day tasks will involve providing first-line support for GTM operations workflows, utilizing data quality practices to ensure accuracy and completeness, ensuring compliance with policies and regulations, communicating with stakeholders, managing requests via a queue, and maintaining key documentation. Additionally, you will identify opportunities for process improvement and automation to enhance shared services systems. To excel in this role, you should have a minimum of 5 years of professional experience, with at least 3 years in sales operations, master data management, sales systems, processes, and tools. You should be well-versed in Salesforce CRM and connected applications, have experience in case queue management, SLA execution, and service organizations, and possess expertise in master data management, data quality, and data governance. Flexibility to work shifts based on operational demands, strong communication skills, business acumen, analytical abilities, and problem-solving skills are essential. An undergraduate degree is required, and an MBA is a plus. Snowflake is a rapidly growing company, and we are seeking individuals who align with our values, challenge conventional thinking, and drive innovation. If you are passionate about making an impact and contributing to our growth journey, we invite you to explore opportunities with Snowflake. For further details on job location, salary, and benefits in the United States, please refer to the job posting on the Snowflake Careers Site at careers.snowflake.com.,

Posted 2 weeks ago

Apply

0.0 - 3.0 years

0 Lacs

indore, madhya pradesh

On-site

The leading provider of comprehensive waste and environmental services in North America, Waste Management (WM), is seeking an individual to join their HR Technology team in an entry level position. As a part of the People Organization, you will be responsible for configuring and supporting software application systems that impact HR processes. Your role will involve providing technical and analytical support for HR foundational elements and structure. Key responsibilities include monitoring HR systems, troubleshooting application related issues, and addressing system related queries. You will also be involved in performing process review analysis, making configuration changes, and ensuring data integrity and governance through various validation methods. Additionally, you will play a crucial role in preparing for releases, upgrades, and patches by conducting testing, reporting, and analysis of changes. In this role, you will not have any supervisory duties. To be successful in this position, you must hold a Bachelor's Degree or High School Diploma/GED with four years of relevant work experience. Previous experience is not required beyond the education requirement. The work environment for this role involves using motor coordination, exerting physical effort in handling objects, and may involve exposure to physical occupational risks and environmental elements. The normal setting for this job is an office environment, and you must be available to work standard business hours with the flexibility to work non-standard hours in case of emergencies. As a part of Waste Management, you will receive a competitive total compensation package that includes Medical, Dental, Vision, Life Insurance, Short Term Disability, and more. Additionally, employees enjoy benefits such as a Stock Purchase Plan, Company match on 401K, Paid Vacation, Holidays, and Personal Days. Please note that benefits may vary by site. If you are looking for an opportunity to contribute to a Fortune 250 company and have the necessary qualifications and experience, we encourage you to click "Apply" to be considered for this role.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As an Operations Analyst at Lam Research, you will have the opportunity to play a crucial role in enhancing the operational effectiveness and efficiency of the Global Operations team. Your primary responsibility will be to utilize analytical methodologies to guide decision-makers towards achieving operational excellence. Your contributions will be instrumental in driving improvements and optimizing processes within the organization. Your main responsibilities will include developing, automating, and maintaining comprehensive reports and dashboards using tools such as Excel and Power BI. You will analyze datasets to provide valuable insights and create visualizations that effectively communicate data stories. It will be essential to ensure compliance with analytical standards and data governance policies to uphold data integrity and accuracy. Additionally, you will be expected to challenge stakeholders to prioritize long-term, data-driven decisions over quick fixes and identify process gaps, offering data-driven recommendations to leadership. You will also play a key role in facilitating change management for data and process changes, ensuring smooth implementation and seamless rollout. Monitoring and publishing operational performance against established metrics and targets will be crucial to track progress and make informed decisions. The ideal candidate for this role will hold a Bachelor's degree in business administration, operations management, supply chain, project management, finance, engineering, or a related field. You should have a minimum of 5+ years of experience in operations, with a focus on extracting and analyzing operational data to derive meaningful insights. Proficiency in data analysis tools and software, particularly Excel and Power BI, is required. Strong self-learning ability, excellent written and verbal communication skills, effective task management, and innovative problem-solving skills are essential qualities for this position. Preferred qualifications include experience with Alteryx for data preparation, modelling, and advanced analytics, as well as expertise in analyzing and optimizing complex operational processes. Demonstrated experience in process mapping, workflow analysis, root cause analysis, and corrective action planning will be advantageous. At Lam Research, we are committed to creating an inclusive environment where every individual is valued, included, and empowered to achieve their full potential. Our work location models offer flexibility based on role requirements, with hybrid roles combining on-site collaboration with remote work options to support a balanced approach to work-life integration.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

At NiCE, we challenge our limits and strive to be game changers in everything we do. If you are ambitious, innovative, and always play to win, we have the ultimate career opportunity that will ignite your passion for excellence. We are currently looking for an experienced AI Architect who possesses not only strategic thinking skills but also a hands-on approach to coding. In this role, you will be involved in both proof-of-concept (POC) and production-grade AI projects. Your responsibilities will include mentoring team members, establishing ethical AI practices, and making critical decisions regarding AI deployment strategies. It is essential to have a strong background in deploying AI solutions on cloud platforms like Azure or AWS, as well as expertise in building secure and compliant data and machine learning pipelines. As a proactive leader, you will be responsible for bridging the gap between innovation and execution while ensuring scalability, security, and governance in AI systems. Your impact will be significant as you: - Build scalable AI systems and infrastructure capable of handling large datasets, ensuring performance, reliability, and maintainability. - Lead the development of secure and compliant data and machine learning pipelines, aligning with data governance and regulatory standards. - Design, develop, and implement AI models and algorithms to solve real-world business problems. - Mentor team members on AI technologies, best practices, and system architecture. - Collaborate with stakeholders to identify AI-driven innovation opportunities and translate business requirements into technical solutions. - Promote ethical and responsible AI practices across the organization. - Take ownership of strategic decisions related to AI deployment and lifecycle management. - Conduct research and implement machine learning algorithms, including Retrieval-Augmented Generation (RAG) techniques. - Develop AI applications using modern frameworks and run experiments to enhance model performance. - Define and implement AI project Software Development Lifecycle (SDLC) processes. To be successful in this role, you should have: - Bachelors or Masters degree in Computer Science, Data Science, Artificial Intelligence, or related field. - Proven experience as an AI Architect with a track record of deploying AI solutions in production. - Strong expertise in AI/ML technologies, cloud platforms, and secure data management. - Proficiency in programming languages such as Python, .NET, and AI/ML frameworks. - Experience with AI project SDLC, CI/CD for ML, and AI testing strategies. - Familiarity with DevOps and Data Engineering tools and practices. - Strong analytical and problem-solving skills. - Excellent communication skills to convey complex technical concepts. Join NiCE, a global company where innovation and collaboration thrive. Embrace the NICE-FLEX hybrid work model for maximum flexibility and endless opportunities for growth and development. If you are passionate, innovative, and ready to raise the bar, come join us at NiCE! Requisition ID: 7474 Reporting into: Tech Manager Role Type: Individual Contributor About NiCE: NICELtd. (NASDAQ: NICE) is a global leader in software products used by over 25,000 businesses worldwide. With a focus on delivering exceptional customer experiences and ensuring public safety, NiCE is known for its innovation in AI, cloud, and digital domains. Join our team of over 8,500 employees across 30+ countries and be part of our journey towards excellence.,

Posted 2 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

karnataka

On-site

The Reporting & Data Product Owner - ISS Data (Associate Director) role at Fidelity involves leading the creation and execution of a future state data reporting product to enable Regulatory, Client, Vendor, Internal & MI reporting and analytics. This key role requires an in-depth knowledge of data domains related to institutional clients, investment life cycle, and regulatory and client reporting data requirements. Sitting within the ISS Delivery Data Analysis chapter, the successful candidate will collaborate with Business Architecture, Data Architecture, and business stakeholders to build a future state platform. Maintaining strong relationships with various business contacts is essential to ensure superior service to internal business stakeholders and clients. **Key Responsibilities** **Leadership and Management:** - Lead ISS distribution, Client Propositions, Sustainable Investing, and Regulatory reporting data outcomes - Define data roadmap and capabilities, supporting execution and delivery of data solutions as a Data Product lead - Line management responsibilities for junior data analysts within the chapter - Define data product vision and strategy with end-to-end thought leadership - Lead and define the data product backlog, documentation, analysis effort estimation, and planning - Drive efficiencies, scale, and innovation as a catalyst for change **Data Quality and Integrity:** - Define data quality use cases for all required data sets - Contribute to technical frameworks of data quality - Align functional solution with best practice data architecture & engineering **Coordination and Communication:** - Communicate at a senior management level to influence senior tech and business stakeholders globally - Coordinate with internal and external teams impacted by data flows - Advocate for the ISS Data Programme - Collaborate closely with Data Governance, Business Architecture, Data owners, etc. - Conduct workshops within scrum teams and across business teams, effectively documenting minutes and driving actions This role offers a comprehensive benefits package, prioritizes wellbeing, supports development, and provides flexibility in work arrangements. Fidelity is committed to ensuring a motivating work environment where employees feel valued and part of a team. Visit careers.fidelityinternational.com to learn more about our work, approach to dynamic working, and opportunities for building a future with us.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

You are a highly motivated and detail-oriented Data Catalog Analyst with expertise in erwin Data Intelligence Suite (DIS), particularly the erwin Data Catalog module. Your main responsibility will be to build and maintain a centralized metadata repository that enables data discovery, lineage, and governance across the enterprise. You will configure, implement, and maintain the erwin Data Catalog to support enterprise metadata management. It will be your duty to harvest metadata from various data sources such as databases, ETL tools, BI platforms, etc., ensuring accuracy and completeness. Your role will involve developing and maintaining data lineage, impact analysis, and data flow documentation. Collaboration with data stewards, business analysts, and IT teams is essential for defining and enforcing metadata standards and governance policies. You will also support the creation and maintenance of business glossaries, technical metadata, and data dictionaries. Ensuring metadata is accessible and well-organized to enable data discovery and self-service analytics will be one of your priorities. You are expected to provide training and support to business and technical users on effectively using the erwin platform. Monitoring system performance and troubleshooting issues related to metadata ingestion and cataloging will also be part of your responsibilities. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. Additionally, you should have at least 3 years of experience in data governance, metadata management, or enterprise data architecture. Hands-on experience with erwin DIS, especially erwin Data Catalog and erwin Data Modeler, is required. Knowledge of other tools like Collibra, Atlasian, etc., can also be considered. A strong understanding of metadata management, data lineage, and data governance frameworks (e.g., DAMA-DMBOK) is necessary. You should be familiar with relational databases, data warehouses, and cloud data platforms (e.g., AWS, Azure, GCP), along with proficiency in SQL and data profiling tools. Preferred skills for this role include experience with other data governance tools (e.g., Collibra, Informatica, Alation), knowledge of regulatory compliance standards (e.g., GDPR, HIPAA, CCPA), strong communication and stakeholder engagement skills, knowledge of creating documentation, experience with Agile methodology, and the ability to work independently and manage multiple priorities in a fast-paced environment.,

Posted 2 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

faridabad, haryana

On-site

As a Data Governance Specialist, you will be responsible for developing and executing data governance strategies and roadmaps to ensure the integrity, accuracy, and efficiency of master data across the organization. This includes leading the implementation and enhancement of SAP MDG solutions, such as data modeling, data stewardship, and workflow management. Designing and enforcing data governance policies, procedures, and standards to maintain data quality and consistency will be a crucial part of your role. You will collaborate closely with cross-functional teams, including business stakeholders, IT teams, and external vendors, to gather requirements, design solutions, and ensure successful project delivery. Managing the integration of SAP MDG with other SAP modules and third-party applications is essential to ensure seamless data flow and consistency. Additionally, you will implement and manage data quality checks, validations, and cleansing processes to uphold high standards of data accuracy and reliability. Facilitating change management processes, including training and support for end-users, is a key aspect of this role to ensure effective adoption of MDG solutions. You will also be responsible for identifying opportunities for process improvements and automation in master data management practices and recommending enhancements to existing systems and processes. Providing advanced support for troubleshooting and resolving complex issues related to SAP MDG and master data management will be part of your responsibilities to ensure smooth operations and functionality.,

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies