Jobs
Interviews

2326 Data Governance Jobs - Page 33

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Build Your Career at Informatica We seek innovative thinkers who believe in the power of data to drive meaningful change. At Informatica, we welcome adventurous, work-from-anywhere minds eager to tackle the worlds most complex challenges. Our employees are empowered to push their bold ideas forward, and we are united by a shared passion for using data to do the extraordinary for each other and the world. Senior Solution Architect - Presales (Remote) Were looking for a senior solution architect candidate with experience in Presales, Data Integration and MDM, to join our team in remote. You will report to the Director, Technical Sales. Technology Youll Use Presales, Data Integration and MDM Your Role Responsibilities? Heres What Youll Do Basic knowledge of top 3 cloud ecosystems and top 2 data related technologies Basic knowledge of cloud computing security aspects Basic certification on at least 1 cloud ecosystem and 1 data related cloud technology at the level defined by the business area of focus Skills on at least one INFA related software platform /technology, Storytelling, and experience establishing communication and engagement with prospects specific to use cases Ability to engage and create relationships with influencers, coaches, and decision makers, and partners Basic technical knowledge of hybrid deployment of software solutions, Data Warehousing, Database, and/or Business Intelligence software concepts and products. What Wed Like to See Manage customer engagements without support. Responsible for sharing best practices, content, and tips and tricks within the primary area of responsibility Stay current on certification of services required for area of responsibility Perform all activities leading up to the delivery of a customer demo with some assistance including discovery, technical qualification/fit, customer presentations, standard demos, and related customer facing communication Assist on RFP responses and/or POCs Partner with the CSM team on nurture activities including technical advisory, workshops, etc. Provide customer feedback on product gaps using Vivun Ability to support demos at marketing events without support. Role Essentials 6+ years of relevant experience in data integration, master data management, or data governance 8+ year of presales/technical sales, industry, or consulting experience BA/BS or equivalent educational background is preferred Perks & Benefits Comprehensive health, vision, and wellness benefits (Paid parental leave, adoption benefits, life insurance, disability insurance and 401k plan or international pension/retirement plans Flexible time-off policy and hybrid working practices Equity opportunities and an employee stock purchase program (ESPP) Comprehensive Mental Health and Employee Assistance Program (EAP) benefit Our DATA values are our north star and we are passionate about building and delivering solutions that accelerate data innovations. At Informatica, our employees are our greatest competitive advantage. So, if your experience aligns but doesnt exactly match every qualification, apply anyway. You may be exactly who we need to fuel our future with innovative ideas and a thriving culture. Informatica (NYSE: INFA), a leader in enterprise AI-powered cloud data management, brings data and AI to life by empowering businesses to realize the transformative power of their most critical assets. We pioneered the Informatica Intelligent Data Management Cloud that manages data across any multi-cloud, hybrid system, democratizing data to advance business strategies. Customers in approximately 100 countries and more than 80 of the Fortune 100 rely on Informatica. www.informatica.com . Connect with LinkedIn , X , and Facebook . Informatica. Where data and AI come to life. ","

Posted 2 weeks ago

Apply

10.0 - 15.0 years

20 - 27 Lacs

Bengaluru

Work from Office

Build Your Career at Informatica Were looking for a diverse group of collaborators who believe data has the power to improve society. Adventurous minds who value solving some of the worlds most challenging problems. Here, employees are encouraged to push their boldest ideas forward, united by a passion to create a world where data improves the quality of life for people and businesses everywhere. Principal Advisory Services Consultant Informatica is looking for a Principal Consultant--Advisory Services with practitioner experience leading large-scale data management and analytics projects. This is a remote position, reporting to a Senior Director, Data Strategy & Governance, you have experience implementing data governance programs, defining vision and data strategy with peers and senior leadership to gain "support" to the strategy and overall value of Informatica Products & Solutions to join our Professional Services team. You will provide pre- and post-sale strategic consulting services. Responsibilities include providing clients with data strategy development and understanding, implementation guidance, program design, business use case identification, program road mapping, and business outcome definition. Provides pre- and post-sale business-oriented strategic consulting services, typically onsite at the customers location. Responsibilities include providing clients with overall data strategy development and alignment, implementation guidance, program design, business use case identification, program road mapping, and business outcome definition. Essential Duties & Responsibilities Analyzes complex customer environments comprised of Informatica and non-informatica products. Organizes large-scale programs and coordinates/leads multiple delivery teams. Applies innovative design solutions by keeping current on new technology trends and changing industry standards and patterns. Travel to customer sites typically exceeds 50%, but may exceed 75% for extended periods, as applicable to the customer engagement. Knowledge & Skills Holds expert-level experience and uses professional concepts and company objectives to resolve complex issues in creative and effective ways. Works on complex issues where analysis of situations or data requires an in-depth evaluation of variable factors. Exercises judgment in methods, techniques, and evaluation criteria for obtaining results. Etensively leverages business acumen and subject matter expertise to provide expert-level advice and guidance to clients. Thorough understanding of Informatica business priorities, strategy and direction. Works across the organization and maintains/builds strong working relationships based in experiences/past interactions. Significant experience leading the delivery of complex enterprise data management projects/initiatives. Competent in navigating, using, and demonstrating functionality in Informaticas business-facing applications. Published industry white papers, best practices, field guides and external communications. Strong written communication skills with competency in developing professional looking presentation materials and customer deliverables. Developed ability in communicating to executive level audiences in both interpersonal and presentation formats. Education/Experience BA/BS or equivalent educational background is preferred. Minimum 10+ years of relevant professional experience. Perks & Benefits Comprehensive health, vision, and wellness benefits (Paid parental leave, adoption benefits, life insurance, disability insurance and 401k plan or international pension/retirement plans Flexible time-off policy and hybrid working practices Equity opportunities and an employee stock purchase program (ESPP) Comprehensive Mental Health and Employee Assistance Program (EAP) benefit Our DATA values are our north star and we are passionate about building and delivering solutions that accelerate data innovations. At Informatica, our employees are our greatest competitive advantage. So, if your experience aligns but doesnt exactly match every qualification, apply anyway. You may be exactly who we need to fuel our future with innovative ideas and a thriving culture. Informatica (NYSE: INFA), an Enterprise Cloud Data Management leader, brings data and AI to life by empowering businesses to realize the transformative power of their most critical assets. We pioneered the Informatica Intelligent Data Management Cloud that manages data across any multi-cloud, hybrid system, democratizing data to advance business strategies. Customers in over 100 countries and 85 of the Fortune 100 rely on Informatica. www.informatica.com . Connect with LinkedIn , Twitter , and Facebook . Informatica. Where data and AI come to life. ","

Posted 2 weeks ago

Apply

6.0 - 7.0 years

13 - 15 Lacs

Bengaluru

Work from Office

Job Title: Software DeveloperLocation: TechM Blr ITC06 07Years of Experience: 2 5 YearsJob Summary:We are seeking a skilled Software Developer with a strong background in SAP Archiving to join our dynamic team The ideal candidate will have 2 5 years of experience in software development, with a focus on SAP solutions You will be responsible for designing, developing, and implementing software applications that meet our business needs while ensuring data integrity and compliance through effective archiving strategies Responsibilities:Design, develop, and maintain software applications in accordance with business requirements Implement and manage SAP Archiving solutions to optimize data storage and retrieval processes Collaborate with cross functional teams to gather requirements and translate them into technical specifications Conduct code reviews and ensure adherence to best practices in software development Perform testing and debugging of applications to ensure high quality deliverables Provide technical support and troubleshooting for existing applications Stay updated with the latest industry trends and technologies related to SAP and software development Mandatory Skills:Strong knowledge and experience in SAP Archiving

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 11 Lacs

Pune

Work from Office

Job Overview: We are seeking a skilled Data Solution Architect to design Solution and Lead the implementation on GCP. The ideal candidate will possess extensive experience in data Architecting, Solution design and data management practices.Responsibilities: Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions.Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Experience with common GCP services such as BigQuery, Dataflow, GCS, Service Accounts, cloud function Extremely strong in BigQuery design, development Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modelling using any Modelling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills.

Posted 2 weeks ago

Apply

8.0 - 10.0 years

15 - 19 Lacs

Mumbai

Work from Office

Job Description Are You Ready to Make It Happen at Mondel z International? Join our Mission to Lead the Future of Snacking. Make It Uniquely Yours. The Senior Manager, Finance Data Governance is a critical role responsible for leading and executing the finance master data governance strategy. This role will drive the implementation of data governance policies, standards, and processes to ensure data quality, integrity, and security. The Senior Manager will collaborate with business stakeholders, IT teams, and data owners to establish a data-driven culture and enable effective use of data for business decision-making. How you will contribute Strategy and Leadership: Contribute to the development and execution of the overall data governance strategy, aligning with business objectives and regulatory requirements. Promote data governance awareness and adoption throughout the organization. Policy and Standards: Develop and maintain data governance policies, standards, and procedures, ensuring alignment with industry best practices and regulatory guidelines. Define data quality metrics and monitor data quality performance. Establish data ownership and stewardship responsibilities. Implementation and Execution: Lead the implementation of data governance tools and technologies. Work with business units to identify and prioritize data governance initiatives. Ensure data lineage is documented and maintained. Collaboration and Communication: Partner with business stakeholders to understand data needs and requirements. Collaborate with IT teams to ensure data governance requirements are integrated into system development and maintenance processes. Communicate data governance policies and procedures to the organization. Facilitate data governance council meetings and working groups. Data Quality Management: Establish data quality rules and monitor data quality metrics. Identify and resolve data quality issues. Implement data quality improvement initiatives. Compliance and Security: Ensure data governance policies and procedures comply with relevant regulations, such as GDPR, CCPA, and other data privacy laws. Implement data security measures to protect sensitive data. Monitor and audit data governance activities to ensure compliance. What you will bring A desire to drive your future and accelerate your career and the following experience and knowledge: Qualifications: Education: Bachelors degree in a relevant field (e.g., Finance, Business Administration, Data & Analytics). Masters degree preferred. Experience: Minimum of 8-10 years of experience in data governance, data management, or related fields. Proven experience in leading and implementing data governance initiatives. Strong understanding of data governance principles, practices, and technologies. Experience with data quality management tools and techniques. Skills: Excellent leadership and communication skills. Strong analytical and problem-solving skills. Ability to work effectively with cross-functional teams. Proficiency in data governance tools and technologies (e.g., Collibra, Informatica, Alation). Knowledge of data warehousing and business intelligence concepts. Strong project management skills. Key Competencies: Strategic Thinking: Ability to develop and execute a data governance strategy aligned with business objectives. Communication: Ability to communicate complex data governance concepts to both technical and non-technical audiences. Collaboration: Ability to work effectively with cross-functional teams. Problem Solving: Ability to identify and resolve data governance issues. Technical Proficiency: Strong understanding of data governance tools and technologies. Results Orientation: Ability to drive data governance initiatives to achieve measurable results. More about this role Education / Certifications: Education: Bachelors degree in a relevant field (e.g., Finance, Business Administration, Data & Analytics). Masters degree preferred. Job specific requirements: Minimum of 8-10 years of experience in data governance, data management, or related fields. Proven experience in leading and implementing data governance initiatives. Strong understanding of data governance principles, practices, and technologies. Experience with data quality management tools and techniques. Travel requirements: Occasional Work schedule: Flexible Relocation Support Available? No Relocation support available Business Unit Summary We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position you see is right for you, we encourage you to apply! Our people make all the difference in our succes Excited to grow your career? We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position you see is right for you, we encourage you to apply! IF YOU REQUIRE SUPPORT TO COMPLETE YOUR APPLICATION OR DURING THE INTERVIEW PROCESS, PLEASE CONTACT THE RECRUITER Job Type Regular Project and Program Management Business Capability

Posted 2 weeks ago

Apply

7.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Experience: 7-10 years Job Description: We are looking for an experienced SAP Master Data Management (MDM) Consultant with expertise in ECC, S4HANA Migration, Rollouts, and Data Management. The ideal candidate will lead and execute MDM strategies, manage data migration, and drive continuous improvements. Key Responsibilities: Own and manage Master Data Management (MDM) activities for SAP projects. De-duplication of Masters Lead data migration and cutovers in SAP S/4HANA projects (Greenfield, Migration, or Rollouts). Establish and implement MDM best practices and data management capabilities. Define data management principles, policies, and lifecycle strategies. Monitor data quality with consistent metrics and reporting. Work with MDM stakeholders to drive data governance and compliance. Track and manage MDM objects, ensuring timely delivery. Conduct training sessions for teams on ECC & S/4HANA MDM. Participate in daily stand-ups, issue tracking, and dashboard updates. Identify risks and process improvements for MDM. Required Skills & Qualifications: Minimum 7-10 years of experience in SAP MDM. Strong knowledge of ECC, SAP S/4HANA, Data Migration, and Rollouts. Experience in data governance, lifecycle management, and compliance. Familiarity with JIRA KANBAN boards, ticketing tools, and dashboards. Strong problem-solving and communication skills. Ability to work with the team especially ABAP, Middleware, Functionals. Knowledge on Excel is a MUST ABAP knowledge is preferable SAP training or certifications are an asset Team player, with strong communication skills and with a collaborative spirit Able to coach, support, train and develop junior consultants Customer oriented, result driven & focused on delivering quality

Posted 2 weeks ago

Apply

2.0 - 7.0 years

7 - 12 Lacs

Chennai

Work from Office

Role Purpose The purpose of this role is to execute the process and drive the performance of the team on the key metrices of the process. Job Details Country/Region: India Employment Type: Onsite Work Type: Contract State: Tamil Nadu City: Chennai Requirements Onsite at Abu Dhabi Contract for 2 Years Shift: Abu Dhabi General Shift Timings Someone who can travel to onsite ASAP or max 40 days to start working on this role. Job Description Minimum overall work experience: 10 years Financial Systems Support (L1/L2) Provide financials system support to all end users across ADD functions relating to financial systems. Coordinating with Finance HQ IT support team to resolve reported issues through RITM/Incident/Idea tickets. Endorse and approve financial roles access & authorization tickets. Month End Closing (MEC) Support Provide support during transactional data processing, preparing uploads, data validation & reconciliation and identify incorrect master data assignments and subsequently suggest corrective actions. Closely coordinating with financial users in data reconciliations and validation during month end closing activities. Close coordinating with HUB and Bi teams to update SAP data to get correct segmented finance reports which is aligned with SAP data. Validate Bi Financial Reports & Dashboard Master data governance and policy & procedure compliance

Posted 2 weeks ago

Apply

4.0 - 6.0 years

8 - 12 Lacs

Bengaluru

Work from Office

[{"Salary":null , "Remote_Job":false , "Posting_Title":"Senior Databricks Engineer / Tech Lead" , "Is_Locked":false , "City":"Bangalore" , "Industry":"Technology" , "Job_Description":" About the Role: As part of our Innovation Team , we are seeking a C ertified Senior Databricks Engineer / Tech Lead with 7\u20138 years of hands-on experience in building scalable data platforms. This role will focus on designing, building, and operationalizing data solutions on the Databricks platform to accelerate advanced analytics and AI use cases. Key Responsibilities: Architect, develop, productionize and maintain end to end solutions in Databricks Implement and optimize ETL/ELT processes for structured and semi-structured data Leverage Delta Lake for ACID transactions, data versioning, and time-travel features Drive adoption of the Lakehouse architecture to unify data warehousing and AI/ML workloads Implement CI/CD pipelines using Databricks Repos , Asset Bundles , and integration with DevOps tools Configure and enforce Unity Catalog for secure, governed access to data assets Design and implement data quality and validation frameworks to ensure trusted data Lead performance tuning and optimization efforts for Spark jobs and queries Integrate with external systems such as Kafka , Event Hub , and REST APIs for real-time and batch processing Collaborate with data scientists and business stakeholders to build feature-rich datasets and reusable assets Troubleshoot and debug complex data workflows in development and production environments Guide junior engineers and contribute to best practices in data engineering and platform usage Ensure platform security, access controls , and compliance with enterprise data governance standards. Required Skills: Expertise in Apache Spark and Databricks platform Experience with Databricks Lakehouse architecture Delta Lake concepts Proficient in PySpark, SQL, and Delta Lake Strong knowledge of Data Engineering concepts Experience with data ingestion, ETL/ELT pipelines Familiarity with Unity Catalog and data governance Hands-on with Databricks Notebooks and Jobs CI/CD automation with Databricks Repos and DevOps, Asset Bundles Databricks Asset Bundle implementation knowledge Strong understanding of performance tuning in Spark Data quality and validation framework implementation Experience in handling structured, semi-structured data Proficient in debugging and troubleshooting Collaboration with data scientists and analysts Good understanding of security and access control Experience with Mosaic AI or Databricks ML capabilities Exposure to streaming pipelines using Structured Streaming Familiarity with data observability and lineage tools.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

15 - 30 Lacs

Pune

Hybrid

Dear Candidate, This is with reference to Senior Business Intelligence Analyst Openings At Wolters Kluwer, Pune Kindly share your resume on jyoti.salvi@wolterskluwer.com Job Specifications :- Skillset Requirement : Looking for Data Governance professionals with experience in Collibra . Experience in Microsoft Purview is highly preferred. Experience Range : Candidates with 2 to 8 years of relevant Data Governance experience Primary Skills - Data Governance, Microsoft Purview, Collibra Architecting, designing, and implementing data governance solutions using Microsoft Purview Experience in data lifecycle management, including data retention, deletion, and archiving strategies using Microsoft Purview Data Lifecycle Management Assist transitions to Microsoft Purview services, including setting up data lifecycle management and eDiscovery configurations Maintain accurate documentation of configurations, processes, and procedures related to Microsoft Purview Experience in implementation of data governance policies and procedures to ensure compliance with regulatory requirements and organizational standards Ensure Data Quality and Compliance by applying expertise in MDM and data governance principles, including data governance frameworks and practices, to ensure the relevancy, quality, security, and compliance of master data Develop and implement data integration solutions for metadata, data lineage, and data quality

Posted 2 weeks ago

Apply

9.0 - 14.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Design and implement custom workflows using Collibra Workflow Designer (BPMN). Collaborate with data governance teams to translate business requirements into technical solutions. Develop and maintain Collibra integrations using APIs, Collibra Connect, and third-party tools. Configure and manage Collibra Data Intelligence Cloud components including domains, assets, and communities. Support metadata management, data lineage, and data catalog initiatives. Troubleshoot and resolve workflow issues and performance bottlenecks. Ensure compliance with data governance policies and standards. Document workflow logic, configurations, and integration processes. Required Skills & Qualifications: 5+ years of experience in data governance or metadata management. 2+ years of hands-on experience with Collibra platform, especially workflow development. Proficiency in Java, Groovy for scripting and workflow customization. Experience with Collibra Connect, REST APIs, and integration with tools like Informatica, Snowflake, or Azure. Familiarity with BPMN 2.0 and workflow lifecycle management. Strong understanding of data governance frameworks (eg, DAMA-DMBOK). Excellent problem-solving and communication skills. Mandatory skills* Data Governance Desired skills* Collibra Domain* Foods and Beverages

Posted 2 weeks ago

Apply

12.0 - 15.0 years

45 - 50 Lacs

Mumbai

Work from Office

This is an exciting time in TransUnion CIBIL. With investments in our people, technology and new business markets, we are redefining the role and purpose of a credit bureau. This role involves overseeing and managing priority sector lending and financial inclusion data acquisition initiatives, ensuring compliance with regulatory requirements while driving growth and impact. What you'll Bring: Data Acquisition Strategy & Execution: Execute functional strategy to drive customer engagement on data acquisition across all associated member institution. Identifying, exploring and detailing out the opportunities to solve for critical data submission issues of the clients. Understanding business initiatives and its purpose to drive and channelize discussions with diverse teams in distributed work environments Identifying, exploring and detailing out the opportunities to solve for critical data submission issues of the clients. Stakeholder Management & Collaboration: Maintain key customer relationships and develop, implement data related strategies with key decision makers. Providing regular inputs to the Product teams on data reporting and any changes in reporting and best practices in the market for smooth as we'll as prompt response. Collaborate with multiple business stakeholders (Sales, Operations and Products) to identify priorities, metrics and track progress on identified data acquisition initiatives Reporting & Insights Generation: Drawing meaningful conclusions and recommendations based on data analysis results for effective member engagement. Take complete ownership of data directives to achieve assigned tasks from its planning, analysis till providing required business insights enabling rational decision making Team Leadership & Management: Build and lead a high performing data acquisition team, including data analyst and data acquisition managers. Set clear KPIs and performance benchmarks for data acquisition teams on data enhancement and reporting Provide specialize training and capacity building programs for data acquisition team members related to MFI data reporting best practices and compliance. Regulatory Compliance & Data Governance Ensure complete, accurate and timely reporting of data and comply with the relevant regulatory guidelines. Establish governance framework for data ingestion, data validations and standardization Monitor adherence to regulatory standards and data reporting practices. Liaise with legal and compliance teams to stay updated on policy changes affecting data acquisition. Experience and Skills Master s degree in agriculture, Rural Business administration or a related field Minimum 12+ years of relevant experience in managing Priority Sector lending or financial inclusion. Flexibility to travel as needed Self-starter, ability to work independently, handle ambiguous situations and exercise judgement in variety of situations. Strong communication, organizational, verbal & written skills. High degree of responsibility and ownership, strong multitasking, coordination and tenaciously looking for ways to get results. This job is assigned as On-Site Essential and requires in- person work at an assigned TU office location as a condition of employment.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Define and implement data quality rules, scorecards, and issue management workflows. Profile datasets to identify and resolve data quality issues. Monitor and report on data quality metrics and KPIs. Support data stewardship activities and provide training to business users. Implement and configure Collibra Data Intelligence Platform for metadata management, data cataloging, and governance workflows. Collaborate with business and IT stakeholders to define and enforce data governance policies and standards. Required Skills & Qualifications: 3+ years of experience in data governance, and data quality. Strong understanding of data governance frameworks eg, DAMADMBOK. Experience with SQL Familiarity with data privacy regulations eg, GDPR, CCPA. Excellent communication and stakeholder management skills. Mandatory skills* Data Quality Desired skills* Collibra Domain* Foods and Beverages

Posted 2 weeks ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Bengaluru

Work from Office

Project description We are looking for an experienced Finance Data Hub Platform Product Manager to own the strategic direction, development, and management of the core data platform that underpins our Finance Data Hub. This role is focused on ensuring the platform is scalable, reliable, secure, and optimized to support data ingestion, transformation, and access across the finance organisation. As the Platform Product Manager, you will work closely with engineering, architecture, governance, and infrastructure teams to define the technical roadmap, prioritize platform enhancements, and ensure seamless integration with data and UI product streams. Your focus will be on enabling data products and services by ensuring the platform's core capabilities meet evolving business needs. Responsibilities Key Responsibilities Platform Strategy & VisionDefine and own the roadmap for the Finance Data Hub platform, ensuring it aligns with business objectives and supports broader data product initiatives. Technical Collaborate with architects, data engineers, and governance teams to define and prioritise platform capabilities, including scalability, security, resilience, and data lineage. Integration ManagementEnsure the platform seamlessly integrates with data streams and serves UI products, enabling efficient data ingestion, transformation, storage, and consumption. Infrastructure CoordinationWork closely with infrastructure and DevOps teams to ensure platform performance, cost optimisation, and alignment with enterprise architecture standards. Governance & CompliancePartner with data governance and security teams to ensure the platform adheres to data management standards, privacy regulations, and security protocols. Backlog ManagementOwn and prioritise the platform development backlog, balancing technical needs with business priorities, and ensuring timely delivery of enhancements. Agile LeadershipSupport and often lead agile ceremonies, write clear user stories focused on platform capabilities, and facilitate collaborative sessions with technical teams. Stakeholder CommunicationProvide clear updates on platform progress, challenges, and dependencies to stakeholders, ensuring alignment across product and engineering teams. Continuous ImprovementRegularly assess platform performance, identify areas for optimization, and champion initiatives that enhance reliability, scalability, and efficiency. Risk ManagementIdentify and mitigate risks related to platform stability, security, and data integrity. SkillsMust have Proven 10+ years experience as a Product Manager focused on data platforms, infrastructure, or similar technical products. Strong understanding of data platforms and infrastructure, including data ingestion, processing, storage, and access within modern data ecosystems. Experience with cloud data platforms (e.g., Azure, AWS, GCP) and knowledge of data lake architectures. Understanding of data governance, security, and compliance best practices. Strong stakeholder management skills, particularly with technical teams (engineering, architecture, security). Experience managing product backlogs and roadmaps in an Agile environment. Ability to balance technical depth with business acumen to drive effective decision-making. Nice to have Experience with financial systems and data sources, such as HFM, Fusion, or other ERPs Knowledge of data orchestration and integration tools (e.g., Apache Airflow, Azure Data Factory). Experience with transitioning platforms from legacy technologies (e.g., Teradata) to modern solutions. Familiarity with cost optimization strategies for cloud platforms.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Bengaluru

Work from Office

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 6+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

10 - 14 Lacs

Mumbai

Work from Office

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in the integration efforts between Alation and Manta, ensuring seamless data flow and compatibility. Collaborate with cross-functional teams to gather requirements and design solutions that leverage both Alation and Manta platforms effectively. Develop and maintain data governance processes and standards within Alation, leveraging Manta's data lineage capabilities. Analyze data lineage and metadata to provide insights into data quality, compliance, and usage patterns Preferred technical and professional experience Lead the evaluation and implementation of new features and updates for both Alation and Manta platforms Ensuring alignment with organizational goals and objectives. Drive continuous improvement initiatives to enhance the efficiency and effectiveness of data management processes, leveraging Alati

Posted 2 weeks ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Mumbai

Work from Office

KEY STAKHOLDERS INTERNAL : - Business Supply Chain team, Business Development team, Finance, Accounts Payable, Accounts Receivable KEY STAKHOLDERS EXTERNAL : Customers, vendors REPORTING STRUCTURE: Will report to Manager - GBSS Master Data Management Team size - Individual Contributor role EXPERIENCE: 3 to 5 years of experience in a Master data management governance role. Good communication skills, both written and verbal, a must. Attention to detail and ability to work with minimal supervision Experience in Master data cleaning, policy compliance in Customer and Material Master Strong background in the pharmaceutical/FMCG/Manufacturing industry. Strong understanding of SAP systems. Strong analytical and problem-solving skills. Ability to work well with people at various levels and with different cultural backgrounds. CRITICAL QUALITIES: Ability to handle work pressure and SLA commitments Through understanding of the SAP Master data transactions related to customer/material/vendor master Understanding of the various fields maintained in customer masters and its impact on downstream process Willingness to work in shared service function Flexibility to work in night shift (in case required) Key Role and responsibilities Responsible for creation/change/review of various masters in SAP such as Customer Master, Material/SKU Masters, Vendor Masters , BOM, Prices etc.). Perform the business policy and data accuracy checks on the request received from requestor for master data creation/change Maintain the SLA adherence (99% request to be completed in 1 working day) with 100 % accuracy Data Governance to ensure that Masters are created complying statutory and legal framework. Educate the stakeholders for policy related compliances required Maintain good master data hygiene to ensure there are no duplicate master records, incomplete masters , incorrect /invalid information. Understand customer priorities and accordingly deliver the master data requirement Continue the development of data standards, processes and procedure for various master data domains Identify and manage projects to improve data standardization, accuracy and integrity Support SAP implementation projects (for SAP S4 HANA during UAT) by representing the MDM function. Support project teams during data cleansing and migration activities. Manage Data Quality exception reports. Work with SAP IT team or RPA team to develop or improve the automation levels as well as system based controls in Master Data Management Process. Work collaboratively with other functions (e.g. Quality, Regulatory, Finance, Sales, BD, IT, Manufacturing sites etc.) to ensure Master Data Other related issues are resolved and objectives are met. Time to time generate MIS related to Master Data Management. QUALIFICATION: Bachelors degree with MBA from a reputed University/Institute. Orientation course in SAP MM / SD module will be added advantage

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Noida

Work from Office

We're Hiring | Power BI + Power Platform Developer | 3-7 Yrs | Noida | Hybrid Location: Noida (Hybrid 3 days/week in office) Experience: 3 to 7 years Joiners: Immediate to Max 2 Weeks Notice Period ONLY Shift: 2:00 PM to 10:30 PM IST Cab Provided Key Responsibilities: Analyze requirements and develop interactive Power BI dashboards Work with complex data models , integrate data from multiple sources (SQL, Oracle, Excel, Web, etc.) Write and optimize SQL queries for high-performance data extraction and analysis Implement security measures , including role-based access Develop business applications using Power Apps and workflows using Power Automate Automate business processes with advanced triggers, approvals, and notifications Collaborate with business and tech teams to deliver scalable BI solutions Must-Have Skills: 3+ years of experience in Power BI , DAX , and M language Strong skills in SQL , Power Apps , and Power Automate Understanding of data warehousing and ETL Good knowledge of relational databases and data governance Strong communication and problem-solving skills Good-to-Have: Knowledge of Python Microsoft Data Analyst Associate or Power Platform Certification To Apply: Send your resume to vijay.s@xebia.com with these details: Full Name Total Experience Current CTC Expected CTC Current Location Preferred Location Notice Period / LWD (if serving) Primary Skills LinkedIn Profile Note: Apply only if you're an immediate joiner or on a notice period 2 weeks and are not in process with any other open roles with us. #PowerBIJobs #PowerPlatform #ImmediateJoiners #HiringNow #PowerAutomate #PowerApps #NoidaJobs #XebiaHiring #HybridJobs #DAX #DataVisualization

Posted 2 weeks ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

Noida

Work from Office

We're Hiring | Microsoft Fabric Developer | 3-7 Yrs | Noida | Hybrid Location: Noida (Hybrid 3 days/week in office) Experience: 3 to 7 years Joiners: Immediate to Max 2 Weeks Notice Period ONLY Shift: 2:00 PM to 10:30 PM IST Cab Provided Key Responsibilities: Setup and manage Microsoft Fabric platform Build secure, scalable Lakehouses and implement Azure Data Factory pipelines Design and manage data warehouses for analytics Develop and manage - reports and semantic models using Fabric Write complex SQL queries, Spark SQL, and build data solutions using PySpark Schedule and optimize Spark jobs for performance and reliability Leverage Data Activator for real-time analytics and insights Must-Have Skills: 3+ years of experience with Microsoft Fabric , OneLake , and Lakehouses Proven expertise with Azure Data Factory and ETL Strong knowledge of Power BI , data warehousing , and data governance Proficiency in Python , PySpark , and Spark SQL Practical exposure to real-time analytics in Fabric Good-to-Have: Knowledge of AWS services and Snowflake To Apply: Send your resume to vijay.s@xebia.com with these details: Full Name Total Experience Current CTC Expected CTC Current Location Preferred Location Notice Period / LWD (if serving) Primary Skills LinkedIn Profile Note: Apply only if you're an immediate joiner or on a notice period 2 weeks and are not in process with any other open roles with us. #MicrosoftFabric #FabricDeveloper #PySparkJobs #PowerBI #AzureDataFactory #ImmediateJoiners #NoidaHiring #HybridJobs #XebiaHiring #BigData #ETL #Lakehous

Posted 2 weeks ago

Apply

15.0 - 20.0 years

11 - 14 Lacs

Bengaluru

Work from Office

At Curriculum Associates, we believe in the potential of every child and are changing the face of education technology with award-winning learning programs like i-Ready that serve a third of the nation s K-8 students. For more than 50 years, our commitment to making classrooms better places, serving educators, and supporting accessible learning experiences for all students has driven the continuous improvement of our innovative programs. Our team of more than 2,500 employees is composed of lifelong learners who stand behind this mission, working tirelessly to serve the educational community with world-class programs and support every day. Summary: The Manager of Data Engineering is responsible for developing reliable, scalable data platform applications and pipelines to market by setting technical direction, coordinating risk and priority across teams and vendors, shaping architectural strategy, managing people, and collaborating closely with product partners on project delivery. This role powers the data behind i - Ready s report s and provides insights to teachers a bout millions of students. Essential duties/responsibilities: Team Leadership: Lead, mentor, and develop a high-performing team of data engineering and backend engineering professionals, promoting collaboration, inclusivity, and professional growth. Provide technical guidance, ensuring your team stays current with emerging technologies and adopts appropriate industry best practices. Strategy Development: Work closely with Product Managers, QE, and business stakeholders to develop strategic technical roadmaps. Align initiatives clearly with business priorities, manage feature delivery timelines, and balance addressing technical debt. Cross-functional Collaboration: Build effective relationships across teams, facilitating clear communication and alignment in an Agile/Scrum environment to swiftly address production issues and prioritize team efforts appropriately. Data Platform and Architecture: Drive best practices in data platform management and software development, ensuring high standards for data quality, architecture, testing, deployment processes, and observability. Data Engineering: Design and execute scalable batch and real-time data pipelines for performance and reliability while addressing business needs. Focus team on success, unblock issues, escalate as needed, and build relationships with peers for success. SDLC and Process Maturity: Continuously enhance engineering processes and team practices, aligning objectives with broader organizational goals. Stay informed about industry trends and integrate new frameworks and methodologies where appropriate . Automation and Efficiency: Champion automation initiatives, driving enhancements in operational efficiency, data integrity, and scalability. Continuously streamline workflows and promote practices that accelerate delivery. Production Support: Manage team responsibilities for addressing production issues promptly, maintaining clear stakeholder communication, and ensuring smooth release cycles . Required job skills: Strong communication and relationship-building skills, particularly in asynchronous and geographically distributed environments. Able to discuss solutions effectively with team members of varying technical backgrounds. Excellent software design skills, with deep knowledge of data engineering and backend development patterns, including performance optimization. Proficient in developing high-quality, well-structured code in Java, Scala, and SQL, following test-driven development approaches and thorough debugging practices. Proven ability to maintain clear, concise, and organized technical documentation. Deep understanding of modern product development methodologies (Agile, SAFe ). Experience building and maintaining data platforms, including data governance, ETL processes, data lakes, and data warehouses (Amazon S3, Snowflake). Knowledge of Amazon cloud computing infrastructure, specifically Aurora MySQL, DynamoDB, EMR, Lambda, Step Functions, and S3. Skilled at performing thoughtful and detailed code reviews, providing constructive feedback aimed at improving code quality and mentoring developers. Familiarity with software engineering and project management tools. Commitment to adhering to security protocols and best practices in data governance. Ability to define KPIs and leverage metrics effectively to drive process improvements. Minimum qualifications: 15+ years experience in designing and developing enterprise level software solutions 10 years experience with large volume data processing and big data tools such as Apache Spark, Scala, and Hadoop technologies 5 years experience developing Scala / Java applications and microservices using Spring Boot 5 years experience with SQL and Relational databases 3+ years in an Engineering Leadership position 2 years experience working with the Agile/Scrum methodology Preferred qualifications: Knowledge in MemSQL DB and SnowFlake Experience with Amazon cloud computing infrastructure (Aurora MySQL, Dynamo dB, EMR, Lambda, Step Functions, etc.) Educational domain background

Posted 2 weeks ago

Apply

17.0 - 22.0 years

32 - 40 Lacs

Pune

Work from Office

We are Allvue Systems, the leading provider of software solutions for the Private Capital and Credit markets. Whether a client wants an end-to-end technology suite, or independently focused modules, Allvue helps eliminate the boundaries between systems, information, and people. We re looking for ambitious, smart, and creative individuals to join our team and help our clients achieve their goals. Working at Allvue Systems means working with pioneers in the fintech industry. Our efforts are powered by innovative thinking and a desire to build adaptable financial software solutions that help our clients achieve even more. With our common goals of growth and innovation, whether you re collaborating on a cutting-edge project or connecting over shared interests at an office happy hour, the passion is contagious. We want all of our team members to be open, accessible, curious and always learning. As a team, we take initiative, own outcomes, and have passion for what we do. With these pillars at the center of what we do, we strive for continuous improvement, excellent partnership and exceptional results. Come be a part of the team that s revolutionizing the alternative investment industry. Define your own future with Allvue Systems! Strategic Leadership Define and execute the data science roadmap aligned with Allvue Systems business objectives. Partner with executive leadership to identify opportunities for leveraging data science to drive innovation and competitive advantage across the company. Foster a culture of data-driven decision-making across the organization. Effectively communicate complex data insights and recommendations to both technical and non-technical audiences, including senior leadership. Team Management Lead, mentor, and grow a high-performing team of data scientists & Analysts Promote collaboration, innovation, and professional development within the team. Manage the data science teams budget effectively, prioritizing investments in key areas. Cross-Functional Collaboration Work closely with product, Engineering, and other teams to identify data science opportunities and deliver impactful solutions. Collaborate with machine learning and core engineering teams to integrate data science models into production systems. Business Impact Enable the team to develop and deploy predictive models, machine learning algorithms, and AI-driven solutions to optimize content recommendations, personalization, and the different revenue lines. Leverage natural language processing (NLP) and LLMs (Large Language Models) to analyze and derive insights from multimodal content to power recommendations and personalization as well as smarter financial decisions. Build algorithms to measure and improve content performance, audience engagement, and subscription growth. Data Governance & Innovation Collaborate with machine learning engineering, data engineering and IT teams to ensure the integrity, accuracy, and security of data used for analysis and modelling. Champion a culture of experimentation and A/B testing, driving the development of new data products and features. Stay at the forefront of data science trends and technologies, exploring new tools and methodologies to enhance Allvue Systems capabilities. 17+ years of overall experience, with 8+ years in leading large-scale data science projects and teams. Proven track record of delivering impactful AI/ML solutions in a Finance domain. Excellent communication, presentation, and interpersonal skills. Exceptional ability in simplifying complex concepts and effectively influencing senior executive stakeholders. Expertise in aligning technology initiatives with business goals, with a hands-on approach to execution. Deep understanding of ML/AI and statistical modelling. Hands-on expertise in predictive modelling, recommendation systems, (Gen) AI, and NLP. Experience with cloud computing platforms (AWS, GCP, or Azure) and big data technologies (e.g. Spark). Fluent in Python, SQL, and ML frameworks like TensorFlow, PyTorch, or scikit-learn.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

Bengaluru

Work from Office

Job Title: Data Governance Specialist Experience: 5-7 Years Location: Bangalore, India Domain: Financial Services Notice Period: Immediate to 30 Days Job Description: Required a skilled Data Governance Specialist to join its data management team in Bangalore. This role will focus on implementing and maintaining data governance frameworks, ensuring high-quality data assets, and enabling consistent use of metadata across the organization. Key Responsibilities: Establish and maintain data governance policies, standards, and processes. Develop and manage the enterprise data glossary and metadata repositories. Monitor and improve data quality metrics, ensuring accuracy and consistency across systems. Work closely with business and technical teams to ensure data lineage and traceability. Support Agile delivery using tools like JIRA and Confluence. Collaborate across departments to promote data stewardship and governance awareness. Key Requirements: 57 years of experience in data governance, metadata management, or data quality roles. Strong knowledge of data glossary, lineage, and metadata practices. Experience working in Agile environments; familiarity with JIRA and Confluence. Excellent communication and stakeholder management skills. Prior experience in the financial services or banking domain is preferred. Preferred Skills: Knowledge of data governance tools (e.g., Collibra, Informatica, Alation) is a plus. Understanding of regulatory data requirements (e.g., BCBS 239, GDPR) is an advantage. Intake call Notes: Data governance, Data Glossary, metadata management, data quality, agile, JIRA, confluence Keywords - data governance, data quality and agile If interested, please share your resume to sunidhi.manhas@portraypeople.com

Posted 2 weeks ago

Apply

8.0 - 10.0 years

0 - 0 Lacs

Bengaluru

Work from Office

Informatica, MDM , SQL, Axon, EDC, Data Governance,

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Pune

Work from Office

Product & Business Alignment Collaborate with the Product Owner to align data solutions with business objectives and product vision. Data Pipeline Development Design, develop, and implement efficient data pipelines for ingesting, transforming, and transporting data into Cummins Digital Core (Azure DataLake, Snowflake) from various sources, including transactional systems (ERP, CRM). Architecture & Standards Compliance Ensure alignment with AAI Digital Core and AAI Solutions Architecture standards for data pipeline design, storage architectures, and governance processes. Automation & Optimization Implement and automate distributed data systems, ensuring reliability, scalability, and efficiency through monitoring, alerting, and performance tuning. Data Quality & Governance Develop and enforce data governance policies, including metadata management, access control, and retention policies, while actively monitoring and troubleshooting data quality issues. Modeling & Storage Design and implement conceptual, logical, and physical data models, optimizing storage architectures using distributed and cloud-based platforms (e.g., Hadoop, HBase, Cassandra, MongoDB, Accumulo, DynamoDB). Documentation & Best Practices Create and maintain data engineering documentation, including standard operating procedures (SOPs) and best practices, with guidance from senior engineers. Tool Evaluation & Innovation Support proof-of-concept (POC) initiatives and evaluate emerging data tools and technologies to enhance efficiency and effectiveness. Testing & Troubleshooting Participate in the testing, troubleshooting, and continuous improvement of data pipelines to ensure data integrity and usability. Agile & DevOps Practices Utilize agile development methodologies, including DevOps, Scrum, and Kanban, to drive iterative improvements in data-driven applications. Preferred Experience: Hands-on experience gained through internships, co-ops, student employment, or team-based extracurricular projects. Proficiency in SQL query language and experience in developing analytical solutions. Exposure to open-source Big Data technologies such as Spark, Scala/Java, MapReduce, Hive, HBase, and Kafka. Familiarity with cloud-based, clustered computing environments and large-scale data movement applications. Understanding of Agile software development methodologies. Exposure to IoT technology and data-driven solutions. Technical Skills: Programming Languages: Proficiency in Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Hands-on experience with Hadoop, Spark, Kafka, and similar frameworks. Cloud Services: Experience with Azure, Databricks, and AWS platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus. API Integration: Experience working with APIs to consume data from ERP and CRM systems.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Pune

Work from Office

So, what s the role all about We are looking for a highly driven and technically skilled Software Engineer to lead the integration of various Content Management Systems with AWS Knowledge Hub, enabling advanced Retrieval-Augmented Generation (RAG) search across heterogeneous customer data without requiring data duplication. This role will also be responsible for expanding the scope of Knowledge Hub to support non-traditional knowledge items and enhance customer self-service capabilities. You will work at the intersection of AI, search infrastructure, and developer experience to make enterprise knowledge instantly accessible, actionable, and AI-ready. How will you make an impact Integrate CMS with AWS Knowledge Hub to allow seamless RAG-based search across diverse data types eliminating the need to copy data into Knowledge Hub instances. Extend Knowledge Hub capabilities to ingest and index non-knowledge assets, including structured data, documents, tickets, logs, and other enterprise sources. Build secure, scalable connectors to read directly from customer-maintained indices and data repositories. Enable self-service capabilities for customers to manage content sources using App Flow, Tray. ai, configure ingestion rules, and set up search parameters independently. Collaborate with the NLP/AI team to optimize relevance and performance for RAG search pipelines. Work closely with product and UX teams to design intuitive, powerful experiences around self-service data onboarding and search configuration. Implement data governance, access control, and observability features to ensure enterprise readiness. Have you got what it takes Proven experience with search infrastructure, RAG pipelines, and LLM-based applications. 5+ Years hands-on experience with AWS Knowledge Hub, AppFlow, Tray. ai, or equivalent cloud-based indexing/search platforms. Strong backend development skills (Python, Typescript/NodeJS, . NET/Java) and familiarity with building and consuming REST APIs. Infrastructure as a code (IAAS) service like AWS Cloud formation, CDK knowledge Deep understanding of data ingestion pipelines, index management, and search query optimization. Experience working with unstructured and semi-structured data in real-world enterprise settings. Ability to design for scale, security, and multi-tenant environment. What s in it for you Enjoy NICE-FLEX! Reporting into: Tech Manager, Engineering, CX Role Type: Individual Contributor About NiCE

Posted 2 weeks ago

Apply

13.0 - 19.0 years

17 - 20 Lacs

Pune

Remote

Looking for a Java full stack developer who is having 15 yrs of experience and " 5 years as Data architect is mandatory " Decode complex business challenges using diverse data sources. Design and build scalable data warehouses and marts.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies