Jobs
Interviews

2326 Data Governance Jobs - Page 32

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. You will be responsible for utilizing your expertise in Databricks Unified Data Analytics Platform to develop efficient and effective applications. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work-related problems.- Collaborate with cross-functional teams to gather and analyze business requirements.- Design, develop, and test applications using Databricks Unified Data Analytics Platform.- Troubleshoot and debug applications to ensure optimal performance and functionality.- Implement security and data protection measures.- Document technical specifications and user manuals for reference and reporting purposes. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data engineering concepts and techniques.- Experience with data integration and ETL processes.- Knowledge of programming languages such as Python or Scala.- Familiarity with cloud platforms like AWS or Azure.- Good To Have Skills: Experience with big data technologies such as Hadoop or Spark.- Understanding of data governance and data quality principles.- Knowledge of SQL and database management systems. Additional Information:- The candidate should have a minimum of 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full-time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Pune

Work from Office

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : AWS Glue Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As part of a Data Transformation programme you will be part of the Data Marketplace team. In this team you will be responsible for Architecture and design for automating data management compliance validation, monitoring, and reporting through rule-based and AI-driven mechanisms, integrating with metadata repositories and governance tools for real-time policy enforcement and for delivering design specifications for real-time metadata integration, enhanced automation, audit logging, monitoring capabilities, and lifecycle management (including version control, decommissioning, and rollback) Preferably experience with the implementation and adaptation of data management and data governance controls around Data Product implementations, preferably on AWS. Experience with AI appreciated.Examples skills Data Architecture, Data Marketplace, Data governance, Data Engineering, AWS DataZone, AWS Sagemaker Unified StudioAs an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning sessions to align project goals with organizational objectives, ensuring that all stakeholders are informed and involved in the decision-making process. Your role will require a balance of technical expertise and leadership skills to drive project success and foster a collaborative team environment. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and implement necessary adjustments to meet deadlines. Professional & Technical Skills: - Must To Have Skills: Proficiency in AWS Glue.- Strong understanding of data integration and ETL processes.- Experience with cloud computing platforms and services.- Familiarity with data warehousing concepts and best practices.- Ability to troubleshoot and optimize data workflows. Additional Information:- The candidate should have minimum 7.5 years of experience in AWS Glue.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure timely delivery of application features. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of database design principles and data architecture.- Experience with data integration and ETL processes.- Familiarity with data governance and data quality frameworks.- Ability to analyze and optimize data models for performance. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : PySpark, Microsoft Azure DatabricksMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be responsible for designing, building, and configuring applications to meet business process and application requirements. Your typical day will involve collaborating with teams to develop innovative solutions and streamline processes. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead the development and implementation of new applications- Conduct code reviews and ensure coding standards are met- Stay updated on industry trends and best practices Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform- Good To Have Skills: Experience with PySpark- Strong understanding of data engineering concepts- Experience in building and optimizing data pipelines- Knowledge of cloud platforms like Microsoft Azure- Familiarity with data governance and security practices Additional Information:- The candidate should have a minimum of 5 years of experience in Databricks Unified Data Analytics Platform- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 4.0 years

13 - 17 Lacs

Gurugram

Work from Office

Project Role : Security Architect Project Role Description : Define the cloud security framework and architecture, ensuring it meets the business requirements and performance goals. Document the implementation of the cloud security controls and transition to cloud security-managed operations. Must have skills : Security Data Privacy Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Security Architect, you will define the cloud security framework and architecture, ensuring it meets the business requirements and performance goals. Your typical day will involve collaborating with various teams to assess security needs, documenting the implementation of cloud security controls, and transitioning to cloud security-managed operations. You will engage in discussions to refine security strategies and ensure compliance with industry standards, all while adapting to the evolving landscape of cloud technologies and security threats. Roles & Responsibilities:- Maintaining the integrity of data and processes in OneTrust or Securiti.ai.- Hands-on support using OneTrust or securiti.ai in data discovery, classification & data governance.- Hands-on support using OneTrust or securiti.ai in performing data security posture management.- Supporting the team with OneTrust or securiti,ai privacy assessments.- Hands-on support using OneTrust or securiti.ai for Policy & Notice Management and DPIA.- Hands-on support using OneTrust or securiti,ai for Cookie compliance, including scanning and banner.- Hands-on support using OneTrust or securiti.ai for Consent compliance and maintain records of consent.- Hands-on support using OneTrust or securiti.ai for Data Subject Requests to automate request to fulfilment to meet regulatory deadlines.- Hands-on Data retention & deletion - Manage and enforce retention policies and data deletion.- Evaluating PIA/DPIA assessments for Risk Management, including Vendors.Professional and Technical Skills: - 3-4 years of hands-on experience as an OneTrust or Securiti.ai administrator.- 3 years of work experience with data privacy regulations such as GDPR, CCPA,DPDP (mandatory).- 2 years of work experience in defining & managing DSAR, DPIA's, Consent, Cookie, TPRM & RoPA lifecycles. 2 years of work experience in performing Data Discovery , Classification, Data Governance, Data Mapping & Cataloging and Data Security Posture Management. Excellent communication skills in English - both written and verbal.- OneTrust or Securiti.ai certified Professional (required). Additional Information:- The candidate should have minimum 12 years of experience in Security Data Privacy.- A 15 year full time education is required.- This will be a work from office on all 5 days, and the resource needs to work from client location only. Qualification 15 years full time education

Posted 2 weeks ago

Apply

1.0 - 6.0 years

9 - 13 Lacs

Faridabad

Work from Office

Postdoctoral Positions in Amrita School of Artificial Intelligence @ Faridabad Postdoctoral Positions in Amrita School of Artificial Intelligence @ Faridabad - Amrita Vishwa Vidyapeetham Postdoctoral Positions in Amrita School of Artificial Intelligence @ Faridabad Postdoctoral Positions in Amrita School of Artificial Intelligence @ Faridabad The School of Artificial Intelligence, Amrita Vishwa Vidyapeetham, Faridabad Campus is inviting applications from qualified candidates for the post of a Postdoctoral For Details Contact : aihroffice@dl. amrita. edu Job Title Postdoctoral positions in Amrita School of Artificial Intelligence Delhi, Faridabad Required Number Ph. D. (Cryptography and Applications) Job Description Post Doctoral Research Fellowship in Cryptography and Applications . Duration: 1 Year (extendable by one additional year based on performance and funding) Job Category Last Date to Apply July 15, 2025 Apply Online Thank You for contacting us! Well be in touch shortly. Phone no * Add File or drop files here Upto 500kb | doc, docx & PDF format only Proffessional Experience * Extra curricular activities * To confirm your request , please check the box to let us know you are human

Posted 2 weeks ago

Apply

9.0 - 11.0 years

20 - 25 Lacs

Pune

Work from Office

SAP MDG Senior DevOps Lead Location Pune Experience 10+ years We are seeking an experienced SAP MDG Senior DevOps Lead to join our dynamic team. The ideal candidate will have extensive experience in SAP Master Data Governance (MDG) and a strong background in DevOps practices. You will be responsible for leading the development and operational management of SAP MDG solutions, ensuring high availability, performance, and scalability. Key Responsibilities Lead the design, implementation, and management of SAP MDG solutions. Collaborate with cross-functional teams to define requirements and deliver effective solutions. Implement DevOps practices to enhance the deployment process and increase system reliability. Monitor and optimize system performance and troubleshoot issues effectively. Develop and maintain CI/CD pipelines for SAP MDG applications. Ensure compliance with best practices in security and data governance. Mentor and guide junior team members in SAP MDG and DevOps methodologies. Stay updated with the latest SAP technologies and industry trends. Experience with SAP Solution Manager ( CHARM ) and Transport Management System (TMS). Optimize SAP MDG performance and integrations with other SAP and non-SAP systems. Qualifications Bachelor s degree in Computer Science , Information Technology, or a related field. 10+ years of experience in SAP MDG with a solid understanding of data governance concepts. Proficiency in DevOps tools and methodologies, including CI/CD, Docker, Kubernetes, and cloud services. Strong analytical and problem-solving skills. Excellent communication and leadership abilities. If you are passionate about leveraging your expertise in SAP MDG and DevOps to drive business success, we invite you to apply for this exciting opportunity.

Posted 2 weeks ago

Apply

5.0 - 10.0 years

20 - 35 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Roles and Responsibilities Develop and implement data governance frameworks, policies, and procedures to ensure compliance with regulatory requirements. Collaborate with stakeholders to identify business needs and develop solutions that meet those needs while ensuring data quality and integrity. Design and maintain databases using SQL to support reporting, analysis, and decision-making across the organization. Provide expert guidance on data management best practices, including data governance analytics, risk reporting, model risk assessment, and regulatory reporting. Ensure effective communication of complex technical concepts to non-technical stakeholders through clear documentation and presentations. Desired Candidate Profile 5-10 years of experience in a similar role within an investment banking/venture capital/private equity firm or related industry. Strong understanding of Basel II/III regulations and their impact on financial institutions' operations. Proficiency in CAR/CCAr/IFRS9/IFRS17 standards for regulatory reporting purposes. Experience working with large datasets using SQL; ability to design efficient queries for extracting insights from complex datasets.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

15 - 30 Lacs

Pune

Hybrid

Dear Candidate, This is with reference to Senior Business Intelligence Analyst Openings At Wolters Kluwer, Pune Kindly share your resume on jyoti.salvi@wolterskluwer.com Job Specifications :- Skillset Requirement : Looking for Data Governance professionals with experience in Collibra . Experience in Microsoft Purview is highly preferred. Experience Range : Candidates with 2 to 8 years of relevant Data Governance experience Primary Skills - Data Governance, Microsoft Purview, Collibra Architecting, designing, and implementing data governance solutions using Microsoft Purview Experience in data lifecycle management, including data retention, deletion, and archiving strategies using Microsoft Purview Data Lifecycle Management Assist transitions to Microsoft Purview services, including setting up data lifecycle management and eDiscovery configurations Maintain accurate documentation of configurations, processes, and procedures related to Microsoft Purview Experience in implementation of data governance policies and procedures to ensure compliance with regulatory requirements and organizational standards Ensure Data Quality and Compliance by applying expertise in MDM and data governance principles, including data governance frameworks and practices, to ensure the relevancy, quality, security, and compliance of master data Develop and implement data integration solutions for metadata, data lineage, and data quality

Posted 2 weeks ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Bengaluru

Hybrid

Job Title: Data Governance & Quality Specialist Experience: 3-8 Years Location: Bangalore (Hybrid) Domain: Financial Services Notice Period: Immediate to 30 Days What Youll Do: Define and enforce data-governance policies (BCBS 239/GDPR) across credit-risk datasets Design, monitor and report on data-quality KPIs; perform profiling & root-cause analysis in SAS/SQL Collaborate with data stewards, risk teams and auditors to remediate data issues Develop governance artifacts: data-lineage maps, stewardship RACI, council presentations Must Have: 38 yrs in data-governance or data-quality roles (financial services) Advanced SAS for data profiling & reporting; strong SQL skills Hands-on with governance frameworks and regulatory requirements Excellent stakeholder-management and documentation abilities Nice to Have: Experience with Collibra, Informatica or Talend Exposure to credit-risk model inputs (PD/LGD/EAD) Automation via SAS macros or Python scripting If Interested, please share your resume to sunidhi.manhas@portraypeople.com

Posted 2 weeks ago

Apply

2.0 - 6.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Cigna is a global health services company dedicated to improving the health and well-being of those we serve. Through our divisions, Cigna Healthcare and Evernorth Health Services, we provide a wide range of services and solutions that enhance the lives of our clients, customers, and patients. Part of Cigna Healthcare, International Health delivers a diverse range of health services and solutions globally, ensuring access to quality care and support. Our International Health Technology team is at the forefront of technological innovation, ensuring seamless integration of systems and processes across global operations. We leverage advanced technologies to enhance service delivery and support strategic goals, meeting the evolving needs of our international community. Role Overview We are seeking an experienced Enterprise Architecture Coordinator who will play a crucial role in supporting the development and implementation of enterprise architecture strategies that align with business goals. This role involves collaborative efforts with senior architects and stakeholders to ensure the seamless integration of systems and processes, thus driving technological innovation within the organization. Key Responsibilities Assist in aligning IT initiatives with business strategy, ensuring that technology solutions support overarching business objectives and contribute to long-term success. Support the establishment and enforcement of architectural standards and governance policies, ensuring compliance with industry best practices and regulatory requirements. Assist in adopting and integrating emerging technologies, ensuring investments support the architecture runway and enhance the organizations technological capabilities. Help translate business needs into actionable technical solutions by collaborating with cross-functional teams and leveraging advanced technologies. Recommend enhancements to existing solutions for improved efficiency, scalability, and performance. Assist in developing and maintaining governance frameworks to ensure consistent application of architectural principles and practices. Work closely with senior architects and team members for seamless integration of new systems and processes, ensuring minimal disruption to operations. Maintain and update architectural documentation, ensuring that all architectural artifacts are accurate, up-to-date, and accessible to relevant stakeholders. Apply SAFe principles to support agile transformation initiatives, emphasizing lean-agile leadership, technical agility, product delivery, enterprise solution delivery, and portfolio management. Support the Enterprise Architecture framework using tools like LeanIX, ensuring accurate and up-to-date architectural data and insights that inform decision-making. Identify and assess technology risks, and develop mitigation strategies to ensure the security, reliability, and resilience of IT systems. Foster a culture of innovation by exploring and recommending new technologies and methodologies that can enhance the organizations IT capabilities and drive competitive advantage. Required Skills and Qualifications Strong understanding of data governance, digital transformation, and integration architecture prevalent in the health insurance industry. Excellent communication skills for conveying technical concepts to both technical and non-technical stakeholders. Strong analytical and problem-solving skills to identify and resolve issues effectively. Willingness to learn and adapt to new technologies and methodologies. Ability to adapt to changing business needs and technological advancements. Knowledge of frameworks (TOGAF, Zachman), modelling and design tools, system integration, security architecture, and cloud services. Additional Information Ability to travel internationally as needed. Ability to work effectively in a globally distributed team environment. Commitment to continuous learning and professional development. About The Cigna Group

Posted 2 weeks ago

Apply

6.0 - 11.0 years

12 - 16 Lacs

Pune

Work from Office

What You'll Do We are looking for Senior Analyst based in the Pune office. You will empower the HR team to harness the full potential of data, enabling them to make informed decisions that drive organizational success. Partnering with the engineering team, you will ensure seamless integration and availability of people data across various systems and tools. You will collaborate closely with both HR stakeholders and tech teams to design, develop, and maintain scalable data pipelines using tools such as Snowflake, dbt, and Fivetran, while implementing and optimizing ELT processes to ensure data accuracy and reliability. Report to Lead Project Manager What Your Responsibilities Will Be On a daily basis, you will build and monitor data pipelines to ensure seamless data flow and availability for analysis. You will troubleshoot and resolve any data-related issues, minimizing disruptions to HR operations. You will partner with the engineering team to integrate people data across systems, ensuring it is handled securely, and in accordance with data governance standards, making it accessible for cross-functional use. You will develop and maintain SQL queries and scripts to support data analysis and reporting needs, ensuring data integrity and security across all HR-related data systems. Additionally, you will document processes, provide technical support and training to HR Team members on tools and improve data infrastructure and processes to enhance performance and scalability. What You'll Need to be Successful You have 6+ years of experience as a Data Engineer or in a similar role. You possess documentation skills, ensuring models are understandable and maintainable by all stakeholders. You have strong proficiency in SQL and deep experience with modern data stack tools (Snowflake, dbt, Fivetran, GitLab). You can translate business requirements into data models. You have experience with ELT processes and data pipeline development. You have knowledge of data security and privacy regulations and have implemented techniques such as row level security or data masking to keep data safe. You hold a Bachelor's degree in Computer Science, Information Technology, or a related field. Highly desirable if you have experience with Workday reporting, calculated fields, RaaS. Plus if you have experience with ICIMS, Workramp, Gallup, Espresa, Adaptive, or Salesforce. Plus if you have experience with Power BI, particularly data marts within Power BI. Plus if you have experience using a scripting language such as Python to automate ELT processes.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

15 - 25 Lacs

Pune

Work from Office

Role & responsibilities: Design & Configuration: Develop and configure MDM and Data Quality tools such as Syndigo, Snowflake, and Alation. Perform data quality functions including audits, assessments, entity resolution, data profiling, scorecard development, and exception management configuration. Configure workflows for Products and Customer data in Syndigo and Winshuttle. Support the development of Master Data and IT system architecture roadmaps to improve e-commerce strategy, supply chain efficiency, and customer experience. Collaborate in an agile environment to design and build data solutions, ensuring thorough end-to-end testing. Implement data orchestration pipelines, data sourcing, cleansing, and quality control processes. Contribute to software verification plans and quality assurance procedures. Document and maintain data pipeline architecture. Contribute to IT standards, procedures, and processes. Document and maintain software functionality. Training & Support: Develop & maintain technical design documentation. Create and maintain relevant project documentation throughout SDLC Create clear and concise training documentation Lead end-to-end delivery of technical solutions Support the installations, maintenance, and upgrades of MDM tools Create test cases and support user testing throughout relevant test cycles. Preferred candidate profile: Bachelor's degree in computer science, systems analysis, or a related study, or equivalent experience. Requires mastery-level knowledge of the job area obtained either through advanced education, experience, or both. Should have a minimum of 4 years of Experience in Data Management and Data Quality solutions. Of which a minimum of 3 years of experience in the Syndigo application is required. Data Quality related certifications (i.e., GS1, CIMP, IQCP, ICP, CDMP, and CMMI Enterprise Data Management) are a plus. Should have a conceptual understanding of SAP and PLMs with a strong understanding of Key business processes involving Customer & Product Master data. Should have an in-Depth Knowledge of SQL, Python, & Snowflake. Knowledge of Cloud-based solutions and Alation/Data Governance tools (preferable) Strong Analytical and Problem-Solving Skills. Knowledge of Agile/Waterfall methodologies. Ability to effectively communicate, orally and writing, with IT & Business stakeholders Propensity to learn innovative technologies and approaches and use this knowledge to enhance Smith & Nephew's strategy, standard practices, and processes. Ability to prioritize tasks and adapt to frequent changes in priorities Ability to work in a distributed team setting and a fast-paced environment Perks and benefits Major medical coverage + policy exclusions and insurance non-medical limit. Educational Assistance. Flexible Personal/Vacation Time Off, Privilege Leave, Floater Leave. Parents/Parents-in-Laws Insurance (Employer Contribution of 8,000/- annually), Employee Assistance Program, Parental Leave. Hybrid Work Model Hands-On, Team-Customized Mentorship Extra Perks: Free Cab Transport Facility for all employees; One-Time Meal provided to all employees as per shift. Night shift allowance for shift-based roles.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

15 - 20 Lacs

Pune

Work from Office

Notice Period: Immediate About the role: We are hiring a Senior Snowflake Data Engineer with 10+ years of experience in cloud data warehousing and deep expertise on the Snowflake platform. The ideal candidate should have strong skills in SQL, ETL/ELT, data modeling, and performance tuning, along with a solid understanding of Snowflake architecture, security, and cost optimization. Roles & Responsibilities: Collaborate with data engineers, product owners, and QA teams to translate business needs into efficient Snowflake-based data models and pipelines. Design, build, and optimize data solutions leveraging Snowflake features such as virtual warehouses, data sharing, cloning, and time travel. Develop and maintain robust ETL/ELT pipelines using tools like Talend, Snowpipe, Streams, Tasks, and Python. Ensure optimal performance of SQL queries, warehouse sizing, and cost-efficient design strategies. Implement best practices for data quality, security, and governance, including RBAC, network policies, and masking. Contribute to code reviews and development standards to ensure high-quality deliverables. Support analytics and BI teams with data exploration and visualization using tools like Tableau or Power BI. Maintain version control using Git and follow Agile development practices. Required Skills: Snowflake Expertise: Deep knowledge of Snowflake architecture and core features. SQL Development: Advanced proficiency in writing and optimizing complex SQL queries. ETL/ELT: Hands-on experience with ETL/ELT design using Snowflake tools and scripting (Python). Data Modeling: Proficient in dimensional modeling, data vault, and best practices within Snowflake. Automation & Scripting: Python or similar scripting language for data workflows. Cloud Integration: Familiarity with Azure and its services integrated with Snowflake. BI & Visualization: Exposure to Tableau, Power BI or other similar platforms.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

0 - 1 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Job Position Title : Project Manager SFDC Responsibilities: The Project Manager will be responsible for overseeing the planning, execution, and delivery of the Data Cloud Project in Salesforce. This role requires strong leadership skills, extensive experience with Salesforce, and a deep understanding of data management best practices. The Project Manager will work closely with cross-functional teams to ensure that project goals are met on time and within budget. Key Responsibilities: - Lead and manage the full lifecycle of the Data Cloud Project in Salesforce, including planning, execution, monitoring, and closure. - Develop detailed project plans, timelines, and budgets to guide project execution and ensure alignment with business objectives. - Coordinate with key stakeholders, including business analysts, developers, and IT teams, to gather requirements and define project scope. - Oversee the design, configuration, and integration of Salesforce Data Cloud to meet organizational data management needs. - Identify and mitigate project risks, and implement contingency plans to ensure successful project delivery. - Monitor project progress and performance, providing regular status updates to stakeholders and senior management. - Facilitate effective communication and collaboration among project team members and stakeholders. - Ensure compliance with data governance and security policies throughout the project lifecycle. - Conduct project evaluations and post-implementation reviews to identify lessons learned and areas for improvement. - Stay current with Salesforce updates, industry trends, and best practices in data management and cloud solutions. Qualifications: - Bachelor's(BE/BTech) degree in Information Technology, Computer Science, Business Administration, or a related field. A Master's degree is a plus. - Proven experience (5+ years) in project management, specifically managing Salesforce projects. - Strong understanding of Salesforce Data Cloud and data management principles. - Project Management Professional (PMP) certification or equivalent is preferred. - Excellent leadership, communication, and interpersonal skills. - Ability to manage multiple projects simultaneously and prioritize tasks effectively. - Strong problem-solving skills and attention to detail. - Experience with Agile/Scrum methodologies is a plus. - Proficiency in project management software and tools. Mandatory skill sets: Project Management Experience Preferred skill sets : PMO Years of experience required: 8Years + Education qualification : B.E./B.Tech

Posted 2 weeks ago

Apply

7.0 - 12.0 years

25 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Job Description: Strong change and project management skills Stakeholder Management, Communications, Reporting Data management, data governance, and data quality management domain knowledge Subject Matter Expertise required in more than one of the following areas- Data Management, Data Governance, Data Quality Measurement and Reporting, Data Quality Issues Management. Liaise with IWPB markets and stakeholders to coordinate delivery of organizational DQ Governance objectives, and provide consultative support to facilitate progress Conduct analysis of IWPB DQ portfolio to identify thematic trends and insights, to effectively advise stakeholders in managing their respective domains Proficiency in MI reporting and visualization is strongly preferred Proficiency in Change and Project Management is strongly preferred. Ability to prepare programme update materials and present for senior stakeholders, with prompt response any issues / escalations Strong communications and Stakeholder Management skills: Should be able to work effectively and maintain strong working relationships as an integral part of a larger team 8+ yrs of relevant experience preferred.

Posted 2 weeks ago

Apply

6.0 - 7.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Job Title: Technical Team LeadLocation: TechM Blr ITC06 07Years of Experience: 5 7 YearsJob Summary:We are seeking a highly skilled and motivated Technical Team Lead with a strong background in SAP Archiving The ideal candidate will lead a team of technical professionals, ensuring the successful delivery of projects while maintaining high standards of quality and efficiency This role requires a deep understanding of SAP Archiving processes and technologies, as well as the ability to mentor and guide team members in best practices Responsibilities:Lead and manage a team of technical professionals, providing guidance and support in SAP Archiving projects Design, implement, and optimize SAP Archiving solutions to enhance system performance and data management Collaborate with cross functional teams to gather requirements and ensure alignment with business objectives Conduct regular code reviews and provide constructive feedback to team members Monitor project progress, identify risks, and implement mitigation strategies to ensure timely delivery Stay updated with the latest SAP Archiving trends and technologies, and share knowledge with the team Facilitate training sessions and workshops to enhance team skills in SAP Archiving Prepare and present project status reports to stakeholders and management Mandatory Skills:Strong expertise in SAP Archiving, including knowledge of archiving objects, data retention policies, and data retrieval processes Proven experience in leading technical teams and managing projects in a fast paced environment Excellent problem solving skills and the ability to troubleshoot complex technical issues Strong communication and interpersonal skills, with the ability to work collaboratively with diverse teams Experience with SAP modules and integration points related to archiving Preferred Skills:Familiarity with SAP S/4HANA and its archiving capabilities Knowledge of data governance and compliance standards related to data archiving Experience with project management methodologies (Agile, Scrum, etc ) Certifications in SAP or related technologies Qualifications:Bachelors degree in Computer Science, Information Technology, or a related field 5 7 years of experience in SAP Archiving and technical team leadership Proven track record of successful project delivery and team management If you are a passionate leader with a strong background in SAP Archiving and are looking to take the next step in your career, we encourage you to apply for this exciting opportunity

Posted 2 weeks ago

Apply

10.0 - 15.0 years

20 - 25 Lacs

Bengaluru

Work from Office

SAP MDG Experience in SAP MDG EhP6 & MDG 7.0/8.0 (Preferably 9.0) (10+ Years of experience) Extensive ECC and/or S/4 HANA experience, Worked on at least 2 MDG projects Expertise in Implementation of SAP MDG Solution for masters like Customer, Vendor, Material, etc. Expertise in Data Model Enhancement, Data Transfer (DIF/DEF), Data Replication Framework (DRF), Business Rules Framework plus (BRFplus). Experience in Configuration rule based Workflow and in Integrating business process requirements with the technical implementation of SAP Master Data Governance. Experience in User interface modelling (Design and Creation of UI, Value restriction, Define navigation elements of type Hyperlink or Push button, Data quality, Validation and Derivation rules). Experience in Process Modelling (Entity, Business Activity change, Request type, Workflow, Edition type, Relationship, Data replication techniques, SOA service, ALE connection, Key & value mapping, Data transfer, Export & import master data, Convert master data). Expert knowledge in activation and configuration of the MDG modules & components. SAP ERP logistics knowledge (SAP modules SD or MM), especially master data is required.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

20 - 25 Lacs

Bengaluru

Work from Office

We are seeking an experienced Data Platform Reliability Engineer to lead our efforts in designing, implementing, and maintaining highly reliable data infrastructure. The ideal candidate will bring extensive expertise in building enterprise-grade data platforms with a focus on reliability engineering, governance, and SLA/SLO design. This role will be instrumental in developing advanced monitoring solutions, including LLM-powered systems, to ensure the integrity and availability of our critical data assets. Platform Architecture and Design Design and architect scalable, fault-tolerant data platforms leveraging modern technologies like Snowflake, Databricks, and cloud-native services Establish architectural patterns that ensure high availability and resiliency across data systems Develop technical roadmaps for platform evolution with reliability as a core principle Reliability Engineering Implement comprehensive SLA/SLO frameworks for data services Design and execute chaos engineering experiments to identify and address potential failure modes Create automated recovery mechanisms for critical data pipelines and services Establish incident management processes and runbooks Monitoring and Observability Develop advanced monitoring solutions, including LLM-powered anomaly detection Design comprehensive observability strategies across the data ecosystem Implement proactive alerting systems to identify issues before they impact users Create dashboards and visualization tools for reliability metrics Data Quality and Governance Establish data quality monitoring processes and tools Implement data lineage tracking mechanisms Develop automated validation protocols for data integrity Collaborate with data governance teams to ensure compliance with policies Innovation and Improvement Research and implement AI/ML approaches to improve platform reliability Lead continuous improvement initiatives for data infrastructure Mentor team members on reliability engineering best practices Stay current with emerging technologies and reliability patterns in the data platform space Qualifications 10+ years of experience in data platform engineering or related fields Proven expertise with enterprise data platforms (Snowflake, Databricks, etc.) Strong background in reliability engineering, SRE practices, or similar disciplines Experience implementing data quality monitoring frameworks Knowledge of AI/ML applications for system monitoring and reliability Excellent communication skills and ability to translate technical concepts to diverse stakeholders

Posted 2 weeks ago

Apply

3.0 - 6.0 years

20 - 25 Lacs

Hyderabad

Work from Office

Blend is hiring a Senior Data Scientist (Generative AI) to spearhead the development of advanced AI-powered classification and matching systems on Databricks. You will contribute to flagship programs like the Diageo AI POC by building RAG pipelines, deploying agentic AI workflows, and scaling LLM-based solutions for high-precision entity matching and MDM modernization. Key Responsibilities Design and implement end-to-end AI pipelines for product classification, fuzzy matching, and deduplication using LLMs, RAG, and Databricks-native workflows . Develop scalable, reproducible AI solutions within Databricks notebooks and job clusters , leveraging Delta Lake, MLflow, and Unity Catalog. Engineer Retrieval-Augmented Generation (RAG) workflows using vector search and integrate with Python-based matching logic. Build agent-based automation pipelines (rule-driven + GenAI agents) for anomaly detection, compliance validation, and harmonization logic. Implement explainability, audit trails, and governance-first AI workflows aligned with enterprise-grade MDM needs. Collaborate with data engineers, BI teams, and product owners to integrate GenAI outputs into downstream systems. Contribute to modular system design and documentation for long-term scalability and maintainability. Bachelor s/Master s in Computer Science, Artificial Intelligence, or related field. 5+ years of overall Data Science experience with 2+ years in Generative AI / LLM-based applications. Deep experience with Databricks ecosystem: Delta Lake, MLflow, DBFS, Databricks Jobs & Workflows. Strong Python and PySpark skills with ability to build scalable data pipelines and AI workflows in Databricks. Experience with LLMs (e.g., OpenAI, LLaMA, Mistral) and frameworks like LangChain or LlamaIndex. Working knowledge of vector databases (e.g., FAISS, Chroma) and prompt engineering for classification/retrieval. Exposure to MDM platforms (e.g., Stibo STEP) and familiarity with data harmonization challenges. Experience with explainability frameworks (e.g., SHAP, LIME) and AI audit tooling. Preferred Skills Knowledge of agentic AI architectures and multi-agent orchestration. Familiarity with Azure Data Hub and enterprise data ingestion frameworks. Understanding of data governance, lineage, and regulatory compliance in AI systems.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

20 - 25 Lacs

Hyderabad

Work from Office

Job Description Blend is hiring a Senior Data Scientist (Generative AI) to spearhead the development of advanced AI-powered classification and matching systems on Databricks. You will contribute to flagship programs like the Diageo AI POC by building RAG pipelines, deploying agentic AI workflows, and scaling LLM-based solutions for high-precision entity matching and MDM modernization. Key Responsibilities Design and implement end-to-end AI pipelines for product classification, fuzzy matching, and deduplication using LLMs, RAG, and Databricks-native workflows . Develop scalable, reproducible AI solutions within Databricks notebooks and job clusters , leveraging Delta Lake, MLflow, and Unity Catalog. Engineer Retrieval-Augmented Generation (RAG) workflows using vector search and integrate with Python-based matching logic. Build agent-based automation pipelines (rule-driven + GenAI agents) for anomaly detection, compliance validation, and harmonization logic. Implement explainability, audit trails, and governance-first AI workflows aligned with enterprise-grade MDM needs. Collaborate with data engineers, BI teams, and product owners to integrate GenAI outputs into downstream systems. Contribute to modular system design and documentation for long-term scalability and maintainability. Qualifications Bachelor s/Master s in Computer Science, Artificial Intelligence, or related field. 5+ years of overall Data Science experience with 2+ years in Generative AI / LLM-based applications. Deep experience with Databricks ecosystem: Delta Lake, MLflow, DBFS, Databricks Jobs & Workflows. Strong Python and PySpark skills with ability to build scalable data pipelines and AI workflows in Databricks. Experience with LLMs (e.g., OpenAI, LLaMA, Mistral) and frameworks like LangChain or LlamaIndex. Working knowledge of vector databases (e.g., FAISS, Chroma) and prompt engineering for classification/retrieval. Exposure to MDM platforms (e.g., Stibo STEP) and familiarity with data harmonization challenges. Experience with explainability frameworks (e.g., SHAP, LIME) and AI audit tooling. Preferred Skills Knowledge of agentic AI architectures and multi-agent orchestration. Familiarity with Azure Data Hub and enterprise data ingestion frameworks. Understanding of data governance, lineage, and regulatory compliance in AI systems.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

6 - 10 Lacs

Mumbai

Work from Office

Position: Data Lifecycle Management (DLM) Specialist | Mumbai | WFO Location: Goregaon, Mumbai (Apply if you are from Western line) Shift Timing: 9 AM 6 PM Notice Period: Immediate to 30 Days Experience: 3 to 5 Years Work Mode: Work from Office (WFO) Interested candidates can apply to saikeertana.r@twsol.com Role Overview: Seeking a highly motivated and client-centric DLM Specialist with 35 years of experience in data management , financial services , or other regulated industries . This role focuses on reviewing applications and ensuring data retention, disposition, and archiving compliance while aligning with privacy regulations and internal policy. Key Responsibilities: Assess data retention, archiving, and disposition requirements across all business divisions Conduct regular reviews and stakeholder meetings with business and technology teams Manage data risk identification and mitigation plans related to retention, location, and transfer Document concise data management requirements and ensure implementation tracking Support in defining operational and compliance controls Compile analysis reports and drive recommendation implementation Engage system owners in problem-solving and decision-making Represent DLM in cross-functional meetings to communicate policy standards Prepare progress reports and contribute to process improvements Required Qualifications: Bachelors degree 3 to 5 years experience in information/data management , data storage , or financial services operations Strong business analysis skills Excellent verbal and written communication skills in English High attention to detail with the ability to document complex information clearly Demonstrated client servicing ability and stakeholder management Experience in developing business and functional requirements for tech systems Nice to Have: Degree in Information Systems , Business Administration , Archiving , or Law Understanding of personal data protection and privacy regulations Familiarity with database and cloud technologies , AI trends Reporting experience with Power BI / Tableau Experience working with high-volume datasets

Posted 2 weeks ago

Apply

5.0 - 10.0 years

8 - 9 Lacs

Pune

Work from Office

Req ID: 332236 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Consulting-Technical analyst with ETL,GCP using Pyspark to join our team in Pune, Mah r shtra (IN-MH), India (IN). Key Responsibilities: Data Pipeline Development: Designing, implementing, and optimizing data pipelines on GCP using PySpark for efficient and scalable data processing. ETL Workflow Development: Building and maintaining ETL workflows for extracting, transforming, and loading data into various GCP services. GCP Service Utilization: Leveraging GCP services like BigQuery, Cloud Storage, Dataflow, and Dataproc for data storage, processing, and analysis. Data Transformation: Utilizing PySpark for data manipulation, cleansing, enrichment, and validation. Performance Optimization: Ensuring the performance and scalability of data processing jobs on GCP. Collaboration: Working with data scientists, analysts, and other stakeholders to understand data requirements and translate them into technical solutions. Data Quality and Governance: Implementing and maintaining data quality standards, security measures, and compliance with data governance policies on GCP. Troubleshooting and Support: Diagnosing and resolving issues related to data pipelines and infrastructure. Staying Updated: Keeping abreast of the latest GCP services, PySpark features, and best practices in data engineering. Required Skills: GCP Expertise: Strong understanding of GCP services like BigQuery, Cloud Storage, Dataflow, and Dataproc. PySpark Proficiency: Demonstrated experience in using PySpark for data processing, transformation, and analysis. Python Programming: Solid Python programming skills for data manipulation and scripting. Data Modeling and ETL: Experience with data modeling, ETL processes, and data warehousing concepts. SQL: Proficiency in SQL for querying and manipulating data in relational databases. Big Data Concepts: Understanding of big data principles and distributed computing concepts. Communication and Collaboration: Ability to effectively communicate technical solutions and collaborate with cross-functional teams

Posted 2 weeks ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Mumbai

Work from Office

Key Responsibilities: Design, develop, and maintain robust, scalable, and efficient data pipelines to collect, process, and store structured and unstructured data. Build and optimize data warehouses, data lakes, and ETL/ELT workflows. Integrate data from multiple sources including databases, APIs, and streaming platforms. Collaborate with data scientists and analysts to understand data requirements and deliver high-quality datasets. Ensure data quality, integrity, and security throughout the data lifecycle. Monitor and troubleshoot data pipeline performance and failures. Implement data governance and compliance policies. Automate data workflows and implement data orchestration tools (e.g., Apache Airflow). Optimize storage and query performance in cloud and on-premises environments. Keep up to date with emerging data engineering tools, techniques, and best practices.

Posted 2 weeks ago

Apply

13.0 - 16.0 years

32 - 40 Lacs

Bengaluru

Work from Office

Key Responsibilities Facilitating the integration of diverse data types and sources to provide a comprehensive view of patient health and treatment outcomes. Provide coaching and peer review to ensure that the team s work reflects the industry s best practices for data curation activities, including data privacy and anonymization standards. Ensure all datasets meet analysis-ready and privacy requirements by performing necessary data curation activities (e.g. pre-process, contextualize and/or anonymize). Ensure that datasets are processed to meet conditions mentioned in the approved data re-use request (e.g., remove subjects from countries that do not allow re-use). Write clean, readable code. Ensure that deliverables are appropriately quality controlled, documented, and when required, can be handed over to R&D Tech team for production pipeline implementation. Transforming raw healthcare data into products that can be used to catalyze the work of the wider RWDMA and Biostatistics teams and be leveraged by our diverse group of stakeholders to generate insights. Ensuring data quality, integrity, and security across various data sources. Supporting data-driven decision-making processes that enhance patient outcomes and operational efficiencies. Education Requirements Advanced degree (Masters or Ph.D.) in Life Sciences, Epidemiology, Biostatistics, Public Health, Computer Sciences, Mathematics, Statistics or a related field with applicable experience . Job Related Experience Experience in data engineering and curation, with majority of experience on real-world data in the healthcare or pharmaceutical industry. Proven ability to handle and process large datasets efficiently, ensuring data privacy. Proficiency in handling structured, semi-structured, and unstructured data while ensuring data privacy. Understanding of data governance principles and practices with a focus on data privacy. Innovative mindset and willingness to challenge status quo, solution-oriented mindset Fluent in written and spoken English to effectively communicate and able to articulate complex concepts to diverse audiences Experience of working in global matrix environment and managing stakeholders effectively Experience in complex batch processing, Azure Data Factory, Databricks, Airflow, Delta Lake, PySpark, Pandas and other python dataframe libraries including how to apply them to achieve industry standards and data privacy. Proven ability to collaborate with cross-functional teams. Strong communication skills to present curated data.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies