Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
25 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Job Description: Strong change and project management skills Stakeholder Management, Communications, Reporting Data management, data governance, and data quality management domain knowledge Subject Matter Expertise required in more than one of the following areas- Data Management, Data Governance, Data Quality Measurement and Reporting, Data Quality Issues Management. Liaise with IWPB markets and stakeholders to coordinate delivery of organizational DQ Governance objectives, and provide consultative support to facilitate progress Conduct analysis of IWPB DQ portfolio to identify thematic trends and insights, to effectively advise stakeholders in managing their respective domains Proficiency in MI reporting and visualization is strongly preferred Proficiency in Change and Project Management is strongly preferred. Ability to prepare programme update materials and present for senior stakeholders, with prompt response any issues / escalations Strong communications and Stakeholder Management skills: Should be able to work effectively and maintain strong working relationships as an integral part of a larger team 8+ yrs of relevant experience preferred.
Posted 2 weeks ago
6.0 - 7.0 years
20 - 25 Lacs
Bengaluru
Work from Office
Job Title: Technical Team LeadLocation: TechM Blr ITC06 07Years of Experience: 5 7 YearsJob Summary:We are seeking a highly skilled and motivated Technical Team Lead with a strong background in SAP Archiving The ideal candidate will lead a team of technical professionals, ensuring the successful delivery of projects while maintaining high standards of quality and efficiency This role requires a deep understanding of SAP Archiving processes and technologies, as well as the ability to mentor and guide team members in best practices Responsibilities:Lead and manage a team of technical professionals, providing guidance and support in SAP Archiving projects Design, implement, and optimize SAP Archiving solutions to enhance system performance and data management Collaborate with cross functional teams to gather requirements and ensure alignment with business objectives Conduct regular code reviews and provide constructive feedback to team members Monitor project progress, identify risks, and implement mitigation strategies to ensure timely delivery Stay updated with the latest SAP Archiving trends and technologies, and share knowledge with the team Facilitate training sessions and workshops to enhance team skills in SAP Archiving Prepare and present project status reports to stakeholders and management Mandatory Skills:Strong expertise in SAP Archiving, including knowledge of archiving objects, data retention policies, and data retrieval processes Proven experience in leading technical teams and managing projects in a fast paced environment Excellent problem solving skills and the ability to troubleshoot complex technical issues Strong communication and interpersonal skills, with the ability to work collaboratively with diverse teams Experience with SAP modules and integration points related to archiving Preferred Skills:Familiarity with SAP S/4HANA and its archiving capabilities Knowledge of data governance and compliance standards related to data archiving Experience with project management methodologies (Agile, Scrum, etc ) Certifications in SAP or related technologies Qualifications:Bachelors degree in Computer Science, Information Technology, or a related field 5 7 years of experience in SAP Archiving and technical team leadership Proven track record of successful project delivery and team management If you are a passionate leader with a strong background in SAP Archiving and are looking to take the next step in your career, we encourage you to apply for this exciting opportunity
Posted 2 weeks ago
10.0 - 15.0 years
20 - 25 Lacs
Bengaluru
Work from Office
SAP MDG Experience in SAP MDG EhP6 & MDG 7.0/8.0 (Preferably 9.0) (10+ Years of experience) Extensive ECC and/or S/4 HANA experience, Worked on at least 2 MDG projects Expertise in Implementation of SAP MDG Solution for masters like Customer, Vendor, Material, etc. Expertise in Data Model Enhancement, Data Transfer (DIF/DEF), Data Replication Framework (DRF), Business Rules Framework plus (BRFplus). Experience in Configuration rule based Workflow and in Integrating business process requirements with the technical implementation of SAP Master Data Governance. Experience in User interface modelling (Design and Creation of UI, Value restriction, Define navigation elements of type Hyperlink or Push button, Data quality, Validation and Derivation rules). Experience in Process Modelling (Entity, Business Activity change, Request type, Workflow, Edition type, Relationship, Data replication techniques, SOA service, ALE connection, Key & value mapping, Data transfer, Export & import master data, Convert master data). Expert knowledge in activation and configuration of the MDG modules & components. SAP ERP logistics knowledge (SAP modules SD or MM), especially master data is required.
Posted 2 weeks ago
10.0 - 15.0 years
20 - 25 Lacs
Bengaluru
Work from Office
We are seeking an experienced Data Platform Reliability Engineer to lead our efforts in designing, implementing, and maintaining highly reliable data infrastructure. The ideal candidate will bring extensive expertise in building enterprise-grade data platforms with a focus on reliability engineering, governance, and SLA/SLO design. This role will be instrumental in developing advanced monitoring solutions, including LLM-powered systems, to ensure the integrity and availability of our critical data assets. Platform Architecture and Design Design and architect scalable, fault-tolerant data platforms leveraging modern technologies like Snowflake, Databricks, and cloud-native services Establish architectural patterns that ensure high availability and resiliency across data systems Develop technical roadmaps for platform evolution with reliability as a core principle Reliability Engineering Implement comprehensive SLA/SLO frameworks for data services Design and execute chaos engineering experiments to identify and address potential failure modes Create automated recovery mechanisms for critical data pipelines and services Establish incident management processes and runbooks Monitoring and Observability Develop advanced monitoring solutions, including LLM-powered anomaly detection Design comprehensive observability strategies across the data ecosystem Implement proactive alerting systems to identify issues before they impact users Create dashboards and visualization tools for reliability metrics Data Quality and Governance Establish data quality monitoring processes and tools Implement data lineage tracking mechanisms Develop automated validation protocols for data integrity Collaborate with data governance teams to ensure compliance with policies Innovation and Improvement Research and implement AI/ML approaches to improve platform reliability Lead continuous improvement initiatives for data infrastructure Mentor team members on reliability engineering best practices Stay current with emerging technologies and reliability patterns in the data platform space Qualifications 10+ years of experience in data platform engineering or related fields Proven expertise with enterprise data platforms (Snowflake, Databricks, etc.) Strong background in reliability engineering, SRE practices, or similar disciplines Experience implementing data quality monitoring frameworks Knowledge of AI/ML applications for system monitoring and reliability Excellent communication skills and ability to translate technical concepts to diverse stakeholders
Posted 2 weeks ago
3.0 - 6.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Blend is hiring a Senior Data Scientist (Generative AI) to spearhead the development of advanced AI-powered classification and matching systems on Databricks. You will contribute to flagship programs like the Diageo AI POC by building RAG pipelines, deploying agentic AI workflows, and scaling LLM-based solutions for high-precision entity matching and MDM modernization. Key Responsibilities Design and implement end-to-end AI pipelines for product classification, fuzzy matching, and deduplication using LLMs, RAG, and Databricks-native workflows . Develop scalable, reproducible AI solutions within Databricks notebooks and job clusters , leveraging Delta Lake, MLflow, and Unity Catalog. Engineer Retrieval-Augmented Generation (RAG) workflows using vector search and integrate with Python-based matching logic. Build agent-based automation pipelines (rule-driven + GenAI agents) for anomaly detection, compliance validation, and harmonization logic. Implement explainability, audit trails, and governance-first AI workflows aligned with enterprise-grade MDM needs. Collaborate with data engineers, BI teams, and product owners to integrate GenAI outputs into downstream systems. Contribute to modular system design and documentation for long-term scalability and maintainability. Bachelor s/Master s in Computer Science, Artificial Intelligence, or related field. 5+ years of overall Data Science experience with 2+ years in Generative AI / LLM-based applications. Deep experience with Databricks ecosystem: Delta Lake, MLflow, DBFS, Databricks Jobs & Workflows. Strong Python and PySpark skills with ability to build scalable data pipelines and AI workflows in Databricks. Experience with LLMs (e.g., OpenAI, LLaMA, Mistral) and frameworks like LangChain or LlamaIndex. Working knowledge of vector databases (e.g., FAISS, Chroma) and prompt engineering for classification/retrieval. Exposure to MDM platforms (e.g., Stibo STEP) and familiarity with data harmonization challenges. Experience with explainability frameworks (e.g., SHAP, LIME) and AI audit tooling. Preferred Skills Knowledge of agentic AI architectures and multi-agent orchestration. Familiarity with Azure Data Hub and enterprise data ingestion frameworks. Understanding of data governance, lineage, and regulatory compliance in AI systems.
Posted 2 weeks ago
3.0 - 6.0 years
20 - 25 Lacs
Hyderabad
Work from Office
Job Description Blend is hiring a Senior Data Scientist (Generative AI) to spearhead the development of advanced AI-powered classification and matching systems on Databricks. You will contribute to flagship programs like the Diageo AI POC by building RAG pipelines, deploying agentic AI workflows, and scaling LLM-based solutions for high-precision entity matching and MDM modernization. Key Responsibilities Design and implement end-to-end AI pipelines for product classification, fuzzy matching, and deduplication using LLMs, RAG, and Databricks-native workflows . Develop scalable, reproducible AI solutions within Databricks notebooks and job clusters , leveraging Delta Lake, MLflow, and Unity Catalog. Engineer Retrieval-Augmented Generation (RAG) workflows using vector search and integrate with Python-based matching logic. Build agent-based automation pipelines (rule-driven + GenAI agents) for anomaly detection, compliance validation, and harmonization logic. Implement explainability, audit trails, and governance-first AI workflows aligned with enterprise-grade MDM needs. Collaborate with data engineers, BI teams, and product owners to integrate GenAI outputs into downstream systems. Contribute to modular system design and documentation for long-term scalability and maintainability. Qualifications Bachelor s/Master s in Computer Science, Artificial Intelligence, or related field. 5+ years of overall Data Science experience with 2+ years in Generative AI / LLM-based applications. Deep experience with Databricks ecosystem: Delta Lake, MLflow, DBFS, Databricks Jobs & Workflows. Strong Python and PySpark skills with ability to build scalable data pipelines and AI workflows in Databricks. Experience with LLMs (e.g., OpenAI, LLaMA, Mistral) and frameworks like LangChain or LlamaIndex. Working knowledge of vector databases (e.g., FAISS, Chroma) and prompt engineering for classification/retrieval. Exposure to MDM platforms (e.g., Stibo STEP) and familiarity with data harmonization challenges. Experience with explainability frameworks (e.g., SHAP, LIME) and AI audit tooling. Preferred Skills Knowledge of agentic AI architectures and multi-agent orchestration. Familiarity with Azure Data Hub and enterprise data ingestion frameworks. Understanding of data governance, lineage, and regulatory compliance in AI systems.
Posted 2 weeks ago
3.0 - 5.0 years
6 - 10 Lacs
Mumbai
Work from Office
Position: Data Lifecycle Management (DLM) Specialist | Mumbai | WFO Location: Goregaon, Mumbai (Apply if you are from Western line) Shift Timing: 9 AM 6 PM Notice Period: Immediate to 30 Days Experience: 3 to 5 Years Work Mode: Work from Office (WFO) Interested candidates can apply to saikeertana.r@twsol.com Role Overview: Seeking a highly motivated and client-centric DLM Specialist with 35 years of experience in data management , financial services , or other regulated industries . This role focuses on reviewing applications and ensuring data retention, disposition, and archiving compliance while aligning with privacy regulations and internal policy. Key Responsibilities: Assess data retention, archiving, and disposition requirements across all business divisions Conduct regular reviews and stakeholder meetings with business and technology teams Manage data risk identification and mitigation plans related to retention, location, and transfer Document concise data management requirements and ensure implementation tracking Support in defining operational and compliance controls Compile analysis reports and drive recommendation implementation Engage system owners in problem-solving and decision-making Represent DLM in cross-functional meetings to communicate policy standards Prepare progress reports and contribute to process improvements Required Qualifications: Bachelors degree 3 to 5 years experience in information/data management , data storage , or financial services operations Strong business analysis skills Excellent verbal and written communication skills in English High attention to detail with the ability to document complex information clearly Demonstrated client servicing ability and stakeholder management Experience in developing business and functional requirements for tech systems Nice to Have: Degree in Information Systems , Business Administration , Archiving , or Law Understanding of personal data protection and privacy regulations Familiarity with database and cloud technologies , AI trends Reporting experience with Power BI / Tableau Experience working with high-volume datasets
Posted 2 weeks ago
5.0 - 10.0 years
8 - 9 Lacs
Pune
Work from Office
Req ID: 332236 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Consulting-Technical analyst with ETL,GCP using Pyspark to join our team in Pune, Mah r shtra (IN-MH), India (IN). Key Responsibilities: Data Pipeline Development: Designing, implementing, and optimizing data pipelines on GCP using PySpark for efficient and scalable data processing. ETL Workflow Development: Building and maintaining ETL workflows for extracting, transforming, and loading data into various GCP services. GCP Service Utilization: Leveraging GCP services like BigQuery, Cloud Storage, Dataflow, and Dataproc for data storage, processing, and analysis. Data Transformation: Utilizing PySpark for data manipulation, cleansing, enrichment, and validation. Performance Optimization: Ensuring the performance and scalability of data processing jobs on GCP. Collaboration: Working with data scientists, analysts, and other stakeholders to understand data requirements and translate them into technical solutions. Data Quality and Governance: Implementing and maintaining data quality standards, security measures, and compliance with data governance policies on GCP. Troubleshooting and Support: Diagnosing and resolving issues related to data pipelines and infrastructure. Staying Updated: Keeping abreast of the latest GCP services, PySpark features, and best practices in data engineering. Required Skills: GCP Expertise: Strong understanding of GCP services like BigQuery, Cloud Storage, Dataflow, and Dataproc. PySpark Proficiency: Demonstrated experience in using PySpark for data processing, transformation, and analysis. Python Programming: Solid Python programming skills for data manipulation and scripting. Data Modeling and ETL: Experience with data modeling, ETL processes, and data warehousing concepts. SQL: Proficiency in SQL for querying and manipulating data in relational databases. Big Data Concepts: Understanding of big data principles and distributed computing concepts. Communication and Collaboration: Ability to effectively communicate technical solutions and collaborate with cross-functional teams
Posted 2 weeks ago
3.0 - 6.0 years
5 - 8 Lacs
Mumbai
Work from Office
Key Responsibilities: Design, develop, and maintain robust, scalable, and efficient data pipelines to collect, process, and store structured and unstructured data. Build and optimize data warehouses, data lakes, and ETL/ELT workflows. Integrate data from multiple sources including databases, APIs, and streaming platforms. Collaborate with data scientists and analysts to understand data requirements and deliver high-quality datasets. Ensure data quality, integrity, and security throughout the data lifecycle. Monitor and troubleshoot data pipeline performance and failures. Implement data governance and compliance policies. Automate data workflows and implement data orchestration tools (e.g., Apache Airflow). Optimize storage and query performance in cloud and on-premises environments. Keep up to date with emerging data engineering tools, techniques, and best practices.
Posted 2 weeks ago
13.0 - 16.0 years
32 - 40 Lacs
Bengaluru
Work from Office
Key Responsibilities Facilitating the integration of diverse data types and sources to provide a comprehensive view of patient health and treatment outcomes. Provide coaching and peer review to ensure that the team s work reflects the industry s best practices for data curation activities, including data privacy and anonymization standards. Ensure all datasets meet analysis-ready and privacy requirements by performing necessary data curation activities (e.g. pre-process, contextualize and/or anonymize). Ensure that datasets are processed to meet conditions mentioned in the approved data re-use request (e.g., remove subjects from countries that do not allow re-use). Write clean, readable code. Ensure that deliverables are appropriately quality controlled, documented, and when required, can be handed over to R&D Tech team for production pipeline implementation. Transforming raw healthcare data into products that can be used to catalyze the work of the wider RWDMA and Biostatistics teams and be leveraged by our diverse group of stakeholders to generate insights. Ensuring data quality, integrity, and security across various data sources. Supporting data-driven decision-making processes that enhance patient outcomes and operational efficiencies. Education Requirements Advanced degree (Masters or Ph.D.) in Life Sciences, Epidemiology, Biostatistics, Public Health, Computer Sciences, Mathematics, Statistics or a related field with applicable experience . Job Related Experience Experience in data engineering and curation, with majority of experience on real-world data in the healthcare or pharmaceutical industry. Proven ability to handle and process large datasets efficiently, ensuring data privacy. Proficiency in handling structured, semi-structured, and unstructured data while ensuring data privacy. Understanding of data governance principles and practices with a focus on data privacy. Innovative mindset and willingness to challenge status quo, solution-oriented mindset Fluent in written and spoken English to effectively communicate and able to articulate complex concepts to diverse audiences Experience of working in global matrix environment and managing stakeholders effectively Experience in complex batch processing, Azure Data Factory, Databricks, Airflow, Delta Lake, PySpark, Pandas and other python dataframe libraries including how to apply them to achieve industry standards and data privacy. Proven ability to collaborate with cross-functional teams. Strong communication skills to present curated data.
Posted 2 weeks ago
6.0 - 11.0 years
35 - 40 Lacs
Bengaluru
Work from Office
Build Your Career at Informatica We seek innovative thinkers who believe in the power of data to drive meaningful change. At Informatica, we welcome adventurous, work-from-anywhere minds eager to tackle the worlds most complex challenges. Our employees are empowered to push their bold ideas forward, and we are united by a shared passion for using data to do the extraordinary for each other and the world. Senior Solution Architect - Presales (Remote) Were looking for a senior solution architect candidate with experience in Presales, Data Integration and MDM, to join our team in remote. You will report to the Director, Technical Sales. Technology Youll Use Presales, Data Integration and MDM Your Role Responsibilities? Heres What Youll Do Basic knowledge of top 3 cloud ecosystems and top 2 data related technologies Basic knowledge of cloud computing security aspects Basic certification on at least 1 cloud ecosystem and 1 data related cloud technology at the level defined by the business area of focus Skills on at least one INFA related software platform /technology, Storytelling, and experience establishing communication and engagement with prospects specific to use cases Ability to engage and create relationships with influencers, coaches, and decision makers, and partners Basic technical knowledge of hybrid deployment of software solutions, Data Warehousing, Database, and/or Business Intelligence software concepts and products. What Wed Like to See Manage customer engagements without support. Responsible for sharing best practices, content, and tips and tricks within the primary area of responsibility Stay current on certification of services required for area of responsibility Perform all activities leading up to the delivery of a customer demo with some assistance including discovery, technical qualification/fit, customer presentations, standard demos, and related customer facing communication Assist on RFP responses and/or POCs Partner with the CSM team on nurture activities including technical advisory, workshops, etc. Provide customer feedback on product gaps using Vivun Ability to support demos at marketing events without support. Role Essentials 6+ years of relevant experience in data integration, master data management, or data governance 8+ year of presales/technical sales, industry, or consulting experience BA/BS or equivalent educational background is preferred Perks & Benefits Comprehensive health, vision, and wellness benefits (Paid parental leave, adoption benefits, life insurance, disability insurance and 401k plan or international pension/retirement plans Flexible time-off policy and hybrid working practices Equity opportunities and an employee stock purchase program (ESPP) Comprehensive Mental Health and Employee Assistance Program (EAP) benefit Our DATA values are our north star and we are passionate about building and delivering solutions that accelerate data innovations. At Informatica, our employees are our greatest competitive advantage. So, if your experience aligns but doesnt exactly match every qualification, apply anyway. You may be exactly who we need to fuel our future with innovative ideas and a thriving culture. Informatica (NYSE: INFA), a leader in enterprise AI-powered cloud data management, brings data and AI to life by empowering businesses to realize the transformative power of their most critical assets. We pioneered the Informatica Intelligent Data Management Cloud that manages data across any multi-cloud, hybrid system, democratizing data to advance business strategies. Customers in approximately 100 countries and more than 80 of the Fortune 100 rely on Informatica. www.informatica.com . Connect with LinkedIn , X , and Facebook . Informatica. Where data and AI come to life. ","
Posted 2 weeks ago
10.0 - 15.0 years
20 - 27 Lacs
Bengaluru
Work from Office
Build Your Career at Informatica Were looking for a diverse group of collaborators who believe data has the power to improve society. Adventurous minds who value solving some of the worlds most challenging problems. Here, employees are encouraged to push their boldest ideas forward, united by a passion to create a world where data improves the quality of life for people and businesses everywhere. Principal Advisory Services Consultant Informatica is looking for a Principal Consultant--Advisory Services with practitioner experience leading large-scale data management and analytics projects. This is a remote position, reporting to a Senior Director, Data Strategy & Governance, you have experience implementing data governance programs, defining vision and data strategy with peers and senior leadership to gain "support" to the strategy and overall value of Informatica Products & Solutions to join our Professional Services team. You will provide pre- and post-sale strategic consulting services. Responsibilities include providing clients with data strategy development and understanding, implementation guidance, program design, business use case identification, program road mapping, and business outcome definition. Provides pre- and post-sale business-oriented strategic consulting services, typically onsite at the customers location. Responsibilities include providing clients with overall data strategy development and alignment, implementation guidance, program design, business use case identification, program road mapping, and business outcome definition. Essential Duties & Responsibilities Analyzes complex customer environments comprised of Informatica and non-informatica products. Organizes large-scale programs and coordinates/leads multiple delivery teams. Applies innovative design solutions by keeping current on new technology trends and changing industry standards and patterns. Travel to customer sites typically exceeds 50%, but may exceed 75% for extended periods, as applicable to the customer engagement. Knowledge & Skills Holds expert-level experience and uses professional concepts and company objectives to resolve complex issues in creative and effective ways. Works on complex issues where analysis of situations or data requires an in-depth evaluation of variable factors. Exercises judgment in methods, techniques, and evaluation criteria for obtaining results. Etensively leverages business acumen and subject matter expertise to provide expert-level advice and guidance to clients. Thorough understanding of Informatica business priorities, strategy and direction. Works across the organization and maintains/builds strong working relationships based in experiences/past interactions. Significant experience leading the delivery of complex enterprise data management projects/initiatives. Competent in navigating, using, and demonstrating functionality in Informaticas business-facing applications. Published industry white papers, best practices, field guides and external communications. Strong written communication skills with competency in developing professional looking presentation materials and customer deliverables. Developed ability in communicating to executive level audiences in both interpersonal and presentation formats. Education/Experience BA/BS or equivalent educational background is preferred. Minimum 10+ years of relevant professional experience. Perks & Benefits Comprehensive health, vision, and wellness benefits (Paid parental leave, adoption benefits, life insurance, disability insurance and 401k plan or international pension/retirement plans Flexible time-off policy and hybrid working practices Equity opportunities and an employee stock purchase program (ESPP) Comprehensive Mental Health and Employee Assistance Program (EAP) benefit Our DATA values are our north star and we are passionate about building and delivering solutions that accelerate data innovations. At Informatica, our employees are our greatest competitive advantage. So, if your experience aligns but doesnt exactly match every qualification, apply anyway. You may be exactly who we need to fuel our future with innovative ideas and a thriving culture. Informatica (NYSE: INFA), an Enterprise Cloud Data Management leader, brings data and AI to life by empowering businesses to realize the transformative power of their most critical assets. We pioneered the Informatica Intelligent Data Management Cloud that manages data across any multi-cloud, hybrid system, democratizing data to advance business strategies. Customers in over 100 countries and 85 of the Fortune 100 rely on Informatica. www.informatica.com . Connect with LinkedIn , Twitter , and Facebook . Informatica. Where data and AI come to life. ","
Posted 2 weeks ago
6.0 - 7.0 years
13 - 15 Lacs
Bengaluru
Work from Office
Job Title: Software DeveloperLocation: TechM Blr ITC06 07Years of Experience: 2 5 YearsJob Summary:We are seeking a skilled Software Developer with a strong background in SAP Archiving to join our dynamic team The ideal candidate will have 2 5 years of experience in software development, with a focus on SAP solutions You will be responsible for designing, developing, and implementing software applications that meet our business needs while ensuring data integrity and compliance through effective archiving strategies Responsibilities:Design, develop, and maintain software applications in accordance with business requirements Implement and manage SAP Archiving solutions to optimize data storage and retrieval processes Collaborate with cross functional teams to gather requirements and translate them into technical specifications Conduct code reviews and ensure adherence to best practices in software development Perform testing and debugging of applications to ensure high quality deliverables Provide technical support and troubleshooting for existing applications Stay updated with the latest industry trends and technologies related to SAP and software development Mandatory Skills:Strong knowledge and experience in SAP Archiving
Posted 2 weeks ago
5.0 - 10.0 years
10 - 11 Lacs
Pune
Work from Office
Job Overview: We are seeking a skilled Data Solution Architect to design Solution and Lead the implementation on GCP. The ideal candidate will possess extensive experience in data Architecting, Solution design and data management practices.Responsibilities: Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions.Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Experience with common GCP services such as BigQuery, Dataflow, GCS, Service Accounts, cloud function Extremely strong in BigQuery design, development Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modelling using any Modelling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills.
Posted 2 weeks ago
8.0 - 10.0 years
15 - 19 Lacs
Mumbai
Work from Office
Job Description Are You Ready to Make It Happen at Mondel z International? Join our Mission to Lead the Future of Snacking. Make It Uniquely Yours. The Senior Manager, Finance Data Governance is a critical role responsible for leading and executing the finance master data governance strategy. This role will drive the implementation of data governance policies, standards, and processes to ensure data quality, integrity, and security. The Senior Manager will collaborate with business stakeholders, IT teams, and data owners to establish a data-driven culture and enable effective use of data for business decision-making. How you will contribute Strategy and Leadership: Contribute to the development and execution of the overall data governance strategy, aligning with business objectives and regulatory requirements. Promote data governance awareness and adoption throughout the organization. Policy and Standards: Develop and maintain data governance policies, standards, and procedures, ensuring alignment with industry best practices and regulatory guidelines. Define data quality metrics and monitor data quality performance. Establish data ownership and stewardship responsibilities. Implementation and Execution: Lead the implementation of data governance tools and technologies. Work with business units to identify and prioritize data governance initiatives. Ensure data lineage is documented and maintained. Collaboration and Communication: Partner with business stakeholders to understand data needs and requirements. Collaborate with IT teams to ensure data governance requirements are integrated into system development and maintenance processes. Communicate data governance policies and procedures to the organization. Facilitate data governance council meetings and working groups. Data Quality Management: Establish data quality rules and monitor data quality metrics. Identify and resolve data quality issues. Implement data quality improvement initiatives. Compliance and Security: Ensure data governance policies and procedures comply with relevant regulations, such as GDPR, CCPA, and other data privacy laws. Implement data security measures to protect sensitive data. Monitor and audit data governance activities to ensure compliance. What you will bring A desire to drive your future and accelerate your career and the following experience and knowledge: Qualifications: Education: Bachelors degree in a relevant field (e.g., Finance, Business Administration, Data & Analytics). Masters degree preferred. Experience: Minimum of 8-10 years of experience in data governance, data management, or related fields. Proven experience in leading and implementing data governance initiatives. Strong understanding of data governance principles, practices, and technologies. Experience with data quality management tools and techniques. Skills: Excellent leadership and communication skills. Strong analytical and problem-solving skills. Ability to work effectively with cross-functional teams. Proficiency in data governance tools and technologies (e.g., Collibra, Informatica, Alation). Knowledge of data warehousing and business intelligence concepts. Strong project management skills. Key Competencies: Strategic Thinking: Ability to develop and execute a data governance strategy aligned with business objectives. Communication: Ability to communicate complex data governance concepts to both technical and non-technical audiences. Collaboration: Ability to work effectively with cross-functional teams. Problem Solving: Ability to identify and resolve data governance issues. Technical Proficiency: Strong understanding of data governance tools and technologies. Results Orientation: Ability to drive data governance initiatives to achieve measurable results. More about this role Education / Certifications: Education: Bachelors degree in a relevant field (e.g., Finance, Business Administration, Data & Analytics). Masters degree preferred. Job specific requirements: Minimum of 8-10 years of experience in data governance, data management, or related fields. Proven experience in leading and implementing data governance initiatives. Strong understanding of data governance principles, practices, and technologies. Experience with data quality management tools and techniques. Travel requirements: Occasional Work schedule: Flexible Relocation Support Available? No Relocation support available Business Unit Summary We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position you see is right for you, we encourage you to apply! Our people make all the difference in our succes Excited to grow your career? We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position you see is right for you, we encourage you to apply! IF YOU REQUIRE SUPPORT TO COMPLETE YOUR APPLICATION OR DURING THE INTERVIEW PROCESS, PLEASE CONTACT THE RECRUITER Job Type Regular Project and Program Management Business Capability
Posted 2 weeks ago
7.0 - 10.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Experience: 7-10 years Job Description: We are looking for an experienced SAP Master Data Management (MDM) Consultant with expertise in ECC, S4HANA Migration, Rollouts, and Data Management. The ideal candidate will lead and execute MDM strategies, manage data migration, and drive continuous improvements. Key Responsibilities: Own and manage Master Data Management (MDM) activities for SAP projects. De-duplication of Masters Lead data migration and cutovers in SAP S/4HANA projects (Greenfield, Migration, or Rollouts). Establish and implement MDM best practices and data management capabilities. Define data management principles, policies, and lifecycle strategies. Monitor data quality with consistent metrics and reporting. Work with MDM stakeholders to drive data governance and compliance. Track and manage MDM objects, ensuring timely delivery. Conduct training sessions for teams on ECC & S/4HANA MDM. Participate in daily stand-ups, issue tracking, and dashboard updates. Identify risks and process improvements for MDM. Required Skills & Qualifications: Minimum 7-10 years of experience in SAP MDM. Strong knowledge of ECC, SAP S/4HANA, Data Migration, and Rollouts. Experience in data governance, lifecycle management, and compliance. Familiarity with JIRA KANBAN boards, ticketing tools, and dashboards. Strong problem-solving and communication skills. Ability to work with the team especially ABAP, Middleware, Functionals. Knowledge on Excel is a MUST ABAP knowledge is preferable SAP training or certifications are an asset Team player, with strong communication skills and with a collaborative spirit Able to coach, support, train and develop junior consultants Customer oriented, result driven & focused on delivering quality
Posted 2 weeks ago
2.0 - 7.0 years
7 - 12 Lacs
Chennai
Work from Office
Role Purpose The purpose of this role is to execute the process and drive the performance of the team on the key metrices of the process. Job Details Country/Region: India Employment Type: Onsite Work Type: Contract State: Tamil Nadu City: Chennai Requirements Onsite at Abu Dhabi Contract for 2 Years Shift: Abu Dhabi General Shift Timings Someone who can travel to onsite ASAP or max 40 days to start working on this role. Job Description Minimum overall work experience: 10 years Financial Systems Support (L1/L2) Provide financials system support to all end users across ADD functions relating to financial systems. Coordinating with Finance HQ IT support team to resolve reported issues through RITM/Incident/Idea tickets. Endorse and approve financial roles access & authorization tickets. Month End Closing (MEC) Support Provide support during transactional data processing, preparing uploads, data validation & reconciliation and identify incorrect master data assignments and subsequently suggest corrective actions. Closely coordinating with financial users in data reconciliations and validation during month end closing activities. Close coordinating with HUB and Bi teams to update SAP data to get correct segmented finance reports which is aligned with SAP data. Validate Bi Financial Reports & Dashboard Master data governance and policy & procedure compliance
Posted 2 weeks ago
4.0 - 6.0 years
8 - 12 Lacs
Bengaluru
Work from Office
[{"Salary":null , "Remote_Job":false , "Posting_Title":"Senior Databricks Engineer / Tech Lead" , "Is_Locked":false , "City":"Bangalore" , "Industry":"Technology" , "Job_Description":" About the Role: As part of our Innovation Team , we are seeking a C ertified Senior Databricks Engineer / Tech Lead with 7\u20138 years of hands-on experience in building scalable data platforms. This role will focus on designing, building, and operationalizing data solutions on the Databricks platform to accelerate advanced analytics and AI use cases. Key Responsibilities: Architect, develop, productionize and maintain end to end solutions in Databricks Implement and optimize ETL/ELT processes for structured and semi-structured data Leverage Delta Lake for ACID transactions, data versioning, and time-travel features Drive adoption of the Lakehouse architecture to unify data warehousing and AI/ML workloads Implement CI/CD pipelines using Databricks Repos , Asset Bundles , and integration with DevOps tools Configure and enforce Unity Catalog for secure, governed access to data assets Design and implement data quality and validation frameworks to ensure trusted data Lead performance tuning and optimization efforts for Spark jobs and queries Integrate with external systems such as Kafka , Event Hub , and REST APIs for real-time and batch processing Collaborate with data scientists and business stakeholders to build feature-rich datasets and reusable assets Troubleshoot and debug complex data workflows in development and production environments Guide junior engineers and contribute to best practices in data engineering and platform usage Ensure platform security, access controls , and compliance with enterprise data governance standards. Required Skills: Expertise in Apache Spark and Databricks platform Experience with Databricks Lakehouse architecture Delta Lake concepts Proficient in PySpark, SQL, and Delta Lake Strong knowledge of Data Engineering concepts Experience with data ingestion, ETL/ELT pipelines Familiarity with Unity Catalog and data governance Hands-on with Databricks Notebooks and Jobs CI/CD automation with Databricks Repos and DevOps, Asset Bundles Databricks Asset Bundle implementation knowledge Strong understanding of performance tuning in Spark Data quality and validation framework implementation Experience in handling structured, semi-structured data Proficient in debugging and troubleshooting Collaboration with data scientists and analysts Good understanding of security and access control Experience with Mosaic AI or Databricks ML capabilities Exposure to streaming pipelines using Structured Streaming Familiarity with data observability and lineage tools.
Posted 2 weeks ago
3.0 - 8.0 years
15 - 30 Lacs
Pune
Hybrid
Dear Candidate, This is with reference to Senior Business Intelligence Analyst Openings At Wolters Kluwer, Pune Kindly share your resume on jyoti.salvi@wolterskluwer.com Job Specifications :- Skillset Requirement : Looking for Data Governance professionals with experience in Collibra . Experience in Microsoft Purview is highly preferred. Experience Range : Candidates with 2 to 8 years of relevant Data Governance experience Primary Skills - Data Governance, Microsoft Purview, Collibra Architecting, designing, and implementing data governance solutions using Microsoft Purview Experience in data lifecycle management, including data retention, deletion, and archiving strategies using Microsoft Purview Data Lifecycle Management Assist transitions to Microsoft Purview services, including setting up data lifecycle management and eDiscovery configurations Maintain accurate documentation of configurations, processes, and procedures related to Microsoft Purview Experience in implementation of data governance policies and procedures to ensure compliance with regulatory requirements and organizational standards Ensure Data Quality and Compliance by applying expertise in MDM and data governance principles, including data governance frameworks and practices, to ensure the relevancy, quality, security, and compliance of master data Develop and implement data integration solutions for metadata, data lineage, and data quality
Posted 2 weeks ago
9.0 - 14.0 years
11 - 16 Lacs
Bengaluru
Work from Office
Design and implement custom workflows using Collibra Workflow Designer (BPMN). Collaborate with data governance teams to translate business requirements into technical solutions. Develop and maintain Collibra integrations using APIs, Collibra Connect, and third-party tools. Configure and manage Collibra Data Intelligence Cloud components including domains, assets, and communities. Support metadata management, data lineage, and data catalog initiatives. Troubleshoot and resolve workflow issues and performance bottlenecks. Ensure compliance with data governance policies and standards. Document workflow logic, configurations, and integration processes. Required Skills & Qualifications: 5+ years of experience in data governance or metadata management. 2+ years of hands-on experience with Collibra platform, especially workflow development. Proficiency in Java, Groovy for scripting and workflow customization. Experience with Collibra Connect, REST APIs, and integration with tools like Informatica, Snowflake, or Azure. Familiarity with BPMN 2.0 and workflow lifecycle management. Strong understanding of data governance frameworks (eg, DAMA-DMBOK). Excellent problem-solving and communication skills. Mandatory skills* Data Governance Desired skills* Collibra Domain* Foods and Beverages
Posted 2 weeks ago
12.0 - 15.0 years
45 - 50 Lacs
Mumbai
Work from Office
This is an exciting time in TransUnion CIBIL. With investments in our people, technology and new business markets, we are redefining the role and purpose of a credit bureau. This role involves overseeing and managing priority sector lending and financial inclusion data acquisition initiatives, ensuring compliance with regulatory requirements while driving growth and impact. What you'll Bring: Data Acquisition Strategy & Execution: Execute functional strategy to drive customer engagement on data acquisition across all associated member institution. Identifying, exploring and detailing out the opportunities to solve for critical data submission issues of the clients. Understanding business initiatives and its purpose to drive and channelize discussions with diverse teams in distributed work environments Identifying, exploring and detailing out the opportunities to solve for critical data submission issues of the clients. Stakeholder Management & Collaboration: Maintain key customer relationships and develop, implement data related strategies with key decision makers. Providing regular inputs to the Product teams on data reporting and any changes in reporting and best practices in the market for smooth as we'll as prompt response. Collaborate with multiple business stakeholders (Sales, Operations and Products) to identify priorities, metrics and track progress on identified data acquisition initiatives Reporting & Insights Generation: Drawing meaningful conclusions and recommendations based on data analysis results for effective member engagement. Take complete ownership of data directives to achieve assigned tasks from its planning, analysis till providing required business insights enabling rational decision making Team Leadership & Management: Build and lead a high performing data acquisition team, including data analyst and data acquisition managers. Set clear KPIs and performance benchmarks for data acquisition teams on data enhancement and reporting Provide specialize training and capacity building programs for data acquisition team members related to MFI data reporting best practices and compliance. Regulatory Compliance & Data Governance Ensure complete, accurate and timely reporting of data and comply with the relevant regulatory guidelines. Establish governance framework for data ingestion, data validations and standardization Monitor adherence to regulatory standards and data reporting practices. Liaise with legal and compliance teams to stay updated on policy changes affecting data acquisition. Experience and Skills Master s degree in agriculture, Rural Business administration or a related field Minimum 12+ years of relevant experience in managing Priority Sector lending or financial inclusion. Flexibility to travel as needed Self-starter, ability to work independently, handle ambiguous situations and exercise judgement in variety of situations. Strong communication, organizational, verbal & written skills. High degree of responsibility and ownership, strong multitasking, coordination and tenaciously looking for ways to get results. This job is assigned as On-Site Essential and requires in- person work at an assigned TU office location as a condition of employment.
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Define and implement data quality rules, scorecards, and issue management workflows. Profile datasets to identify and resolve data quality issues. Monitor and report on data quality metrics and KPIs. Support data stewardship activities and provide training to business users. Implement and configure Collibra Data Intelligence Platform for metadata management, data cataloging, and governance workflows. Collaborate with business and IT stakeholders to define and enforce data governance policies and standards. Required Skills & Qualifications: 3+ years of experience in data governance, and data quality. Strong understanding of data governance frameworks eg, DAMADMBOK. Experience with SQL Familiarity with data privacy regulations eg, GDPR, CCPA. Excellent communication and stakeholder management skills. Mandatory skills* Data Quality Desired skills* Collibra Domain* Foods and Beverages
Posted 2 weeks ago
10.0 - 15.0 years
12 - 17 Lacs
Bengaluru
Work from Office
Project description We are looking for an experienced Finance Data Hub Platform Product Manager to own the strategic direction, development, and management of the core data platform that underpins our Finance Data Hub. This role is focused on ensuring the platform is scalable, reliable, secure, and optimized to support data ingestion, transformation, and access across the finance organisation. As the Platform Product Manager, you will work closely with engineering, architecture, governance, and infrastructure teams to define the technical roadmap, prioritize platform enhancements, and ensure seamless integration with data and UI product streams. Your focus will be on enabling data products and services by ensuring the platform's core capabilities meet evolving business needs. Responsibilities Key Responsibilities Platform Strategy & VisionDefine and own the roadmap for the Finance Data Hub platform, ensuring it aligns with business objectives and supports broader data product initiatives. Technical Collaborate with architects, data engineers, and governance teams to define and prioritise platform capabilities, including scalability, security, resilience, and data lineage. Integration ManagementEnsure the platform seamlessly integrates with data streams and serves UI products, enabling efficient data ingestion, transformation, storage, and consumption. Infrastructure CoordinationWork closely with infrastructure and DevOps teams to ensure platform performance, cost optimisation, and alignment with enterprise architecture standards. Governance & CompliancePartner with data governance and security teams to ensure the platform adheres to data management standards, privacy regulations, and security protocols. Backlog ManagementOwn and prioritise the platform development backlog, balancing technical needs with business priorities, and ensuring timely delivery of enhancements. Agile LeadershipSupport and often lead agile ceremonies, write clear user stories focused on platform capabilities, and facilitate collaborative sessions with technical teams. Stakeholder CommunicationProvide clear updates on platform progress, challenges, and dependencies to stakeholders, ensuring alignment across product and engineering teams. Continuous ImprovementRegularly assess platform performance, identify areas for optimization, and champion initiatives that enhance reliability, scalability, and efficiency. Risk ManagementIdentify and mitigate risks related to platform stability, security, and data integrity. SkillsMust have Proven 10+ years experience as a Product Manager focused on data platforms, infrastructure, or similar technical products. Strong understanding of data platforms and infrastructure, including data ingestion, processing, storage, and access within modern data ecosystems. Experience with cloud data platforms (e.g., Azure, AWS, GCP) and knowledge of data lake architectures. Understanding of data governance, security, and compliance best practices. Strong stakeholder management skills, particularly with technical teams (engineering, architecture, security). Experience managing product backlogs and roadmaps in an Agile environment. Ability to balance technical depth with business acumen to drive effective decision-making. Nice to have Experience with financial systems and data sources, such as HFM, Fusion, or other ERPs Knowledge of data orchestration and integration tools (e.g., Apache Airflow, Azure Data Factory). Experience with transitioning platforms from legacy technologies (e.g., Teradata) to modern solutions. Familiarity with cost optimization strategies for cloud platforms.
Posted 2 weeks ago
6.0 - 11.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 6+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 2 weeks ago
3.0 - 6.0 years
10 - 14 Lacs
Mumbai
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in the integration efforts between Alation and Manta, ensuring seamless data flow and compatibility. Collaborate with cross-functional teams to gather requirements and design solutions that leverage both Alation and Manta platforms effectively. Develop and maintain data governance processes and standards within Alation, leveraging Manta's data lineage capabilities. Analyze data lineage and metadata to provide insights into data quality, compliance, and usage patterns Preferred technical and professional experience Lead the evaluation and implementation of new features and updates for both Alation and Manta platforms Ensuring alignment with organizational goals and objectives. Drive continuous improvement initiatives to enhance the efficiency and effectiveness of data management processes, leveraging Alati
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France