Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
3.0 - 8.0 years
15 - 30 Lacs
Pune
Hybrid
Dear Candidate, This is with reference to Senior Business Intelligence Analyst Openings At Wolters Kluwer, Pune Kindly share your resume on jyoti.salvi@wolterskluwer.com Job Specifications :- Skillset Requirement : Looking for Data Governance professionals with experience in Collibra . Experience in Microsoft Purview is highly preferred. Experience Range : Candidates with 2 to 8 years of relevant Data Governance experience Primary Skills - Data Governance, Microsoft Purview, Collibra Architecting, designing, and implementing data governance solutions using Microsoft Purview Experience in data lifecycle management, including data retention, deletion, and archiving strategies using Microsoft Purview Data Lifecycle Management Assist transitions to Microsoft Purview services, including setting up data lifecycle management and eDiscovery configurations Maintain accurate documentation of configurations, processes, and procedures related to Microsoft Purview Experience in implementation of data governance policies and procedures to ensure compliance with regulatory requirements and organizational standards Ensure Data Quality and Compliance by applying expertise in MDM and data governance principles, including data governance frameworks and practices, to ensure the relevancy, quality, security, and compliance of master data Develop and implement data integration solutions for metadata, data lineage, and data quality
Posted 2 weeks ago
9.0 - 14.0 years
11 - 16 Lacs
Bengaluru
Work from Office
Design and implement custom workflows using Collibra Workflow Designer (BPMN). Collaborate with data governance teams to translate business requirements into technical solutions. Develop and maintain Collibra integrations using APIs, Collibra Connect, and third-party tools. Configure and manage Collibra Data Intelligence Cloud components including domains, assets, and communities. Support metadata management, data lineage, and data catalog initiatives. Troubleshoot and resolve workflow issues and performance bottlenecks. Ensure compliance with data governance policies and standards. Document workflow logic, configurations, and integration processes. Required Skills & Qualifications: 5+ years of experience in data governance or metadata management. 2+ years of hands-on experience with Collibra platform, especially workflow development. Proficiency in Java, Groovy for scripting and workflow customization. Experience with Collibra Connect, REST APIs, and integration with tools like Informatica, Snowflake, or Azure. Familiarity with BPMN 2.0 and workflow lifecycle management. Strong understanding of data governance frameworks (eg, DAMA-DMBOK). Excellent problem-solving and communication skills. Mandatory skills* Data Governance Desired skills* Collibra Domain* Foods and Beverages
Posted 2 weeks ago
12.0 - 15.0 years
45 - 50 Lacs
Mumbai
Work from Office
This is an exciting time in TransUnion CIBIL. With investments in our people, technology and new business markets, we are redefining the role and purpose of a credit bureau. This role involves overseeing and managing priority sector lending and financial inclusion data acquisition initiatives, ensuring compliance with regulatory requirements while driving growth and impact. What you'll Bring: Data Acquisition Strategy & Execution: Execute functional strategy to drive customer engagement on data acquisition across all associated member institution. Identifying, exploring and detailing out the opportunities to solve for critical data submission issues of the clients. Understanding business initiatives and its purpose to drive and channelize discussions with diverse teams in distributed work environments Identifying, exploring and detailing out the opportunities to solve for critical data submission issues of the clients. Stakeholder Management & Collaboration: Maintain key customer relationships and develop, implement data related strategies with key decision makers. Providing regular inputs to the Product teams on data reporting and any changes in reporting and best practices in the market for smooth as we'll as prompt response. Collaborate with multiple business stakeholders (Sales, Operations and Products) to identify priorities, metrics and track progress on identified data acquisition initiatives Reporting & Insights Generation: Drawing meaningful conclusions and recommendations based on data analysis results for effective member engagement. Take complete ownership of data directives to achieve assigned tasks from its planning, analysis till providing required business insights enabling rational decision making Team Leadership & Management: Build and lead a high performing data acquisition team, including data analyst and data acquisition managers. Set clear KPIs and performance benchmarks for data acquisition teams on data enhancement and reporting Provide specialize training and capacity building programs for data acquisition team members related to MFI data reporting best practices and compliance. Regulatory Compliance & Data Governance Ensure complete, accurate and timely reporting of data and comply with the relevant regulatory guidelines. Establish governance framework for data ingestion, data validations and standardization Monitor adherence to regulatory standards and data reporting practices. Liaise with legal and compliance teams to stay updated on policy changes affecting data acquisition. Experience and Skills Master s degree in agriculture, Rural Business administration or a related field Minimum 12+ years of relevant experience in managing Priority Sector lending or financial inclusion. Flexibility to travel as needed Self-starter, ability to work independently, handle ambiguous situations and exercise judgement in variety of situations. Strong communication, organizational, verbal & written skills. High degree of responsibility and ownership, strong multitasking, coordination and tenaciously looking for ways to get results. This job is assigned as On-Site Essential and requires in- person work at an assigned TU office location as a condition of employment.
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Define and implement data quality rules, scorecards, and issue management workflows. Profile datasets to identify and resolve data quality issues. Monitor and report on data quality metrics and KPIs. Support data stewardship activities and provide training to business users. Implement and configure Collibra Data Intelligence Platform for metadata management, data cataloging, and governance workflows. Collaborate with business and IT stakeholders to define and enforce data governance policies and standards. Required Skills & Qualifications: 3+ years of experience in data governance, and data quality. Strong understanding of data governance frameworks eg, DAMADMBOK. Experience with SQL Familiarity with data privacy regulations eg, GDPR, CCPA. Excellent communication and stakeholder management skills. Mandatory skills* Data Quality Desired skills* Collibra Domain* Foods and Beverages
Posted 2 weeks ago
10.0 - 15.0 years
12 - 17 Lacs
Bengaluru
Work from Office
Project description We are looking for an experienced Finance Data Hub Platform Product Manager to own the strategic direction, development, and management of the core data platform that underpins our Finance Data Hub. This role is focused on ensuring the platform is scalable, reliable, secure, and optimized to support data ingestion, transformation, and access across the finance organisation. As the Platform Product Manager, you will work closely with engineering, architecture, governance, and infrastructure teams to define the technical roadmap, prioritize platform enhancements, and ensure seamless integration with data and UI product streams. Your focus will be on enabling data products and services by ensuring the platform's core capabilities meet evolving business needs. Responsibilities Key Responsibilities Platform Strategy & VisionDefine and own the roadmap for the Finance Data Hub platform, ensuring it aligns with business objectives and supports broader data product initiatives. Technical Collaborate with architects, data engineers, and governance teams to define and prioritise platform capabilities, including scalability, security, resilience, and data lineage. Integration ManagementEnsure the platform seamlessly integrates with data streams and serves UI products, enabling efficient data ingestion, transformation, storage, and consumption. Infrastructure CoordinationWork closely with infrastructure and DevOps teams to ensure platform performance, cost optimisation, and alignment with enterprise architecture standards. Governance & CompliancePartner with data governance and security teams to ensure the platform adheres to data management standards, privacy regulations, and security protocols. Backlog ManagementOwn and prioritise the platform development backlog, balancing technical needs with business priorities, and ensuring timely delivery of enhancements. Agile LeadershipSupport and often lead agile ceremonies, write clear user stories focused on platform capabilities, and facilitate collaborative sessions with technical teams. Stakeholder CommunicationProvide clear updates on platform progress, challenges, and dependencies to stakeholders, ensuring alignment across product and engineering teams. Continuous ImprovementRegularly assess platform performance, identify areas for optimization, and champion initiatives that enhance reliability, scalability, and efficiency. Risk ManagementIdentify and mitigate risks related to platform stability, security, and data integrity. SkillsMust have Proven 10+ years experience as a Product Manager focused on data platforms, infrastructure, or similar technical products. Strong understanding of data platforms and infrastructure, including data ingestion, processing, storage, and access within modern data ecosystems. Experience with cloud data platforms (e.g., Azure, AWS, GCP) and knowledge of data lake architectures. Understanding of data governance, security, and compliance best practices. Strong stakeholder management skills, particularly with technical teams (engineering, architecture, security). Experience managing product backlogs and roadmaps in an Agile environment. Ability to balance technical depth with business acumen to drive effective decision-making. Nice to have Experience with financial systems and data sources, such as HFM, Fusion, or other ERPs Knowledge of data orchestration and integration tools (e.g., Apache Airflow, Azure Data Factory). Experience with transitioning platforms from legacy technologies (e.g., Teradata) to modern solutions. Familiarity with cost optimization strategies for cloud platforms.
Posted 2 weeks ago
6.0 - 11.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 6+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.
Posted 2 weeks ago
3.0 - 6.0 years
10 - 14 Lacs
Mumbai
Work from Office
As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You’ll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you’ll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you’ll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in the integration efforts between Alation and Manta, ensuring seamless data flow and compatibility. Collaborate with cross-functional teams to gather requirements and design solutions that leverage both Alation and Manta platforms effectively. Develop and maintain data governance processes and standards within Alation, leveraging Manta's data lineage capabilities. Analyze data lineage and metadata to provide insights into data quality, compliance, and usage patterns Preferred technical and professional experience Lead the evaluation and implementation of new features and updates for both Alation and Manta platforms Ensuring alignment with organizational goals and objectives. Drive continuous improvement initiatives to enhance the efficiency and effectiveness of data management processes, leveraging Alati
Posted 2 weeks ago
3.0 - 5.0 years
4 - 8 Lacs
Mumbai
Work from Office
KEY STAKHOLDERS INTERNAL : - Business Supply Chain team, Business Development team, Finance, Accounts Payable, Accounts Receivable KEY STAKHOLDERS EXTERNAL : Customers, vendors REPORTING STRUCTURE: Will report to Manager - GBSS Master Data Management Team size - Individual Contributor role EXPERIENCE: 3 to 5 years of experience in a Master data management governance role. Good communication skills, both written and verbal, a must. Attention to detail and ability to work with minimal supervision Experience in Master data cleaning, policy compliance in Customer and Material Master Strong background in the pharmaceutical/FMCG/Manufacturing industry. Strong understanding of SAP systems. Strong analytical and problem-solving skills. Ability to work well with people at various levels and with different cultural backgrounds. CRITICAL QUALITIES: Ability to handle work pressure and SLA commitments Through understanding of the SAP Master data transactions related to customer/material/vendor master Understanding of the various fields maintained in customer masters and its impact on downstream process Willingness to work in shared service function Flexibility to work in night shift (in case required) Key Role and responsibilities Responsible for creation/change/review of various masters in SAP such as Customer Master, Material/SKU Masters, Vendor Masters , BOM, Prices etc.). Perform the business policy and data accuracy checks on the request received from requestor for master data creation/change Maintain the SLA adherence (99% request to be completed in 1 working day) with 100 % accuracy Data Governance to ensure that Masters are created complying statutory and legal framework. Educate the stakeholders for policy related compliances required Maintain good master data hygiene to ensure there are no duplicate master records, incomplete masters , incorrect /invalid information. Understand customer priorities and accordingly deliver the master data requirement Continue the development of data standards, processes and procedure for various master data domains Identify and manage projects to improve data standardization, accuracy and integrity Support SAP implementation projects (for SAP S4 HANA during UAT) by representing the MDM function. Support project teams during data cleansing and migration activities. Manage Data Quality exception reports. Work with SAP IT team or RPA team to develop or improve the automation levels as well as system based controls in Master Data Management Process. Work collaboratively with other functions (e.g. Quality, Regulatory, Finance, Sales, BD, IT, Manufacturing sites etc.) to ensure Master Data Other related issues are resolved and objectives are met. Time to time generate MIS related to Master Data Management. QUALIFICATION: Bachelors degree with MBA from a reputed University/Institute. Orientation course in SAP MM / SD module will be added advantage
Posted 2 weeks ago
5.0 - 10.0 years
15 - 25 Lacs
Noida
Work from Office
We're Hiring | Power BI + Power Platform Developer | 3-7 Yrs | Noida | Hybrid Location: Noida (Hybrid 3 days/week in office) Experience: 3 to 7 years Joiners: Immediate to Max 2 Weeks Notice Period ONLY Shift: 2:00 PM to 10:30 PM IST Cab Provided Key Responsibilities: Analyze requirements and develop interactive Power BI dashboards Work with complex data models , integrate data from multiple sources (SQL, Oracle, Excel, Web, etc.) Write and optimize SQL queries for high-performance data extraction and analysis Implement security measures , including role-based access Develop business applications using Power Apps and workflows using Power Automate Automate business processes with advanced triggers, approvals, and notifications Collaborate with business and tech teams to deliver scalable BI solutions Must-Have Skills: 3+ years of experience in Power BI , DAX , and M language Strong skills in SQL , Power Apps , and Power Automate Understanding of data warehousing and ETL Good knowledge of relational databases and data governance Strong communication and problem-solving skills Good-to-Have: Knowledge of Python Microsoft Data Analyst Associate or Power Platform Certification To Apply: Send your resume to vijay.s@xebia.com with these details: Full Name Total Experience Current CTC Expected CTC Current Location Preferred Location Notice Period / LWD (if serving) Primary Skills LinkedIn Profile Note: Apply only if you're an immediate joiner or on a notice period 2 weeks and are not in process with any other open roles with us. #PowerBIJobs #PowerPlatform #ImmediateJoiners #HiringNow #PowerAutomate #PowerApps #NoidaJobs #XebiaHiring #HybridJobs #DAX #DataVisualization
Posted 2 weeks ago
5.0 - 10.0 years
20 - 30 Lacs
Noida
Work from Office
We're Hiring | Microsoft Fabric Developer | 3-7 Yrs | Noida | Hybrid Location: Noida (Hybrid 3 days/week in office) Experience: 3 to 7 years Joiners: Immediate to Max 2 Weeks Notice Period ONLY Shift: 2:00 PM to 10:30 PM IST Cab Provided Key Responsibilities: Setup and manage Microsoft Fabric platform Build secure, scalable Lakehouses and implement Azure Data Factory pipelines Design and manage data warehouses for analytics Develop and manage - reports and semantic models using Fabric Write complex SQL queries, Spark SQL, and build data solutions using PySpark Schedule and optimize Spark jobs for performance and reliability Leverage Data Activator for real-time analytics and insights Must-Have Skills: 3+ years of experience with Microsoft Fabric , OneLake , and Lakehouses Proven expertise with Azure Data Factory and ETL Strong knowledge of Power BI , data warehousing , and data governance Proficiency in Python , PySpark , and Spark SQL Practical exposure to real-time analytics in Fabric Good-to-Have: Knowledge of AWS services and Snowflake To Apply: Send your resume to vijay.s@xebia.com with these details: Full Name Total Experience Current CTC Expected CTC Current Location Preferred Location Notice Period / LWD (if serving) Primary Skills LinkedIn Profile Note: Apply only if you're an immediate joiner or on a notice period 2 weeks and are not in process with any other open roles with us. #MicrosoftFabric #FabricDeveloper #PySparkJobs #PowerBI #AzureDataFactory #ImmediateJoiners #NoidaHiring #HybridJobs #XebiaHiring #BigData #ETL #Lakehous
Posted 2 weeks ago
15.0 - 20.0 years
11 - 14 Lacs
Bengaluru
Work from Office
At Curriculum Associates, we believe in the potential of every child and are changing the face of education technology with award-winning learning programs like i-Ready that serve a third of the nation s K-8 students. For more than 50 years, our commitment to making classrooms better places, serving educators, and supporting accessible learning experiences for all students has driven the continuous improvement of our innovative programs. Our team of more than 2,500 employees is composed of lifelong learners who stand behind this mission, working tirelessly to serve the educational community with world-class programs and support every day. Summary: The Manager of Data Engineering is responsible for developing reliable, scalable data platform applications and pipelines to market by setting technical direction, coordinating risk and priority across teams and vendors, shaping architectural strategy, managing people, and collaborating closely with product partners on project delivery. This role powers the data behind i - Ready s report s and provides insights to teachers a bout millions of students. Essential duties/responsibilities: Team Leadership: Lead, mentor, and develop a high-performing team of data engineering and backend engineering professionals, promoting collaboration, inclusivity, and professional growth. Provide technical guidance, ensuring your team stays current with emerging technologies and adopts appropriate industry best practices. Strategy Development: Work closely with Product Managers, QE, and business stakeholders to develop strategic technical roadmaps. Align initiatives clearly with business priorities, manage feature delivery timelines, and balance addressing technical debt. Cross-functional Collaboration: Build effective relationships across teams, facilitating clear communication and alignment in an Agile/Scrum environment to swiftly address production issues and prioritize team efforts appropriately. Data Platform and Architecture: Drive best practices in data platform management and software development, ensuring high standards for data quality, architecture, testing, deployment processes, and observability. Data Engineering: Design and execute scalable batch and real-time data pipelines for performance and reliability while addressing business needs. Focus team on success, unblock issues, escalate as needed, and build relationships with peers for success. SDLC and Process Maturity: Continuously enhance engineering processes and team practices, aligning objectives with broader organizational goals. Stay informed about industry trends and integrate new frameworks and methodologies where appropriate . Automation and Efficiency: Champion automation initiatives, driving enhancements in operational efficiency, data integrity, and scalability. Continuously streamline workflows and promote practices that accelerate delivery. Production Support: Manage team responsibilities for addressing production issues promptly, maintaining clear stakeholder communication, and ensuring smooth release cycles . Required job skills: Strong communication and relationship-building skills, particularly in asynchronous and geographically distributed environments. Able to discuss solutions effectively with team members of varying technical backgrounds. Excellent software design skills, with deep knowledge of data engineering and backend development patterns, including performance optimization. Proficient in developing high-quality, well-structured code in Java, Scala, and SQL, following test-driven development approaches and thorough debugging practices. Proven ability to maintain clear, concise, and organized technical documentation. Deep understanding of modern product development methodologies (Agile, SAFe ). Experience building and maintaining data platforms, including data governance, ETL processes, data lakes, and data warehouses (Amazon S3, Snowflake). Knowledge of Amazon cloud computing infrastructure, specifically Aurora MySQL, DynamoDB, EMR, Lambda, Step Functions, and S3. Skilled at performing thoughtful and detailed code reviews, providing constructive feedback aimed at improving code quality and mentoring developers. Familiarity with software engineering and project management tools. Commitment to adhering to security protocols and best practices in data governance. Ability to define KPIs and leverage metrics effectively to drive process improvements. Minimum qualifications: 15+ years experience in designing and developing enterprise level software solutions 10 years experience with large volume data processing and big data tools such as Apache Spark, Scala, and Hadoop technologies 5 years experience developing Scala / Java applications and microservices using Spring Boot 5 years experience with SQL and Relational databases 3+ years in an Engineering Leadership position 2 years experience working with the Agile/Scrum methodology Preferred qualifications: Knowledge in MemSQL DB and SnowFlake Experience with Amazon cloud computing infrastructure (Aurora MySQL, Dynamo dB, EMR, Lambda, Step Functions, etc.) Educational domain background
Posted 2 weeks ago
17.0 - 22.0 years
32 - 40 Lacs
Pune
Work from Office
We are Allvue Systems, the leading provider of software solutions for the Private Capital and Credit markets. Whether a client wants an end-to-end technology suite, or independently focused modules, Allvue helps eliminate the boundaries between systems, information, and people. We re looking for ambitious, smart, and creative individuals to join our team and help our clients achieve their goals. Working at Allvue Systems means working with pioneers in the fintech industry. Our efforts are powered by innovative thinking and a desire to build adaptable financial software solutions that help our clients achieve even more. With our common goals of growth and innovation, whether you re collaborating on a cutting-edge project or connecting over shared interests at an office happy hour, the passion is contagious. We want all of our team members to be open, accessible, curious and always learning. As a team, we take initiative, own outcomes, and have passion for what we do. With these pillars at the center of what we do, we strive for continuous improvement, excellent partnership and exceptional results. Come be a part of the team that s revolutionizing the alternative investment industry. Define your own future with Allvue Systems! Strategic Leadership Define and execute the data science roadmap aligned with Allvue Systems business objectives. Partner with executive leadership to identify opportunities for leveraging data science to drive innovation and competitive advantage across the company. Foster a culture of data-driven decision-making across the organization. Effectively communicate complex data insights and recommendations to both technical and non-technical audiences, including senior leadership. Team Management Lead, mentor, and grow a high-performing team of data scientists & Analysts Promote collaboration, innovation, and professional development within the team. Manage the data science teams budget effectively, prioritizing investments in key areas. Cross-Functional Collaboration Work closely with product, Engineering, and other teams to identify data science opportunities and deliver impactful solutions. Collaborate with machine learning and core engineering teams to integrate data science models into production systems. Business Impact Enable the team to develop and deploy predictive models, machine learning algorithms, and AI-driven solutions to optimize content recommendations, personalization, and the different revenue lines. Leverage natural language processing (NLP) and LLMs (Large Language Models) to analyze and derive insights from multimodal content to power recommendations and personalization as well as smarter financial decisions. Build algorithms to measure and improve content performance, audience engagement, and subscription growth. Data Governance & Innovation Collaborate with machine learning engineering, data engineering and IT teams to ensure the integrity, accuracy, and security of data used for analysis and modelling. Champion a culture of experimentation and A/B testing, driving the development of new data products and features. Stay at the forefront of data science trends and technologies, exploring new tools and methodologies to enhance Allvue Systems capabilities. 17+ years of overall experience, with 8+ years in leading large-scale data science projects and teams. Proven track record of delivering impactful AI/ML solutions in a Finance domain. Excellent communication, presentation, and interpersonal skills. Exceptional ability in simplifying complex concepts and effectively influencing senior executive stakeholders. Expertise in aligning technology initiatives with business goals, with a hands-on approach to execution. Deep understanding of ML/AI and statistical modelling. Hands-on expertise in predictive modelling, recommendation systems, (Gen) AI, and NLP. Experience with cloud computing platforms (AWS, GCP, or Azure) and big data technologies (e.g. Spark). Fluent in Python, SQL, and ML frameworks like TensorFlow, PyTorch, or scikit-learn.
Posted 2 weeks ago
5.0 - 10.0 years
9 - 19 Lacs
Bengaluru
Work from Office
Job Title: Data Governance Specialist Experience: 5-7 Years Location: Bangalore, India Domain: Financial Services Notice Period: Immediate to 30 Days Job Description: Required a skilled Data Governance Specialist to join its data management team in Bangalore. This role will focus on implementing and maintaining data governance frameworks, ensuring high-quality data assets, and enabling consistent use of metadata across the organization. Key Responsibilities: Establish and maintain data governance policies, standards, and processes. Develop and manage the enterprise data glossary and metadata repositories. Monitor and improve data quality metrics, ensuring accuracy and consistency across systems. Work closely with business and technical teams to ensure data lineage and traceability. Support Agile delivery using tools like JIRA and Confluence. Collaborate across departments to promote data stewardship and governance awareness. Key Requirements: 57 years of experience in data governance, metadata management, or data quality roles. Strong knowledge of data glossary, lineage, and metadata practices. Experience working in Agile environments; familiarity with JIRA and Confluence. Excellent communication and stakeholder management skills. Prior experience in the financial services or banking domain is preferred. Preferred Skills: Knowledge of data governance tools (e.g., Collibra, Informatica, Alation) is a plus. Understanding of regulatory data requirements (e.g., BCBS 239, GDPR) is an advantage. Intake call Notes: Data governance, Data Glossary, metadata management, data quality, agile, JIRA, confluence Keywords - data governance, data quality and agile If interested, please share your resume to sunidhi.manhas@portraypeople.com
Posted 2 weeks ago
8.0 - 10.0 years
0 - 0 Lacs
Bengaluru
Work from Office
Informatica, MDM , SQL, Axon, EDC, Data Governance,
Posted 2 weeks ago
4.0 - 8.0 years
6 - 10 Lacs
Pune
Work from Office
Product & Business Alignment Collaborate with the Product Owner to align data solutions with business objectives and product vision. Data Pipeline Development Design, develop, and implement efficient data pipelines for ingesting, transforming, and transporting data into Cummins Digital Core (Azure DataLake, Snowflake) from various sources, including transactional systems (ERP, CRM). Architecture & Standards Compliance Ensure alignment with AAI Digital Core and AAI Solutions Architecture standards for data pipeline design, storage architectures, and governance processes. Automation & Optimization Implement and automate distributed data systems, ensuring reliability, scalability, and efficiency through monitoring, alerting, and performance tuning. Data Quality & Governance Develop and enforce data governance policies, including metadata management, access control, and retention policies, while actively monitoring and troubleshooting data quality issues. Modeling & Storage Design and implement conceptual, logical, and physical data models, optimizing storage architectures using distributed and cloud-based platforms (e.g., Hadoop, HBase, Cassandra, MongoDB, Accumulo, DynamoDB). Documentation & Best Practices Create and maintain data engineering documentation, including standard operating procedures (SOPs) and best practices, with guidance from senior engineers. Tool Evaluation & Innovation Support proof-of-concept (POC) initiatives and evaluate emerging data tools and technologies to enhance efficiency and effectiveness. Testing & Troubleshooting Participate in the testing, troubleshooting, and continuous improvement of data pipelines to ensure data integrity and usability. Agile & DevOps Practices Utilize agile development methodologies, including DevOps, Scrum, and Kanban, to drive iterative improvements in data-driven applications. Preferred Experience: Hands-on experience gained through internships, co-ops, student employment, or team-based extracurricular projects. Proficiency in SQL query language and experience in developing analytical solutions. Exposure to open-source Big Data technologies such as Spark, Scala/Java, MapReduce, Hive, HBase, and Kafka. Familiarity with cloud-based, clustered computing environments and large-scale data movement applications. Understanding of Agile software development methodologies. Exposure to IoT technology and data-driven solutions. Technical Skills: Programming Languages: Proficiency in Python, Java, and/or Scala. Database Management: Expertise in SQL and NoSQL databases. Big Data Technologies: Hands-on experience with Hadoop, Spark, Kafka, and similar frameworks. Cloud Services: Experience with Azure, Databricks, and AWS platforms. ETL Processes: Strong understanding of Extract, Transform, Load (ETL) processes. Data Replication: Working knowledge of replication technologies like Qlik Replicate is a plus. API Integration: Experience working with APIs to consume data from ERP and CRM systems.
Posted 2 weeks ago
3.0 - 7.0 years
5 - 9 Lacs
Pune
Work from Office
So, what s the role all about We are looking for a highly driven and technically skilled Software Engineer to lead the integration of various Content Management Systems with AWS Knowledge Hub, enabling advanced Retrieval-Augmented Generation (RAG) search across heterogeneous customer data without requiring data duplication. This role will also be responsible for expanding the scope of Knowledge Hub to support non-traditional knowledge items and enhance customer self-service capabilities. You will work at the intersection of AI, search infrastructure, and developer experience to make enterprise knowledge instantly accessible, actionable, and AI-ready. How will you make an impact Integrate CMS with AWS Knowledge Hub to allow seamless RAG-based search across diverse data types eliminating the need to copy data into Knowledge Hub instances. Extend Knowledge Hub capabilities to ingest and index non-knowledge assets, including structured data, documents, tickets, logs, and other enterprise sources. Build secure, scalable connectors to read directly from customer-maintained indices and data repositories. Enable self-service capabilities for customers to manage content sources using App Flow, Tray. ai, configure ingestion rules, and set up search parameters independently. Collaborate with the NLP/AI team to optimize relevance and performance for RAG search pipelines. Work closely with product and UX teams to design intuitive, powerful experiences around self-service data onboarding and search configuration. Implement data governance, access control, and observability features to ensure enterprise readiness. Have you got what it takes Proven experience with search infrastructure, RAG pipelines, and LLM-based applications. 5+ Years hands-on experience with AWS Knowledge Hub, AppFlow, Tray. ai, or equivalent cloud-based indexing/search platforms. Strong backend development skills (Python, Typescript/NodeJS, . NET/Java) and familiarity with building and consuming REST APIs. Infrastructure as a code (IAAS) service like AWS Cloud formation, CDK knowledge Deep understanding of data ingestion pipelines, index management, and search query optimization. Experience working with unstructured and semi-structured data in real-world enterprise settings. Ability to design for scale, security, and multi-tenant environment. What s in it for you Enjoy NICE-FLEX! Reporting into: Tech Manager, Engineering, CX Role Type: Individual Contributor About NiCE
Posted 2 weeks ago
13.0 - 19.0 years
17 - 20 Lacs
Pune
Remote
Looking for a Java full stack developer who is having 15 yrs of experience and " 5 years as Data architect is mandatory " Decode complex business challenges using diverse data sources. Design and build scalable data warehouses and marts.
Posted 2 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru
Hybrid
Key Responsibilities: Develop and implement data governance solutions using Informatica CDGC. Configure and manage metadata ingestion, lineage, and data cataloging functionalities. Collaborate with data stewards to define and enforce data governance policies and standards. Design and implement data quality rules and metrics to monitor and improve data accuracy. Integrate CDGC with other enterprise systems and data sources for seamless metadata management. Work with business users to capture and maintain business glossaries and data dictionaries. Conduct data profiling and analysis to support data governance initiatives. Provide training and support to users on leveraging CDGC for data governance and cataloging. Participate in solution design reviews, troubleshooting, and performance tuning. Stay updated with the latest trends and best practices in data governance and cataloging. Must-Have Skills: 4+ years of experience in data governance and cataloging, with at least 1 year on the Informatica CDGC platform. Proficiency in configuring and managing Informatica CDGC components. Ability to integrate CDGC with various data sources and enterprise systems. Experience in debugging issues & applying fixes in Informatica CDGC In depth understanding of Data Management landscape including technology landscape, standards, and best practices prevalent in data governance, metadata management, cataloging, data lineage, data quality, and data privacy. Familiarity with data management principles and practices in DMBOK. Experience in creating frameworks, policies and processes. Strong experimental mindset to drive innovation amidst uncertainty and solving problems. Strong experience in process improvements, hands-on operational management, and change management. Good-to-Have Skills: Certifications in Data Governance or related fields (e.g., DAMA-CDMP, DCAM, CDMC) or any Data Governance tool certification Experience in other Data Governance Tool like Collibra, Talend, Microsoft Purview, Atlan, Solidatus etc. Experience in working on RFPs, internal/external POVs, Accelerators and other frameworks Knowledge of data manipulation and data visualization tools Experiencing in designing solutions as an Architect on client use cases
Posted 2 weeks ago
8.0 - 13.0 years
15 - 25 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Role & responsibilities DQ Analyst Subject Matter Expertise required in more than one of the following areas- Data Management, Data Governance, Data Quality Measurement and Reporting, Data Quality Issues Management . Liaise with IWPB markets and stakeholders to coordinate delivery of organizational DQ Governance objectives, and provide consultative support to facilitate progress Conduct analysis of IWPB DQ portfolio to identify thematic trends and insights, to effectively advise stakeholders in managing their respective domains Proficiency in MI reporting and visualization is strongly preferred Proficiency in Change and Project Management is strongly preferred. Ability to prepare programme update materials and present for senior stakeholders, with prompt response any issues / escalations Strong communications and Stakeholder Management skills: Should be able to work effectively and maintain strong working relationships as an integral part of a larger team Preferred candidate profile
Posted 2 weeks ago
6.0 - 11.0 years
20 - 30 Lacs
Navi Mumbai
Work from Office
Business Analyst Data Governance Domain (Hiring for Senior Officer & Assistant Vice President Levels) Location: Navi Mumbai Shift Timing: General Shift | Work from Office Job Type: Full-Time | Permanent Vacancies: Senior Officer 1 AVP 1 Job Summary: We are hiring experienced Business Analysts to join our Data Governance team , with openings at Senior Officer (SO) and Assistant Vice President (AVP) levels. This is a high-impact role that involves managing regulatory reporting, data governance, visualization, and analytics across the corporate banking domain with a strong regional focus on India and APAC. The ideal candidates will have a deep understanding of regulatory compliance, banking operations, and cross-functional project executionalong with the ability to work collaboratively with international teams and regulatory bodies. Key Responsibilities (Common for SO & AVP): Data Source Visualization/Analysis: Analyze data sources, files, and field structures for internal and external reporting needs. Regulatory Reporting Compliance: Ensure accurate, timely reporting aligned with RBI, MAS, HKMA, and other APAC regulators. Data Governance Implementation: Enforce governance frameworks covering data ownership, quality, lineage, and stewardship. Business Analysis & Documentation: Prepare BRDs, FRDs, data dictionaries, and UAT documentation per APAC standards. Cross-Regional Coordination: Collaborate with operations teams in India and stakeholders across APAC (e.g., Singapore, HK, Australia). Data Quality & Control: Define data quality rules, perform root cause analysis, and drive remediation efforts. Audit Support: Assist in internal/external audits with traceability, documentation, and evidence of compliance. Tool Enablement & Automation: Use tools like SQL, Excel Macros, Power BI, Tableau, UiPath to enhance reporting and automate governance processes. Stakeholder Engagement: Act as the liaison between IT, Compliance, Operations, and Finance teams for data/reporting alignment. Role-Specific Responsibilities: Senior Officer (SO): Support daily project tasks, documentation, and requirement translation. Assist in business requirement gathering and reporting development. Assistant Vice President (AVP): Lead and coordinate project execution between cross-functional teams. Develop in-depth data analysis, drive decision-making, and mentor junior analysts. Own planning and promotion of user tasks from a business analysts perspective. Key Skills & Experience (Common for both roles): Domain Expertise: Core banking and back-office operations experience (Trade, Payments, Lending, CASA). Strong understanding of APAC regulatory frameworks RBI , MAS 610 , HKMA , BCBS 239 . Tools & Technologies: Strong proficiency in SQL , MS Access , Excel Macros , Power BI , Tableau , UiPath (RPA) . Familiarity with data governance tools , metadata repositories, and reporting platforms. Documentation & BA Skills: Expertise in BRD/FRD creation, UAT planning & execution, and Agile/Waterfall methodology. Data Knowledge: Understanding of downstream data flows , and AI/Chatbot tools (e.g., ChatGPT) for potential integration. Soft Skills: Strong analytical, communication, and stakeholder management skills. Detail-oriented with a high standard of documentation quality. Travel Flexibility: Willingness to travel within the APAC region for business and project requirements. Qualifications: Master’s Degree (preferred in Science, Finance, Business, or IT). Experience Requirements: Senior Officer (SO): Total Experience: 3–8 Years Minimum 3 years in: Business Analyst role Corporate Banking, Regulatory Reporting, Data Governance/Visualization Assistant Vice President (AVP): Total Experience: 10–15 Years Minimum 5 years in: Business Analyst role Information systems, Regulatory Reporting, Data Governance, Corporate Banking Interested candidates, please share your updated CV to: swetha@intuitiveapps.com
Posted 2 weeks ago
10.0 - 17.0 years
20 - 35 Lacs
Chennai
Work from Office
must have - Data governance/Data OPS Work from Chennai office 5 days a week EXP 10+ yr (considering 8+ yr) CTC 35 LPA Should have EDC development exp on data discovery, data domain creation, relationship, profiling, Data Lineage, Data curation etc Required Candidate profile Exposure in architecting a data governance solution at the enterprise level using Informatica tool AXON Must be able to integrate with other tools and Informatica tools like EDC, IDQ etc
Posted 2 weeks ago
5.0 - 8.0 years
15 - 25 Lacs
Chennai
Work from Office
Job Summary The Team Lead - Business Analyst to join our Merchandising Analytics Team based in Dania Beach, FL. In this role, the Business Analyst will directly support our Enterprise Data and BI team in delivering scalable data infrastructure, supporting data governance initiatives, and improving the self-service merchandising reporting environment. You will report to the Sr. Manager, Enterprise Data and BI and partner with analysts, data engineers, and the corporate IT team to anticipate data infrastructure needs and proactively build solutions that fuel data driven insights. In this role your key responsibilities will include: The ideal candidate will demonstrate exemplary skills in retrieving and analyzing data sets with millions of records and is highly proficient in SQL, Excel and Tableau. They will have proven experience creating metrics and dashboards using customer, marketing, ecommerce, and financial metrics. In addition to a strong analytics background, this individual will have the ability to apply data visualization techniques that present abstract and complex topics in a clear and concise manner. What You'll Do: • Partner with analytics, data engineering, and BI teams to develop and QA new data tables and data sources across multiple platforms • Partner with business leaders and report end-users to continuously improve the Tableau experience and ensure that KPI’s are both actionable and insightful • Support the Sr. Manager of Enterprise BI in developing and maintaining strict data governance standards in both Tableau and underlying data sources • Develop and maintain reusable SQL scripts that can source multiple business cases and reporting needs • Support the development and creation of insightful and intuitive Tableau dashboards using multiple data sources, parameters, and measures • Perform complex analyses on customer behavior, marketing mix, and funnel performance by leveraging data sets with millions of records from multiple systems • Partner with various analytics teams to scope and build scalable, self-service reporting for cross-functional stakeholders • Partner with stakeholders to construct and prioritize product release roadmaps using JIRA, Confluence, and other workflow management and documentation tools
Posted 2 weeks ago
5.0 - 10.0 years
20 - 30 Lacs
Chennai
Hybrid
Experience: 4+ Years Design and optimize data pipelines integrating various data sources (1st party, 3rd party, operational) to support BI and advanced analytics. Develop data models and flows enabling personalized customer experiences and omnichannel engagement. Lead data governance, data quality, and security initiatives, ensuring compliance with GDPR and CCPA. Implement and maintain data warehousing solutions in Snowflake for large-scale data processing. Optimize workflows to streamline data transformation and modeling. Leverage Azure for cloud infrastructure, storage, and real-time analytics, ensuring scalability and performance. Collaborate with cross-functional teams to align data architecture with business needs. Support both real-time and batch data integration for actionable insights. Continuously evaluate and integrate new data technologies and methodologies. Requirements Qualifications: 4+ years of experience in Data Engineering/Data Architecture. Hands-on experience in Snowflake and Azure . Strong knowledge of data modeling , ETL/ELT , and modern data architecture. Experience designing scalable data solutions for marketing, sales, and customer analytics. Working knowledge of cloud platforms (preferably Azure) and Big Data technologies. Hands-on with Python for scripting and data tasks. Primary Skills: 3+ years of experience in DBT , Snowflake , CI/CD , SQL Python (Nice to have) Strong task ownership and accountability Eager to learn, good communication skills, and enthusiastic to upskill Let me know if you want a shorter version for a job post too!
Posted 2 weeks ago
0.0 - 1.0 years
2 - 3 Lacs
Bengaluru
Work from Office
Job Title: Mechanical Engineering Data Analyst Location: Bangalore Experience: 0-1 Years Job Summary: We are looking for a Mechanical Engineering Data Analyst with experience in mechanical components , BOM management , and data handling across PLM and engineering systems. The ideal candidate will have strong analytical skills, experience with engineering drawings , and the ability to manage high volumes of technical data related to components, attributes, and supplier catalogs. This role will work closely with engineering, manufacturing, and supply chain teams to ensure accurate classification and data governance Required Skills: 3 6 years of experience in a mechanical engineering or data management environment. Good knowledge of mechanical components across various commodities. Strong ability to interpret 2D and 3D CAD drawings . Hands-on experience with CREO for 3D model analysis. Proficiency in Engineering and Manufacturing BOM handling. Solid understanding of manufacturing processes for mechanical components. Advanced knowledge of Microsoft Excel (including formulas, filters, pivot tables, etc.) Basic knowledge of PLM systems (Windchill, Teamcenter, etc.) Experience with large-scale data processing and cleansing . Strong communication, attention to detail, and teamwork skills.
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Gurugram
Work from Office
What we expect from you: 5 years+ of experience in designing and managing BI tools Data modelling and design experience - experience of designs, builds and configuring applications that utilise Big Query for data storage, processing, and analysis. Experience of building scalable data models with Big Query Writing and optimising complex SQL queries to extract and load data, analyse data, and generate reports Experience of Data integration connecting Big Query to various data sources including cloud storage and cloud services Experience of data quality and data governance ensuring data integrity Writing and maintaining comprehensive technical documentation including data dictionaries, query documentation and application documentation to ensure we have proper data governance and quality. Experience of terraform to support billing backend logic What you can expect from us We won t just meet your expectations. We ll defy them. So you ll enjoy the comprehensive rewards package you d expect from a leading technology company. But also, a degree of personal flexibility you might not expect. Plus, thoughtful perks, like flexible working hours and your birthday off. You ll also benefit from an investment in cutting-edge technology that reflects our global ambition. But with a nimble, small-business feel that gives you the freedom to play, experiment and learn. And we don t just talk about diversity and inclusion. We live it every day - with thriving networks including dh Gender Equality Network, dh Proud, dh Family, dh One, dh Enabled and dh Thrive as the living proof. We want everyone to have the opportunity to shine and perform at your best throughout our recruitment process. Please let us know how we can make this process work best for you. Our approach to Flexible Working At dunnhumby, we value and respect difference and are committed to building an inclusive culture by creating an environment where you can balance a successful career with your commitments and interests outside of work. We believe that you will do your best at work if you have a work / life balance. Some roles lend themselves to flexible options more than others, so if this is important to you please raise this with your recruiter, as we are open to discussing agile working opportunities during the hiring process. For further information about how we collect and use your personal information please see our Privacy Notice which can be found (here)
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France