Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 4.0 years
9 - 13 Lacs
Pune
Work from Office
Overview Data Technology group in MSCI is responsible to build and maintain state-of-the-art data management platform that delivers Reference. Market & other critical datapoints to various products of the firm. The platform, hosted on firms’ data centers and Azure & GCP public cloud, processes 100 TB+ data and is expected to run 24*7. With increased focus on automation around systems development and operations, Data Science based quality control and cloud migration, several tech stack modernization initiatives are currently in progress. To accomplish these initiatives, we are seeking a highly motivated and innovative individual to join the Data Engineering team for the purpose of supporting our next generation of developer tools and infrastructure. The team is the hub around which Engineering, and Operations team revolves for automation and is committed to provide self-serve tools to our internal customers. Responsibilities Implement & Maintain Data Catalogs Deploy and manage data catalog tool Collibra to improve data discoverability and governance. Metadata & Lineage Management Automate metadata collection, establish data lineage, and maintain consistent data definitions across systems. Enable Data Governance Collaborate with governance teams to apply data policies, classifications, and ownership structures in the catalog. Support Self-Service & Adoption Promote catalog usage across teams through training, documentation, and continuous support. Cross-Team Collaboration Work closely with data engineers, analysts, and stewards to align catalog content with business needs. Tooling & Automation Build scripts and workflows for metadata ingestion, tagging, and monitoring of catalog health. Leverage AI tools for automation of cataloging activities Reporting & Documentation Maintain documentation and generate usage metrics, ensuring transparency and operational efficiency. Qualifications Self-motivated, collaborative individual with passion for excellence E Computer Science or equivalent with 5+ years of total experience and at least 2 years of experience in working with Azure DevOps tools and technologies Good working knowledge of source control applications like git with prior experience of building deployment workflows using this tool Good working knowledge of Snowflake YAML, Python Tools: Experience with data catalog platforms (e.g., Collibra, Alation, DataHub). Metadata & Lineage: Understanding of metadata management and data lineage. Scripting: Proficient in SQL and Python for automation and integration. APIs & Integration: Ability to connect catalog tools with data sources using APIs. Cloud Knowledge: Familiar with cloud data services (Azure, GCP). Data Governance: Basic knowledge of data stewardship, classification, and compliance. Collaboration: Strong communication skills to work across data and business teams What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 2 weeks ago
10.0 - 12.0 years
25 - 35 Lacs
Faridabad
Work from Office
We’re looking for a seasoned SAP Datasphere specialist to design and implement enterprise-level data models and integration pipelines. The role demands strong ETL craftsmanship using SAP native tools, with a foundational knowledge of BW systems leveraged during transitions and migrations. Roles and Responsibilities Key Responsibilities Data Pipeline Development Act as an architect for complex ETL workflows leveraging SAP Datasphere’s graphical and scripting tools. Should have worked with various sources and targets for data integration such as S/4HANA, ECC, Oracle and other third-party sources. Experience using BW Bridge and using standard BW datasources with Datasphere. Ensure data replication, federation, and virtualization use cases are optimally addressed. Modeling & Governance Design and maintain business-oriented semantic layers within Datasphere—creating abstracted, reusable data models and views tailored for analytics consumption. Apply rigorous data governance, lineage tracking, and quality frameworks. Performance & Operations Should be able to design highly optimized and performant data models that perform well under heavy data volume. Continuously track and enhance the performance of data pipelines and models—ensuring efficient processing and robust scalability. Manage workspace structures, access controls, and overall system hygiene. Team Collaboration & Mentorship Collaborate with IT, analytics, and business teams to operationalize data requirements. Coach junior engineers and drive standardized practices across the team Must-Have Qualifications Bachelor’s degree in computer science, Information Systems, or related field. Minimum 8 years in SAP data warehousing, including exposure to BW/BW4HANA. At least 2 years of hands-on experience in SAP Datasphere for ETL, modeling, and integration. Expertise in SQL and scripting (Python). Solid understanding of data governance, lineage, security, and metadata standards. Must be aware of ongoing and rapid changes in SAP landscape such as introduction of BDC and Databricks to SAP Data and Analytics. Nice-to-Have Certifications in SAP Datasphere, BW/4HANA, or data engineering. Knowledge of data virtualization, federation architectures, and hybrid cloud deployments. Experience with Agile or DevOps practices, CI/CD pipelines.
Posted 2 weeks ago
5.0 - 10.0 years
10 - 16 Lacs
Hyderabad
Remote
Job description As an ETL Developer for the Data and Analytics team, at Guidewire you will participate and collaborate with our customers and SI Partners who are adopting our Guidewire Data Platform as the centerpiece of their data foundation. You will facilitate and be an active developer when necessary to operationalize the realization of the agreed upon ETL Architecture goals of our customers adhering to Guidewire best practices and standards. You will work with our customers, partners, and other Guidewire team members to deliver successful data transformation initiatives. You will utilize best practices for design, development, and delivery of customer projects. You will share knowledge with the wider Guidewire Data and Analytics team to enable predictable project outcomes and emerge as a leader in our thriving data practice. One of our principles is to have fun while we deliver, so this role will need to keep the delivery process fun and engaging for the team in collaboration with the broader organization. Given the dynamic nature of the work in the Data and Analytics team, we are looking for decisive, highly-skilled technical problem solvers who are self-motivated and take proactive actions for the benefit of our customers and ensure that they succeed in their journey to Guidewire Cloud Platform. You will collaborate closely with teams located around the world and adhere to our core values Integrity, Collegiality, and Rationality. Key Responsibilities: Build out technical processes from specifications provided in High Level Design and data specifications documents. Integrate test and validation processes and methods into every step of the development process Work with Lead Architects and provide inputs into defining user stories, scope, acceptance criteria and estimates. Systematic problem-solving approach, coupled with a sense of ownership and drive Ability to work independently in a fast-paced Agile environment Actively contribute to the knowledge base from every project you are assigned to. Qualifications: Bachelors or Masters Degree in Computer Science, or equivalent level of demonstrable professional competency, and 3 - 5 years + in a technical capacity building out complex ETL Data Integration frameworks. 3+ years of Experience with data processing and ETL (Extract, Transform, Load) and ELT (Extract, Load, and Transform) concepts. Experience with ADF or AWS Glue, Spark/Scala, GDP, CDC, ETL Data Integration, Experience working with relational and/or NoSQL databases Experience working with different cloud platforms (such as AWS, Azure, Snowflake, Google Cloud, etc.) Ability to work independently and within a team. Nice to have: Insurance industry experience Experience with ADF or AWS Glue Experience with the Azure data factory, Spark/Scala Experience with the Guidewire Data Platform.
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
Position Overview: The domain Data Steward role is responsible for working within the global data governance team and with their local businesses to maintain alignment to the Enterprise Data Governance's (EDG) processes, rules and standards set to ensure data is fit for purpose. This will be achieved through the EDG Data Steward operating as the single point of contact for those creating and consuming data within their respected data domain(s). Additionally, they will be driving the team to interact directly with key domain and project stakeholders, the EDG Lead, Governance Council, other data stewards across the organization and relevant SMEs throughout the organization as necessary. This position collaborates / advises with PepsiCo's Governance Council, of which they are accountable for the success of PepsiCos EDG program. Responsibilities Primary Accountabilities: Partner closely with the PepsiCo Supply Chain & Ops Transformation team to ensure data requirements are met to enable timely, accurate and insightful reporting and analysis in support of FP&A digitization initiatives Promote data accuracy and adherence to PepsiCo defined global governance practices, as well as driving acceptance of PepsiCo's enterprise data standards and policies across the various business segments. Maintain and advise relevant stakeholders on data governance-related matters in the relevant data domains with a focus on the business use of the data. Monitor operational incidents, support root cause analysis and based on the recurrence propose ways to optimize the Data Governance framework (processes, Data Quality Rules, etc.) Provide recommendations and supporting documentation for new or proposed data standards, business rules and policy (in conjunction with the Governance Council as appropriate). Advice on various projects and initiatives to ensure that any data related changes and dependencies are identified, communicated, and managed to ensure adherence with the Enterprise Data Governance established standards. Represent market specific needs in Sector data councils and above, ensuring locals user needs are heard/met/addressed; Voice opinions around why proposals will or will not work for the market you represent and provide alternative solutions. Coordinate across the Sector (with fellow Market Data Stewards and the EDG Steward; strategic initiatives, Digital Use Cases and the federated data network) in order to maintain consistency of PepsiCo's critical enterprise, digital, operational and analytical data. Accountable for ensuring that data-centric activities are aligned with the EDG program and leverage applicable data standards, governance processes, and overall best practices. Data Governance Business Standards: Ensures alignment of the data governance processes and standards with applicable enterprise, business segment, and local data support models. Champions the single set of Enterprise-level data standards & repository of key elements pertaining to the finance domain and promoting their use throughout the PepsiCo organization. Owns one or multiple domain perspectives in defining and continually evolving the roadmap for enterprise data governance based upon strategic business objectives, existing capabilities/programs, cultural considerations and a general understanding of emerging technologies and governance models/techniques. Advise on various projects and initiatives to ensure that any data related changes and dependencies are identified, communicated, and managed to ensure adherence with the Enterprise Data Governance established standards. Data Domain Coordination and Collaboration: Responsible for helping identify the need for sector-level data standards (and above) based on strategic business objectives and the evolution of enterprise-level capabilities and analytical requirements. Collaborates across the organization to ensure consistent and effective execution of data governance and management principles across PepsiCo's enterprise and analytical systems and data domains. Accountable for driving organizational acceptance of EDG established data standards, policies, and definitions and process standards for critical / related enterprise data. Promotes and champions PepsiCo's Enterprise Data Governance Capability and data management program across the organization. Qualifications: 5+ years of experience working in Data Governance or Data Management within a global CPG (Consumer Packaged Good) company; Strong data management background who understands data, how to ingest data, proper data use / consumption, data quality, and stewardship. 5+ years of experience working with data across multiple domains (with a particular focus on Finance data), associated processes, involved systems and data usage.. Minimum of 5+ years functional experience working with and designing standards for data cataloging processes and tools. Ability to partner with both business and technical subject matter experts to ensure standardization of operational information and execution of enterprise-wide data governance policies and procedures are defined and implemented. Matrix management skills and business acumen Competencies: Strong knowledge and understanding of master data elements and processes related to data across multiple domains Understanding of operation usage of transactional data as it relates to financial planning. Strong Communication Skills/Able to Persuade/Influence Others at all Organization Levels and the ability foster lasting partnerships. Ability to translate business requirements into critical data dependencies and requirements Ability to think beyond their current state (processes, roles and tools) and work towards an unconstrained, optimized design. An ability to solicit followership from the functional teams to think beyond the way the things work today. Able to align various stakeholders to a common set of standards and the ability to sell the benefits of the EDG program. Foster lasting relationships across varying organizational levels and business segments with the maturity to interface with all levels of management Self-starter who welcomes responsibility, along with the ability to thrive in an evolving organization and an ability to bring structure to unstructured situations. Ability to arbitrate on difficult decisions and drive consensus through a diplomatic approach. Matrix management skills and business acumen Excellent written & verbal communication skills. Skills & Traits: Passion for data and positive attitude to champion data standards Ability to think beyond their current state (processes, roles and tools) and work towards an unconstrained, optimized design. An ability to solicit followership from the functional teams to think beyond the way the things work today. Able to align various stakeholders to a common set of standards and the ability to sell the benefits of the EDG program Foster lasting relationships across varying organizational levels and business segments with the maturity to interface with all levels of management Ability to arbitrate on difficult decisions and drive consensus through a diplomatic approach. Excellent written & verbal communication skills. Self-starter who welcomes responsibility, along with the ability to thrive in an evolving organization and an ability to bring structure to unstructured situations. Matrix management skills and business acumen
Posted 2 weeks ago
10.0 - 15.0 years
18 - 22 Lacs
Gurugram
Remote
Work Mode : Remote Contract : 6months Position Summary : We are seeking an experienced SAP Data Architect with a strong background in enterprise data management, SAP S/4HANA architecture, and data integration. This role demands deep expertise in designing and governing data structures, ensuring data consistency, and leading data transformation initiatives across large-scale SAP implementations. Key Responsibilities : - Lead the data architecture design across multiple SAP modules and legacy systems. - Define data governance strategies, master data management (MDM), and metadata standards. - Architect data migration strategies for SAP S/4HANA projects, including ETL, data validation, and reconciliation. - Develop data models (conceptual, logical, and physical) aligned with business and technical requirements. - Collaborate with functional and technical teams to ensure integration across SAP and nonSAP platforms. - Establish data quality frameworks and monitoring practices. - Conduct impact assessments and ensure scalability of data architecture. - Support reporting and analytics requirements through structured data delivery frameworks (e.g., SAP BW, SAP HANA). Required Qualifications : - 15+ years of experience in enterprise data architecture, with 8+ years in SAP landscapes. - Proven experience in SAP S/4HANA data models, SAP Datasphere, SAP HANA Cloud & SAC and integrating this with AWS Data Lake (S3) - Strong knowledge of data migration tools (SAP Data Services, LTMC / LSMW, BODS, etc.). - Expertise in data governance, master data strategy, and data lifecycle management. - Experience with cloud data platforms (Azure, AWS, or GCP) is a plus. - Strong analytical and communication skills to work across business and IT stakeholders. Preferred Certifications : - SAP Certified Technology Associate SAP S/4HANA / Datasphere - TOGAF or other Enterprise Architecture certifications - ITIL Foundation (for process alignment)
Posted 2 weeks ago
5.0 - 10.0 years
12 - 17 Lacs
Hyderabad
Work from Office
Overview We are seeking a skilled Associate Manager AIOps & MLOps Operations to support and enhance the automation, scalability, and reliability of AI/ML operations across the enterprise. This role requires a solid understanding of AI-driven observability, machine learning pipeline automation, cloud-based AI/ML platforms, and operational excellence. The ideal candidate will assist in deploying AI/ML models, ensuring continuous monitoring, and implementing self-healing automation to improve system performance, minimize downtime, and enhance decision-making with real-time AI-driven insights. Responsibilities Support and maintain AIOps and MLOps programs, ensuring alignment with business objectives, data governance standards, and enterprise data strategy. Assist in implementing real-time data observability, monitoring, and automation frameworks to enhance data reliability, quality, and operational efficiency. Contribute to developing governance models and execution roadmaps to drive efficiency across data platforms, including Azure, AWS, GCP, and on-prem environments. Ensure seamless integration of CI/CD pipelines, data pipeline automation, and self-healing capabilities across the enterprise. Collaborate with cross-functional teams to support the development and enhancement of next-generation Data & Analytics (D&A) platforms. Assist in managing the people, processes, and technology involved in sustaining Data & Analytics platforms, driving operational excellence and continuous improvement. Support Data & Analytics Technology Transformations by ensuring proactive issue identification and the automation of self-healing capabilities across the PepsiCo Data Estate.Responsibilities: Support the implementation of AIOps strategies for automating IT operations using Azure Monitor, Azure Log Analytics, and AI-driven alerting. Assist in deploying Azure-based observability solutions (Azure Monitor, Application Insights, Azure Synapse for log analytics, and Azure Data Explorer) to enhance real-time system performance monitoring. Enable AI-driven anomaly detection and root cause analysis (RCA) by collaborating with data science teams using Azure Machine Learning (Azure ML) and AI-powered log analytics. Contribute to developing self-healing and auto-remediation mechanisms using Azure Logic Apps, Azure Functions, and Power Automate to proactively resolve system issues. Support ML lifecycle automation using Azure ML, Azure DevOps, and Azure Pipelines for CI/CD of ML models. Assist in deploying scalable ML models with Azure Kubernetes Service (AKS), Azure Machine Learning Compute, and Azure Container Instances. Automate feature engineering, model versioning, and drift detection using Azure ML Pipelines and MLflow. Optimize ML workflows with Azure Data Factory, Azure Databricks, and Azure Synapse Analytics for data preparation and ETL/ELT automation. Implement basic monitoring and explainability for ML models using Azure Responsible AI Dashboard and InterpretML. Collaborate with Data Science, DevOps, CloudOps, and SRE teams to align AIOps/MLOps strategies with enterprise IT goals. Work closely with business stakeholders and IT leadership to implement AI-driven insights and automation to enhance operational decision-making. Track and report AI/ML operational KPIs, such as model accuracy, latency, and infrastructure efficiency. Assist in coordinating with cross-functional teams to maintain system performance and ensure operational resilience. Support the implementation of AI ethics, bias mitigation, and responsible AI practices using Azure Responsible AI Toolkits. Ensure adherence to Azure Information Protection (AIP), Role-Based Access Control (RBAC), and data security policies. Assist in developing risk management strategies for AI-driven operational automation in Azure environments. Prepare and present program updates, risk assessments, and AIOps/MLOps maturity progress to stakeholders as needed. Support efforts to attract and build a diverse, high-performing team to meet current and future business objectives. Help remove barriers to agility and enable the team to adapt quickly to shifting priorities without losing productivity. Contribute to developing the appropriate organizational structure, resource plans, and culture to support business goals. Leverage technical and operational expertise in cloud and high-performance computing to understand business requirements and earn trust with stakeholders. Qualifications 5+ years of technology work experience in a global organization, preferably in CPG or a similar industry. 5+ years of experience in the Data & Analytics field, with exposure to AI/ML operations and cloud-based platforms. 5+ years of experience working within cross-functional IT or data operations teams. 2+ years of experience in a leadership or team coordination role within an operational or support environment. Experience in AI/ML pipeline operations, observability, and automation across platforms such as Azure, AWS, and GCP. Excellent Communication: Ability to convey technical concepts to diverse audiences and empathize with stakeholders while maintaining confidence. Customer-Centric Approach: Strong focus on delivering the right customer experience by advocating for customer needs and ensuring issue resolution. Problem Ownership & Accountability: Proactive mindset to take ownership, drive outcomes, and ensure customer satisfaction. Growth Mindset: Willingness and ability to adapt and learn new technologies and methodologies in a fast-paced, evolving environment. Operational Excellence: Experience in managing and improving large-scale operational services with a focus on scalability and reliability. Site Reliability & Automation: Understanding of SRE principles, automated remediation, and operational efficiencies. Cross-Functional Collaboration: Ability to build strong relationships with internal and external stakeholders through trust and collaboration. Familiarity with CI/CD processes, data pipeline management, and self-healing automation frameworks. Strong understanding of data acquisition, data catalogs, data standards, and data management tools. Knowledge of master data management concepts, data governance, and analytics.
Posted 2 weeks ago
1.0 - 5.0 years
2 - 4 Lacs
Chennai, Tamil Nadu
Work from Office
Overview As Sales Sr. Mgr., ensure that exceptional leadership & operational direction is provided by his/her analysts team to sales employees across multiple teams and markets. His/her Planogram Analysts deliver visually appealing planograms based on store clustering, space definitions and defined flow. Work closely with Category Management and Space teams to ensure planograms meet approved parameters. Conduct planogram quality audit ensuring all planograms meet assortment requirements, visual appeal, innovation opportunities and shelving metrics. Continuously identify opportunities and implement processes to improve quality, timeliness of output and process efficiency through automation. Responsibilities Head the DX Sector Planogram Analyst team and ensure efficient, effective and comprehensive support of the sales employees across multiple teams and markets Lead and manage the Planogram Analysts work stream by working closely with Sector Space & Planogram team Ensure accurate and timely delivery of tasks regarding: deliver visually appealing versioned planograms based on store clustering, space definitions and defined flow work closely with Category Management and Space teams to ensure planogram meet approved parameters conduct planogram quality control ensuring all planograms meet assortment requirements, visual appeal, innovation opportunities and shelving metrics electronically deliver planograms to both internal teams and external customer specific systems manage multiple project timelines simultaneously ensure timelines are met by tracking project process, coordinating activities and resolving issues build and maintain relationships with internal project partners manage planogram version/store combination and/or store planogram assignments and to provide reporting and data as needed maintain planogram database with most updated planogram files retain planogram models and files for historical reference, as needed Invest and drive adoption of industry best practices across regions/sector, as required Partner with global teams to define strategy for End to End execution ownership and accountability. Lead workload forecasting and effectively drive prioritization conversation to support capacity management. Build stronger business context and elevate the teams capability from execution focused to end to end capability focused. Ensure delivery of accurate and timely planograms in accordance with agreed service level agreements (SLA) Work across multiple functions to aid in collecting insights for action-oriented cause of change analysis Ability to focus against speed of execution and quality of service delivery rather than achievement of SLAs Recognize opportunities and take action to improve delivery of work Implement continued improvements and simplifications of processes and optimal use of technology Scale-up operation in-line with business growth, both within existing scope, as well as new areas of opportunity Create an inclusive and collaborative environment People Leadership Enable direct reports capabilities and enforce consistency in execution of key capability areas; planogram QC, development and timely delivery Responsible for Hiring, talent assessment, competency development, performance management, productivity improvement, talent retention, career planning and development Provide and receive feedback about the global team and support effective partnership. Qualifications 10+ yrs. of retail/merchandizing experience (including JDA) 2+ yrs. of people leadership experience in a Space planning/planogram environment Bachelors in commerce/business administration/marketing, Masters degree is a plus Advanced level skill in Microsoft Office, with demonstrated intermediate-advanced Excel skills necessary Experience with analyzing and reporting data to identify issues, trends, or exceptions to drive improvement of results and find solutions Advanced knowledge and experience of space management technology platform JDA Propensity to learn PepsiCo software systems Ability to provide superior customer service Best-in-class time management skills, ability to multitask, set priorities and plan
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Gurugram
Work from Office
Data Catalog Lead (Collibra) : Position Overview : We are seeking an experienced Data Catalog Lead to lead the implementation and ongoing development of enterprise data catalog using Collibra. This role focuses specifically on healthcare payer industry requirements, including complex regulatory compliance, member data privacy, and multi-system data integration challenges unique to health plan operations. Key Responsibilities : - Data Catalog Implementation & Development - Configure and customize Collibra workflows, data models, and governance processes to support health plan business requirements - Develop automated data discovery and cataloging processes for healthcare data assets including claims, eligibility, provider networks, and member information - Design and implement data lineage tracking across complex healthcare data ecosystems spanning core administration systems, data warehouses, and analytics platforms - Healthcare-Specific Data Governance - Build specialized data catalog structures for healthcare data domains including medical coding systems (ICD-10, CPT, HCPCS), pharmacy data (NDC codes), and provider taxonomies - Configure data classification and sensitivity tagging for PHI (Protected Health Information) and PII data elements in compliance with HIPAA requirements - Implement data retention and privacy policies within Collibra that align with healthcare regulatory requirements and member consent management - Develop metadata management processes for regulatory reporting datasets (HEDIS, Medicare Stars, MLR reporting, risk adjustment) Technical Integration & Automation : - Integrate Collibra with healthcare payer core systems including claims processing platforms, eligibility systems, provider directories, and clinical data repositories - Implement automated data quality monitoring and profiling processes that populate the data catalog with technical and business metadata - Configure Collibra's REST APIs to enable integration with existing data governance tools and business intelligence platforms Required Qualifications : - Collibra Platform Expertise - 8+ years of hands-on experience with Collibra Data Intelligence Cloud platform implementation and administration - Expert knowledge of Collibra's data catalog, data lineage, and data governance capabilities - Proficiency in Collibra workflow configuration, custom attribute development, and role-based access control setup - Experience with Collibra Connect for automated metadata harvesting and system integration - Strong understanding of Collibra's REST APIs and custom development capabilities - Healthcare Payer Industry Knowledge - 4+ years of experience working with healthcare payer/health plan data environments - Deep understanding of healthcare data types including claims (professional, institutional, pharmacy), eligibility, provider data, and member demographics - Knowledge of healthcare industry standards including HL7, X12 EDI transactions, and FHIR specifications - Familiarity with healthcare regulatory requirements (HIPAA, ACA, Medicare Advantage, Medicaid managed care) - Understanding of healthcare coding systems (ICD-10-CM/PCS, CPT, HCPCS, NDC, SNOMED CT) Technical Skills : - Strong SQL skills and experience with healthcare databases (claims databases, clinical data repositories, member systems) - Knowledge of cloud platforms (AWS, Azure, GCP) and their integration with Collibra cloud services - Understanding of data modeling principles and healthcare data warehouse design patterns Data Governance & Compliance : - Experience implementing data governance frameworks in regulated healthcare environments - Knowledge of data privacy regulations (HIPAA, state privacy laws) and their implementation in data catalog tools - Understanding of data classification, data quality management, and master data management principles - Experience with audit trail requirements and compliance reporting in healthcare organizations Preferred Qualifications : - Advanced Healthcare Experience - Experience with specific health plan core systems (such as HealthEdge, Facets, QNXT, or similar platforms) - Knowledge of Medicare Advantage, Medicaid managed care, or commercial health plan operations - Understanding of value-based care arrangements and their data requirements - Experience with clinical data integration and population health analytics Technical Certifications & Skills : - Collibra certification (Data Citizen, Data Steward, or Technical User) - Experience with additional data catalog tools (Alation, Apache Atlas, IBM Watson Knowledge Catalog) - Knowledge of data virtualization tools and their integration with data catalog platforms - Experience with healthcare interoperability standards and API management
Posted 2 weeks ago
8.0 - 12.0 years
25 - 30 Lacs
Gurugram
Work from Office
Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
At PwC, our team in managed services specializes in providing outsourced solutions and supporting clients across various functions. We help organizations enhance their operations, reduce costs, and boost efficiency by managing key processes and functions on their behalf. Our expertise lies in project management, technology, and process optimization, allowing us to deliver high-quality services to our clients. In managed service management and strategy at PwC, the focus is on transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your role will involve continuous improvement and optimization of managed services processes, tools, and services. As a Managed Services - Data Engineer Senior Associate at PwC, you will be part of a team of problem solvers dedicated to addressing complex business issues from strategy to execution using Data, Analytics & Insights Skills. Your responsibilities will include using feedback and reflection to enhance self-awareness and personal strengths, acting as a subject matter expert in your chosen domain, mentoring junior resources, and conducting knowledge sharing sessions. You will be required to demonstrate critical thinking, ensure quality of deliverables, adhere to SLAs, and participate in incident, change, and problem management. Additionally, you will be expected to review your work and that of others for quality, accuracy, and relevance, as well as demonstrate leadership capabilities by working directly with clients and leading engagements. The primary skills required for this role include ETL/ELT, SQL, SSIS, SSMS, Informatica, and Python, with secondary skills in Azure/AWS/GCP, Power BI, Advanced Excel, and Excel Macro. As a Data Ingestion Senior Associate, you should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines, designing and implementing ETL processes, monitoring and troubleshooting data pipelines, implementing data security measures, and creating visually impactful dashboards for data reporting. You should also have expertise in writing and analyzing complex SQL queries, be proficient in Excel, and possess strong communication, problem-solving, quantitative, and analytical abilities. In our Managed Services platform, we focus on leveraging technology and human expertise to deliver simple yet powerful solutions to our clients. Our team of skilled professionals, combined with advanced technology and processes, enables us to provide effective outcomes and add greater value to our clients" enterprises. We aim to empower our clients to focus on their business priorities by providing flexible access to world-class business and technology capabilities that align with today's dynamic business environment. If you are a candidate who thrives in a high-paced work environment, capable of handling critical Application Evolution Service offerings, engagement support, and strategic advisory work, then we are looking for you to join our team in the Data, Analytics & Insights Managed Service at PwC. Your role will involve working on a mix of help desk support, enhancement and optimization projects, as well as strategic roadmap initiatives, while also contributing to customer engagements from both a technical and relationship perspective.,
Posted 2 weeks ago
14.0 - 18.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Senior Technical Team Leader specializing in Business Intelligence, Data Governance, and Reporting, you will play a crucial role in leading the development and implementation of BI strategies, tools, and reporting solutions to align with the organization's business objectives. Your responsibilities will include serving as a subject matter expert in BI, providing support for internal initiatives, and mentoring team members on best practices. You will design, implement, and maintain scalable data models, analytical layers, and interactive dashboards using modern BI tools, primarily Power BI. Your role will involve optimizing BI architecture for scalability, performance, and adaptability to evolving business needs, as well as applying performance optimization techniques to enhance data processing, dashboard responsiveness, and user experience. Ensuring high standards of data quality, consistency, and governance across all BI solutions will be essential. Collaboration with cross-functional teams, such as data engineers, data scientists, and business stakeholders, to define and meet BI requirements will also be a key aspect of your responsibilities. Utilizing advanced Power BI features like DAX, Power Query, and Power BI Service, you will build robust, automated reporting and analytical solutions. Additionally, hosting workshops and office hours to guide business units on Power BI usage, self-service BI strategies, and technical troubleshooting will be part of your role. Staying updated on emerging BI tools, trends, and methodologies to drive continuous innovation and improvement is crucial for success in this position. To excel in this role, you should hold a Bachelor's or Master's degree in Computer Science, Data Science, Engineering, Mathematics, or a related field, along with at least 10 years of experience in Business Intelligence, including data warehousing, ETL pipelines, and reporting. Expert-level proficiency in BI tools, particularly Power BI, and certifications such as Certified Power BI Data Analyst Associate (PL300) and Certified Data Management Professional (CDMP)- DAMA are desired. Strong command of DAX, Power Query, SQL for data modeling, integration, and Python for analysis is essential. Proficiency in Agile\Scrum or traditional project management methodologies, fostering a collaborative team culture, and encouraging continuous learning are also valuable skills for this role. This position requires a total experience of 14-18 years and may involve work at the secondary location of Noida Campus. At our organization, we are dedicated to promoting diversity and inclusivity, providing equal opportunities for all individuals, including those with disabilities.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
pune, maharashtra
On-site
Myers-Holum is expanding the NSAW Practice and is actively seeking experienced Enterprise Architects with strong end-to-end data warehousing and business intelligence experience to play a pivotal role leading client engagements on this team. As an Enterprise Architect specializing in Data Integration and Business Intelligence, you will be responsible for leading the strategic design, architecture, and implementation of enterprise data solutions to ensure alignment with clients" long-term business goals. You will have the opportunity to develop and promote architectural visions for data integration, Business Intelligence (BI), and analytics solutions across various business functions and applications. Leveraging cutting-edge technologies such as the Oracle NetSuite Analytics Warehouse (NSAW) platform, NetSuite ERP, Suite Commerce Advanced (SCA), and other cloud-based and on-premise tools, you will design and build scalable, high-performance data warehouses and BI solutions for clients. In this role, you will lead cross-functional teams in developing data governance frameworks, data models, and integration architectures to facilitate seamless data flow across disparate systems. By translating high-level business requirements into technical specifications, you will ensure that data architecture decisions align with broader organizational IT strategies and compliance standards. Additionally, you will architect end-to-end data pipelines, integration frameworks, and governance models to enable the seamless flow of structured and unstructured data from multiple sources. Your responsibilities will also include providing thought leadership in evaluating emerging technologies, tools, and best practices for data management, integration, and business intelligence. You will oversee the deployment and adoption of key enterprise data initiatives, engage with C-suite executives and senior stakeholders to communicate architectural solutions, and lead and mentor technical teams to foster a culture of continuous learning and innovation in data management, BI, and integration. Furthermore, as part of the MHI team, you will have the opportunity to contribute to the development of internal frameworks, methodologies, and standards for data architecture, integration, and BI. By staying up to date with industry trends and emerging technologies, you will continuously evolve the enterprise data architecture to meet the evolving needs of the organization and its clients. To qualify for this role, you should possess 10+ years of relevant professional experience in data management, business intelligence, and integration architecture, along with 6+ years of experience in designing and implementing enterprise data architectures. You should have expertise in cloud-based data architectures, proficiency in data integration tools, experience with relational databases, and a strong understanding of BI platforms. Additionally, you should have hands-on experience with data governance, security, and compliance frameworks, as well as exceptional communication and stakeholder management skills. Joining Myers-Holum as an Enterprise Architect offers you the opportunity to collaborate with curious and thought-provoking minds, shape your future, and positively influence change for clients. You will be part of a dynamic team that values continuous learning, growth, and innovation, while providing stability and growth opportunities within a supportive and forward-thinking organization. If you are ready to embark on a rewarding career journey with Myers-Holum and contribute to the evolution of enterprise data architecture, we invite you to explore the possibilities and discover your true potential with us.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
haryana
On-site
You will have the opportunity to lead and manage SAP MDG implementation projects, ensuring alignment with business requirements and best practices. Collaborate with stakeholders to gather and analyze business requirements, translating them into functional specifications. Your role will involve designing, configuring, and customizing SAP MDG solutions to meet business needs. You will conduct workshops and training sessions for end-users and key stakeholders, perform data modeling, data mapping, and data validation activities. Ensuring data quality and governance standards are maintained throughout the project lifecycle will be a key responsibility. You will provide ongoing support and maintenance for SAP MDG solutions, troubleshooting and resolving issues as they arise. It will be important to stay updated with the latest SAP MDG features and enhancements, recommending improvements to existing processes. You will also work on data modeling and replication, workflow configuration, and integration with other systems. To be successful in this role, you will need a Bachelor's degree in Computer Science, Information Technology, or a related field. A minimum of 5 years of experience in SAP MDG implementation and support is required. Strong understanding of master data management principles and best practices is essential. Proficiency in SAP MDG configuration, data modeling, and data governance is a must. Excellent analytical and problem-solving skills, along with strong communication and interpersonal skills, are desired. SAP MDG certification and functional knowledge of financial processes are considered a plus. A strong understanding of SAP workflows is also important. Your skills and experience should include a strong background in SAP MDG and excellent communication skills.,
Posted 2 weeks ago
0.0 - 4.0 years
0 Lacs
noida, uttar pradesh
On-site
As an Operations Executive at our company located in Noida, Sector 58, you will be responsible for supporting our business operations through efficient data management, validation, and reporting. Your meticulous attention to detail will be crucial in ensuring the accuracy and consistency of data, facilitating informed decision-making and seamless collaboration among various teams. Your primary responsibilities will include collecting and consolidating data from diverse sources, meticulously verifying data for accuracy and completeness, and preparing insightful reports and dashboards. By maintaining data quality and adhering to governance standards, you will contribute significantly to operational efficiency and effectiveness. Additionally, collaborating closely with finance, sales, and delivery teams will be essential in meeting data and reporting requirements efficiently. To excel in this role, you should hold a Bachelor's or Master's degree in Business or a related field. A solid understanding of data management and reporting practices is essential, along with proficiency in Microsoft Excel. Experience with data visualization tools will be considered advantageous. Your strong analytical and communication skills will be instrumental in fulfilling the responsibilities of this position, and having basic knowledge of data governance will be a plus. If you are passionate about operational excellence and possess the required qualifications and skills, we encourage you to apply and be a part of our dynamic team dedicated to driving business success through effective operations management.,
Posted 2 weeks ago
1.0 - 5.0 years
0 Lacs
karnataka
On-site
As a member of the Sanctions team within the Global Financial Crimes (GFC) program at FIS, you will play a key role in supporting the Senior Director to establish a strong tech and data knowledge base, as well as assist in system implementations across Global Financial Crimes Compliance. Your responsibilities will include writing documentation on sanctions workflows, standards, guidelines, testing procedures, taxonomies, and operating procedures. You will also be tasked with developing and optimizing complex SQL queries to extract, manipulate, and analyze large volumes of financial data to ensure data accuracy and integrity. Additionally, you will be responsible for creating and maintaining comprehensive data lineage documentation, contributing to the development and maintenance of master data management processes, and generating regular reporting on Financial Crimes Data Governance KPIs, metrics, and activities. Your role will involve monitoring LOB compliance activities, verifying regulatory compliance deadlines are met, and tracking product data compliance deficiencies to completion. To excel in this role, you should possess a Bachelor's or Master's degree in a relevant field such as Computer Science, Statistics, or Engineering, along with 1-3 years of experience in the regulatory compliance field. Previous experience as a Data Analyst in the financial services industry, particularly with a focus on Financial Crimes Compliance, is highly desirable. Proficiency in SQL, data analysis tools, and experience with data governance practices is essential. Strong analytical, problem-solving, and communication skills are key to success in this position. If you have experience in regulatory oversight of high-risk product lines containing complex banking functions or are considered a subject matter expert in sanctions regulatory compliance, it would be considered an added bonus. At FIS, we offer a flexible and creative work environment, diverse and collaborative atmosphere, professional and personal development resources, opportunities for volunteering and supporting charities, as well as competitive salary and benefits. FIS is committed to protecting the privacy and security of all personal information processed to provide services to clients. Our recruitment model primarily relies on direct sourcing, and we do not accept resumes from recruitment agencies that are not on our preferred supplier list. We take pride in our commitment to diversity, inclusion, and professional growth, and we invite you to be part of our team to advance the world of fintech.,
Posted 2 weeks ago
10.0 - 14.0 years
0 Lacs
karnataka
On-site
The Director, Clinical Operational Data Governance role at GSK is crucial in driving the governance and management of clinical operational data to support groundbreaking research and development efforts aimed at delivering transformative medicines. Your primary responsibility will be to ensure the availability of high-quality operational data from clinical studies, supporting informed decision-making for current and future clinical pipeline activities. As a key player in shaping the future of healthcare, you will play a pivotal role in ensuring the integrity and accessibility of operational data that drives our clinical studies, ultimately enabling us to bring innovative treatments to patients more efficiently. Your responsibilities will include serving as the Global Process Owner/Lead for processes related to clinical operational data intake, management, and downstream provisioning. Additionally, you will act as the Clinical Technology Lead to ensure the maintenance of an optimal technology landscape that supports the execution of business processes related to clinical operational data. As the Domain Owner for the study-level Clinical Operational Data Domain, you will lead data governance activities and collaborate closely with stakeholders to maintain data integrity, reliability, and compliance with regulations and standards. You will oversee the collection, storage, and maintenance of clinical operational data, ensuring it is organized and accessible for analysis and reporting. Your role will also involve stakeholder management, communication, and continuous improvement efforts to foster a culture of accountability, collaboration, and data quality awareness. You will be responsible for managing relationships with stakeholders, leading cross-functional projects, and identifying opportunities to enhance data management processes through advanced analytics and automation. To be successful in this role, you should have a Bachelor's degree in Information Systems, Life Sciences, Data Science, or a related field, along with at least 10 years of experience in the pharmaceutical or life sciences industry in data management and governance. Strong industry experience in clinical trial processes, regulatory requirements, and data governance principles is essential. Additionally, a proven track record in stakeholder management and leading cross-functional teams will be beneficial. Join GSK in uniting science, technology, and talent to stay ahead of disease and positively impact the health of billions of people while delivering sustainable shareholder returns. GSK is committed to creating an inclusive and inspiring work environment where employees can thrive, grow, and contribute to our shared ambition of getting Ahead Together.,
Posted 2 weeks ago
4.0 - 8.0 years
7 - 17 Lacs
Hyderabad
Hybrid
Job Title: IT- Senior Engineer Azure Lake Years of Experience: 4-6 Years Mandatory Skills: Azure, DataLake, SAP BW, PowerBI, Tableau Key Responsibilities: Develop and maintain data architecture strategy, including design and architecture validation reviews. Architect scalable data flows, storage, and analytics platforms in cloud/hybrid environments, ensuring secure and cost-effective solutions. Establish and enforce data governance frameworks, promoting data quality and compliance. Act as a technical advisor on complex data projects and collaborate with stakeholders on project scope and planning. Drive adoption of new technologies, conduct technological watch, and define standards for data management. Develop using SQL, SYNAPSE, Databricks, PowerBI, Fabric. Required Qualifications & Experience: Bachelors or Master’s degree in Computer Science or related field. Experience in data architecture with at least 3 years in a leadership role. Deep knowledge of Azure/AWS, Databricks, Synapse, and other cloud data platforms. Understanding of SAP technologies (SAP BW, SAP DataSphere, HANA, S/4, ECC) and visualization tools (Power BI, Tableau). Understanding of data modeling, ETL/ELT, big data, relational/NoSQL databases, and data security. Experience with AI/ML and familiarity with data mesh/fabric. 5 years in back-end/full stack development in large scale projects with Azure Synapse / Databricks.
Posted 2 weeks ago
3.0 - 6.0 years
3 - 6 Lacs
Vapi
Work from Office
The Business Analyst/Senior Business Analyst (BA/SBA) for Master Data Management (MDM) in the Shared Service Center (SSC) will be responsible for managing and ensuring the accuracy and consistency of the organization's master data. This role will involve working closely with various departments to collect, analyze, and make changes to data as necessary. The BA/SBA will also be responsible for creating and implementing data standards and policies.
Posted 2 weeks ago
2.0 - 5.0 years
9 - 13 Lacs
Hyderabad
Work from Office
Overview We are seeking a skilled Associate Manager AIOps & MLOps Operations to support and enhance the automation, scalability, and reliability of AI/ML operations across the enterprise. This role requires a solid understanding of AI-driven observability, machine learning pipeline automation, cloud-based AI/ML platforms, and operational excellence. The ideal candidate will assist in deploying AI/ML models, ensuring continuous monitoring, and implementing self-healing automation to improve system performance, minimize downtime, and enhance decision-making with real-time AI-driven insights. Responsibilities Support and maintain AIOps and MLOps programs, ensuring alignment with business objectives, data governance standards, and enterprise data strategy. Assist in implementing real-time data observability, monitoring, and automation frameworks to enhance data reliability, quality, and operational efficiency. Contribute to developing governance models and execution roadmaps to drive efficiency across data platforms, including Azure, AWS, GCP, and on-prem environments. Ensure seamless integration of CI/CD pipelines, data pipeline automation, and self-healing capabilities across the enterprise. Collaborate with cross-functional teams to support the development and enhancement of next-generation Data & Analytics (D&A) platforms. Assist in managing the people, processes, and technology involved in sustaining Data & Analytics platforms, driving operational excellence and continuous improvement. Support Data & Analytics Technology Transformations by ensuring proactive issue identification and the automation of self-healing capabilities across the PepsiCo Data Estate. Responsibilities: Support the implementation of AIOps strategies for automating IT operations using Azure Monitor, Azure Log Analytics, and AI-driven alerting. Assist in deploying Azure-based observability solutions (Azure Monitor, Application Insights, Azure Synapse for log analytics, and Azure Data Explorer) to enhance real-time system performance monitoring. Enable AI-driven anomaly detection and root cause analysis (RCA) by collaborating with data science teams using Azure Machine Learning (Azure ML) and AI-powered log analytics. Contribute to developing self-healing and auto-remediation mechanisms using Azure Logic Apps, Azure Functions, and Power Automate to proactively resolve system issues. Support ML lifecycle automation using Azure ML, Azure DevOps, and Azure Pipelines for CI/CD of ML models. Assist in deploying scalable ML models with Azure Kubernetes Service (AKS), Azure Machine Learning Compute, and Azure Container Instances. Automate feature engineering, model versioning, and drift detection using Azure ML Pipelines and MLflow. Optimize ML workflows with Azure Data Factory, Azure Databricks, and Azure Synapse Analytics for data preparation and ETL/ELT automation. Implement basic monitoring and explainability for ML models using Azure Responsible AI Dashboard and InterpretML. Collaborate with Data Science, DevOps, CloudOps, and SRE teams to align AIOps/MLOps strategies with enterprise IT goals. Work closely with business stakeholders and IT leadership to implement AI-driven insights and automation to enhance operational decision-making. Track and report AI/ML operational KPIs, such as model accuracy, latency, and infrastructure efficiency. Assist in coordinating with cross-functional teams to maintain system performance and ensure operational resilience. Support the implementation of AI ethics, bias mitigation, and responsible AI practices using Azure Responsible AI Toolkits. Ensure adherence to Azure Information Protection (AIP), Role-Based Access Control (RBAC), and data security policies. Assist in developing risk management strategies for AI-driven operational automation in Azure environments. Prepare and present program updates, risk assessments, and AIOps/MLOps maturity progress to stakeholders as needed. Support efforts to attract and build a diverse, high-performing team to meet current and future business objectives. Help remove barriers to agility and enable the team to adapt quickly to shifting priorities without losing productivity. Contribute to developing the appropriate organizational structure, resource plans, and culture to support business goals. Leverage technical and operational expertise in cloud and high-performance computing to understand business requirements and earn trust with stakeholders. Qualifications 5+ years of technology work experience in a global organization, preferably in CPG or a similar industry. 5+ years of experience in the Data & Analytics field, with exposure to AI/ML operations and cloud-based platforms. 5+ years of experience working within cross-functional IT or data operations teams. 2+ years of experience in a leadership or team coordination role within an operational or support environment. Experience in AI/ML pipeline operations, observability, and automation across platforms such as Azure, AWS, and GCP. Excellent Communication: Ability to convey technical concepts to diverse audiences and empathize with stakeholders while maintaining confidence. Customer-Centric Approach: Strong focus on delivering the right customer experience by advocating for customer needs and ensuring issue resolution. Problem Ownership & Accountability: Proactive mindset to take ownership, drive outcomes, and ensure customer satisfaction. Growth Mindset: Willingness and ability to adapt and learn new technologies and methodologies in a fast-paced, evolving environment. Operational Excellence: Experience in managing and improving large-scale operational services with a focus on scalability and reliability. Site Reliability & Automation: Understanding of SRE principles, automated remediation, and operational efficiencies. Cross-Functional Collaboration: Ability to build strong relationships with internal and external stakeholders through trust and collaboration. Familiarity with CI/CD processes, data pipeline management, and self-healing automation frameworks. Strong understanding of data acquisition, data catalogs, data standards, and data management tools. Knowledge of master data management concepts, data governance, and analytics.
Posted 2 weeks ago
5.0 - 10.0 years
7 - 12 Lacs
Chennai
Work from Office
Customers trust the Alation Data Intelligence Platform for self-service analytics, cloud transformation, data governance, and AI-ready data, fostering data-driven innovation at scale. With more than $340M in funding valued at over $1.7 billion and nearly 600 customers, including 40% of the Fortune 100 Alation helps organizations realize value from data and AI initiatives. Alation has been recognized in 2024 as one of Inc. Magazines Best Workplaces for the fifth time, a testament to our commitment to creating an inclusive, innovative, and collaborative environment. Collaboration is at the forefront of everything we do. We strive to bring diverse perspectives together and empower each team member to contribute their unique strengths to live out our values each day. These are: Move the Ball, Build for the Long Term, Listen Like You re Wrong, and Measure Through Customer Impact. Joining Alation means being part of a fast-paced, high-growth company where every voice matters, and where we re shaping the future of data intelligence with AI-ready data. Join us on our journey to build a world where data culture thrives and curiosity is celebrated each day! Job Description: About the Role Independent contributor role who is one of the pillars of the team. Opportunity to own business critical components and services, powering key use cases for Alation s customers. The role is one of high ownership, large business impact and executing on a future vision that will be loved by the customers. What youll do! Own the design, development, and optimization of features and services Solve challenging technical problems with minimal guidance Develop, maintain and evangelize scalable, maintainable, and resilient source code Maintain and improve development best practices, code quality, and testing strategies Enhance automation by reducing execution time for faster feedback and reliable regression detection Identify technical risks and propose mitigation strategies Contribute to architectural discussions and provide input to ensure technical clarity across the team You should have 5+ years of professional experience designing, developing, and shipping software products and/or n-tier services Proficiency in any object-oriented language, preferably Golang, Python or Java Experience in developing, deploying, maintaining micro-services Strong problem-solving and analytical skills Excellent communication and collaboration skills Ability to learn through collaboration and apply the knowledge to the assigned tasks Bachelor s Degree in Computer Science or similar A Big Plus Experience with Airflow, Kafka Working experience in Kubernetes and/or Docker Exposure to data modeling in RDBMS #LI-VV1 .
Posted 2 weeks ago
5.0 - 10.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Responsibilities: Development of workflows and Connectors for the Collibra Platform Administration and configuration of Collibra Platform Duties: Collibra DGC Administration and Configuration Collibra Connect Administration and Configuration Collibra Development of Workflows and MuleSoft Connectors Ingesting metadata from any external sources into Collibra. Installation, upgrading and Administration Collibra Components Setup, support, deployment & migration of Collibra Components Implement Application changes: review and deploy code packages, perform post implementation verifications. Participate in group meetings (including business partners) for problem solving, decision making and implementation planning Senior Collibra Developer- Mandatory Skills MUST HAVE SKILLS: Collibra Connect Collibra DGC Java Advanced hands-on working knowledge of Unix/Linux Advanced hands on experience wit UNIX scripting SQL Server Groovy Nice to have: Knowledge and interest in data governance and/or metadata management Working knowledge of Jira would be an asset
Posted 2 weeks ago
1.0 - 6.0 years
5 - 9 Lacs
Gurugram
Work from Office
As an EEO/Affirmative Action Employer, all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, age, national origin, disability, or veteran status. Waste Management (WM), a Fortune 250 company, is the leading provider of comprehensive waste and environmental services in North America. We are strongly committed to a foundation of operating excellence, professionalism and financial strength. WM serves nearly 25 million customers in residential, commercial, industrial and municipal markets throughout North America through a network of collection operations, transfer stations, landfills, recycling facilities and waste-based energy production projects. I. Job Summary This entry level position is responsible for the configuration and support of software application systems within the People Organization. As part of the HR Technology team, provides fundamental technical and analytical support of HR foundational elements and structure that impact HR processes. II. Essential Duties and Responsibilities To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. Other duties may be assigned. Monitors HR systems, open cases and reviews current processes to troubleshoot application related issues and answers system related questions. Performs process review analysis and provides documentation on current to future state. Continues to seek development on the job and through training. Makes required configuration changes according to documented requirements. Generally responsible for foundational workforce data structures such as job codes, positions, location tables, HR departments and other organizational structures and data fields. Considers impact of configuration of tables, data fields and foundational structures on downstream systems and integrations. Ensures data integrity and governance by supporting data imports and extracts and validating accuracy through reporting and queries. Reviews new software application products or new modules in existing applications. Provides fundamental day to day support and maintenance for system(s), preparation for releases, upgrades and/or patches through testing, reporting and analysis of changes. Executes unit, integration and acceptance testing. Working with the functional team, provides screen shots and system steps for testing and change management. Delivers simple reports and queries utilizing delivered software. Follows established data governance. Documents all configuration. III. Supervisory Responsibilities No supervisory duties. IV. Qualifications The requirements listed below are representative of the qualifications necessary to perform the job. A. Education and Experience Education: Bachelor s Degree (accredited), or in lieu of degree, High School Diploma or GED (accredited) and four (4) years of relevant work experience. Experience: No previous years of previous experience (in addition to education requirement). V. Work Environment Listed below are key points regarding environmental demands and work environment of the job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions of the job. Required to use motor coordination with finger dexterity (such as keyboarding, machine operation, etc.) most of the work day; Required to exert physical effort in handling objects less than 30 pounds rarely; Required to be exposed to physical occupational risks (such as cuts, burns, exposure to toxic chemicals, etc.) rarely; Required to be exposed to physical environment which involves dirt, odors, noise, weather extremes or similar elements rarely; Normal setting for this job is: office setting. Must be available to work standard business hours, as well as be available to work non-standard hours in case of emergency (natural disasters, power outages, etc.). May need to attend after hours calls with the offshore team. Benefits At Waste Management, each eligible employee receives a competitive total compensation package including Medical, Dental, Vision, Life Insurance and Short Term Disability. As well as a Stock Purchase Plan, Company match on 401K, and more! Our employees also receive Paid Vacation, Holidays, and Personal Days. Please note that benefits may vary by site. If this sounds like the opportunity that you have been looking for, please click "Apply.
Posted 2 weeks ago
5.0 - 10.0 years
15 - 19 Lacs
Pune
Work from Office
Master Data Team Leader - Finance & Quality Job Description You re not the person who will settle for just any role. Neither are we. Because we re out to create Better Care for a Better World, and that takes a certain kind of person and teams who care about making a difference. At Kimberly-Clark, you ll bring your professional expertise, talent, and drive to building and managing our portfolio of iconic, ground-breaking brands. From our new Hub in Pune, you will own important work that will enable our organization to succeed globally. Role Purpose: Master data is a critical business asset with a significant impact on business performance and decision-making. Properly managed and high-quality master data will enable the company to drive business results, while poorly controlled and low-quality data will lead to higher costs and missed opportunities. You will apply data governance best practices in the delivery of master data management services, ensuring that high-quality (timely, complete, accurate, and consistent) master data is delivered, maintained, and used to drive business results. You will be responsible for delivering master data management services globally, supporting the implementation of systems in partnership with ITS, leading continuous improvement initiatives, and driving global standardization. The current scope covers the majority of master data types (domains), including product, customer, vendor, material, and finance. The scope is unrestricted and may expand in the future to include additional master data types where business value is identified. Role Accountabilities: Master data is a critical business asset with a significant impact on business performance and decision-making. Properly managed and high-quality master data will enable the company to drive business results, while poorly controlled and low-quality data will lead to higher costs and missed opportunities. You will apply data governance best practices in the delivery of master data management services, ensuring that high-quality (timely, complete, accurate, and consistent) master data is delivered, maintained, and used to drive business results. You will be responsible for delivering master data management services globally, supporting the implementation of systems in partnership with ITS, leading continuous improvement initiatives, and driving global standardization. The current scope covers the majority of master data types (domains), including product, customer, vendor, material, and finance. The scope is unrestricted and may expand in the future to include additional master data types where business value is identified. Leadership: Organizational and management skills. Strong ability to communicate with multiple organizational levels and cultures. Excellent interpersonal and collaboration skills. Problem-solving skills. Analytical and critical thinking skills. Results oriented and customer focus. Superior attention to detail. Advanced project management skills. Consistently demonstrate the KC Values (We Care, We Own, We Act) and Our Ways of Working (Focus on Consumers, Play to Win, Move Fast and Grow Our People). Qualifications: Bachelors degree in Business Administration or Engineering or related field.Over 5 years of relevant experience. Expert in Finance Master Data principles, quality, practices and their relationship with business. Good understanding of financial and accounting concepts and processes and experience in related activities (month-end closing, costing, reporting). Knowledge external and internal controls for Vendor and Customer Master Data and adherence to SOX control compliance. Advanced in SAP Finance - FI modules user. CI/LEAN experience. Advanced Microsoft Excel proficiency. Knowledge of Power BI, PowerApp, Winshuttle, Macro and/or SAP scripting will be an advantage. About Us Huggies . Kleenex . Cottonelle . Scott . Kotex . Poise . Depend . Kimberly-Clark Professional . You already know our legendary brands and so does the rest of the world. In fact, millions of people use Kimberly-Clark products every day. We know these amazing Kimberly-Clark products wouldn t exist without talented professionals, like you. At Kimberly-Clark, you ll be part of the best team committed to driving innovation, growth and impact. We re founded on more than 150 years of market leadership, and we re always looking for new and better ways to perform - so there s your open door of opportunity. It s all here for you at Kimberly-Clark. Led by Purpose. Driven by You. About You You perform at the highest level possible, and you appreciate a performance culture fueled by authentic caring. You want to be part of a company actively dedicated to sustainability, inclusion, wellbeing, and career development. You love what you do, especially when the work you do makes a difference. At Kimberly-Clark, we re constantly exploring new ideas on how, when, and where we can best achieve results. When you join our team, you ll experience Flex That Works: flexible (hybrid) work arrangements that empower you to have purposeful time in the office and partner with your leader to make flexibility work for both you and the business. And finally, the fine print . Employment is subject to verification of pre-screening tests, which may include drug screening, background check, and DMV check. #LI-Hybrid Primary Location India - Pune Additional Locations Worker Type Employee Worker Sub-Type Regular Time Type Full time
Posted 2 weeks ago
10.0 - 18.0 years
50 - 70 Lacs
Mumbai
Work from Office
A Chief Data Officer (CDO) is responsible for developing and executing an organizational data strategy. Key highlights of the role are listed below (purely indicative and not limiting): Enterprise Data Strategy & Governance Develop and lead the enterprise-wide data strategy in alignment with business priorities and long-term digital goals. Define a prioritization framework to drive cross-functional data initiatives across the organization. Collaborate with business units to identify data needs and translate them into strategic data initiatives. Establish robust data governance frameworks, policies, and standards to ensure data quality, integrity, and accessibility. Data Privacy, Protection & Compliance Collaborate with the CISO, Legal, Compliance, and Risk teams to ensure adherence to data protection and privacy regulations (RBI, SEBI, DPDP, etc.). Lead the organizations data privacy program, including policy development and breach response protocols. Oversee accurate and timely regulatory reporting based on validated and compliant data sources. Data Architecture & Management Build and manage the data architecture and infrastructure, including data lakes, warehouses, and master data systems, GEP Ensure secure, scalable, and efficient data storage, access, and sharing across the enterprise. Monitor and enforce compliance with internal data policies and external regulatory requirements. Analytics, BI & Advanced Use Cases Drive adoption of advanced analytics (AI/ML) across customer lifecycle acquisition, credit underwriting, collections, fraud, and portfolio management. Enable real-time dashboards and business intelligence reporting for operational and strategic decision-making. Promote the use of analytics for customer segmentation, CLTV, and personalized engagement strategies. Build scalable data architecture including data lakes, data warehouses, and a unified Single Customer View (SCV). Vendor & Third-Party Data Management Partner with CIO for data architecture modernization and API integration across partners (AUA/KUA, credit bureaus), etc. Periodically evaluate vendor platforms for data lineage, encryption, and audit capabilities. Cross-Functional Collaboration & Advisory Act as a strategic advisor to business units on data-driven initiatives and prioritization. Partner with internal stakeholders (Risk, IT, Product, Credit, Marketing, Operations, Compliance) to embed data into core processes and decision-making. Establish and manage external data partnerships and co-develop data products where applicable. Leadership & Capability Building Build and lead a high-performing data organization comprising data scientists, data engineers, analysts, and governance professionals. Foster a data-driven culture by improving data literacy across functions and leadership teams. Champion change management initiatives to support enterprise-wide data transformation.
Posted 2 weeks ago
3.0 - 8.0 years
4 - 8 Lacs
Mumbai
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Snowflake Data Warehouse Good to have skills : Data Engineering, Databricks Unified Data Analytics PlatformMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data solutions are efficient, scalable, and aligned with business objectives. You will also monitor and optimize existing data processes to enhance performance and reliability, while staying updated with the latest industry trends and technologies to continuously improve data management practices. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with stakeholders to gather and analyze data requirements.- Design and implement data models that support business needs. Professional & Technical Skills: - Must To Have Skills: Proficiency in Snowflake Data Warehouse.- Good To Have Skills: Experience with Data Engineering, Databricks Unified Data Analytics Platform.- Strong understanding of ETL processes and data integration techniques.- Experience with data quality assurance and data governance practices.- Familiarity with cloud-based data solutions and architecture. Additional Information:- The candidate should have minimum 3 years of experience in Snowflake Data Warehouse.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough