Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
The role of Workday Adaptive Planning+Workday Finance requires an experienced professional with 5-8 years of experience in PAN India. As an ideal candidate, you will be responsible for managing the Workday Financials system, which includes tasks such as maintenance, configuration of allocations, integrations testing, and review with 3rd party systems. You will also be involved in maintaining the structure of Workday Financials and Adaptive Planning, including the chart of accounts, organizational hierarchy, and calculation logic. Your key responsibilities will include building and updating complex financial models within Adaptive Planning to support budgeting, forecasting, and scenario analysis. Additionally, you will provide training to end-users on effectively utilizing Workday Financials and Adaptive Planning features. You will assist the team in maintaining metadata, business processes, security groups, and user-raised support tickets for both systems. Adhering to established Service Level Agreements for support tickets and commitments is crucial. You will also work closely with the business to gather requirements, develop fit-gap analysis, provide training on new features, and make adoption recommendations of new or deprecated functionality from Workday Financials and Adaptive Planning releases and updates. Creating customized dashboards and reports utilizing data from both systems to provide key insights to stakeholders is a vital part of your role. You will actively participate in implementations, upgrades, integration support, and enhancements of financial systems. Timely submission of external auditor requests related to IT support of financial systems is also expected from you. Collaboration with finance teams including FPA and accounting to understand their business needs and translate them into system configurations and reporting requirements is an essential aspect of this role. Having a general accounting knowledge of financial statements, system consolidation, varying ledger and reporting currencies, and complex intercompany transactions is necessary. If you find this opportunity interesting and aligning with your skills and experience, please share your CV on Sneha.Gedam@ltimindtree.com.,
Posted 1 week ago
8.0 - 10.0 years
22 - 27 Lacs
Bengaluru
Work from Office
Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: Data Governance. Experience : 8-10 Years.
Posted 1 week ago
2.0 - 4.0 years
6 - 12 Lacs
Hyderabad
Work from Office
We are seeking experienced Data Analysts / Data Engineers with strong expertise in U.S. pharmaceutical commercial datasets to support critical Data Operations initiatives. This role will be focused on onboarding third-party data, ensuring data quality , and implementing outlier detection techniques . Familiarity with ML/A I approaches for anomaly detection is highly desirable. Key Responsibilities: Pharma Data Integration: Work extensively with U.S. pharmaceutical commercial datasets. Ingest and onboard third-party data sources such as IQVIA, Symphony Health, Komodo Health etc. Ensure alignment of data schemas, dictionary mapping, and metadata integrity. Data Quality & Governance: Design and implement QC protocols for data integrity and completeness. Track data lineage and maintain proper documentation of data flow and transformations. Outlier Detection & Analytics: Apply statistical or algorithmic techniques to identify anomalies in data related to sales, claims, or patient-level records. Utilize ML/AI tools (if applicable) for automated outlier detection and trend analysis. Collaboration & Reporting: Work cross-functionally with business teams, data scientists, and IT to ensure timely delivery of reliable data. Provide detailed reports and insights for stakeholders to support commercial decision-making. Required Skills & Qualifications: 3+ years of experience in Pharmaceutical Data Operations , preferably with U.S. market data. Strong hands-on experience with third-party commercial healthcare data sources (IQVIA, Symphony, Komodo, etc.). Solid understanding of ETL pipelines, data ingestion frameworks, and metadata management . Proficient in SQL, Python, or R for data processing and quality checks. Experience in outlier detection techniques both statistical (Z-score, IQR, etc.) and ML-based (Isolation Forest, Autoencoders, etc.). Familiarity with Snowflake, Databricks, AWS, or similar cloud platforms is a plus. Excellent problem-solving, documentation, and communication skills
Posted 1 week ago
10.0 - 15.0 years
35 - 40 Lacs
Hyderabad
Work from Office
Role Purpose The purpose of the role is to define and develop Enterprise Data Structure along with Data Warehouse, Master Data, Integration and transaction processing with maintaining and strengthening the modelling standards and business information. Do 1. Define and Develop Data Architecture that aids organization and clients in new/ existing deals a. Partnering with business leadership (adopting the rationalization of the data value chain) to provide strategic, information-based recommendations to maximize the value of data and information assets, and protect the organization from disruptions while also embracing innovation b. Assess the benefits and risks of data by using tools such as business capability models to create an data-centric view to quickly visualize what data matters most to the organization, based on the defined business strategy c. Create data strategy and road maps for the Reference Data Architecture as required by the clients d. Engage all the stakeholders to implement data governance models and ensure that the implementation is done based on every change request e. Ensure that the data storage and database technologies are supported by the data management and infrastructure of the enterprise f. Develop, communicate, support and monitor compliance with Data Modelling standards g. Oversee and monitor all frameworks to manage data across organization h. Provide insights for database storage and platform for ease of use and least manual work i. Collaborate with vendors to ensure integrity, objectives and system configuration j. Collaborate with functional & technical teams and clients to understand the implications of data architecture and maximize the value of information across the organization k. Presenting data repository, objects, source systems along with data scenarios for the front end and back end usage l. Define high-level data migration plans to transition the data from source to target system/ application addressing the gaps between the current and future state, typically in sync with the IT budgeting or other capital planning processes m. Knowledge of all the Data service provider platforms and ensure end to end view. n. Oversight all the data standards/ reference/ papers for proper governance o. Promote, guard and guide the organization towards common semantics and the proper use of metadata p. Collecting, aggregating, matching, consolidating, quality-assuring, persisting and distributing such data throughout an organization to ensure a common understanding, consistency, accuracy and control q. Provide solution of RFPs received from clients and ensure overall implementation assurance i. Develop a direction to manage the portfolio of all the databases including systems, shared infrastructure services in order to better match business outcome objectives ii. Analyse technology environment, enterprise specifics, client requirements to set a collaboration solution for the big/small data iii. Provide technical leadership to the implementation of custom solutions through thoughtful use of modern technology iv. Define and understand current issues and problems and identify improvements v. Evaluate and recommend solutions to integrate with overall technology ecosystem keeping consistency throughout vi. Understand the root cause problem in integrating business and product units vii. Validate the solution/ prototype from technology, cost structure and customer differentiation point of view viii. Collaborating with sales and delivery leadership teams to identify future needs and requirements ix. Tracks industry and application trends and relates these to planning current and future IT needs 2. Building enterprise technology environment for data architecture management a. Develop, maintain and implement standard patterns for data layers, data stores, data hub & lake and data management processes b. Evaluate all the implemented systems to determine their viability in terms of cost effectiveness c. Collect all the structural and non-structural data from different places integrate all the data in one database form d. Work through every stage of data processing: analysing, creating, physical data model designs, solutions and reports e. Build the enterprise conceptual and logical data models for analytics, operational and data mart structures in accordance with industry best practices f. Implement the best security practices across all the data bases based on the accessibility and technology g. Strong understanding of activities within primary discipline such as Master Data Management (MDM), Metadata Management and Data Governance (DG) h. Demonstrate strong experience in Conceptual, Logical and physical database architectures, design patterns, best practices and programming techniques around relational data modelling and data integration 3. Enable Delivery Teams by providing optimal delivery solutions/ frameworks a. Build and maintain relationships with delivery and practice leadership teams and other key stakeholders to become a trusted advisor b. Define database physical structure, functional capabilities, security, back-up and recovery specifications c. Develops and establishes relevant technical, business process and overall support metrics (KPI/SLA) to drive results d. Monitor system capabilities and performance by performing tests and configurations e. Integrate new solutions and troubleshoot previously occurred errors f. Manages multiple projects and accurately reports the status of all major assignments while adhering to all project management standards g. Identify technical, process, structural risks and prepare a risk mitigation plan for all the projects h. Ensure quality assurance of all the architecture or design decisions and provides technical mitigation support to the delivery teams i. Recommend tools for reuse, automation for improved productivity and reduced cycle times j. Help the support and integration team for better efficiency and client experience for ease of use by using AI methods. k. Develops trust and builds effective working relationships through respectful, collaborative engagement across individual product teams l. Ensures architecture principles and standards are consistently applied to all the projects m. Ensure optimal Client Engagement i. Support pre-sales team while presenting the entire solution design and its principles to the client ii. Negotiate, manage and coordinate with the client teams to ensure all requirements are met iii. Demonstrate thought leadership with strong technical capability in front of the client to win the confidence and act as a trusted advisor Mandatory Skills: AI Application Integration. Experience: 10 YEARS.
Posted 1 week ago
1.0 - 4.0 years
9 - 13 Lacs
Pune
Work from Office
Overview Data Technology group in MSCI is responsible to build and maintain state-of-the-art data management platform that delivers Reference. Market & other critical datapoints to various products of the firm. The platform, hosted on firms’ data centers and Azure & GCP public cloud, processes 100 TB+ data and is expected to run 24*7. With increased focus on automation around systems development and operations, Data Science based quality control and cloud migration, several tech stack modernization initiatives are currently in progress. To accomplish these initiatives, we are seeking a highly motivated and innovative individual to join the Data Engineering team for the purpose of supporting our next generation of developer tools and infrastructure. The team is the hub around which Engineering, and Operations team revolves for automation and is committed to provide self-serve tools to our internal customers. Responsibilities Implement & Maintain Data Catalogs Deploy and manage data catalog tool Collibra to improve data discoverability and governance. Metadata & Lineage Management Automate metadata collection, establish data lineage, and maintain consistent data definitions across systems. Enable Data Governance Collaborate with governance teams to apply data policies, classifications, and ownership structures in the catalog. Support Self-Service & Adoption Promote catalog usage across teams through training, documentation, and continuous support. Cross-Team Collaboration Work closely with data engineers, analysts, and stewards to align catalog content with business needs. Tooling & Automation Build scripts and workflows for metadata ingestion, tagging, and monitoring of catalog health. Leverage AI tools for automation of cataloging activities Reporting & Documentation Maintain documentation and generate usage metrics, ensuring transparency and operational efficiency. Qualifications Self-motivated, collaborative individual with passion for excellence E Computer Science or equivalent with 5+ years of total experience and at least 2 years of experience in working with Azure DevOps tools and technologies Good working knowledge of source control applications like git with prior experience of building deployment workflows using this tool Good working knowledge of Snowflake YAML, Python Tools: Experience with data catalog platforms (e.g., Collibra, Alation, DataHub). Metadata & Lineage: Understanding of metadata management and data lineage. Scripting: Proficient in SQL and Python for automation and integration. APIs & Integration: Ability to connect catalog tools with data sources using APIs. Cloud Knowledge: Familiar with cloud data services (Azure, GCP). Data Governance: Basic knowledge of data stewardship, classification, and compliance. Collaboration: Strong communication skills to work across data and business teams What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 1 week ago
5.0 - 10.0 years
4 - 7 Lacs
Bengaluru
Work from Office
Responsibilities: Development of workflows and Connectors for the Collibra Platform Administration and configuration of Collibra Platform Duties: Collibra DGC Administration and Configuration Collibra Connect Administration and Configuration Collibra Development of Workflows and MuleSoft Connectors Ingesting metadata from any external sources into Collibra. Installation, upgrading and Administration Collibra Components Setup, support, deployment & migration of Collibra Components Implement Application changes: review and deploy code packages, perform post implementation verifications. Participate in group meetings (including business partners) for problem solving, decision making and implementation planning Senior Collibra Developer- Mandatory Skills MUST HAVE SKILLS: Collibra Connect Collibra DGC Java Advanced hands-on working knowledge of Unix/Linux Advanced hands on experience wit UNIX scripting SQL Server Groovy Nice to have: Knowledge and interest in data governance and/or metadata management Working knowledge of Jira would be an asset
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
vadodara, gujarat
On-site
The role aims to define and develop Enterprise Data Structure including Data Warehouse, Master Data, Integration, and transaction processing while maintaining and enhancing modelling standards and business information. You will be responsible for defining and developing Data Architecture to support organization and clients in new/existing deals. This involves partnering with business leadership to provide strategic recommendations, creating data strategy and road maps, implementing data governance models, ensuring data storage technologies align with enterprise infrastructure, monitoring compliance with Data Modelling standards, and collaborating with various stakeholders to maximize the value of data architecture. Additionally, you will be tasked with building enterprise technology environment for data architecture management by developing standard patterns for data layers, data stores, data hub & lake, and data management processes, evaluating system implementations for cost-effectiveness, building conceptual and logical data models, implementing best security practices, and demonstrating strong experience in Master Data Management, Metadata Management, and Data Governance. Furthermore, you will enable Delivery Teams by providing optimal delivery solutions/frameworks, maintaining relationships with key stakeholders, defining database physical structure and specifications, establishing relevant technical and business process metrics, monitoring system capabilities and performance, identifying and mitigating risks, ensuring quality assurance of architecture/design decisions, recommending tools for improved productivity, and supporting integration teams for better efficiency and client experience. In conclusion, Wipro is seeking individuals who are inspired by reinvention and are looking to evolve in their careers. The company is dedicated to digital transformation and welcomes applications from individuals with disabilities. If you are motivated by constant evolution and wish to be part of a purpose-driven organization, consider joining Wipro to realize your ambitions.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You have a total of 4-6 years of development/design experience with a minimum of 3 years experience in Big Data technologies on-prem and on cloud. You should be proficient in Snowflake and possess strong SQL programming skills. Your role will require strong experience with data modeling and schema design, as well as extensive experience in using Data warehousing tools like Snowflake/BigQuery/RedShift and BI Tools like Tableau/QuickSight/PowerBI (at least one must be a must-have). You must also have experience with orchestration tools like Airflow and transformation tool DBT. Your responsibilities will include implementing ETL/ELT processes and building data pipelines, workflow management, job scheduling, and monitoring. You should have a good understanding of Data Governance, Security and Compliance, Data Quality, Metadata Management, Master Data Management, Data Catalog, as well as cloud services (AWS), including IAM and log analytics. Excellent interpersonal and teamwork skills are essential, along with the experience of leading and mentoring other team members. Good knowledge of Agile Scrum and communication skills are also required. At GlobalLogic, the culture prioritizes caring and inclusivity. Youll join an environment where people come first, fostering meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. Continuous learning and development opportunities are provided to help you grow personally and professionally. Meaningful work awaits you at GlobalLogic, where youll have the chance to work on impactful projects and engage your curiosity and problem-solving skills. The organization values balance and flexibility, offering various career areas, roles, and work arrangements to help you achieve a perfect balance between work and life. GlobalLogic is a high-trust organization where integrity is key, ensuring a safe, reliable, and ethical global environment for all employees. Truthfulness, candor, and integrity are fundamental values upheld in everything GlobalLogic does. GlobalLogic, a Hitachi Group Company, is a trusted digital engineering partner that collaborates with the world's largest and most forward-thinking companies. Leading the digital revolution since 2000, GlobalLogic helps create innovative digital products and experiences, transforming businesses and redefining industries through intelligent products, platforms, and services.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At PwC, the infrastructure team focuses on designing and implementing secure IT systems that support business operations. The primary goal is to ensure the smooth functioning of networks, servers, and data centers to enhance performance and minimize downtime. In the infrastructure engineering role at PwC, you will be tasked with creating robust and scalable technology infrastructure solutions for clients. This will involve working on network architecture, server management, and cloud computing. As a Data Modeler, we are seeking candidates with a solid background in data modeling, metadata management, and data system optimization. Your responsibilities will include analyzing business requirements, developing long-term data models, and ensuring the efficiency and consistency of our data systems. Key areas of expertise for this role include: - Analyzing and translating business needs into long-term data model solutions. - Evaluating existing data systems and suggesting enhancements. - Defining rules for translating and transforming data across various models. - Collaborating with the development team to create conceptual data models and data flows. - Establishing best practices for data coding to maintain consistency within the system. - Reviewing modifications of existing systems for cross-compatibility. - Implementing data strategies and developing physical data models. - Updating and optimizing local and metadata models. - Utilizing canonical data modeling techniques to improve data system efficiency. - Evaluating implemented data systems for variances, discrepancies, and efficiency. - Troubleshooting and optimizing data systems for optimal performance. - Demonstrating strong expertise in relational and dimensional modeling (OLTP, OLAP). - Using data modeling tools like Erwin, ER/Studio, Visio, PowerDesigner effectively. - Proficiency in SQL and database management systems (Oracle, SQL Server, MySQL, PostgreSQL). - Understanding of NoSQL databases (MongoDB, Cassandra) and their data structures. - Experience with data warehouses and BI tools (Snowflake, Redshift, BigQuery, Tableau, Power BI). - Familiarity with ETL processes, data integration, and data governance frameworks. - Strong analytical, problem-solving, and communication skills. Qualifications for this position include: - A Bachelor's degree in Engineering or a related field. - 5 to 9 years of experience in data modeling or a related field. - 4+ years of hands-on experience with dimensional and relational data modeling. - Expert knowledge of metadata management and related tools. - Proficiency with data modeling tools such as Erwin, Power Designer, or Lucid. - Knowledge of transactional databases and data warehouses. Preferred Skills: - Experience in cloud-based data solutions (AWS, Azure, GCP). - Knowledge of big data technologies (Hadoop, Spark, Kafka). - Understanding of graph databases and real-time data processing. - Certifications in data management, modeling, or cloud data engineering. - Excellent communication and presentation skills. - Strong interpersonal skills to collaborate effectively with various teams.,
Posted 2 weeks ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
As a Senior Data Analyst in the Solution Design team at Barclays, your primary responsibility will be to support in defining and designing technology and business solutions that align with organizational goals. This includes conducting requirements gathering, data analysis, data architecture, system integration, and delivering scalable, high-quality designs that cater to both business and technical needs. To excel in this role, you must have experience in delivering large-scale changes in complex environments, leading requirements documentation, and facilitating workshops to gather, clarify, and communicate business needs effectively. Your strong data analysis and data modeling skills will be crucial for performing data validations, anomaly detection, and deriving insights from large volumes of data to support decision-making. Proficiency in advanced SQL for querying, joining, and transforming data, along with experience in data visualization tools such as Tableau, Qlik, or Business Objects, is essential. Furthermore, you should be an effective communicator capable of translating complex technical concepts into clear language for diverse audiences. Your ability to liaise between business stakeholders and technical teams to ensure a mutual understanding of data interpretations, requirements definition, and solution designs will be key. Previous experience in Banking and Financial services, particularly in wholesale credit risk, as well as knowledge of implementing data governance standards, will be advantageous. Additional skills highly valued for this role include experience with Python data analysis and visualization tools, familiarity with external data vendors for integrating financials and third-party datasets, and exposure to wholesale credit risk IRB models and regulatory frameworks. Your responsibilities will include investigating and analyzing data quality issues, executing data cleansing and transformation tasks, designing and building data pipelines, applying advanced analytical techniques like machine learning and AI, and documenting data quality findings for improvement. It will also be essential to contribute to strategy, drive requirements, manage resources, and deliver continuous improvements in alignment with organizational goals. As a Senior Data Analyst, you will be expected to demonstrate leadership behaviors that create an environment for colleagues to thrive and deliver excellence. If the position includes leadership responsibilities, you will be required to set strategic direction, manage policies and processes, and drive continuous improvement. Additionally, you will advise key stakeholders, manage and mitigate risks, and collaborate with other areas of the organization to achieve business goals. All colleagues at Barclays are expected to embody the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, as well as the Barclays Mindset of Empower, Challenge, and Drive.,
Posted 2 weeks ago
6.0 - 10.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Data Architect specializing in OLTP & OLAP Systems, you will play a crucial role in designing, optimizing, and governing data models for both OLTP and OLAP environments. Your responsibilities will include architecting end-to-end data models across different layers, defining conceptual, logical, and physical data models, and collaborating closely with stakeholders to capture functional and performance requirements. You will need to optimize database structures for real-time and analytical workloads, enforce data governance, security, and compliance best practices, and enable schema versioning, lineage tracking, and change control. Additionally, you will review query plans and indexing strategies to enhance performance. To excel in this role, you must possess a deep understanding of OLTP and OLAP systems architecture, along with proven experience in GCP databases such as BigQuery, CloudSQL, and AlloyDB. Your expertise in database tuning, indexing, sharding, and normalization/denormalization will be critical, as well as proficiency in data modeling tools like DBSchema, ERWin, or equivalent. Familiarity with schema evolution, partitioning, and metadata management is also required. Experience in the BFSI or mutual fund domain, knowledge of near real-time reporting and streaming analytics architectures, and familiarity with CI/CD for database model deployments are preferred skills that will set you apart. Strong communication, stakeholder management, strategic thinking, and the ability to mentor data modelers and engineers are essential soft skills for success in this position. By joining our team, you will have the opportunity to own the core data architecture for a cloud-first enterprise, bridge business goals with robust data design, and work with modern data platforms and tools. If you are looking to make a significant impact in the field of data architecture, this role is perfect for you.,
Posted 2 weeks ago
7.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
As a Business Analyst with 7-14 years of experience, you will be responsible for various tasks including Business Requirement Documents (BRD) and Functional Requirement Documents (FRD) creation, Stakeholder Management, User Acceptance Testing (UAT), understanding Datawarehouse Concepts, SQL queries, and subqueries, as well as utilizing Data Visualization tools such as Power BI or MicroStrategy. It is essential that you have a deep understanding of the Investment Domain, specifically in areas like Capital markets, Asset management, and Wealth management. Your primary responsibilities will involve working closely with stakeholders to gather requirements, analyzing data, and testing systems to ensure they meet business needs. Additionally, you should have a strong background in investment management or financial services, with experience in areas like Asset management, Investment operations, and Insurance. Your familiarity with concepts like Critical Data Elements (CDEs), data traps, and reconciliation workflows will be beneficial in this role. Technical expertise in BI and analytics tools like Power BI, Tableau, and MicroStrategy is required, along with proficiency in SQL. You should also possess excellent communication skills, analytical thinking capabilities, and the ability to engage effectively with stakeholders. Experience in working within Agile/Scrum environments with cross-functional teams is highly valued. In terms of technical skills, you should demonstrate proven abilities in analytical problem-solving, with a deep knowledge of investment data platforms such as Golden Source, NeoXam, RIMES, and JPM Fusion. Expertise in cloud data technologies like Snowflake, Databricks, and AWS/GCP/Azure data services is essential. Understanding data governance frameworks, metadata management, and data lineage is crucial, along with compliance standards in the investment management industry. Hands-on experience with Investment Book of Records (IBORs) like Blackrock Alladin, CRD, Eagle STAR (ABOR), Eagle Pace, and Eagle DataMart is preferred. Familiarity with investment data platforms including Golden Source, FINBOURNE, NeoXam, RIMES, and JPM Fusion, as well as cloud data platforms like Snowflake and Databricks, will be advantageous. Your background in data governance, metadata management, and data lineage frameworks will be essential in ensuring data accuracy and compliance within the organization.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
noida, uttar pradesh
On-site
As a Senior Data Engineer at Veersa, you will utilize your deep expertise in ETL/ELT processes, data warehousing principles, and both real-time and batch data integrations. In this role, you will have the opportunity to mentor junior engineers, establish best practices, and contribute to the overarching data strategy of the company. Your proficiency in SQL, Python, and ideally Airflow and Bash scripting will be instrumental in designing and implementing scalable data integration and pipeline solutions using Azure cloud services. Your key responsibilities will include architecting and implementing data solutions, developing ETL/ELT processes, building and automating data workflows, orchestrating pipelines, and writing Bash scripts for system automation. Collaborating with business and technical stakeholders to understand data requirements and translating them into technical solutions will be a key aspect of your role. Moreover, you will be expected to develop data flows, mappings, quality standards, and validation rules across various systems, ensuring adherence to best practices in data modeling, metadata management, and data governance. To qualify for this role, you must hold a B.Tech or B.E degree in Computer Science, Information Systems, or a related field, along with a minimum of 3 years of experience in data engineering, focusing on Azure-based solutions. Your proficiency in SQL and Python, experience with Airflow and Bash scripting, and proven track record in real-time and batch data integrations will be essential. Hands-on experience with Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks is highly desirable, as well as a strong understanding of data warehousing principles, ETL/ELT methodologies, and data pipeline architecture. In addition, familiarity with data quality, metadata management, and data validation frameworks, coupled with strong problem-solving skills and clear communication abilities, will set you up for success in this role. Preferred qualifications include experience with multi-tenant SaaS data solutions, knowledge of DevOps practices, CI/CD pipelines, version control systems like Git, and a proven ability to mentor and coach other engineers in technical decision-making processes. By joining Veersa as a Senior Data Engineer, you will play a crucial role in driving innovation and delivering cutting-edge technical solutions to clients in the US healthcare industry.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
haryana
On-site
You are seeking an Analytics Developer with expertise in Databricks, Power BI, and ETL technologies to design, develop, and deploy advanced analytics solutions. Your focus will be on creating robust, scalable data pipelines, implementing actionable business intelligence frameworks, and delivering insightful dashboards and reports to drive strategic decision-making. This role involves close collaboration with technical teams and business stakeholders to ensure analytics initiatives align with organizational objectives. With 8+ years of experience in analytics, data integration, and reporting, you should possess 4+ years of hands-on experience with Databricks, including proficiency in Databricks Notebooks for development and testing. Your key responsibilities will include leveraging Databricks to develop and optimize scalable data pipelines for real-time and batch data processing, designing and implementing Databricks Notebooks for exploratory data analysis, ETL workflows, and machine learning models, managing and optimizing Databricks clusters for performance, cost efficiency, and scalability, using Databricks SQL for advanced query development, data aggregation, and transformation, incorporating Python and/or Scala within Databricks workflows to automate and enhance data engineering processes, developing solutions to integrate Databricks with other platforms such as Azure Data Factory for seamless data orchestration, creating interactive and visually compelling Power BI dashboards and reports to enable self-service analytics, leveraging DAX for building calculated columns, measures, and complex aggregations, designing effective data models in Power BI using star schema and snowflake schema principles for optimal performance, configuring and managing Power BI workspaces, gateways, and permissions for secure data access, implementing row-level security and data masking strategies in Power BI to ensure compliance with governance policies, building real-time dashboards by integrating Power BI with Databricks, Azure Synapse, and other data sources, providing end-user training and support for Power BI adoption across the organization, developing and maintaining ETL/ELT workflows ensuring high data quality and reliability, implementing data governance frameworks to maintain data lineage, security, and compliance with organizational policies, optimizing data flow across multiple environments including data lakes, warehouses, and real-time processing systems, collaborating with data governance teams to enforce standards for metadata management and audit trails, working closely with IT teams to integrate analytics solutions with ERP, CRM, and other enterprise systems, troubleshooting and resolving technical challenges related to data integration, analytics performance, and reporting accuracy, staying updated on the latest advancements in Databricks, Power BI, and data analytics technologies, driving innovation by integrating AI/ML capabilities into analytics solutions using Databricks, contributing to the enhancement of organizational analytics maturity through scalable and reusable approaches. You should possess self-management skills, thinking outside the box, learning new technologies, logical thinking, fluency in English, strong communication skills, a Bachelor's degree in Computer Science, Data Science, or a related field (Masters preferred), relevant certifications, and the ability to manage multiple priorities in a fast-paced environment with high customer expectations.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At PwC, our team in managed services focuses on providing a variety of outsourced solutions and supporting clients across multiple functions. We help organizations streamline their operations, reduce costs, and enhance efficiency by managing key processes and functions on their behalf. Our team is skilled in project management, technology, and process optimization, ensuring the delivery of high-quality services to our clients. Those in managed service management and strategy at PwC are responsible for transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your role will involve continuous improvement and optimization of managed services processes, tools, and services. As a member of our team, you will build meaningful client relationships and learn how to manage and inspire others. You will navigate complex situations, develop your personal brand, deepen your technical expertise, and leverage your strengths. Anticipating the needs of your teams and clients, you will deliver quality results. Embracing ambiguity, you will be comfortable when the path forward is unclear, asking questions and using such moments as opportunities for growth. Required skills, knowledge, and experiences for this role include but are not limited to: - Responding effectively to diverse perspectives, needs, and feelings of others - Using a broad range of tools, methodologies, and techniques to generate new ideas and solve problems - Applying critical thinking to break down complex concepts - Understanding the broader objectives of your project or role and how your work aligns with the overall strategy - Developing a deeper understanding of the business context and its changing dynamics - Using reflection to enhance self-awareness, strengths, and development areas - Interpreting data to derive insights and recommendations - Upholding and reinforcing professional and technical standards, along with the Firm's code of conduct and independence requirements As a Senior Associate, you will work collaboratively with a team of problem solvers, addressing complex business issues from strategy to execution through Data, Analytics & Insights Skills. Your responsibilities at this level include: - Using feedback and reflection to enhance self-awareness, personal strengths, and address development areas - Demonstrating critical thinking and the ability to structure unstructured problems - Reviewing deliverables for quality, accuracy, and relevance - Adhering to SLAs, incident management, change management, and problem management - Leveraging tools effectively in different situations and explaining the rationale behind the choices - Seeking opportunities for exposure to diverse situations, environments, and perspectives - Communicating straightforwardly and structurally to influence and connect with others - Demonstrating leadership by engaging directly with clients and leading engagements - Collaborating in a team environment with client interactions, workstream management, and cross-team cooperation - Contributing to cross-competency work and Center of Excellence activities - Managing escalations and risks effectively Position Requirements: - Primary Skill: Tableau, Visualization, Excel - Secondary Skill: Power BI, Cognos, Qlik, SQL, Python, Advanced Excel, Excel Macro BI Engineer Role: - Minimum 5 years hands-on experience in building advanced Data Analytics - Minimum 5 years hands-on experience in delivering Managed Data and Analytics programs - Extensive experience in developing scalable, repeatable, and secure data structures and pipelines - Proficiency in industry tools like Python, SQL, Spark for Data analytics - Experience in building Data Governance solutions using leading tools - Knowledge of Data consumption patterns and BI tools like Tableau, Qlik Sense, Power BI - Strong communication, problem-solving, quantitative, and analytical abilities Certifications in Tableau and other BI tools are advantageous, along with certifications in any cloud platform. In our Managed Services - Data, Analytics & Insights team at PwC, we focus on collaborating with clients to leverage technology and human expertise, delivering simple yet powerful solutions. Our goal is to enable clients to focus on their core business while trusting us as their IT partner. We are driven by the passion to enhance our clients" capabilities every day. Within our Managed Services platform, we offer integrated services grounded in industry experience and powered by top talent. Our team of global professionals, combined with cutting-edge technology, ensures effective outcomes that add value to our clients" enterprises. Through a consultative approach, we enable transformational journeys that drive sustained client outcomes, allowing clients to focus on accelerating their priorities and optimizing their operations. As a member of our Data, Analytics & Insights Managed Service team, you will contribute to critical Application Evolution Service offerings, help desk support, enhancement and optimization projects, and strategic roadmap development. Your role will involve technical expertise and relationship management to support customer engagements effectively.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
hyderabad, telangana
On-site
You will have a pivotal role in implementing and embracing the data governance framework at Amgen, which aims to revolutionize the company's data ecosystem and establish Amgen as a pioneer in biopharma innovation. This position will make use of cutting-edge technologies such as Generative AI, Machine Learning, and integrated data. Your expertise in domains, technical knowledge, and business processes will be crucial in providing exceptional support for Amgen's data governance framework. Collaboration with business stakeholders and data analysts will be essential to ensure successful implementation and adoption of the data governance framework. Working closely with the Product Owner and other Business Analysts will be necessary to guarantee operational support and excellence from the team. You will be responsible for the implementation of the data governance and data management framework within a specific domain of expertise, such as Research, Development, or Supply Chain. Operationalizing the Enterprise data governance framework and aligning a broader stakeholder community with their data governance needs, including data quality, data access controls, compliance with privacy and security regulations, master data management, data sharing, communication, and change management will be part of your responsibilities. Collaborating with Enterprise MDM and Reference Data to enforce standards and data reusability will also be key. You will drive cross-functional alignment in your area of expertise to ensure adherence to Data Governance principles and maintain privacy policies and procedures to safeguard sensitive data and ensure compliance. Regular privacy risk assessments and audits will be conducted by you to identify and mitigate potential risks as required. Furthermore, you will be responsible for maintaining documentation on data definitions, data standards, data flows, legacy data structures, common data models, and data harmonization for the assigned domains. Ensuring compliance with data privacy, security, and regulatory policies for the assigned domains, including GDPR, CCPA, and other relevant legislations, will be critical. Together with Technology teams, business functions, and enterprise teams, you will define the specifications shaping the development and implementation of data foundations. Building strong relationships with key business leads and partners to ensure their needs are met will also be part of your role. Your must-have functional skills include technical knowledge of Pharma processes with specialization in a domain, in-depth understanding of data management, data quality, master data management, data stewardship, data protection, and familiarity with data protection laws and regulations. You should have experience in the development life cycle of data products and proficiency in tools like Collibra and Alation. Strong problem-solving skills, excellent communication, and working with data governance frameworks are essential. Experience with data governance councils, Agile software development methodologies, proficiency in data analysis and quality tools, and 3-5 years of experience in data privacy or compliance are good-to-have functional skills. Soft skills required for this role include integrity, adaptability, proactivity, leadership, organization, analytical skills, ability to work effectively with teams, manage multiple priorities, ambition to develop skills and career, build business relationships, understand end-to-end data use and needs, interpersonal skills, initiative, self-motivation, presentation skills, attention to detail, time management, and customer focus. Basic qualifications for this position include any Degree and 9-13 years of experience.,
Posted 2 weeks ago
3.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
We are seeking a highly skilled and experienced Snowflake Architect to take charge of designing, developing, and deploying enterprise-grade cloud data solutions. The ideal candidate should possess a solid background in data architecture, cloud data platforms, and Snowflake implementation, along with practical experience in end-to-end data pipeline and data warehouse design. In this role, you will be responsible for leading the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. You will also be tasked with defining data modeling standards, best practices, and governance frameworks. Collaborating with stakeholders to comprehend data requirements and translating them into robust architectural solutions will be a key part of your responsibilities. Furthermore, you will be required to design and optimize ETL/ELT pipelines utilizing tools like Snowpipe, Azure Data Factory, Informatica, or DBT. Implementing data security, privacy, and role-based access controls within Snowflake is also essential. Providing guidance to development teams on performance tuning, query optimization, and cost management within Snowflake will be part of your duties. Additionally, ensuring high availability, fault tolerance, and compliance across data platforms will be crucial. Mentoring developers and junior architects on Snowflake capabilities is an important aspect of this role. Qualifications and Experience: - 8+ years of overall experience in data engineering, BI, or data architecture, with a minimum of 3+ years of hands-on Snowflake experience. - Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization. - Strong proficiency in SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP). - Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion. - Good understanding of data lakes, data mesh, and modern data stack principles. - Experience with CI/CD for data pipelines, DevOps, and data quality frameworks. - Solid knowledge of data governance, metadata management, and cataloging. Desired Skills: - Snowflake certification (e.g., SnowPro Core/Advanced Architect). - Familiarity with Apache Airflow, Kafka, or event-driven data ingestion. - Knowledge of data visualization tools such as Power BI, Tableau, or Looker. - Experience in healthcare, BFSI, or retail domain projects. Please note that this job description is sourced from hirist.tech.,
Posted 2 weeks ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
You are a highly motivated and detail-oriented Data Catalog Analyst with expertise in erwin Data Intelligence Suite (DIS), particularly the erwin Data Catalog module. Your main responsibility will be to build and maintain a centralized metadata repository that enables data discovery, lineage, and governance across the enterprise. You will configure, implement, and maintain the erwin Data Catalog to support enterprise metadata management. It will be your duty to harvest metadata from various data sources such as databases, ETL tools, BI platforms, etc., ensuring accuracy and completeness. Your role will involve developing and maintaining data lineage, impact analysis, and data flow documentation. Collaboration with data stewards, business analysts, and IT teams is essential for defining and enforcing metadata standards and governance policies. You will also support the creation and maintenance of business glossaries, technical metadata, and data dictionaries. Ensuring metadata is accessible and well-organized to enable data discovery and self-service analytics will be one of your priorities. You are expected to provide training and support to business and technical users on effectively using the erwin platform. Monitoring system performance and troubleshooting issues related to metadata ingestion and cataloging will also be part of your responsibilities. To qualify for this role, you should hold a Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. Additionally, you should have at least 3 years of experience in data governance, metadata management, or enterprise data architecture. Hands-on experience with erwin DIS, especially erwin Data Catalog and erwin Data Modeler, is required. Knowledge of other tools like Collibra, Atlasian, etc., can also be considered. A strong understanding of metadata management, data lineage, and data governance frameworks (e.g., DAMA-DMBOK) is necessary. You should be familiar with relational databases, data warehouses, and cloud data platforms (e.g., AWS, Azure, GCP), along with proficiency in SQL and data profiling tools. Preferred skills for this role include experience with other data governance tools (e.g., Collibra, Informatica, Alation), knowledge of regulatory compliance standards (e.g., GDPR, HIPAA, CCPA), strong communication and stakeholder engagement skills, knowledge of creating documentation, experience with Agile methodology, and the ability to work independently and manage multiple priorities in a fast-paced environment.,
Posted 2 weeks ago
4.0 - 8.0 years
7 - 11 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
Job Title: Erwin Data Modeler, Insurance domain Location: Any Job Type: Full-Time | 2-11pm Shift Job Summary We are seeking a skilled and experienced Data Modeler with hands-on expertise in Erwin Data Modeling to join our team. The ideal candidate will have a strong background in data architecture and modeling, with a minimum of 4 years of relevant experience. Knowledge of the insurance domain is a significant plus. Key Responsibilities Design, develop, and maintain conceptual, logical, and physical data models using Erwin Data Modeler. Collaborate with business analysts, data architects, and developers to understand data requirements and translate them into data models. Ensure data models align with enterprise standards and best practices. Perform data analysis and profiling to support modeling efforts. Maintain metadata and documentation for data models. Support data governance and data quality initiatives. Participate in reviews and provide feedback on data models and database designs. Required Skills & Qualifications Strong understanding of data modeling concepts including normalization, denormalization, and dimensional modeling. Knowledge on any relational database will be an advantage. Familiarity with data warehousing and ETL processes. Excellent analytical and problem-solving skills. Strong communication and collaboration abilities.
Posted 2 weeks ago
5.0 - 10.0 years
13 - 18 Lacs
Hyderabad
Work from Office
Overview Enterprise Data Operations Analyst Job OverviewAs an Analyst, Data Modeling , your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics . The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks , Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/ modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives , maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications Qualifications 5+ years of overall technology experience that includes at least 2+ years of data modeling and systems architecture. Around 2+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 2+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ , and Great Expectations. Experience building/ operating highly available , distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake . Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI ).
Posted 2 weeks ago
8.0 - 13.0 years
18 - 22 Lacs
Hyderabad
Work from Office
Overview Enterprise Data Operations Sr Analyst L08 Job OverviewAs Senior Analyst, Data Modeling , your focus would be to partner with D&A Data Foundation team members to create data models for Global projects. This would include independently analyzing project data needs, identifying data storage and integration needs/issues, and driving opportunities for data model reuse, satisfying project requirements. Role will advocate Enterprise Architecture, Data Design, and D&A standards, and best practices. You will be performing all aspects of Data Modeling working closely with Data Governance, Data Engineering and Data Architects teams. As a member of the data modeling team, you will create data models for very large and complex data applications in public cloud environments directly impacting the design, architecture, and implementation of PepsiCo's flagship data products around topics like revenue management, supply chain, manufacturing, and logistics . The primary responsibilities of this role are to work with data product owners, data management owners, and data engineering teams to create physical and logical data models with an extensible philosophy to support future, unknown use cases with minimal rework. You'll be working in a hybrid environment with in-house, on-premise data sources as well as cloud and remote systems. You will establish data design patterns that will drive flexible, scalable, and efficient data models to maximize value and reuse. Responsibilities Responsibilities Complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, EMR, Spark, DataBricks , Snowflake, Azure Synapse or other Cloud data warehousing technologies. Governs data design/ modeling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. Supports assigned project contractors (both on- & off-shore), orienting new contractors to standards, best practices, and tools. Contributes to project cost estimates, working with senior members of team to evaluate the size and complexity of the changes or new development. Ensure physical and logical data models are designed with an extensible philosophy to support future, unknown use cases with minimal rework. Develop a deep understanding of the business domain and enterprise technology inventory to craft a solution roadmap that achieves business objectives , maximizes reuse. Partner with IT, data engineering and other teams to ensure the enterprise data model incorporates key dimensions needed for the proper managementbusiness and financial policies, security, local-market regulatory rules, consumer privacy by design principles (PII management) and all linked across fundamental identity foundations. Drive collaborative reviews of design, code, data, security features implementation performed by data engineers to drive data product development. Assist with data planning, sourcing, collection, profiling, and transformation. Create Source To Target Mappings for ETL and BI developers. Show expertise for data at all levelslow-latency, relational, and unstructured data stores; analytical and data lakes; data streaming (consumption/production), data in-transit. Develop reusable data models based on cloud-centric, code-first approaches to data management and cleansing. Partner with the Data Governance team to standardize their classification of unstructured data into standard structures for data discovery and action by business customers and stakeholders. Support data lineage and mapping of source system data to canonical data stores for research, analysis and productization. Qualifications Qualifications 8+ years of overall technology experience that includes at least 4+ years of data modeling and systems architecture. 3+ years of experience with Data Lake Infrastructure, Data Warehousing, and Data Analytics tools. 4+ years of experience developing enterprise data models. Experience in building solutions in the retail or in the supply chain space. Expertise in data modeling tools (ER/Studio, Erwin, IDM/ARDM models). Experience with integration of multi cloud services (Azure) with on-premises technologies. Experience with data profiling and data quality tools like Apache Griffin, Deequ , and Great Expectations. Experience building/ operating highly available , distributed systems of data extraction, ingestion, and processing of large data sets. Experience with at least one MPP database technology such as Redshift, Synapse, Teradata or SnowFlake . Experience with version control systems like Github and deployment & CI tools. Experience with Azure Data Factory, Databricks and Azure Machine learning is a plus. Experience of metadata management, data lineage, and data glossaries is a plus. Working knowledge of agile development, including DevOps and DataOps concepts. Familiarity with business intelligence tools (such as PowerBI ). Does the person hired for this job need to be based in a PepsiCo office, or can they be remote Employee must be based in a Pepsico office Primary Work LocationHyderabad HUB-IND
Posted 2 weeks ago
3.0 - 8.0 years
10 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Job Title: OCI Data Catalog Specialist Experience Level: 3+ Years Job Level: Consultant / Senior Consultant (based on total experience) Work Location: Hyderabad, Bangalore, Chennai, Mumbai, Pune, Kolkata, Gurgaon Work Mode: Hybrid (Minimum 2 days/week from office) Work Timings: 10:00 AM - 7:00 PM IST Contract to Hire Role Overview: We are looking for a skilled and proactive OCI Data Catalog PoC Specialist to design, implement, and demonstrate a Proof of Concept (PoC) for Oracle Cloud Infrastructure (OCI) Data Catalog as part of a strategic data governance initiative. The role involves showcasing OCI Data Catalog features, evaluating its fit for business needs, and guiding future adoption in production. Key Responsibilities: Lead the end-to-end delivery of the OCI Data Catalog PoC . Collaborate with client stakeholders to understand data governance goals and cataloging needs. Configure and integrate OCI Data Catalog with data sources including: Oracle Autonomous Database OCI Object Storage On-premises databases Develop and execute test scenarios for: Metadata harvesting Data lineage Data classification Stewardship and search Integrate OCI Data Catalog metadata output with Marketplace applications for automated sharing. Troubleshoot PoC issues and coordinate with Oracle support. Document PoC results , provide lessons learned, and recommend steps for production rollout. Provide training and knowledge transfer sessions to client teams. Required Skills and Experience: 3+ years of experience in data management, governance, or cloud-based data solutions. Hands-on experience with OCI Data Catalog is mandatory . Strong understanding of: Metadata management Data lineage Classification and cataloging principles Proven experience integrating data catalogs with multiple source types (cloud + on-prem). Familiarity with Oracle Cloud Infrastructure (OCI) and associated data services. Strong analytical, communication, and documentation skills. Ability to work with cross-functional teams and present findings to both technical and business stakeholders . Good to Have: Experience with other data governance tools or frameworks. Exposure to Oracle Marketplace and its integration with metadata workflows. Prior experience in client-facing PoC or advisory roles in cloud data environments.
Posted 2 weeks ago
7.0 - 10.0 years
0 - 0 Lacs
noida
On-site
Lead Taxonomist AEM Operations_Full-time_Noida [Remote/Hybrid] Job Title: Lead Taxonomist AEM Operations Experience Required: 7+ Years Location: Noida [Remote/Hybrid] Employment Type: [Full-time] About the Role *We are seeking a highly skilled and strategic Lead Taxonomist to join our AEM Operations team. *In this role, you will be responsible for designing, implementing, and maintaining the visa.com taxonomy structure to ensure intuitive content categorization and seamless navigation across digital platforms. *You will partner with UX/UI designers, content strategists, SEO specialists, and engineering teams to deliver a robust taxonomy framework that enhances discoverability, improves content performance, and supports a consistent user experience. *This role is ideal for someone who thrives in a data-driven, collaborative environment and is passionate about organizing information in meaningful and scalable ways. Key Responsibilities *Design and maintain the visa.com taxonomy to support intuitive navigation, consistent content structure, and scalable content strategy. *Develop tagging strategies and metadata frameworks to ensure accurate content labeling, improved search capabilities, and optimal content discoverability. *Collaborate cross-functionally with UX/UI, content, SEO, and engineering teams to implement, validate, and enhance taxonomy solutions across platforms powered by Adobe Experience Manager (AEM). *Conduct regular audits of taxonomy and metadata structures; analyze user behavior and site analytics to continuously optimize taxonomy for usability and performance. *Champion best practices in information architecture, ensuring taxonomy evolves with the digital ecosystem and business objectives. *Translate business and user requirements into actionable taxonomy updates and metadata enhancements. Required Qualifications & Skills *Bachelors or Masters degree in Information Science, Data Science, Library Science, Human-Computer Interaction (HCI), or a related field. *5+ years of experience in taxonomy development, metadata management, or information architecture within digital product, CMS, or website environments. *Strong understanding of content management systems (especially AEM), tagging frameworks, metadata schemas, and SEO best practices. *Proficiency in auditing digital content using analytics tools and translating insights into actionable taxonomy improvements. *Exceptional attention to detail, with strong analytical and problem-solving skills. *Excellent communication skills and demonstrated ability to work collaboratively across cross-functional teams, including product, design, engineering, and marketing. *Comfortable working in an agile, fast-paced digital environment. Preferred Qualifications (Nice to Have) *Experience working with Adobe Experience Manager (AEM) or similar enterprise-level CMS platforms. *Familiarity with tools like Adobe Analytics, Google Analytics, or ContentSquare. *Experience with accessibility and internationalization considerations in taxonomy. What We Offer *An opportunity to shape the information structure of a global digital platform. *A dynamic, collaborative work environment with leading industry professionals. *Support for ongoing professional development in taxonomy and content strategy. *Flexible working hours and remote opportunities. --------------- If you are interested, please share your updated resume along with the following details for the next steps: # Your full name ( First : Middle : Last ) ( All expanded ): # Present Employer Name & Work Location: # Permanent / Contract Employee: # Current Location: # Preferred Location (Noida): # Highest Qualification (University Name and Passing year): # Total experience: # Current CTC and take home: # Expected CTC and take home: # Official Notice Period: # Are you serving notice period if yes then mention LWD (Last Working Day): # Any offer you are holding (if yes please share the offer amount):
Posted 2 weeks ago
7.0 - 11.0 years
0 Lacs
vadodara, gujarat
On-site
The purpose of your role is to define and develop Enterprise Data Structure, Data Warehouse, Master Data, Integration, and transaction processing while maintaining and strengthening modeling standards and business information. You will define and develop Data Architecture that supports the organization and clients in new/existing deals. This includes partnering with business leadership to provide strategic recommendations, assessing data benefits and risks, creating data strategy and roadmaps, engaging stakeholders for data governance, ensuring data storage and database technologies are supported, monitoring compliance with Data Modeling standards, overseeing frameworks for data management, and collaborating with vendors and clients to maximize the value of information. Additionally, you will be responsible for building enterprise technology environments for data architecture management. This involves developing, maintaining, and implementing standard patterns for data layers, data stores, data hub & lake, evaluating implemented systems, collecting and integrating data, creating data models, implementing best security practices, and demonstrating strong experience in database architectures and design patterns. You will also enable Delivery Teams by providing optimal delivery solutions and frameworks. This includes building relationships with delivery and practice leadership teams, defining database structures and specifications, establishing relevant metrics, monitoring system capabilities and performance, integrating new solutions, managing projects, identifying risks, ensuring quality assurance, recommending tools for reuse and automation, and supporting the integration team for better efficiency. In addition, you will ensure optimal Client Engagement by supporting pre-sales teams, negotiating and coordinating with client teams, demonstrating thought leadership, and acting as a trusted advisor. Join Wipro to reinvent your world and be a part of an end-to-end digital transformation partner with bold ambitions. Realize your ambitions in a business powered by purpose and empowered to design your reinvention. Applications from people with disabilities are explicitly welcome.,
Posted 2 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
At PwC, our team in infrastructure is dedicated to designing and implementing secure and robust IT systems that facilitate business operations. We focus on ensuring the smooth functioning of networks, servers, and data centers to enhance performance and reduce downtime. As part of the infrastructure engineering team at PwC, your role will involve creating and implementing scalable technology infrastructure solutions for our clients. This will encompass tasks such as network architecture, server management, and experience in cloud computing. We are currently seeking a Data Modeler with a solid background in data modeling, metadata management, and optimizing data systems. In this role, you will be responsible for analyzing business requirements, developing long-term data models, and maintaining the efficiency and consistency of our data systems. Key Responsibilities: - Analyze business needs and translate them into long-term data model solutions. - Evaluate existing data systems and suggest enhancements. - Define rules for data translation and transformation across different models. - Collaborate with the development team to design conceptual data models and data flows. - Establish best practices for data coding to ensure system consistency. - Review modifications to existing systems to ensure cross-compatibility. - Implement data strategies and create physical data models. - Update and optimize local and metadata models. - Utilize canonical data modeling techniques to improve system efficiency. - Evaluate implemented data systems for discrepancies, variances, and efficiency. - Troubleshoot and optimize data systems to achieve optimal performance. Key Requirements: - Proficiency in relational and dimensional modeling (OLTP, OLAP). - Experience with data modeling tools such as Erwin, ER/Studio, Visio, and PowerDesigner. - Strong skills in SQL and database management systems like Oracle, SQL Server, MySQL, and PostgreSQL. - Familiarity with NoSQL databases such as MongoDB and Cassandra, including their data structures. - Hands-on experience with data warehouses and BI tools like Snowflake, Redshift, BigQuery, Tableau, and Power BI. - Knowledge of ETL processes, data integration, and data governance frameworks. - Excellent analytical, problem-solving, and communication skills. Qualifications: - Bachelor's degree in Engineering or a related field. - 5 to 9 years of experience in data modeling or related areas. - 4+ years of practical experience in dimensional and relational data modeling. - Expertise in metadata management and relevant tools. - Proficiency in data modeling tools like Erwin, Power Designer, or Lucid. - Understanding of transactional databases and data warehouses. Preferred Skills: - Experience in cloud-based data solutions, such as AWS, Azure, and GCP. - Knowledge of big data technologies like Hadoop, Spark, and Kafka. - Understanding of graph databases and real-time data processing. - Certifications in data management, modeling, or cloud data engineering. - Strong communication and presentation skills. - Excellent interpersonal skills to collaborate effectively with diverse teams.,
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough