Home
Jobs

1467 Data Governance Jobs - Page 19

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

15.0 - 20.0 years

15 - 18 Lacs

Bengaluru

Work from Office

We are seeking a skilled and strategic Data and Analytics Executive to lead our data initiatives and build a high-performing data team. The ideal candidate will possess extensive experience in developing data teams, data architecture, analytics, and a strong ability to align data priorities with business goals. This role is essential in driving data-driven decision-making across the organization and ensuring that our data landscape serves our business objectives effectively. Key Responsibilities: 1. Team Building and Mentorship: Recruit, develop, and mentor a talented team of data professionals, including engineers, business analysts, data product managers, and visualization experts. Foster a culture of continuous learning and improvement within the team. Establish clear performance metrics and career development pathways for team members. 2. Goal Prioritization Based on Business Value: Collaborate with business teams to define and prioritize data and analytics initiatives that align with overall business strategy. Assess the potential business impact of various data projects to inform prioritization and resource allocation. Communicate project goals and progress to stakeholders to ensure alignment and transparency. 3. Design and Architecture of Data Landscape: Own the design and execution of the organization s data architecture, ensuring scalability, security, and accessibility. Evaluate and implement modern data technologies and frameworks that support business objectives. Collaborate with the Data Governance team and implement practices to maintain data quality, privacy, and compliance. 4. Development of Certified Data Products: Lead the development and deployment of data products that are reliable, scalable, and meet the needs of end users. Collaborate with product teams to identify opportunities for new data product development and enhancement. Establish testing and certification processes to validate data products effectiveness and reliability. 5. Ensuring Data Infrastructure Meets Business Requirements: Assess the organizations data infrastructure and identify opportunities for improvement to meet evolving business needs. Ensure that data pipelines, storage solutions, and analytics tools effectively support data integration and analysis. Monitor and improve the performance of data systems to ensure timely and accurate reporting. Required Qualifications: Bachelor s degree in computer science, Statistics, or a related field. Masters degree preferred. 15+ years of IT experience with a proven track record in building data teams with at least 5 years in a leadership role. Ability to build and lead high-performing teams in a dynamic and fast-paced environment. Strong understanding of data architecture, data governance, and analytics technologies. Excellent communication and interpersonal skills, with the ability to influence stakeholders at all levels.

Posted 2 weeks ago

Apply

6.0 - 8.0 years

6 - 9 Lacs

Bengaluru

Work from Office

Design and implement data extraction solutions from S4HANA. Create and enhance DataSources in S4 for master data and transaction data. Monitor and troubleshoot data loads, ensure data accuracy and timeliness. Design and build InfoObjects, DataStore Objects (DSOs), Multiproviders, and InfoCubes (as applicable). Ensure performance optimization and scalability in data models. Develop and enhance BEx Queries, Analysis for Office reports . Support business users in defining key KPIs, metrics, and drill-down capabilities. Provide training and documentation for end-users and key stakeholders. Analyze business requirements and translate them into technical specifications for reports . Collaborate with functional teams (FICO) to understand source data and business logic. Participate in requirement gathering, design workshops, and UAT sessions. Provide support for regression testing, go-live, and hypercare activities. Follow change management and transport procedures. Ensure compliance with data governance, security, and quality standards. Systems : S4HANA , SAC P , BW on HANA (7.4 Sp20) Reporting tools : Query designer ,Bex Analyzer , Analysis for Office Mandatory skills SAP BW (Data extraction/Data Modelling/Reporting) Desired/ Secondary skills Good to have S4HANA with CDS view built Max Vendor Rate in Per Day (Currency in relevance to work location) 12000 INR/day Delivery Anchor for tracking the sourcing statistics, technical evaluation, interviews and feedback etc

Posted 2 weeks ago

Apply

4.0 - 12.0 years

3 - 7 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Design, development and testing of chatbot solutions leveraging AWS Lex, Python, AWS Lambda, and AWS DynamoDB. Experience in building conversation flow. Collaborate with technical analyst / solution architects to assist with solution design. Take ownership and accountability beyond individual deliverables, always pushing the envelope in order to deliver awesome results for our Business stakeholders. Follow and promote best engineering practices like clean reusable code, Continuous Integration (CI), Test-Driven Development (TDD), Infrastructure as Code etc Provide technical software support, including investigation and qualification of bugs, interpreting procedure manuals, and maintaining accurate documentation Key Responsibilities Design, develop, and deploy solutions on Salesforce Data Cloud platform. Collaborate with stakeholders to gather requirements and translate them into technical specifications. Build custom applications, integrations, and data pipelines using Salesforce Data Cloud tools and technologies. Develop and optimize data models to support business processes and reporting needs. Implement data governance and security best practices to ensure data integrity and compliance. Perform troubleshooting, debugging, and performance tuning of Salesforce Data Cloud solutions. Stay current with Salesforce Data Cloud updates, best practices, and industry trends. Provide technical guidance and support to other team members and end-users. Document solution designs, configurations, and customizations.

Posted 2 weeks ago

Apply

8.0 - 12.0 years

12 - 17 Lacs

Mumbai

Work from Office

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. Youll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, youll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, youll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in the integration efforts between Alation and Manta, ensuring seamless data flow and compatibility. Collaborate with cross-functional teams to gather requirements and design solutions that leverage both Alation and Manta platforms effectively.. Develop and maintain data governance processes and standards within Alation, leveraging Manta's data lineage capabilities.. Analyze data lineage and metadata to provide insights into data quality, compliance, and usage patterns Preferred technical and professional experience Lead the evaluation and implementation of new features and updates for both Alation and Manta platforms Ensuring alignment with organizational goals and objectives. Drive continuous improvement initiatives to enhance the efficiency and effectiveness of data management processes, leveraging Alati

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Hyderabad

Work from Office

As Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. Youll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, youll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, youll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience in the integration efforts between Alation and Manta, ensuring seamless data flow and compatibility. Collaborate with cross-functional teams to gather requirements and design solutions that leverage both Alation and Manta platforms effectively.. Develop and maintain data governance processes and standards within Alation, leveraging Manta's data lineage capabilities.. Analyze data lineage and metadata to provide insights into data quality, compliance, and usage patterns Preferred technical and professional experience Lead the evaluation and implementation of new features and updates for both Alation and Manta platforms Ensuring alignment with organizational goals and objectives. Drive continuous improvement initiatives to enhance the efficiency and effectiveness of data management processes, leveraging Alati

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Job Description Job Purpose ICE Data Services, a subsidiary of ICE, has an exciting opportunity within our third-party Market Data Governance team. The Market Data Governance team is responsible for the governance, oversight and administration of third-party market data usage rights within the organization. Responsibilities Summarize and document market data usage policies and distribution rights Assist with processing source mandated changes to usage policies and pricing by updating documentation, maintaining internal entitlement systems and preparing materials to communicate changes both internally and externally to clients Administer system for ICE product entitlement systems, including setting up new services, products and other features Review and resolve inquiries related to usage policies, pricing and billing issues, as well as entitlement system administration Provide support to the external audit defense team by researching audit related inquiries, providing entitlement system audit reports, and tracking remediation items Assist with various compliance related projects Knowledge and Experience Experience in market data industry, including compliance and contract management/ negotiation background a plus Familiarity with real time, end-of-day and derived data use cases Ability to analyze complex policy data (contracts and agreements, fee schedules, new rule announcements) to identify adjustments to current systems and processes Highly effective written and verbal communication skills Strong analytical and problem-solving skills Proficiency in the standard Microsoft Office suite (Excel, Word, Access); knowledge of Microsoft Visio a plus Proficiency with SQL Ability to learn new software applications and systems Adept at creating and maintaining effective relationships through strong interpersonal skills

Posted 2 weeks ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Gurugram

Work from Office

About the Role: Seeking a highly skilled Senior Data Engineer with 8 years of experience to join our dynamic team. Requirements: Experienced in architecting, building and maintaining end-to-end data pipelines using Python and Spark in Databricks Proficient in designing and implementing scalable data lake and data warehouse solutions on Azure including Azure Data Lake, Data Factory, Synapse and Azure SQL Hands on experience in leading the integration of complex data sources and the development of efficient ETL processes Champion best practices in data governance, data quality and data security across the organization Adept in collaborating closely with data scientists, analysts and business stakeholders to deliver high-impact data solutions. #LI-MS2 #LI-Onsite

Posted 2 weeks ago

Apply

7.0 - 17.0 years

9 - 13 Lacs

Gurugram

Work from Office

Job Title: MDG developers Location: Chennai, Bangalore, Hyderabad, Noida, Gurugram, Jaipur Experience: 7-17 Years Work Mode: Contract Duration: 6Months (Hybrid) Shift Timings: 2-11 PM Job Summary: We are seeking a talented and detail-oriented MDG Developer to join our growing SAP team. The ideal candidate will have hands-on experience with SAP Master Data Governance (MDG) , particularly in developing and enhancing MDG frameworks for master data objects such as Customer, Vendor, Material, and Finance . This role involves working closely with functional consultants, data governance teams, and business stakeholders to build and maintain high-quality master data solutions. Required Skills: Strong hands-on experience in SAP MDG development Proficiency in ABAP , including OO ABAP , BADI , and enhancement frameworks Experience with UI modeling using FPM and Web Dynpro ABAP Familiarity with BRF+ , Rule-based workflows , and Change Requests Good understanding of data replication (DRF) and integration with SAP ECC/S/4HANA and external systems Experience in MDG Consolidation and Mass Processing (preferred) Strong problem-solving skills and ability to work independently

Posted 2 weeks ago

Apply

3.0 - 8.0 years

8 - 11 Lacs

Mumbai

Work from Office

Power BI Data Specialist Job Details | atlascorp Search by Keyword Search by Location Select how often (in days) to receive an alert: Select how often (in days) to receive an alert: Power BI Data Specialist The Atlas Corp. and Seaspan teams are goal-driven and share a high-performance culture, focusing on building services offerings to become a leading asset manager. Seaspan provides many of the worlds major shipping lines with alternatives to vessel ownership by offering long-term leases on large, modern containerships and pure car, truck carriers (PCTCs) combined with industry leading ship management serves. Seaspans fleet has evolved over time to meet the varying needs of our customer base. We own vessels in a wide range of sizes, from 2,500 TEU to 24,000 TEU vessels. As a wholly owned subsidiary of Atlas Corp, Seaspan delivers on the companys core strategy as a leading asset management and core infrastructure company. Position Description: We are seeking a skilled and detail-oriented Power BI Data Specialist to join our Data Operations team. The ideal candidate will have strong SQL skills, a solid understanding of data warehousing concepts, and hands-on experience with Power BI development and administration. This role involves ensuring data quality and integrity, supporting business units with analytical needs, and enhancing the organizations data infrastructure through the addition of new dimensions, columns, and other data model updates. Experience with Databricks and big data technologies and Power Apps/Automate is considered a strong asset. Job Responsibilities: Data Analysis and Quality Assurance: Perform regular data audits and quality checks to ensure the accuracy and consistency of enterprise data. Investigate and resolve data issues in collaboration with business units. Proactively identify data anomalies and opportunities for improvement. Data Warehouse Support: Understand the structure and flow of data within the enterprise data warehouse. Add and maintain data warehouse dimensions, attributes, and measures as required. Collaborate with data engineers to optimize data structures and queries. Reporting and Visualization: Design, develop, and publish interactive Power BI dashboards and reports to meet business requirements. Maintain and administer Power BI workspaces, datasets, and security. Support report lifecycle management, ensuring reports are relevant, accurate, and up to date. Collaboration and Stakeholder Engagement: Work closely with business units to gather requirements and deliver analytical insights. Act as a liaison between business and technical teams to ensure data needs are met. Technical Tool Expertise: Write efficient SQL queries to extract, transform, and analyze data from various sources. Utilize Power BI for data visualization, modeling, and reporting. Job Responsibilities: Bachelor s degree in Computer Science, Information Systems, or a related field. 3+ years of experience in a Data Analyst or BI Analyst role. Hands-on experience with both Power BI development and administration. Strong proficiency in SQL for querying and manipulating data. Solid understanding of data warehousing concepts and practices. Familiarity with data quality frameworks and best practices. Excellent analytical thinking and problem-solving skills. Strong communication skills and ability to work collaboratively with non-technical stakeholders. Requirements: Experience with Databricks or similar big data platforms. Exposure to ETL tools and data integration processes. Understanding of data governance and security principles. Knowledge of Power Apps and Power Automate. Power BI or Microsoft Data certifications. Job Demands and/or Physical Requirements: As Seaspan is a global company, occasional work outside of regular office hours may be required. Atlas Corp. and Seaspan Corporation are an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to age, race, colour, religion, gender, sexual orientation, gender identity, national origin, disability, or protected Veteran status. We thank all applicants in advance. If your application is shortlisted to be included in the interview process, one of our team will be in contact with you. A Week at Sea with Seaspan - YouTube 1.13K subscribers Tap to unmute If playback doesnt begin shortly, try restarting your device. More videos on YouTube An error occurred while retrieving sharing information. Please try again later. A WEEK AT SEA WITH SEASPAN This video was taken on board our 4250 TEU vessel, Seaspan Santos, during a six-day passage in September 2013. JOIN THE BEST AT SEA When you join the Seaspan family, you become a part of a company with one of the newest and most advanced fleets in the industry. Our modern vessels are maintained to the highest standards and offer excellent living conditions for our seafarers.

Posted 2 weeks ago

Apply

3.0 - 7.0 years

6 - 10 Lacs

Hyderabad

Work from Office

JD for Power Bi. Key Responsibilities: Power BI Development: Design and develop interactive dashboards, reports, and visualizations using Power BI, ensuring that they are user-friendly and meet business requirements. Data Modeling: Create efficient data models by integrating data from multiple sources, ensuring consistency, accuracy, and integrity. Implement relationships, calculated columns, and measures using DAX (Data Analysis Expressions). Data Integration: Work with various data sources (SQL Server, Excel, SharePoint, APIs, etc.) to extract, transform, and load (ETL) data into Power BI datasets for analysis. Report Optimization: Optimize Power BI reports and dashboards for performance and scalability. Implement best practices for report design and visualization. Collaboration with Stakeholders: Work closely with business stakeholders to understand their reporting needs and translate them into effective BI solutions. Provide technical support and advice to end-users. Data Analysis & Insight Generation: Analyze complex data sets and provide actionable insights to support business decision-making. Identify trends, patterns, and opportunities for improvement. Data Governance & Security: Ensure data governance standards are adhered to, including setting up proper security roles, user access, and data privacy measures in Power BI workspaces. Troubleshooting & Support: Address any issues with Power BI reports and dashboards. Provide technical support to users and troubleshoot any data or visualization errors. Documentation & Training: Document Power BI reports, dashboards, and data models for future reference. Provide training sessions to business users to help them effectively use the Power BI reports and visualizations. Continuous Improvement: Stay up-to-date with the latest Power BI features and best practices, incorporating them into your work to improve report functionality and user experience.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

35 - 50 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Overall Work Experience - 8 years+ Skillset - Must Have: Strong DG and Data Management Concepts, Collibra or IDMC and Excellent Communication Work with Customers onshore team collaboratively to support following initiatives: Interface with business stakeholders, understand their data and analytics needs, establish requirement with technical stakeholders and align on delivery plan. Understand various data sources around asset classes, portfolio, historical performances, market trends etc. and Develop/enhance data documentation. Help deliver data-driven analysis and recommendations that effectively influence business decisions. Extract data, perform data cleansing / data quality checking tasks, prepare data quality reports, and model ready data. Synthesize different sources of data into a single source while conducting data quality checks, applying relevant filters etc. Develop Power BI dashboards to define procedures/metrics for reporting data quality. Candidate Profile: Over 5 years of experience in data analytics, governance, and business analysis Strong understanding of data analytics and ability to derive actionable insights Skilled in developing strategic project roadmaps and aligning data initiatives with business goals Proactive in proposing suggestions and providing regular project updates to stakeholders Capable of writing SQL and Python code to troubleshoot and resolve data quality issues Hands-on experience with data governance frameworks; Collibra knowledge helpful but not mandatory. Strong comprehension of metadata strategies and real-world use cases Excellent communication skills and ability to work across business and technical teams Familiar with technology stack: SQL, Snowflake, Power BI Experience with IceDQ (a plus) Understanding of investment fundamentals is a valuable asset Detail-oriented, self-motivated, and adept at cross-functional collaboration

Posted 2 weeks ago

Apply

7.0 - 9.0 years

15 - 18 Lacs

Kolkata, Hyderabad, Chennai

Hybrid

Key Skills & Responsibilities Strong hands-on experience with Snowflake database design, coding, and documentation. Expertise in performance tuning for both Oracle and Snowflake. Experience as an Apps DBA, capable of coordinating with application teams. Proficiency in using OEM, Tuning Advisor, and analyzing AWR reports. Strong SQL skills with the ability to guide application teams on improvements. Efficient management of compute and storage in Snowflake architecture. Execute administrative tasks, handle multiple Snowflake accounts, and apply best practices. Implement data governance via column-level security, dynamic masking, and RBAC. Utilize Time Travel, Cloning, replication, and recovery methods. Manage DML/DDL operations, concurrency models, and security policies. Enable secure data sharing internally and externally. Locations : Multiple location (Bangalore , Hyderabad , Chennai , kolkata , Mumbai , Pune , Gurugram) Interview Mode: Virtual (2 Rounds) Skills: documentation,apps dba,replication,dml/ddl operations,performance tunning,compute management,rbac,security policies,coding,recovery methods,dynamic masking,cloning,oem,storage management,performance tuning,secure data sharing,time travel,tuning advisor,snowflake database design,snowflake,column-level security,data governance,concurrency models,column-level security, dynamic masking, and rbac,oracle,awr reports analysis,sql

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Title : S&C Global Network - Strategy - MC - Industry X - Smart Connected Professional Title : S&C Global Network - Strategy - MC - Industry X - Smart Connected Products- Consultant Job Title - + + S&C Global Network) > Management Level:07 - Manager Location:Bangalore/ Gurgaon/Pune/Mumbai Must have skills:Business Process Consulting Additional Skills: Problem definition, Architecture, Design, R&D, Innovation mgmt., PLM, BOM, Digital Twin and Thread space, Process Excellence, Digital Transformations, SAP PLM Packages (PDM functional knowledge and configuration, PPM, Master Data, EBOM, PS) with strong functional and implementation knowledge Job Summary : Looking for Self-Driven and Seasoned Senior Manager/Manager with exceptional skills in coordinating, organizing and supporting execution of transformations/improvements in PLM Programs for our clients and to build and grow Engineering and R&D Digitization team.As Senior Manager/Manager in Engineering and R&D Digitization, will need to work closely with leadership to define and deliver in the areas of PLM Enablement, BOM Management, Master Data Management and Digital Twin & Thread Roles & Responsibilities: Key responsibilities include: Lead Engineering and R&D Transformation Programs to drive Innovation and Process Enablement for the Clients Lead and Curate relevent assets, offering in PLM Enablement, Integrated BOM, Product & Engineering Master Data Management and Digital Twin and Thread areas and develop and execute Go To Market for the same along with Leadership In-depth understanding of Product Data Management and able to drive Product Journey with capabilities in defining PLM Roadmap, Process Design, Value Realization and PLM Maturity Assessment areas Experience in Master/Material Data Management and Data Migration Tools and solutions that meet our clients needs in innovative ways. Enabling transformation in R&D utilizing the SAP PLM capabilities by creating business processes for Package/Product design, Bill of Material Management, Engineering Change Management, Product Research, Simulations, Prototyping, Product Testing (qualitative & quantitative) and supplier integration. Professional & Technical Skills: At least 10 years of experience in Business Process Consulting, problem definition, Architecture/Design /Detailing of Processes At least 7 years of experience in SAP PLM Packages (PDM functional knowledge and configuration, PPM, Master Data, EBOM, PS) with strong functional and implementation knowledge as well as general Project Management and Customer Management skills. At least 6 years of industry experience with SAP PLM package implementations which includes strong knowledge in configuration, Agile architecture and all its components. Experience in Classification Migration, Master Data Cleansing and Engineering Master Data experience is preferred. At least 5 years of experience in Configuration/solutions evaluation/ Validation and deployment Project Management Experience with strong communication and teamwork skills Ability to work in Global Environment using Onshore Offshore model Sensitivity and skill at working with different cultures and styles Rapidly learn and apply new engineering technologies and exposure to other PLM tools Additional Information: Experience of working in PLM, BOM, Master Data Management and Digital Twin and Thread space Expert in SAP PLM, Process Excellence, Data Governance, Digital Transformations and shaping end to end Engineering Transformations Concrete experience leading complex PLM Solution Design across multiple industries Ability to work in a rapidly changing environment where continuous innovation is desired. Analytical and quantitative skills and the ability to use hard data and metrics to back up assumptions and develop business cases. Ability to clearly communicate these data insights to others. General Manager / owner mentality, work closely with Team to deliver At least 6 years of industry experience with SAP PLM package implementations which includes strong knowledge in configuration, Agile architecture and all its components. Experience in Classification Migration, Master Data Cleansing and Engineering Master Data experience is preferred. At least 5 years of experience in Configuration/solutions evaluation/ Validation and deployment Project Management Experience with strong communication and teamwork skills Ability to work in Global Environment using Onshore Offshore model Sensitivity and skill at working with different cultures and styles Rapidly learn and apply new engineering technologies and exposure to other PLM toolsQualificationExperience:Minimum 5 years of experience is requiredEducational Qualification:Engineering & MBA Preferred

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 18 Lacs

Kolkata, Hyderabad, Chennai

Hybrid

Key Skills & Responsibilities Strong hands-on experience with Snowflake database design, coding, and documentation. Expertise in performance tuning for both Oracle and Snowflake. Experience as an Apps DBA, capable of coordinating with application teams. Proficiency in using OEM, Tuning Advisor, and analyzing AWR reports. Strong SQL skills with the ability to guide application teams on improvements. Efficient management of compute and storage in Snowflake architecture. Execute administrative tasks, handle multiple Snowflake accounts, and apply best practices. Implement data governance via column-level security, dynamic masking, and RBAC. Utilize Time Travel, Cloning, replication, and recovery methods. Manage DML/DDL operations, concurrency models, and security policies. Enable secure data sharing internally and externally. Locations : Multiple location (Bangalore , Hyderabad , Chennai , kolkata , Mumbai , Pune , Gurugram) Interview Mode: Virtual (2 Rounds) Skills: documentation,apps dba,replication,dml/ddl operations,performance tunning,compute management,rbac,security policies,coding,recovery methods,dynamic masking,cloning,oem,storage management,performance tuning,secure data sharing,time travel,tuning advisor,snowflake database design,snowflake,column-level security,data governance,concurrency models,column-level security, dynamic masking, and rbac,oracle,awr reports analysis,sql

Posted 2 weeks ago

Apply

5.0 - 7.0 years

15 - 18 Lacs

Mumbai

Work from Office

Business Analyst focused on designing and implementing a Data Governance strategy and Master Data Management (MDM) framework. This role will support the high-level design and detailed design phases of a transformative project involving systems such as 3DS PLM, SAP, Team Centre, and Blue Yonder. The ideal candidate will bring a blend of business analysis expertise, data governance knowledge, and automotive/manufacturing domain experience to drive workshops, map processes, and deliver actionable recommendations. Working closely with the GM of Master Data and MDM technical resources, you will play a pivotal role in aligning people, processes, and technology to achieve M&Ms data governance and MDM objectives. Key Responsibilities Requirements Gathering & Workshops: Lead and facilitate workshops with business and IT stakeholders to elicit requirements, define data governance policies, and establish MDM strategies for automotive specific data domains (e.g., parts, engineering data, bill of material, service parts, supplier and dealer master data). Process Mapping & Design: Document and design master data-related processes, including data flows between systems such as 3DS, SAP, Talend, and Blue Yonder, ensuring alignment with business needs and technical feasibility. Analysis & Recommendations: Analyse existing data structures, processes, and system integrations to identify gaps and opportunities; provide clear, actionable recommendations to support Data Governance and MDM strategy. Stakeholder Collaboration: Act as a bridge between business units, IT teams, and technical resources (e.g., 3DS specialists) to ensure cohesive delivery of the project objectives. Documentation & Communication: Create high-quality deliverables, including process maps, requirement specifications, governance frameworks, and summary reports, tailored to both technical and non-technical audiences. Support Detailed Design: Collaborate with the 3DS/Talend technical resource to translate high-level designs into detailed MDM solutions, ensuring consistency across people, process, and technology components. Project Support: Assist the MDM Leadership in planning, tracking, and executing project milestones, adapting to evolving client needs. Experience Required Skills & Qualifications 5+ years of experience as a Business Analyst, with a focus on data governance, master data management (MDM) such as Talend, Informatica, Reltio etc. Proven track record of working on auto/manufacturing industry projects, ideally with exposure to systems like 3DS, Team Centre, SAP S/4HANA, MDG, or Blue Yonder. Technical Knowledge Strong understanding of MDM concepts, data flows, and governance frameworks. Familiarity with auto-specific data domains (e.g., ECCMA/E-Class Schema). Experience with process modelling tools (e.g., Visio, Lucid chart, or BPMN) and documentation standards. Soft Skills Exceptional communication and facilitation skills, with the ability to engage diverse stakeholders and drive consensus in workshops. Methodical and structured approach to problem-solving and project delivery. Ability to summarize complex information into clear, concise recommendations. Education: Bachelors degree in business, Information Systems, or a related field (or equivalent experience). Certifications: Relevant certifications (e.g., CBAP, PMP, or MDM-specific credentials) are a plus but not required. Preferred Qualifications Prior consulting experience in a client-facing role. Hands-on experience with MDG, Talend, Informatica, Reltio etc. or similar MDM platforms. Exposure to data quality analysis or profiling (not required to be at a Data Analyst level) Skills: 3ds,eccma/e-class schema,data governance policies,high-quality deliverables,data flows,talend,governance frameworks,mdm platforms,mdm strategy,mdm strategies,team centre,mdm leadership,mdm objectives,m&m's data governance,master data management (mdm),sap,process modelling tools,cbap,master data,stakeholder collaboration,process mapping,visio,pmp,data quality analysis,mdg,data governance,informatica,3ds plm,data governance knowledge,problem-solving,workshop facilitation,mdm,mdm concepts,bpmn,auto-specific data domains,mdm-specific credentials,blue yonder,communication skills,reltio,sap s/4hana,lucid chart,profiling

Posted 2 weeks ago

Apply

5.0 - 8.0 years

18 - 30 Lacs

Hyderabad

Remote

Key Responsibilities Atlan Deployment & Connector Setup : Configure Atlan (SaaS or private cloud), set up connectors for Data Bricks, Hadoop, Power BI, and schedule metadata ingestion pipelines. Metadata Modeling & Domain Onboarding : Work with domain owners to map schemas, define custom metadata attributes (sensitivity, owner, SLA), and create standardized ingestion playbooks for new data domains. Lineage Instrumentation & Data Profiling : Instrument column-level lineage via OpenLineage or native Atlan connectors, configure automated profiling jobs (row counts, null rates) to surface data quality metrics. Governance Policy Implementation : Translate policies (PII detection, data retention) into Atlans rule engine, configure RBAC and SSO/LDAP integration, and implement encryption/masking for sensitive datasets. Monitoring & Troubleshooting : Build monitoring dashboards (CloudWatch, Grafana) to track ingestion health and API errors, diagnose pipeline failures, and coordinate fixes with Atlan Support and source-system teams. Required Qualifications 3-5 years implementing or supporting ( hands-on ) Atlans platform (catalog, lineage, policy automation) (Good to have: Collibra, Alation experience). Proficiency in Python or JavaScript for API integrations, strong SQL skills, and hands-on experience with ETL/ELT frameworks. Familiarity with cloud platforms (AWS/GCP), containerization (Kubernetes/Docker), and scripting infrastructure as code (Terraform, CloudFormation). Solid understanding of metadata concepts (technical vs. business metadata, lineage, profiling) and data classification schemes (PII, PCI, PHI). Strong stakeholder-engagement skills: able to run onboarding sessions and create clear runbooks. Bachelor’s in computer science, Data Engineering, or related field; relevant cloud or Atlan certifications a plus

Posted 2 weeks ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data modeling and database design principles.- Experience with data warehousing solutions and ETL tools.- Familiarity with cloud platforms such as AWS, Azure, or Google Cloud.- Knowledge of data governance and data quality frameworks. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Bengaluru

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data modeling and database design principles.- Experience with data warehousing solutions and ETL tools.- Familiarity with cloud platforms such as AWS, Azure, or Google Cloud.- Knowledge of data governance and data quality frameworks. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Hyderabad

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand data requirements and contribute to the overall data strategy of the organization, ensuring that data is accessible, reliable, and actionable for stakeholders. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the design and implementation of data architecture and data models.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Strong understanding of data integration techniques and ETL processes.- Experience with data quality frameworks and data governance practices.- Familiarity with cloud platforms and services related to data storage and processing.- Knowledge of programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Hyderabad

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data modeling and database design principles.- Experience with data warehousing solutions and ETL tools.- Familiarity with cloud platforms such as AWS, Azure, or Google Cloud.- Knowledge of data governance and data quality best practices. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Hyderabad

Work from Office

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : PySpark Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in PySpark.- Strong understanding of data modeling and database design principles.- Experience with data warehousing solutions and ETL tools.- Familiarity with cloud platforms such as AWS, Azure, or Google Cloud.- Knowledge of data governance and data quality best practices. Additional Information:- The candidate should have minimum 5 years of experience in PySpark.- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Gurugram

Work from Office

Develop, test and support future-ready data solutions for customers across industry verticals. Develop, test, and support end-to-end batch and near real-time data flows/pipelines. Demonstrate understanding of data architectures, modern data platforms, big data, analytics, cloud platforms, data governance and information management and associated technologies. Communicates risks and ensures understanding of these risks. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Graduate with a minimum of 6+ years of related experience required. Experience in modelling and business system designs. Good hands-on experience on DataStage, Cloud-based ETL Services. Have great expertise in writing TSQL code. Well-versed with data warehouse schemas and OLAP techniques. Preferred technical and professional experience Ability to manage and make decisions about competing priorities and resources. Ability to delegate where appropriate. Must be a strong team player/leader. Ability to lead Data transformation projects with multiple junior data engineers. Strong oral written and interpersonal skills for interacting throughout all levels of the organization. Ability to communicate complex business problems and technical solutions.

Posted 2 weeks ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

As an Software Developer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include: Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modelling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Bachelor’s degree in computer science, Supply Chain, Information Systems, or related field. Minimum of 5-7 years of experience in Master Data Management or related field. 3-5 years SAP experience / ERP experience required with strong exposure to at least two of the functional areas described above – Proven experience in leading an MDM team. Strong knowledge of data governance principles, best practices, and technologies. – Experience with data profiling, cleansing, and enrichment tools. Ability to work with cross-functional teams to understand and address their master data needs. Proven ability to build predictive analytics tools using PowerBI, Spotfire or otherwise Preferred technical and professional experience You thrive on teamwork and have excellent verbal and written communication skills. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 2 weeks ago

Apply

1.0 - 4.0 years

3 - 6 Lacs

Hyderabad

Work from Office

Job Description Job Purpose ICE Data Services, a subsidiary of ICE, has an exciting opportunity within our third-party Market Data Governance team. The Market Data Governance team is responsible for the governance, oversight and administration of third-party market data usage rights within the organization. Responsibilities Summarize and document market data usage policies and distribution rights Assist with processing source mandated changes to usage policies and pricing by updating documentation, maintaining internal entitlement systems and preparing materials to communicate changes both internally and externally to clients Administer system for ICE product entitlement systems, including setting up new services, products and other features Review and resolve inquiries related to usage policies, pricing and billing issues, as well as entitlement system administration Provide support to the external audit defense team by researching audit related inquiries, providing entitlement system audit reports, and tracking remediation items Assist with various compliance related projects Knowledge and Experience Experience in market data industry, including compliance and contract management/ negotiation background a plus Familiarity with real time, end-of-day and derived data use cases Ability to analyze complex policy data (contracts and agreements, fee schedules, new rule announcements) to identify adjustments to current systems and processes Highly effective written and verbal communication skills Strong analytical and problem-solving skills Proficiency in the standard Microsoft Office suite (Excel, Word, Access); knowledge of Microsoft Visio a plus Proficiency with SQL Ability to learn new software applications and systems Adept at creating and maintaining effective relationships through strong interpersonal skills

Posted 2 weeks ago

Apply

6.0 - 10.0 years

14 - 15 Lacs

Bengaluru

Work from Office

At Broadridge, weve built a culture where the highest goal is to empower others to accomplish more. If you re passionate about developing your career, while helping others along the way, come join the Broadridge team. 1. 6-10 years of experience as a Product Owner, with demonstrated expertise in managing complex data elements across multiple products. 2. Capable of defining and communicating a clear vision for data products, ensuring alignment with the organizations priorities and goals. 3. Collaborate with multiple cross-functional teams - including engineering, product and business SMEs - to gather and understand their data needs and translate them into actionable requirements. 4. Design and implement a robust, scalable, and transparent intake process for collecting, evaluating, prioritizing, and tracking data-related requests from across the organization. 5. Own the product backlog, ensuring stories and tasks are clear, complete, and prioritized according to business value and technical feasibility. 6. Partner with technical teams to deeply understand source systems, data models, data pipelines, data quality, and data governance issues. Facilitate discussions around data architecture, accessibility, and integration. 7. Define clear, comprehensive user stories and acceptance criteria based on business and technical requirements, especially for complex data use cases and integrations. 8. Ensure effective communication and change management practices are in place as data products evolve and are rolled out to users. We are dedicated to fostering a collaborative, engaging, and inclusive environment and are committed to providing a workplace that empowers associates to be authentic and bring their best to work. We believe that associates do their best when they feel safe, understood, and valued, and we work diligently and collaboratively to ensure Broadridge is a company and ultimately a community that recognizes and celebrates everyone s unique perspective.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies