Home
Jobs

1386 Data Governance Jobs - Page 12

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

25 - 30 Lacs

Pune

Work from Office

Naukri logo

The Data & Analytics team is looking for a Databricks Platform Engineer to drive our mission to unlock potential of data assets on AWS cloud by consistently innovating, and moving data to our Cloud platforms and making our business engineering team consume it with the set standards and principles. Our Cloud Data Platform (Databricks) is being used by various groups to derive insights, perform Machine Learning and Data Science activities which in turn supports various revenue generating initiatives across the organization. Role - Databricks Platform Engineer Design and develop high quality, secure, scalable software solutions based on technical requirements specifications and design artifacts within expected time and budget Develop and build solutions on cloud data platforms - Databricks - adhering to industry standards. Develop and build Databricks as a platform for the engineering community to use Build governance framework for cloud data storage and usage for enterprise data platforms. Collaborate on cross-functional agile teams - that include Developers, Report/BI Developers and Product Owners Work closely with Product Owners to understand cloud use cases and determine the best way to implement those. Provide technology leadership in the cloud big data space to development and product teams. Provide technology leadership to other leaders in big data and cloud data platform space. Stay abreast of Data Platform technology trends and industry best practices to hone and maintain your talent Participate in architectural discussions , iteration planning, and feature sizing meetings Adhere to Agile processes and participate actively in agile ceremonies Stakeholder management skills All About You 8-12 years of hands on experience with Cloud data platform Hands on experience in Databricks and deep understanding of its architecture Experience managing, developing and supporting data lakes, lakehouses, warehouse on premise and on cloud, ETL solutions and other analytics solutions. Exposure to technology trends like R, Python and similar technology is added advantage. Experience working with development/engineering teams in a global environment In depth practical experience on native cloud services - Azure and/or AWS. Strong understanding of cloud DevOps practices and implementation. Experience in cloud data migration for large sized enterprise data warehouse environments. Understanding of cloud data governance include compute optimization, cost control, user profile management etc In depth understanding of cloud operations strategy and user management across various cloud providers and cloud platforms. Strong understanding and hands-on on cloud security models, encryption strategy, network layers for on-prem + cloud hybrid model and other related concepts. Experience in establishing new engineering team and working towards taking the team to steady state Understanding of database technologies like Exadata/Oracle/Netezza/SQL server databases Understanding of SDLC and experience in establishing processes, standards and governance to bring efficiency within development team

Posted 1 week ago

Apply

4.0 - 9.0 years

8 - 13 Lacs

Gurugram

Work from Office

Naukri logo

I. Job Summary This intermediate level position is part of a team responsible for the configuration and support of software application systems within the People Organization. As part of the HR Technology team, this role provides basic technical and analytical support delivering HR processes. With experienced team members, may provide input for delivering HR processes. II. Essential Duties and Responsibilities To perform this job successfully, an individual must be able to perform each essential duty satisfactorily. Other duties may be assigned. Monitors HR systems, open cases and reviews current processes to troubleshoot application related issues and answers system related questions. Seeks development on the job and through more formal training. Performs analysis and documents current and new processes to take advantage of technology Makes required configuration changes according to documented requirements. Advises on impacts to current configuration and downstream applications. Maintains foundational workforce structures, data fields and processes. Analyzes impact of configuration of tables, data fields, foundational structures and processes on downstream systems and integrations. Ensures data integrity and governance by supporting data imports and extracts and validating accuracy through reporting and queries. Supports integrations/file transfers. With guidance, analyzes new software application products or new modules in existing applications. Provides day to day support and maintenance for system(s), preparation for releases, upgrades and/or patches. Executes testing, reporting and analysis of changes. Monitors open tickets/vendor escalations for progress. Executes unit, integration and acceptance testing. Working with the functional team, provides screen shots and system steps for testing and change management. May be responsible for configuring and delivering basic reports and queries utilizing delivered software. Follows established data governance. Documents all configuration. III. Supervisory Responsibilities No supervisory duties. IV. Qualifications The requirements listed below are representative of the qualifications necessary to perform the job. A. Education and Experience Education: bachelors Degree (accredited), or in lieu of degree, High School Diploma or GED (accredited) and four (4) years of relevant work experience. Experience: Two (2) years of previous experience (in addition to education requirement). B. Certificates, Licenses, Registrations or Other Requirements None required. C. Other Knowledge, Skills or Abilities Required Hands on configuration of application(s), and supporting releases, patches, upgrades and/or enhancements. Excellent written and verbal communication skills. Strong organizational and analytical skills. Ability to provide efficient, timely, reliable and courteous service to customers. Ability to effectively present information. V. Work Environment Listed below are key points regarding environmental demands and work environment of the job. Reasonable accommodations may be made to enable individuals with disabilities to perform the essential functions of the job. Required to use motor coordination with finger dexterity (such as keyboarding, machine operation, etc) most of the work day; Required to exert physical effort in handling objects less than 30 pounds rarely; Required to be exposed to physical occupational risks (such as cuts, burns, exposure to toxic chemicals, etc) rarely; Required to be exposed to physical environment which involves dirt, odors, noise, weather extremes or similar elements rarely; Normal setting for this job is: office setting. Must be available to work standard business hours, as we'll as be available to work non-standard hours in case of emergency (natural disasters, power outages, etc). May need to attend after hours calls with the offshore team

Posted 1 week ago

Apply

3.0 - 8.0 years

11 - 15 Lacs

Gurugram

Work from Office

Naukri logo

3+ Years of experience with Microsoft Purview Excellent communication and communication skills, with the ability to effectively liaise with both technical and non-technical stakeholders. Capable of generating accurate, comprehensive as-built documentation representing the total output of work delivered to the client. Strong ability to create a positive impression on clients and maintain confidence while guiding client IT teams in enterprise deployments of Purview. This includes navigating various client challenges, attitudes, concerns and expectations while achieving technical success. Strong analytical, problem-solving, and troubleshooting skills Role Designing and implementing Data Security solutions and capabilities that are clearly aligned to their business, technology, and threat drivers. Implementing Microsoft Information Protection and Microsoft Purview suite unified data governance solutions within a complex business environment, through requirements gathering, building, testing, and production roll-out. Demonstrating proven problem-solving skills with an emphasis on tool implementation and integration. Act as a subject matter expert for Microsoft Purview unified data governance solutions that manage data services across on-premises, multi-cloud, and software-as-a-service (SaaS) estate. Create an up-to-date map of entire data estate that includes data classification and end-to-end lineage. Identify where sensitive data is stored in estate Create a secure environment for data consumers to find valuable data Generate insights about how data is stored and used Manage access to the data in your estate securely and at scale Have Hands-on knowledge and capability to build Proof of Concept solutions and integrations with Workflow Management, Identity and Security Operations Developing strategy and roadmap, operating model, policies/standards, and tool design and process documents. Evaluating new solutions and services, providing a business case on whether the firm should develop skills and vendor relations within new Data Security solutions and technologies. Identifying and addressing client needs and building relationships with clients. Demonstrating documentation and presentation skills. Strong critical thinking and problem-solving skills with clear communication The ability and mindset to fully own the production environment; to identify production issues, design and develop enhanced monitoring solutions and automate fixes for those issues.

Posted 1 week ago

Apply

10.0 - 15.0 years

9 - 13 Lacs

Hyderabad

Work from Office

Naukri logo

ServiceNow is seeking a seasoned product management leader to drive the end-to-end lifecycle of Generative AI Platform capabilities, with a strong focus on data infrastructure, AI readiness, and trusted AI delivery. This leader will closely partner with engineering, platform and horizontal product teams, and go-to-market functions to deliver AI-native platform features that power next-gen applications across multiple use cases. The ideal candidate is deeply passionate about Gen AI platforms, understands the critical role of data in building and scaling AI products, and thrives in fast-paced, ambiguous environments. Own the product strategy and roadmap for Gen AI platform services, with emphasis on data ingestion, model lifecycle management, grounding, prompt orchestration, and output validation. Partner with engineering and design to build robust, scalable platform components that address the unique challenges of Gen AI deployment in the enterprise. Drive requirements and integration strategies for data governance, vector databases, LLM evaluation tooling, and observability throughout the AI lifecycle. Leverage existing ServiceNow capabilities while identifying key innovations needed to unlock the full value of Gen AI across the product portfolio. Influence horizontal and vertical product teams to adopt common Gen AI and data standards, ensuring reuse, scalability, and trust. Collaborate with Outbound partners to deliver ecosystem-aligned, data-powe'red solutions to market. Analyze competitive Gen AI platform trends and identify whitespace opportunities to differentiate ServiceNow. Engage deeply with customers to drive platform adoption, gather feedback, and continuously iterate on the roadmap based on evolving enterprise needs. Prototype and test new AI capabilities with cross-functional teams, translating early learnings into product direction. Champion customer-centric thinking across the organization and be the voice of AI platform users, including developers, data scientists, and enterprise architects. To be successful in this role you have: Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AIs potential impact on the function or industry. Experience in building or managing Gen AI platform capabilities, including data pipelines, model orchestration, LLM tuning, and evaluation frameworks, with a strong track record as an individual contributor. 10+ years of enterprise software product management experience, with at least 5 in SaaS; AI/ML or data platform experience strongly preferred. Familiarity with ServiceNows platform and application portfolio is a strong plus. Thought leadership in Generative AI trends, AI safety and ethics, and enterprise AI adoption patterns. Comfortable navigating complexity and ambiguity, with a bias for action and continuous learning. Excellent communicator who can tailor messages to technical and business audiences alike, from LLM practitioners to C-suite stakeholders. Analytical thinker with strong data literacy; able to connect technical metrics with product strategy and user outcomes. Proven collaborator with experience driving consensus and execution across engineering, design, sales, and customer success teams. Obsession with product-market fit and delivering value at speed while maintaining a long-term architectural vision. Entrepreneurial mindset with experience launching 0-to-1 products or platform capabilities; proven ability to scale offerings over time. Lead, manage a high-performing team of product managers, leveraging exceptional leadership skills to inspire and motivate them to achieve exceptional results. Strong understanding of the role of data in AI development labeling, quality, governance, and how it impacts model performance and business outcomes.

Posted 1 week ago

Apply

7.0 - 9.0 years

9 - 12 Lacs

Chennai

Work from Office

Naukri logo

Position Purpose As a member of WM IT DCIO team, the candidate will work on WM Data Governance workstreams as prioritized (Data Quality, Data literacy, Data Lineage/Architecture, Data Privacy by Design, Data Protection workstreams). Support WM IT DCIO team in driving the Data initiatives transversally with WM Leadership, Application Development & Maintenance (ADM), Engineering & Production (E&P), Security (CISO) teams. Support the development & testing of software / applications for Data Management. Note: DCIO complements Chief Data Office (CDO) within Wealth Management IT organization. Responsibilities Direct Responsibilities Work closely with WM IT DCIO team to execute the Data Governance Implementation across Data Initiatives e.g., RAMI (Data Retention), Data Privacy by Design, Data Quality, etc. Create and test proof-of-concept / solutions to support the strategic evolution of the software applications. Data Governance SME within Wealth Management operationally working with Data Custodian IT Officer (DCIO), DPO (Data Protection Officer), CDO (Chief Data Officer) teams. Hands-on with the development, testing, configuration, deployment of software systems in the Data Transversal organization Operationalize Data Policies / Frameworks including Business Glossaries, Data Dictionaries, Data Profiling, etc. Technical & Behavioral Competencies Minimum 7+ years of experience as: Data expertise (At least 2 of the following: Data Governance, Data Quality, Data Privacy & Protection, Data Management) Bachelors degree in Engineering (Computer science or Electronic & Communications) Qualifications: Hands-on experience in working with Data (Data Profiling, Scorecards/BI) Previously worked in Data Governance and Data Security Financial Services products and Applications knowledge Working knowledge across Excel, SQL, Python, Collibra, PowerBI, Cloud Plus: Collibra Developer or Ranger Certified or similar certification is preferred. Skills required: Knowledge about Data and Compliance / Regulatory environment( global and local Data Regulations) Demonstrates flexibility and willingness to accept assignments and challenges in rapidly changing environment. Understand how data is used (e.g., Analytics, Business Intelligence, etc.) Working knowledge on Data lifecycle, and Data Transformations / Data Lineage At least 2 of the following: Data Quality, Data Architecture, Database Management, Data Privacy & Protection, Security of data Ability to define relevant key performance indicators (KPI) Problem solving and team collaboration Self-motivated and results driven Project management and business analysis Agile thinking Transversal skills: Proficient in design new process, adaptation of Group IT processes to Wealth Management IT Strong communication to elaborate across stakeholders, and support change Minimum 7 years of experience in Data / Tech .

Posted 1 week ago

Apply

15.0 - 20.0 years

20 - 25 Lacs

Indore, Hyderabad, Ahmedabad

Work from Office

Naukri logo

What You Will Do Role Overview As a Data Governance Architect, you will define and lead enterprise-wide data governance strategies, design robust governance architectures, and enable seamless implementation of tools like Microsoft Purview, Informatica, and other leading data governance platforms. This is a key role bridging compliance, data quality, security, and metadata management across cloud and enterprise ecosystems. Key Responsibilities 1. Strategy, Framework, and Operating Model Define governance strategies, standards, and policies for compliance and analytics readiness. Establish a governance operating model with clear roles and responsibilities. Conduct maturity assessments and lead change management efforts. 2. Metadata, Lineage & Glossary Management Architect technical and business metadata workflows. Validate end-to-end lineage across ADF Synapse Power BI. Govern glossary approvals and term workflows. 3. Policy & Data Classification Management Define and enforce rules for: Classification, Access, Retention, and Sharing. Leverage Microsoft Information Protection (MIP) for automation. Ensure alignment with GDPR, HIPAA, CCPA, SOX. 4. Data Quality Governance Define quality KPIs, validation logic, and remediation rules. Build scalable frameworks embedded in pipelines and platforms. 5. Compliance, Risk & Audit Oversight Establish compliance standards, dashboards, and alerts. Enable audit readiness and reporting through governance analytics. 6. Automation & Integration Automate workflows using: PowerShell, Azure Functions, Logic Apps, REST APIs. Integrate governance into: Azure Monitor, Synapse Link, Power BI, and third-party tools. Primary Skills Microsoft Purview Architecture & Administration Data Governance Framework Design Metadata & Data Lineage Management (ADF Synapse Power BI) Data Quality and Compliance Governance Informatica / Collibra / BigID / Alation / Atlan PowerShell, REST APIs, Azure Functions, Logic Apps RBAC, Glossary Governance, Classification Policies MIP, Insider Risk, DLP, Compliance Reporting Azure Data Factory, Agile Methodologies

Posted 1 week ago

Apply

3.0 - 6.0 years

12 - 20 Lacs

Chennai

Remote

Naukri logo

Develop, test, and implement Alteryx workflows and macros. Collaborate with business stakeholders to gather and analyze data requirements. Optimize existing Alteryx workflows for performance and efficiency. Troubleshoot & resolve issues .

Posted 1 week ago

Apply

10.0 - 15.0 years

25 - 27 Lacs

Mumbai

Work from Office

Naukri logo

Preferred Qualification: BE/ B Tech/ MCA/ MBA Required Qualification: BE/ B Tech/ MCA/ MBA Skills: Incumbent should possess hands on knowledge of the database design and should be able to devise data migration strategies from scratch. o Ability to design the database E-R modelling vis-a-vis legacy system database models by understanding the legacy packages and procedures o The Incumbent would function as Technical lead for Database Migration in the transformation project and would be responsible for initiating, leading,planning and ensuring the successful migration of logacy data to the transformation o Ability to analyse, assess and evaluate trending technologies in DB related tools o Ability to understand & troubleshoot data related issues and suggest mitigation solutions. o Significant prior experience in designing, building and supporting enterprise class Database systems in the capital markets fixed income / foreign exchange / derivatives domain o Implement and govern the database migration strategies. o Experience with agile development methodologies and supporting tools o Thorough understanding of SDLC processes including testing methodologies; exposure to automated testing would be an added advantage o Good Communication and Presentation skills o Understand the legacy data model and data and provide strategic database direction to the teams. Be responsible for maintaining staging Databases Core competency Database design/ Dovelopment in Capital Markets / Fixed Income / Foreign Exchange | Derivatives Treasury domain| Information technology/ Fintech DB Security and Governance DB Performance Tuning IT Team Management & Dolivery Oracle Certification (DBA Track) on Oracie 10g/11g/19c Hands on experience with PL-SQL complex queries, procedure, packages and DB administration. Appreciation of Enterprise Functional Database Architecture in Capital Markets Cloud Native design, Agile Methodologies AWS Certification Job Purpose: The successful candidate would join the IT Department in the Transformation Programme for Clearing and Settlement systems. The transformation programme is focussed on transforming the existing applications for the clearing and settlement systems as per the new technical architecture and design. The successful candidate would be responsible for leading an database team or teams through all stages of the development lite cycle while also working closely with all other project stakeholders. Area Of Operations: PL SQL/ Database Strategic Development Management. Recruitment and Team Building Key Responsibility: * ER Modelling, Project Estimation and Delivery Tracking, DB Design, DB recon, DB Migration, DB Support (UAT and Production). Developing assessment criteria, conducting interviews, guiding the development team, problem solving, setting clear goals and delivering through teams Any Other Requirement: * Leadership skills with ability to build consensus across multiple stakeholders Would be required to work with multiple projects / teams concurrently Team building and Team working

Posted 1 week ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Mumbai

Work from Office

Naukri logo

Position Purpose The Data Architect is to support the work for ensuring that systems are designed, upgraded, managed, de-commissioned and archived in compliance with data policy across the full data life cycle. This includes complying with the data strategy and undertaking the design of data models and supporting the management of metadata. The Data Architect mission will integrate a focus on GDPR law, with the contribution to the privacy impact assessment and Record of Process & Activities relating to personal Data. The scope is CIB EMEA and CIB ASIA Responsibilities Direct Responsibilities Engage with key business stakeholders to assist with establishing fundamental data governance processes Define key data quality metrics and indicators and facilitate the development and implementation of supporting standards Help to identify and deploy enterprise data best practices such as data scoping, metadata standardization, data lineage, data deduplication, mapping and transformation and business validation Structures the information in the Information System (any data modelling tool like Abacus), i.e. the way information is grouped, as well as the navigation methods and the terminology used within the Information Systems of the entity, as defined by the lead data architects. Creates and manages data models (Business Flows of Personal Data with process involved) in all their forms, including conceptual models, functional database designs, message models and others in compliance with the data framework policy Allows people to step logically through the Information System (be able to train them to use tools like Abacus) Contribute and enrich the Data Architecture framework through the material collected during analysis, projects and IT validations Update all records in Abacus collected from stakeholder interviews/ meetings. Skill Area Expected Communicating between the technical and the non-technical Is able to communicate effectively across organisational, technical and political boundaries, understanding the context. Makes complex and technical information and language simple and accessible for non- technical audiences. Is able to advocate and communicate what a team does to create trust and authenticity, and can respond to challenge. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. Data Modelling (Business Flows of Data in Abacus) Produces data models and understands where to use different types of data models. Understands different tools and is able to compare between different data models. Able to reverse engineer a data model from a live system. Understands industry recognized data modelling patterns and standards. Understands the concepts and principles of data modelling and is able to produce, maintain and update relevant data models for specific business needs. Data Standards (Rules defined to manage/ maintain Data) Develops and sets data standards for an organisation. Communicates the business benefit of data standards, championing and governing those standards across the organisation. Develops data standards for a specific component. Analyses where data standards have been applied or breached and undertakes an impact analysis of that breach. Metadata Management Understands a variety of metadata management tools. Designs and maintains the appropriate metadata repositories to enable the organization to understand their data assets. Works with metadata repositories to complete and Maintains it to ensure information remains accurate and up to date. The objective is to manage own learning and contribute to domain knowledge building Turning business problems into data design Works with business and technology stakeholders to translate business problems into data designs. Creates optimal designs through iterative processes, aligning user needs with organisational objectives and system requirements. Designs data architecture by dealing with specific business problems and aligning it to enterprise-wide standards and principles. Works within the context of well understood architecture and identifies appropriate patterns. Contributing Responsibilities It is expected that the data architect applies knowledge and experience of the capability, including tools and technique and adopts those that are more appropriate for the environment. The Data Architect needs to have the knowledge of: The Functional & Application Architecture, Enterprise Architecture and Architecture rules and principles The activities Global Market and/or Global Banking Market meta-models, taxonomies and ontologies (such as FpML, CDM, ISO2022) Skill Area Expected Data Communication Uses the most appropriate medium to visualise data to tell compelling and actionable stories relevant for business goals. Presents, communicates and disseminates data appropriately and with high impact. Able to create basic visuals and presentations. Data Governance Understands data governance and how it works in relation to other organisational governance structures. Participates in or delivers the assurance of a service. Understands what data governance is required and contribute to these data governance. Data Innovation Recognises and exploits business opportunities to ensure more efficient and effective performance of organisations. Explores new ways of conducting business and organisational processes Aware of opportunities for innovation with new tools and uses of data Technical & Behavioral Competencies 1. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. 2. Able to create basic visuals and presentations. 3. Experience in working with Enterprise Tools for Data Cataloging and Data Management (like Abacus, collibra, Alation, etc) 4. Experience in working with BI Tools (Like Power BI) 5. Good understanding of Excel (formulas and Functions) Specific Qualifications (if required) Preferred: BE/ BTech, BSc-IT, BSc-Comp, MSc-IT, MSc Comp, MCA Skills Referential Behavioural Skills : (Please select up to 4 skills) Communication skills - oral & written Ability to collaborate / Teamwork Ability to deliver / Results driven Creativity & Innovation / Problem solving Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to understand, explain and support change Ability to develop and adapt a process Ability to anticipate business / strategic evolution Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 10 years Other/Specific Qualifications (if required) 1. Experience in GDPR (General Data Protection Regulation) or in Privacy by Design would be preferred 2. DAMA Certified (Good to have)

Posted 1 week ago

Apply

7.0 - 12.0 years

9 - 14 Lacs

Mumbai

Work from Office

Naukri logo

Position Purpose The Data Architect is to support the work for ensuring that systems are designed, upgraded, managed, de-commissioned and archived in compliance with data policy across the full data life cycle. This includes complying with the data strategy and undertaking the design of data models and supporting the management of metadata. The Data Architect mission will integrate a focus on GDPR law, with the contribution to the privacy impact assessment and Record of Process & Activities relating to personal Data. The scope is CIB EMEA and CIB ASIA Responsibilities Direct Responsibilities Engage with key business stakeholders to assist with establishing fundamental data governance processes Define key data quality metrics and indicators and facilitate the development and implementation of supporting standards Help to identify and deploy enterprise data best practices such as data scoping, metadata standardization, data lineage, data deduplication, mapping and transformation and business validation Structures the information in the Information System (any data modelling tool like Abacus), i.e. the way information is grouped, as well as the navigation methods and the terminology used within the Information Systems of the entity, as defined by the lead data architects. Creates and manages data models (Business Flows of Personal Data with process involved) in all their forms, including conceptual models, functional database designs, message models and others in compliance with the data framework policy Allows people to step logically through the Information System (be able to train them to use tools like Abacus) Contribute and enrich the Data Architecture framework through the material collected during analysis, projects and IT validations Update all records in Abacus collected from stakeholder interviews/ meetings. Skill Area Expected Communicating between the technical and the non-technical Is able to communicate effectively across organisational, technical and political boundaries, understanding the context. Makes complex and technical information and language simple and accessible for non- technical audiences. Is able to advocate and communicate what a team does to create trust and authenticity, and can respond to challenge. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. Data Modelling (Business Flows of Data in Abacus) Produces data models and understands where to use different types of data models. Understands different tools and is able to compare between different data models. Able to reverse engineer a data model from a live system. Understands industry recognized data modelling patterns and standards. Understands the concepts and principles of data modelling and is able to produce, maintain and update relevant data models for specific business needs. Data Standards (Rules defined to manage/ maintain Data) Develops and sets data standards for an organisation. Communicates the business benefit of data standards, championing and governing those standards across the organisation. Develops data standards for a specific component. Analyses where data standards have been applied or breached and undertakes an impact analysis of that breach. Metadata Management Understands a variety of metadata management tools. Designs and maintains the appropriate metadata repositories to enable the organization to understand their data assets. Works with metadata repositories to complete and Maintains it to ensure information remains accurate and up to date. The objective is to manage own learning and contribute to domain knowledge building Turning business problems into data design Works with business and technology stakeholders to translate business problems into data designs. Creates optimal designs through iterative processes, aligning user needs with organisational objectives and system requirements. Designs data architecture by dealing with specific business problems and aligning it to enterprise-wide standards and principles. Works within the context of well understood architecture and identifies appropriate patterns. Contributing Responsibilities It is expected that the data architect applies knowledge and experience of the capability, including tools and technique and adopts those that are more appropriate for the environment. The Data Architect needs to have the knowledge of: The Functional & Application Architecture, Enterprise Architecture and Architecture rules and principles The activities Global Market and/or Global Banking Market meta-models, taxonomies and ontologies (such as FpML, CDM, ISO2022) Skill Area Expected Data Communication Uses the most appropriate medium to visualise data to tell compelling and actionable stories relevant for business goals. Presents, communicates and disseminates data appropriately and with high impact. Able to create basic visuals and presentations. Data Governance Understands data governance and how it works in relation to other organisational governance structures. Participates in or delivers the assurance of a service. Understands what data governance is required and contribute to these data governance. Data Innovation Recognises and exploits business opportunities to ensure more efficient and effective performance of organisations. Explores new ways of conducting business and organisational processes Aware of opportunities for innovation with new tools and uses of data Technical & Behavioral Competencies 1. Able to effectively translate and accurately communicate across technical and non- technical stakeholders as well as facilitating discussions within a multidisciplinary team, with potentially difficult dynamics. 2. Able to create basic visuals and presentations. 3. Experience in working with Enterprise Tools (like Abacus, informatica, big data, collibra, etc) 4. Experience in working with BI Tools (Like Power BI) 5. Good understanding of Excel (formulas and Functions) Specific Qualifications (if required) Preferred: BE/ BTech, BSc-IT, BSc-Comp, MSc-IT, MSc Comp, MCA Skills Referential Behavioural Skills : (Please select up to 4 skills) Communication skills - oral & written Ability to collaborate / Teamwork Ability to deliver / Results driven Creativity & Innovation / Problem solving Transversal Skills: (Please select up to 5 skills) Analytical Ability Ability to understand, explain and support change Ability to develop and adapt a process Ability to anticipate business / strategic evolution Choose an item. Education Level: Bachelor Degree or equivalent Experience Level At least 7 years Other/Specific Qualifications (if required) 1. Experience in GDPR (General Data Protection Regulation) or in Privacy by Design would be preferred 2. DAMA Certified.

Posted 1 week ago

Apply

2.0 - 6.0 years

5 - 8 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Job Description KPI Partners is seeking an enthusiastic and skilled Data Engineer specializing in STIBO (STEP) development to join our dynamic team. As a pivotal member of our data engineering team, you will be responsible for designing, developing, and implementing data solutions that meet the needs of our clients. This role requires a strong understanding of data management principles along with technical expertise in the STIBO STEP platform. Key Responsibilities - Design and develop data models and solutions using STIBO STEP for effective Master Data Management (MDM). - Collaborate with data architects, data analysts, and business stakeholders to gather requirements and translate them into technical specifications. - Implement and maintain ETL processes for data extraction, transformation, and loading to ensure data integrity and reliability. - Optimize data pipelines and workflows for performance and efficiency. - Monitor data quality and implement best practices for data governance. - Troubleshoot and resolve technical issues related to STIBO STEP development and data processes. - Provide technical support and guidance to team members and stakeholders regarding best practices in data management. Qualifications. - Bachelor’s degree in Computer Science, Information Technology, or a related field. - Proven experience as a Data Engineer or in a similar role, with a focus on STIBO (STEP) development. - Strong understanding of Master Data Management concepts and methodologies. - Proficiency in data modeling and experience with ETL tools and data integration processes. - Familiarity with database technologies such as SQL Server, Oracle, or PostgreSQL. - Excellent problem-solving skills and the ability to work independently as well as part of a team. - Strong communication skills to effectively collaborate with technical and non-technical stakeholders. - Experience with data visualization tools is a plus. What We Offer. - Competitive salary and performance-based incentives. - Opportunity to work on innovative projects in a collaborative environment. - Professional development and training opportunities to enhance your skills. - A flexible work environment that promotes work-life balance. - A vibrant company culture that values creativity and teamwork. If you are passionate about data engineering and want to play a crucial role in shaping our clients' data strategies, we would love to hear from you! Apply now to join KPI Partners in delivering impactful data solutions. KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.

Posted 1 week ago

Apply

6.0 - 8.0 years

27 - 42 Lacs

Bengaluru

Work from Office

Naukri logo

Job Summary The Sr. Developer role involves designing developing and maintaining data integration solutions using Abinitio admin and other ETL tools. The candidate will work on data warehousing projects ensuring efficient data processing and integration. This position requires a strong understanding of data warehousing concepts and proficiency in SQL and Unix Shell Scripting. The role is hybrid with no travel required. Responsibilities Design and develop data integration solutions using Ab Initio tools to ensure seamless data processing and transformation. Collaborate with cross-functional teams to gather and analyze requirements for data warehousing projects. Implement ETL processes to extract transform and load data from various sources into data warehouses. Optimize SQL queries to enhance performance and ensure efficient data retrieval and manipulation. Utilize Unix Shell Scripting for automation and scheduling of data processing tasks. Monitor and troubleshoot data integration workflows to ensure data accuracy and integrity. Provide technical support and guidance to team members on data warehousing and ETL best practices. Conduct regular reviews of data integration processes to identify areas for improvement and implement necessary changes. Ensure compliance with data governance and security standards in all data integration activities. Collaborate with stakeholders to understand business requirements and translate them into technical specifications. Develop and maintain documentation for data integration processes and workflows. Stay updated with the latest trends and technologies in data warehousing and ETL to drive innovation. Contribute to the companys mission by enabling data-driven decision-making through robust data integration solutions. Qualifications Demonstrate expertise in Data Warehousing Concepts and Scheduling Basics to design effective data solutions. Possess strong skills in ETL and SQL to manage data extraction and transformation processes efficiently. Show proficiency in Ab Initio GDE Conduct>It and Co>Operating System for advanced data integration tasks. Have experience in Unix Shell Scripting to automate and streamline data workflows. Nice to have domain experience in Telecom to understand industry-specific data requirements. Exhibit ability to work in a hybrid model balancing remote and on-site tasks effectively. Display strong analytical and problem-solving skills to address data integration challenges. Certifications Required Ab Initio Certification SQL Certification

Posted 1 week ago

Apply

15.0 - 20.0 years

20 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

Job Role: Global Data Governance and Quality Lead Job Location: Hyderabad Work Mode: (work from office) Shift Timings : 2 PM to 11 PM Job Overview This role will lead and deliver the implementation and institutionalization of Data Governance and Master Data Management (MDM) platform along with the complimentary processes, across the whole global firm. This initiative is one of the foundational fixes around underlying data management that the Global Data Strategy & Architecture team is addressing to drive standardization and simplification of data creation, management and usage at Clifford Chance. The role will oversee an external delivery team to achieve target outcomes on time and on budget. Who you will work with : You will work closely with data management teams, business stakeholders, IT professionals & delivery partners to deliver on the Program objectives. What you will do and be responsible for Project Scoping and Planning Develop a multi-year project and budget plan for the execution of the Data Governance and Master Data Management (MDM) priorities, aligning approach with the broader Global Data Strategy & Architecture team Establish positive relationships with a large network of cross-functional and leadership stakeholders to drive engagement, buy-in and collaborative working arrangements to support delivery of target outcomes Work with the Global Head of Data Strategy & Architecture to select the technology platforms and external delivery partner to support the project Project Delivery Provide day-to-day oversight of selected external delivery partner, leading on target outcomes: Single source of truth for firms master data Standardized and managed taxonomies across the firm Clearly defined linkages and relationships between master and taxonomy data Clearly defined categories of firm’s data that is owned and managed by nominated individuals Governance approach to changes to master data Approach to maximised automated data capture to minimize manual entry and defects, and implementation Implementation of selected MDM platform Embedded behavioural change in the firm around use of data Work closely with the ERP/CGP Data Governance Lead to ensure that the guidance and support to these programmes to establish and maintain data standards, policies, and processes is in line with the firm-wide approach. Project Management and Stakeholder Management To own the plan for the delivery of project outcomes, managing project and technical interdependencies and a large network of cross-functional and leadership stakeholders to deliver and embed project outputs into BAU, e.g. business units, Technology, Legal and Compliance. To continuously ensure that operational and technical outcome of the projects aligns with the expected strategic and business outcomes of the overall Global Data Strategy & Architecture program

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Naukri logo

This role requires deep understanding of data warehousing, business intelligence (BI), and data governance principles, with strong focus on the Microsoft technology stack. Data Architecture: Develop and maintain the overall data architecture, including data models, data flows, data quality standards. Design and implement data warehouses, data marts, data lakes on Microsoft Azure platform Business Intelligence: Design and develop complex BI reports, dashboards, and scorecards using Microsoft Power BI. Data Engineering: Work with data engineers to implement ETL/ELT pipelines using Azure Data Factory. Data Governance: Establish and enforce data governance policies and standards. Primary Skills Experience: 15+ years of relevant experience in data warehousing, BI, and data governance. Proven track record of delivering successful data solutions on the Microsoft stack. Experience working with diverse teams and stakeholders. Required Skills and Experience: Technical Skills: Strong proficiency in data warehousing concepts and methodologies. Expertise in Microsoft Power BI. Experience with Azure Data Factory, Azure Synapse Analytics, and Azure Databricks. Knowledge of SQL and scripting languages (Python, PowerShell). Strong understanding of data modeling and ETL/ELT processes. Secondary Skills Soft Skills Excellent communication and interpersonal skills. Strong analytical and problem-solving abilities. Ability to work independently and as part of a team. Strong attention to detail and organizational skills.

Posted 1 week ago

Apply

4.0 - 9.0 years

25 - 35 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

Exp- 4-7yrs Skillset- IDMC (preferably) or any other DG tool, SQL, ETL Location- Gurgaon (Preferred), Bangalore, Pune, NP- URGENT Work with Customers onshore team collaboratively to support following initiatives: Interface with business stakeholders, understand their data and analytics needs, establish requirement with technical stakeholders and align on delivery plan. Understand various data sources around asset classes, portfolio, historical performances, market trends etc. and Develop/enhance data documentation. Help deliver data-driven analysis and recommendations that effectively influence business decisions. Extract data, perform data cleansing / data quality checking tasks, prepare data quality reports, and model ready data. Candidate Profile: Over 4 years of experience in data analytics, governance, and business analysis Strong understanding of data analytics and ability to derive actionable insights Skilled in developing strategic project roadmaps and aligning data initiatives with business goals Proactive in proposing suggestions and providing regular project updates to stakeholders Hands-on experience with data governance frameworks; Collibra knowledge helpful but not mandatory. Strong comprehension of metadata strategies and real-world use cases Excellent communication skills and ability to work across business and technical teams Familiar with technology stack: SQL, Snowflake, Power BI Experience with IceDQ (a plus) Understanding of investment fundamentals is a valuable asset Detail-oriented, self-motivated, and adept at cross-functional collaboration

Posted 1 week ago

Apply

5.0 - 10.0 years

14 - 24 Lacs

Mumbai, Hyderabad, Bengaluru

Hybrid

Naukri logo

Integration Specialist Collibra Must have Understanding of data governance concepts Experience with Source to Target Data Mapping Collibra Data Catalog API based integration programming knowledge Python Java Good to have Knowledge on Groovy scripts

Posted 1 week ago

Apply

12.0 - 14.0 years

20 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description & Summary: We are seeking an experienced Senior Data Architect to lead the design and development of our data architecture, leveraging cloud-based technologies, big data processing frameworks, and DevOps practices. The ideal candidate will have a strong background in data warehousing, data pipelines, performance optimization, and collaboration with DevOps teams. Responsibilities: 1. Design and implement end-to-end data pipelines using cloud-based services (AWS/ GCP/Azure) and conventional data processing frameworks. 2. Lead the development of data architecture, ensuring scalability, security, and performance. 3. Collaborate with cross-functional teams, including DevOps, to design and implement data lakes, data warehouses, and data ingestion/extraction processes. 4. Develop and optimize data processing workflows using PySpark, Kafka, and other big data processing frameworks. 5. Ensure data quality, integrity, and security across all data pipelines and architectures. 6. Provide technical leadership and guidance to junior team members. 7. Design and implement data load strategies, data partitioning, and data storage solutions. 8. Collaborate with stakeholders to understand business requirements and develop data solutions to meet those needs. 9. Work closely with DevOps team to ensure seamless integration of data pipelines with overall system architecture. 10. Participate in design and implementation of CI/CD pipelines for data workflows. DevOps Requirements: 1. Knowledge of DevOps practices and tools, such as Jenkins, GitLab CI/CD, or Apache Airflow. 2. Experience with containerization using Docker. 3. Understanding of infrastructure as code (IaC) concepts using tools like Terraform or AWS CloudFormation. 4. Familiarity with monitoring and logging tools, such as Prometheus, Grafana, or ELK Stack. Requirements: 1. 12-14 years of experience for Senior Data Architect in data architecture, data warehousing, and big data processing. 2. Strong expertise in cloud-based technologies (AWS/ GCP/ Azure) and data processing frameworks (PySpark, Kafka, Flink , Beam etc.). 3. Experience with data ingestion, data extraction, data warehousing, and data lakes. 4. Strong understanding of performance optimization, data partitioning, and data storage solutions. 5. Excellent leadership and communication skills. 6. Experience with NoSQL databases is a plus. Mandatory skill sets: 1. Experience with agile development methodologies. 2. Certification in cloud-based technologies (AWS / GCP/ Azure) or data processing frameworks. 3. Experience with data governance, data quality, and data security. Preferred skill sets: Knowledge of AgenticAI and GenAI is added advantage

Posted 1 week ago

Apply

4.0 - 7.0 years

8 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Complete PMD tasks in CS Quotations from MATCH locations Maintain Global PMD in SAP-MRP Type, Weight, Dimensions, Delivery Time & Origin, Stackable Process Intercompany Order requests Manage information flow Support SAS in PMD related questions/tasks Required Candidate profile Understand technical /mechanical topics-machines / spare /wear parts Exp with data admn Good communication in English-MUST MS Office Interpersonal/teamwork Customer focus Manufacturing Background ONLY Perks and benefits MUST-exp in PMD MGT in MNCs-MANUFACTURING CO ONLY

Posted 1 week ago

Apply

5.0 - 8.0 years

5 - 12 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Detailed JD *(Roles and Responsibilities) Understand the current data governance structure of the organization and draft a data governance charter and operating model with roles and responsibilities on levels of operating model. Create a glossary with terms and definitions, mapping between logical elements and physical elements, and simple and complex relations for mapping. Set up Collibra communities, domains, types, attributes, status, articulation, workflow, and customize attribution. Identify and prioritize data domains for data governance based on business value and ease of implementation. Define roles and responsibilities governing communities and assets within the Collibra environment. Recommend and implement workflows to govern metadata. Engage with client SMEs to identify key business terms from shared documents to be showcased in Collibra as part of the business glossary. Identify key attributes like definition, criticality, security classification, purpose, etc., associated with the business terms. Create templates to gather the information required about business term attributes and technical metadata. Automate the manual data demand process by configuring and implementing workflows. Create end-to-end lineage in Collibra DGC based on the analysis performed and display the lineage in visual format for business users. Document best practices and provide training to stakeholders. Mandatory skills* Collibra is the mandatory skill

Posted 1 week ago

Apply

6.0 - 10.0 years

22 - 25 Lacs

Mumbai, Hyderabad

Work from Office

Naukri logo

About the role As Master Data Management Manager, you will manage a cluster of technology platforms, continuously evaluate technology solutions induct an innovative technology stack to drive business excellence at ICICI Bank. You will work along with the cross functional business teams in creating technology solutions by leveraging digital data capabilities induct new age technologies. In this role, you will have opportunities to ideate, develop, manage, maintain improvise our digital offerings also our internal tools/platforms. Key Responsibilities Design and Develop Designing and developing customized MDM code based on Specification. Customization of MDM using MDM features such as Extension, Additions and Business proxies, rule, Services. Support Design, develop, Conduct Unit Testing Cases of new releases of SW components within the MDM repository. Be Up-to-Date Committed to learning and expanding professional and technical knowledge in master data management processes tools, data modeling data integration. Key Qualifications & Skills Educational Qualification B.E /B. Tech/M. E/ M. Tech with 6 to 10 years of relevant experience in IBM Infosphere Master Data Management Expertise in IBMs MDM version 11.x product, hands-on implementation experience, preferably in Banking domain. Experience with IBM MDM Customization includes Extension, Addition, Business proxies, SDP, Match-Merge Rules, Event Manage Support Provide support in understanding of OSGI architecture in terms of MDM customization, deployment and how to troubleshoot failures. Expert Java development background with RSA/ RAD, MDM Workbench, SOAP Web Services, XML, XSD, WSDL Communication skills Excellent oral and written communication skills

Posted 1 week ago

Apply

10.0 - 13.0 years

12 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Atos is a global leader in digital transformation,European number one in Cloud, Cybersecurity and High-Performance Computing, the Group provides end-to-end Orchestrated Hybrid Cloud, Big Data, Business Applications and Digital Workplace solutions. The Group is the Worldwide Information Technology Partner for the Olympic & Paralympic Games and operates under the brands Atos, Atos Syntel, and Unify. Atos is a SE (Societas Europaea), listed on the CAC40 Paris stock index. The purpose of Atos is to help design the future of the information space. Its expertise and services support the development of knowledge, education and research in a multicultural approach and contribute to the development of scientific and technological excellence. Across the world, the Group enables its customers and employees, and members of societies at large to live, work and develop sustainably, in a safe and secure information space Role Overview: The Technical Architect has expertise in Google Cloud Platform (GCP) AI technologies, including Gemini. This role involves designing and implementing cutting-edge AI solutions that drive business transformation, scalability, and efficiency. Responsibilities: Architect and implement AI-driven solutions using GCP services and Gemini technologies. Collaborate with stakeholders to understand business requirements and translate them into scalable, secure technical solutions. Design efficient architectures for AI applications, ensuring compliance with industry standards and best practices. Optimize AI workflows for performance, reliability, and cost-effectiveness. Establish and enforce best practices for AI development, deployment, and monitoring. Provide technical leadership and mentorship to teams working on AI projects. Stay updated with emerging trends in AI and cloud technologies, particularly within the GCP ecosystem. Troubleshoot and resolve complex technical issues related to AI systems. Document architectural designs and decisions for future reference. Key Technical Skills & Responsibilities Strategic AI Leadership Advanced Model Development Google Cloud Architecture Mastery Data Science and Engineering Prompt Engineering Expertise Responsible AI Implementation Cross-Functional Collaboration AI Governance and Compliance Strategic AI System Design Programming Mastery - Python Multi-Agent Orchestration using Vertex AI Agent Builder Cloud Infrastructure Expertise (Vertex ML, Google Cloud API management) AI Governance and Compliance - Ethical AI practices Integration and Interoperability Performance Optimization Eligibility Criteria: Bachelors degree in Computer Science, Data Science, or a related field. Extensive experience with GCP AI services and Gemini technologies. Strong understanding of AI architecture, design, and optimization. Proficiency in programming languages such as Python and Java. Experience with cloud-based AI solutions is a plus. Familiarity with ethical AI principles and practices. Knowledge of data governance and compliance standards. Excellent problem-solving and analytical skills. Proven leadership and team management abilities. Ability to work in a fast-paced environment and manage multiple priorities.

Posted 1 week ago

Apply

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

We are currently seeking a Data Privacy Specialist to join the CSC Enterprise Data Governance and Privacy team. As a Data Privacy Specialist, you will play a crucial role in ensuring the protection and compliance of personal data within our organization. You will be responsible for supporting the Global Data Privacy and Protection Office with developing, implementing, and maintaining data privacy policies and procedures to safeguard our data assets and ensure compliance with relevant regulations and standards. Key Responsibilities: Data Privacy Compliance : Monitor and assess the organizations compliance with data privacy laws, regulations, and standards such as GDPR, CCPA, HIPAA, etc. Policy Development : Develop and maintain data privacy policies, procedures, and guidelines tailored to the organizations needs and regulatory requirements. Privacy Impact Assessments (PIAs) : Conduct PIAs to identify and assess the potential privacy risks associated with new projects, systems, or processes, and recommend mitigation strategies. Record of Processing Activities (ROPAs) : Develop and maintain the Record of Processing Activities in accordance with regulatory requirements. Ensure ROPAs are updated regularly to reflect changes in data processing activities within the organization. Data Mapping and Inventory : Maintain an inventory of data assets, including personal and sensitive data, and ensure appropriate data mapping to understand data flows and identify privacy risks. Privacy Training and Awareness : Develop and deliver privacy training programs and awareness campaigns to educate employees about data privacy best practices and their responsibilities. Personal Data Incident/Breach Tracking : Develop and implement procedures for responding to data privacy incidents, including breach notification requirements, and coordinate incident response efforts as needed. Projects : Lead or represent Data Governance and Privacy as needed for project work. Qualifications Bachelor s degree in a relevant field such as Information Technology, Law, or Business Administration., Masters preferred. Privacy certification preferred (i.e., CIPM, CIPP/E) Overall 5-7 years of experience, including 3+ years of proven Data Privacy and Protection experience is desired. Knowledge of Data Privacy and Protection regulations (i.e., GDPR, CCPA/CPRA) required. Strong analytical or problem-solving skills. Self-motivated with the ability to drive tasks to completion. Excellent communication and interpersonal skills. Intermediate level of proficiency with PC based software programs and automated database management systems required (Excel, Access, Visio, PowerPoint). Demonstrated process improvement, workflow, benchmarking and / or evaluation of business processes.

Posted 1 week ago

Apply

4.0 - 8.0 years

13 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Associate Specialist- Data Engineering Location: Bengaluru Shift : UK Shift About the Role: We are seeking a skilled and experienced Data Engineer to join our team and help build, optimize, and maintain data pipelines and architectures. The ideal candidate will have deep expertise in the Microsoft data engineering ecosystem, particularly leveraging tools such as Azure Data Factory , Databricks , Synapse Analytics , Microsoft Fabric , and a strong command of SQL , Python , and Apache Spark . Key Responsibilities: Design, develop, and optimize scalable data pipelines and workflows using Azure Data Factory , Synapse Pipelines , and Microsoft Fabric . Build and maintain ETL/ELT processes for ingesting structured and unstructured data from various sources. Develop and manage data transformation logic using Databricks (PySpark/Spark SQL) and Python . Collaborate with data analysts, architects, and business stakeholders to understand requirements and deliver high-quality data solutions. Ensure data quality, integrity, and governance across the data lifecycle. Implement monitoring and alerting for data pipelines to ensure reliability and performance. Work with Azure Synapse Analytics to build data models and enable analytics and reporting. Utilize SQL for querying and managing large datasets efficiently. Participate in data architecture discussions and contribute to technical design decisions. Required Skills and Qualifications: data engineering or a related field. Strong proficiency in the Microsoft Azure data ecosystem including: Azure Data Factory (ADF) Azure Synapse Analytics Microsoft Fabric Azure Databricks Solid experience with Python and Apache Spark (including PySpark). Advanced skills in SQL for data manipulation and transformation. Experience in designing and implementing data lakes and data warehouses. Familiarity with data governance, security, and compliance standards. Strong analytical and problem-solving skills. Excellent communication and collaboration abilities. Preferred Qualifications: Microsoft Azure certifications (e.g., Azure Data Engineer Associate). Experience with DevOps tools and CI/CD practices in data workflows. Knowledge of REST APIs and integration techniques. Background in agile methodologies and working in cross-functional teams.

Posted 1 week ago

Apply

8.0 - 13.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Number of Openings 2 ECMS Request no 530008 & 530010 Total Yrs. of Experience* 10 +Yrs. Relevant Yrs. of experience* 8 + Yrs. Job Description Minimum 8 years of experience in design, development, and deployment of largescale, distributed environmentDevelopment experience on ABAP, FPM, Webdynpro, Webservices, Workflow, Fiori, BADI, Expertise with SAP MDG configurations for Data modelling, UI modelling, process modelling, rules, and derivations, BRF, replication configurationsMust have knowledge and experiences of implementing multiple SAP MDG Master Data Governance architectures and business processes across multiple master data domain. Mandatory skill SAP MDG Desired skills* SAP MDG Domain* SAP MDG Vendor billing rate* 12.5 K to 12.7K Precise Work Location Offshore BG Check Post Onboarding Delivery Anchor for screening, interviews and feedback* saravanan.arumugam@infosys.com Is there any working in shifts from standard Daylight (to avoid confusions post onboarding) * Normal Shift

Posted 1 week ago

Apply

10.0 - 15.0 years

12 - 17 Lacs

Chennai

Work from Office

Naukri logo

We are looking for a BI Architect with 13+ years of experience to lead the design and implementation of scalable BI and data architecture solutions. The role involves driving data modeling, cloud-based pipelines, migration projects, and data lake initiatives using technologies like AWS, Kafka, Spark, SQL, and Python. Experience with EDW modeling and architecture is a strong plus. Key Responsibilities Design and develop scalable BI and data models to support enterprise analytics. Lead data platform migration from legacy BI systems to modern cloud architectures. Architect and manage data lakes, batch and streaming pipelines, and real-time integrations via Kafka and APIs. Support data governance, quality, and access control initiatives. Partner with data engineers, analysts, and business stakeholders to deliver reliable, high-performing data solutions. Contribute to architecture decisions and platform scalability planning Qualifications Should have 10-15 years of relevant experience. 10+ years in BI, data engineering, or data architecture roles. Proficiency in SQL, Python, Apache Spark, and Kafka. Strong hands-on experience with AWS data services (e.g., S3, Redshift, Glue, EMR). Track record of leading data migration and modernization projects. Solid understanding of data governance, security, and scalable pipeline design. Excellent collaboration and communication skills. Good to Have Experience with enterprise data warehouse (EDW) modeling and architecture. Familiarity with BI tools like Power BI, Tableau, Looker, or Quicksight. Knowledge of lakehouse, data mesh, or modern data stack concepts.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies