Jobs
Interviews

1949 Data Governance Jobs - Page 23

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 6.0 years

3 - 6 Lacs

Noida

Work from Office

company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=4 to 6 , jd=10 BDC7A SummaryAs a Data Platform Engineer, you will assist with the data platform blueprint and design, collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models. You will play a crucial role in shaping the data platform components. Roles & Responsibilities- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to design and implement data platform solutions.- Develop and maintain data pipelines for efficient data processing.- Optimize data storage and retrieval processes for improved performance.- Implement data governance policies and ensure data quality standards are met.- Stay updated with industry trends and best practices in data engineering. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Building Tool.- Strong understanding of data modeling and database design principles.- Experience in ETL processes and data integration techniques.- Knowledge of cloud platforms and services for data storage and processing.- Hands-on experience with data visualization tools for reporting and analysis. Additional Information- The candidate should have a minimum of 3 years of experience in Data Building Tool.- This position is based at our Bengaluru office.- A 15 years full time education is required., Title=Data Building Tool, ref=6566428

Posted 2 weeks ago

Apply

8.0 - 13.0 years

3 - 7 Lacs

Bengaluru

Work from Office

company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=8 to 15 , jd= MDM Architect Accountable for delivery of the MDM capabilities required by the business functions Handson experience in Informatica MDM and Customer 360 implementation experience is a must Handson experience in Informatica MDM and Customer 360 implementation experience is a must Certification INFA MDM or Customer 360 Understand specs user stories and product planning docs around MDM implementation Understand Customer Master roadmap aligning with business objectives and outcomes Support in creation and implementation of Data Model that helps support all business use cases Participate to achieve single source of truth from master data perspective Participate in full lifecycle of complex crossfunctional programsprojects with considerable impact across multiple organizations Participate in adoption and implementation of best practices across key data elements and processes Ability to convert business requirements into technical specifications and decide timeline to accomplish Understand business and functional requirements documents Particiapte with business technology and operations on driving data cleanup validation efforts across various systems to achieve clean complete Master Data Identification root cause analysis prioritization remediation planning and coordination and execution of remediation activities for data issues Update system data documentation metadata dictionary lineage in accordance with established policies Help resolve critical issues and providing technical resolution Particiapte in development and maintenance of the technologies that automate data processes Design and guide developers on topics related to development and deployment of MDM processes Work with users and team members at all level for performance improvement and suggestions Develop process and tools to monitor and analyse model performance and data accuracy quality Performing a detailed analysis of data management requirements from a DataOps standpoint specifically Data Governance and Quality across all systems platforms and applications Identify design and implement internal process improvements automating manual processes optimizing data delivery redesigning infrastructure for greater scalability etc Build the infrastructure required for optimal extraction transformation and loading of data from a wide variety of data sources , Title=Informatica MDM Developer, ref=6566421

Posted 2 weeks ago

Apply

6.0 - 10.0 years

6 - 10 Lacs

Noida

Work from Office

company name=Apptad Technologies Pvt Ltd., industry=Employment Firms/Recruitment Services Firms, experience=6 to 10 , jd= SAP Master Data Governance MDG Tool-8 & 9-Pune Exp 8-10 yrs - 30 LPA ( request ID - 75800-1) 6-8 Yrs 21 LPA (Request ID - 75799-1) WFO - 5 days Location Pune Apptads Payroll Must To Have Skills: Proficiency in SAP Master Data Governance MDG Tool.- Strong understanding of data governance principles and practices.- Experience with application design and configuration.- Ability to lead cross-functional teams and manage stakeholder expectations.- Familiarity with project management methodologies and tools. Additional Information- The candidate should have minimum 7.5 years of experience in SAP Master Data Governance MDG Tool.- This position is based in Pune .- A 15 years full time education is required. , Title=SAP MDG, ref=6566190

Posted 2 weeks ago

Apply

3.0 - 8.0 years

17 - 30 Lacs

Bengaluru

Work from Office

Role: Functional Data Steward Manager (Finance) Location: Bangalore Experience: 3+years in data management and governance Notice Period-Immediate joiners Mandatory Skills-Data Governance & Stewardship, Master Data Management (MDM), Data Quality Management, SQL/SSIS & Data Profiling Tools, Documentation & Communication, Cross-Functional Collaboration, Audit & Compliance Support. About the Data Governance Team As a Data Governance team, the team is responsible for Business Efficiency and Evolution function, the Data Governance team is accountable for data quality, against a specific business process(es) ensuring that data maintenance activities are executed in line with agreed data standards, underpinned by regular audits. Manages the delivery of configurable master data additions, deletions and amendments. The team also reviews the existing functional and non-functional requirements and ensure they contain all the pertinent design and architecture components to support the deployment of all key features in the form of reports using the ISDS and SSIS/SQL The Practice is looking for a Functional Data steward- Finance) who is accountable for data quality, against a specific business process(es) / area(s) of Mercury functionality, ensuring that data maintenance activities are executed in line with agreed data standards, underpinned by regular audits. Manages the delivery of configurable master data additions, deletions and amendments. Responsibilities Essential Functions of the Job: Manage the hierarches, and governance processes to ensure they are maintained, supported and meet current business requirements, including scheduled reviews and updates Maintain the current documentation which includes hierarchy, and master data definitions, standards and policies Provide expert knowledge through research and acquired business knowledge Determine and publish data quality standards and measures/KPIs for master data attribute Investigate, promote and recommend new uses and utilization of hierarchies, taxonomies, and governance processes Work with project management and enterprise architecture counter parts to ensure a unified strategy around hierarchies, reference data, and vocabularies Evaluate new data sources for adherence to the organization's quality standards and work with Information Architecture to identify trusted or best sources of data and populate the Business Data Glossary with that metadata Provide input and guidance regarding master data and data quality standards and processes to business and solution teams Manage the master data governance processes Provide functional requirements for development of governance workflow process development Direct the development of governance workflow processes, and coordinate UAT and production roll-out Communicate all new, and changes to hierarchies and governance processes to the primary stakeholders and the user community Serve as the primary quality control analyst for all master data entities in the primary databases Develop and maintain data profiling and monitoring queries, routines and reports against all MDM to ensure a defined standard of data quality for master data attributes Audit and report data quality measures/KPIs against all systems to the business units, as they pertain to the master data attributes Escalate data management decisions that cannot be made easily or due to unresolved conflict Report on, and support through enforcement, the agreed and signed-off policies mandated by the Data governance policy. Identify data governance metrics and execute audits to benchmark the state of data quality, retention, security etc., and its impact on desired business outcomes. Regularly expose data governance metrics via standard reporting mechanisms (for example, data quality scorecard or dashboard). Engage with business leadership (key business managers, Talent leadership etc.), to quantify and articulate the business impact of violations of policy. Develop data quality methodologies to execute data quality improvement projects. Actively participate in the design and deployment of applications and data integration processes to ensure standards and controls, to ensure high-quality data is implemented in adherence with data governance policies. What are we looking for in you: 7+ years experience in the data management field, Experience that demonstrates the ability to work with multiple business lines to drive governance and stewardship within global organizations Experience with major querying tools and knowledge of database concepts a plus 7+ years, in a supporting role of data or information management involving data analysis, data stewardship, metadata management, data governance and/or master data management in a large business setting. Experience that demonstrates the ability to work with multiple business lines to drive governance and stewardship within a global organization. Driving the value proposition for the functional team Driving the functional requirements Liaising with various stakeholders to ensure the deliverables are met Experience in change management, configuration management, or incident management would be a plus. Experience with major querying, data profiling tools and knowledge of database concepts preferred Strong research and analytical skills, as well as data profiling, manipulation and cleansing experience. Detailed oriented, strong organizational skills, and the ability to work independently or as part of a team. Multi-tasking abilities and the ability to work in a fast-paced environment. Desire and aptitude to learn Master Data Management technologies and concepts. Able to interpret data from both technical and business perspective Understanding of master data management, data integration & SOA methodologies, concepts & approaches Ability and flexibility to work in a virtual environment across multiple time zones Responsible for driving the work done by the Senior associates within the team and managing their career Regards, Daina Infosys Recruitment Team

Posted 2 weeks ago

Apply

8.0 - 12.0 years

30 - 35 Lacs

Pune, Thiruvananthapuram

Hybrid

Should have experience in Data Analysis, Data Governance, and Business Development leveraging agile methodologies to drive continuous improvement, value delivery, and strategic decision-making. Certified in SAFe 6.0 (Scaled Agile Framework). Required Candidate profile Proven ability to conduct in-depth data analysis, employing tools such as Microsoft Excel, SQL, and Databricks, to extract meaningful insights and support data-driven decision-making

Posted 2 weeks ago

Apply

8.0 - 13.0 years

2 - 30 Lacs

Pune

Work from Office

Join Barclays as a Senior Data Engineer At Barclays, we are building the bank of tomorrow As a Strategy and Transformation Lead you will build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure To be a successful Senior Data Engineer, you should have experience with: Hands on experience to work with large scale data platforms & in development of cloud solutions in AWS data platform with proven track record in driving business success Strong understanding of AWS and distributed computing paradigms, ability to design and develop data ingestion programs to process large data sets in Batch mode using Glue, Lambda, S3, redshift and snowflake and data bricks Ability to develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Hands on programming experience in python and PY-Spark Understanding of Dev Ops Pipelines using Jenkins, GitLab & should be strong in data modelling and Data architecture concepts & well versed with Project management tools and Agile Methodology Sound knowledge of data governance principles and tools (alation/glue data quality, mesh), Capable of suggesting solution architecture for diverse technology applications Additional Relevant Skills Given Below Are Highly Valued Experience working in financial services industry & working in various Settlements and Sub ledger functions like PNS, Stock Record and Settlements, PNL Knowledge in BPS, IMPACT & Gloss products from Broadridge & creating ML model using python, Spark & Java You may be assessed on key critical skills relevant for success in role, such as risk and controls, change and transformation, business acumen, strategic thinking and digital and technology, as well as job-specific technical skills This role is based in Pune Purpose of the role To build and maintain the systems that collect, store, process, and analyse data, such as data pipelines, data warehouses and data lakes to ensure that all data is accurate, accessible, and secure Accountabilities Build and maintenance of data architectures pipelines that enable the transfer and processing of durable, complete and consistent data Design and implementation of data warehoused and data lakes that manage the appropriate data volumes and velocity and adhere to the required security measures Development of processing and analysis algorithms fit for the intended data complexity and volumes Collaboration with data scientist to build and deploy machine learning models Vice President Expectations To contribute or set strategy, drive requirements and make recommendations for change Plan resources, budgets, and policies; manage and maintain policies/ processes; deliver continuous improvements and escalate breaches of policies/procedures If managing a team, they define jobs and responsibilities, planning for the departments future needs and operations, counselling employees on performance and contributing to employee pay decisions/changes They may also lead a number of specialists to influence the operations of a department, in alignment with strategic as well as tactical priorities, while balancing short and long term goals and ensuring that budgets and schedules meet corporate requirements If the position has leadership responsibilities, People Leaders are expected to demonstrate a clear set of leadership behaviours to create an environment for colleagues to thrive and deliver to a consistently excellent standard The four LEAD behaviours are: L Listen and be authentic, E Energise and inspire, A Align across the enterprise, D Develop others OR for an individual contributor, they will be a subject matter expert within own discipline and will guide technical direction They will lead collaborative, multi-year assignments and guide team members through structured assignments, identify the need for the inclusion of other areas of specialisation to complete assignments They will train, guide and coach less experienced specialists and provide information affecting long term profits, organisational risks and strategic decisions Advise key stakeholders, including functional leadership teams and senior management on functional and cross functional areas of impact and alignment Manage and mitigate risks through assessment, in support of the control and governance agenda Demonstrate leadership and accountability for managing risk and strengthening controls in relation to the work your team does Demonstrate comprehensive understanding of the organisation functions to contribute to achieving the goals of the business Collaborate with other areas of work, for business aligned support areas to keep up to speed with business activity and the business strategies Create solutions based on sophisticated analytical thought comparing and selecting complex alternatives In-depth analysis with interpretative thinking will be required to define problems and develop innovative solutions Adopt and include the outcomes of extensive research in problem solving processes Seek out, build and maintain trusting relationships and partnerships with internal and external stakeholders in order to accomplish key business objectives, using influencing and negotiating skills to achieve outcomes All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence and Stewardship our moral compass, helping us do what we believe is right They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge and Drive the operating manual for how we behave

Posted 2 weeks ago

Apply

6.0 - 10.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Job Title: Informatica Architect Job Type: Full-time, Contractor Location: HybridBengaluru | Pune About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs If you are looking to be at the forefront of AI innovation and work with some of the fastest-growing companies in Silicon Valley, we invite you to apply for a role By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market Job Summary Join our customer's team as an Informatica Architect and play a critical role in shaping data governance, data catalog, and data quality initiatives for enterprise-level products As a key leader, you will collaborate closely with Data & Analytics leads, ensuring the integrity, accessibility, and quality of business-critical data assets across multiple domains Key Responsibilities Lead data governance, data catalog, and data quality efforts utilizing Informatica and other industry-leading tools Design, develop, and manage data catalogs and enterprise data assets to support analytics and reporting across the organization Configure and optimize Informatica CDQ and Data Quality modules, ensuring adherence to enterprise data standards and policies Implement and maintain business glossaries, data domains, data lineage, and data stewardship resources for enterprise-wide use Collaborate with cross-functional teams to define critical data elements, data governance rules, and quality policies for multiple data sources Develop dashboards and visualizations to support data quality monitoring, compliance, and stewardship activities Continuously review, assess, and enhance data definitions, catalog resources, and governance practices to stay ahead of evolving business needs Required Skills and Qualifications Minimum 7-8 years of enterprise data integration, management, and governance experience with proven expertise in EDW technologies At least 5 years of hands-on experience with Informatica CDQ and Data Quality solutions, having executed 2+ large-scale Data Governance and Quality projects from inception to production Demonstrated proficiency configuring business glossaries, policies, dashboards, and search functions within Informatica or similar platforms In-depth expertise in data quality, data cataloguing, and data governance frameworks and best practices Strong background in Master Data Management (MDM), ensuring oversight and control of complex product catalogs Exceptional written and verbal communication skills, able to effectively engage technical and business stakeholders Experience collaborating with diverse teams to deliver robust data governance and analytics solutions Preferred Qualifications Administration and management experience with industry data catalog tools such as Collibra, Alation, or Atian Strong working knowledge of configuring user groups, permissions, data profiling, and lineage within catalog platforms Hands-on experience implementing open-source data catalog tools in enterprise environments

Posted 2 weeks ago

Apply

6.0 - 9.0 years

8 - 11 Lacs

Hyderabad

Work from Office

This role involves the development and application of engineering practice and knowledge in defining, configuring and deploying industrial digital technologies including but not limited to PLM MES for managing continuity of information across the engineering enterprise, including design, industrialization, manufacturing supply chain, and for managing the manufacturing data. Job Description - Grade Specific Focus on Digital Continuity Manufacturing. Fully competent in own area. Acts as a key contributor in a more complex critical environment. Proactively acts to understand and anticipates client needs. Manages costs and profitability for a work area. Manages own agenda to meet agreed targets. Develop plans for projects in own area. Looks beyond the immediate problem to the wider implications. Acts as a facilitator, coach and moves teams forward.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

18 - 33 Lacs

Pune

Work from Office

Role & responsibilities: Analyse, manage, and maintain entity reference data to ensure accuracy, consistency, and completeness across systems Collaborate with product and technology teams to design, develop, and implement new features and controls for our reference data products Identify opportunities for product improvement and process optimization, providing actionable recommendations based on thorough data analysis Conduct root cause analysis of data issues, propose solutions, and oversee their implementation. Define and translate business requirements into functional specifications for developers and QA teams Evaluate and integrate new data sources, ensuring alignment with business needs and data governance standards Utilize SQL for data extraction, transformation, and analysis; automate data quality checks and reporting. Leverage exploratory data tools (such as Alteryx, KNIME, or similar) to build workflows, analyse large datasets, and generate insights Monitor data quality metrics, develop dashboards, and prepare management reports Engage with stakeholders to gather requirements, document processes, and communicate findings effectively Support ongoing data governance, compliance, and audit requirements within the reference data domain. Preferred candidate profile 8-10 years of relevant experience in entity reference data management, preferably within financial services, fintech, or data-centric organizations Proven expertise in data analysis, data quality management, and process improvement. Advanced proficiency in SQL for querying, analysing, and managing large datasets Hands-on experience with at least one leading data exploration or ETL tool (e.g., Alteryx, KNIME, etc.) Strong understanding of data governance, data standards, and regulatory compliance in the context of reference data Excellent problem-solving skills, with the ability to work independently on complex data challenges. Strong communication skills, both verbal and written, with the ability to interact effectively with technical and business stakeholders Bachelors or Master’s degree in Computer Science, Engineering, Finance, Business, or a related field.

Posted 2 weeks ago

Apply

11.0 - 16.0 years

40 - 45 Lacs

Pune

Work from Office

Role Description This role is for a Senior business functional analyst for Group Architecture. This role will be instrumental in establishing and maintaining bank wide data policies, principles, standards and tool governance. The Senior Business Functional Analyst acts as a link between the business divisions and the data solution providers to align the target data architecture against the enterprise data architecture principles, apply agreed best practices and patterns. Group Architecture partners with each division of the bank to ensure that Architecture is defined, delivered, and managed in alignment with the banks strategy and in accordance with the organizations architectural standards. Your key responsibilities Data Architecture: The candidate will work closely with stakeholders to understand their data needs and break out business requirements into implementable building blocks and design the solution's target architecture. AI/ML: Identity and support the creation of AI use cases focused on delivery the data architecture strategy and data governance tooling. Identify AI/ML use cases and architect pipelines that integrate data flows, data lineage, data quality. Embed AI-powered data quality, detection and metadata enrichment to accelerate data discoverability. Assist in defining and driving the data architecture standards and requirements for AI that need to be enabled and used. GCP Data Architecture & Migration: A strong working experience on GCP Data architecture is must (BigQuery, Dataplex, Cloud SQL, Dataflow, Apigee, Pub/Sub, ...). Appropriate GCP architecture level certification. Experience in handling hybrid architecture & patterns addressing non- functional requirements like data residency, compliance like GDPR and security & access control. Experience in developing reusable components and reference architecture using IaaC (Infrastructure as a code) platforms such as terraform. Data Mesh: The candidate is expected to have proficiency in Data Mesh design strategies that embrace the decentralization nature of data ownership. The candidate must have good domain knowledge to ensure that the data products developed are aligned with business goals and provide real value. Data Management Tool: Access various tools and solutions comprising of data governance capabilities like data catalogue, data modelling and design, metadata management, data quality and lineage and fine-grained data access management. Assist in development of medium to long term target state of the technologies within the data governance domain. Collaboration: Collaborate with stakeholders, including business leaders, project managers, and development teams, to gather requirements and translate them into technical solutions. Your skills and experience Demonstrable experience in designing and deploying AI tooling architectures and use cases Extensive experience in data architecture, within Financial Services Strong technical knowledge of data integration patterns, batch & stream processing, data lake/ data lake house/data warehouse/data mart, caching patterns and policy bases fine grained data access. Proven experience in working on data management principles, data governance, data quality, data lineage and data integration with a focus on Data Mesh Knowledge of Data Modelling concepts like dimensional modelling and 3NF. Experience of systematic structured review of data models to enforce conformance to standards. High level understanding of data management solutions e.g. Collibra, Informatica Data Governance etc. Proficiency at data modeling and experience with different data modelling tools. Very good understanding of streaming and non-streaming ETL and ELT approaches for data ingest. Strong analytical and problem-solving skills, with the ability to identify complex business requirements and translate them into technical solutions.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Noida, Gurugram, Uttarpradesh

Work from Office

The Role: We are seeking a skilled Data Governance and Data Quality Analyst to support our organization's data management strategy. This role is critical in establishing and maintaining data governance frameworks and ensuring the accuracy, consistency, and reliability of enterprise data. The ideal candidate will work closely with cross-functional teams to define data standards, monitor data quality metrics, and drive continuous improvement initiatives that enhance data integrity and support informed business decisions. The Impact: Being a data Governance and Data Quality Analyst will not only enhance the quality of our data but will also contribute to the overall success and growth of S&P Global by enabling better decision-making, improving operational processes, and fostering a culture of data excellence. Whats in it for you: Joining S&P Global as a Data Governance and Data Quality Analyst offers a unique opportunity to be part of a leading organization dedicated to data excellence. Heres what you can expect if you become a part of our team: Professional Growth: You will have access to ongoing training and development resources to enhance your skills in data governance and quality management. S&P Global is committed to investing in your professional growth, ensuring you stay at the forefront of industry trends and best practices. Impactful Work: Your contributions will directly influence the quality of data that drives critical business decisions at S&P Global. You will play a key role in shaping the data landscape of a global leader, making your work both meaningful and rewarding. Collaborative Environment: You will work alongside a diverse team of professionals who are passionate about data management. Our culture promotes collaboration, innovation, and knowledge-sharing, allowing you to learn from others while also contributing your expertise. Career Advancement Opportunities: S&P Global values internal mobility and career progression. As you demonstrate your skills and make an impact, you will have opportunities to advance your career within the organization and explore various paths in data management, analytics, and beyond. Global Exposure: As part of a global organization, you will have the opportunity to collaborate with teams and stakeholders from around the world, gaining insights into different markets and cultures while expanding your professional network.

Posted 2 weeks ago

Apply

2.0 - 3.0 years

7 - 11 Lacs

Bengaluru

Work from Office

What is The Job? We are hiring a Marketing Operations Analyst, who will be an integrated part of marketing and sales team. You will be responsible for automating the day-to-day work between marketing and sales, keeping the best practices in mind. You will get the actionable insights for sales and marketing, through data and funnel analysis. You will also help marketing and product team with the routine research work (including and not limited to) keyword analysis, list building, and coordinating with vendors. Why Should You Apply? If Inbound Marketing and a small team interests you, this is an incredible opportunity to create a significant impact on a fast growing product. You can also extend your reach to other aspects of inbound / content marketing. Responsibilities: Manage technical aspects of key marketing systems (marketing automation, CRM) used to generate, distribute, and report on leads Establish and maintain scalable processes that ensure best practices in campaign and lead management Create and maintain metrics reports on marketing and sales activities and effectiveness and business impact Analyze marketing and sales data to develop insights and make recommendations on areas for optimization Monitor and maintain data quality within the marketing database Evaluate new technologies and add-on applications to improve and optimize marketing team performance Help marketing team in the list building exercise through various channels Coordinate with various marketing vendors Optimizing online paid campaign Help marketing team in routine research activities like SEO keywords, tools selection etc. Help in circulating weekly marketing reports across the company. Requirements : Bachelor's degree is a must. An MBA would be a PLUS. 2-3 years of experience in marketing automation and analytics Strong analytical skills and experience with reporting and data analysis. Proficient in marketing automation systems (e.g. HubSpot) and integrating those systems with other technologies. Strong drive to learn, excellent communicator, and a desire to improve processes

Posted 2 weeks ago

Apply

2.0 - 4.0 years

0 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Skill Collibra Data Quality (DQ) including rule creation, monitoring, and troubleshooting. Application support, DevOps, or Data engineering. understanding of Kubernetes concepts including pods, services, deployments, config maps, and secrets Experience with Docker , Helm, and CI/CD tools (e.g., Jenkins, GitHub Actions). Proficiency in SQL and basic scripting (e.g., Shell, Python). Knowledge of JDBC/ODBC connectors , REST APIs, and enterprise data integration patterns Exposure to cloud environments ( AWS , Azure , or GCP ) and managing DQ components in cloud-native architectures

Posted 2 weeks ago

Apply

14.0 - 20.0 years

40 - 60 Lacs

Bengaluru

Hybrid

Role & responsibilities Shape technical strategy (e.g., build vs. buy decisions, technical road-mapping) in collaboration with architects Evaluate and identify appropriate technology stacks, platforms and vendors, including web application frameworks and cloud providers for solution development Attend team ceremonies as required; in particular, feature refinement and cross-team iteration reviews/demos Drive the resolution of technical impediments Own the 'success' of foundational enablers Champion for Research and Innovation Lead in scaled agile ceremonies and activities, like quarterly reviews, quarterly increment planning and OKR writing Collaborate with the Platform Owner in the writing and prioritization of technical capabilities and enablers Present platform delivery metrics, OKR health and platform finance status to Executive audiences Collaborate with other Technical Leads Create and maintain the technical roadmap for in-scope products and services at the platform. Preferred candidate profile Total work experience 14-20 years B.E. / B.Tech or equivalent Engineering professional Masters degree or equivalent experience in Marketing, Business or finance is an added advantage 10+ yrs. of experience in technical architecture, solution design, and platform engineering Strong experience in MDM, Data Quality and Data Governance practices including tool stack such as Informatica MDM SaaS, Informatica Data Quality, and Collibra is a plus Good experience with major cloud platforms and data tools in cloud including but not limited to AWS, Microsoft Azure, Kafka, CDC, Tableau, and Data virtualization tools Good experience in ETL and BI solution development and tool stack Informatica ETL experience is a plus Good experience in Data Architecture, SQL, NoSQL, REST API, data security, and AI concepts Familiarity with agile methodologies and data factory operations processes including usage of tools like confluence, Jira and Miro Strong knowledge of industry standards and regulations: A data platform owner should have knowledge of industry standards and regulations related to data management, such as HIPAA, PCI-DSS, and GDPR.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

15 - 20 Lacs

Pune

Work from Office

Druva Inc. is looking for Manager, Cloud Operations to join our dynamic team and embark on a rewarding career journey Cloud Infrastructure Management: Design, deploy, and manage cloud infrastructure on platforms such as AWS, Azure, or Google Cloud. Configure and maintain virtual servers, storage, and networking components. Security and Compliance: Implement and monitor security measures to protect cloud-based systems and data. Ensure compliance with industry standards and regulatory requirements. Performance Optimization: Monitor and optimize the performance of cloud-based applications and services. Identify and resolve issues related to scalability, reliability, and efficiency. Automation: Implement automation scripts and tools for provisioning and managing cloud resources. Streamline repetitive tasks to improve operational efficiency. Backup and Disaster Recovery: Develop and maintain backup and disaster recovery plans for cloud-based systems. Conduct regular tests to ensure the effectiveness of recovery procedures. Collaboration: Collaborate with other IT teams, including development, infrastructure, and security teams. Participate in cross-functional projects and initiatives. Cost Management: Monitor and manage cloud resource costs. Implement cost optimization strategies without compromising performance or security. Documentation: Create and maintain documentation for cloud architecture, configurations, and procedures. Provide training and knowledge transfer to team members. Incident Response: Respond to and resolve cloud-related incidents and outages. Implement preventive measures to minimize the risk of future incidents.

Posted 2 weeks ago

Apply

0.0 - 5.0 years

4 - 9 Lacs

Noida

Remote

Identify, analyze, and interpret trends or patterns in complex data sets. Filter and “clean” data by reviewing computer reports, printouts, and performance indicators to locate and correct code problems. Work with management to prioritize business. Required Candidate profile Knowledge of and experience with reporting packages databases (SQL etc), programming (XML, Javascript, or ETL frameworks). Adept at queries, report writing and presenting findings. Perks and benefits Flexible work arrangements.

Posted 2 weeks ago

Apply

0.0 - 5.0 years

4 - 9 Lacs

Chennai

Remote

Identify, analyze, and interpret trends or patterns in complex data sets. Filter and “clean” data by reviewing computer reports, printouts, and performance indicators to locate and correct code problems. Work with management to prioritize business. Required Candidate profile Knowledge of and experience with reporting packages databases (SQL etc), programming (XML, Javascript, or ETL frameworks). Adept at queries, report writing and presenting findings. Perks and benefits Flexible work arrangements.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

3 - 6 Lacs

Pune, Ahmedabad, Vadodara

Work from Office

Job Description : Design, build, and maintain scalable data pipelines on Snowflake. Possessing experience or knowledge in Snow pipe, Time Travel, and Fail Safe. Write and optimize SQL queries for data extraction and transformation. Develop ETL processes to integrate various data sources into Snowflake. Monitor and troubleshoot data warehouse performance issues. Implement security measures and data governance practices. Having sound knowledge on snowflake architecture. Having knowledge on fivetran is addon advantage Collaborate with cross-functional teams to support analytical and reporting needs.

Posted 2 weeks ago

Apply

9.0 - 14.0 years

6 - 9 Lacs

Bengaluru

Work from Office

8+ years of experience as a Snowflake developer or data engineer with a focus on data warehousing and ETL. Snowflake certification(s) is a plus. Strong SQL skills and proficiency in data modeling and database design. Knowledge of cloud data warehousing concepts and best practices. Familiarity with data integration tools and technologies. Solid understanding of data governance, data security, and compliance requirements. Experience with version control systems and deployment processes. Excellent problem-solving and troubleshooting skills. Strong communication and collaboration abilities. Ability to work in an Agile or iterative development environment. Collaborate with data architects to design and develop Snowflake data models and schemas Write complex SQL queries, stored procedures, and user-defined functions (UDFs) to support data analytics and reporting needs Ensure SQL code follows best practices for readability and performance Develop ETL (Extract, Transform, Load) processes to ingest data from various sources into Snowflake Design and implement data pipelines using Snowflake features like tasks and streams Strong knowledge in Snow pipe and snow SQL Job Location: Bangalore Chennai Mumbai Pune

Posted 2 weeks ago

Apply

3.0 - 6.0 years

7 - 11 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

To design, build, and optimize scalable data pipelines and solutions using Azure Databricks and related technologies, enabling Zodiac Maritime to make faster, data-driven decisions as part of its data transformation journey. Proficiency in data integration techniques, ETL processes and data pipeline architectures. we'll versed in Data Quality rules, principles and implementation. Key Result Areas and Activities: Data Pipeline Development: Design and implement robust batch and streaming data pipelines using Azure Databricks and Spark. Data Architecture Implementation: Apply Medallion Architecture to structure data layers (raw, enriched, curated). Data Quality & Governance: Ensure data accuracy, consistency, and governance using tools like Azure Purview and Unity Catalog. Performance Optimization: Optimize Spark jobs, Delta Lake tables, and SQL queries for efficiency and cost-effectiveness. Collaboration & Delivery: Work closely with analysts, architects, and business teams to deliver end-to-end data solutions. Technical Experience : Must Have: Hands-on experience with Azure Databricks, Delta Lake, Data Factory. Proficiency in Python, PySpark, and SQL with strong query optimization skills. Deep understanding of Lakehouse architecture and Medallion design patterns. Experience building scalable ETL/ELT pipelines and data transformations. Familiarity with Git, CI/CD pipelines, and Agile methodologies. Good To Have: Knowledge of data quality frameworks and monitoring practices. Experience with Power BI or other data visualization tools. Understanding of IoT data pipelines and streaming technologies like Kafka/Event Hubs. Awareness of emerging technologies such as Knowledge Graphs. Qualifications: Education: Likely a degree in Computer Science, Data Engineering, Information Systems, or a related field. Experience: Proven hands-on experience with Azure data stack (Databricks, Data Factory, Delta Lake). Experience in building scalable ETL/ELT pipelines. Familiarity with data governance and DevOps practices. Qualities: Strong problem-solving and analytical skills Attention to detail and commitment to data quality Collaborative mindset and effective communication Proactive and self-driven Passion for learning and staying updated with emerging data technologies

Posted 2 weeks ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

Thane

Work from Office

Job Description You are responsible for ensuring that projects are successfully completed on time and on budget. This includes project governance, budget and timeline development, build quality, testing and operational readiness, and the completed project s readiness to go live. Are You Ready to Make It Happen at Mondel z International? Join our Mission to Lead the Future of Snacking. Make It Uniquely Yours. Food safety is paramount to Mondelez. It is the foundation of our commitment to delivering high-quality, delicious products that consumers trust and enjoy. Food Safety Data Intelligence platform is designed to capture and analyze food safety data from across our enterprise, from internal plants, external manufacturers, and suppliers. Data governance is a critical process in sustaining the Food Safety Data Intelligence solution. By collaborating with project team and quality team in all regions, BUs and plants, you are responsible for ensuring the correct data governance processes are implemented during and beyond the project. How you will contribute You will: Lead the data governance of the solution per roadmap. Ensure data quality meets Mdlz standards. Good understanding of data analytics and modelling. Drive consistency and quality in the execution of projects using the appropriate methodology and tools. As required support the implementation leads in rolling out the solution. Identify, assess, and mitigate project-level risks and escalate them to project team when needed. Confirm the completion of the implementation in plants and consistency of the execution and the readiness to go live. Provide hyper-care and on-going support to ensure the full usage of the solution. Provide status updates using status reports and review meetings. What you will bring A desire to drive your future and accelerate your career and the following experience and knowledge: Proven learning agility Excellent communication and influencing skills and able to drive effective discussions with project resources and multi-function teams Excellent project planning and proactiveness in identifying risks and mitigations and managing strict timelines Knowledge of relevant manufacturing processes and related technology solutions Stakeholder management and ability to influence positively in decision making Proven ability in building effective teams across internal and external partners Strong conceptual and analytic skills - enjoys problem solving Fluent in English, Spanish an advantage. More about this role What you need to know about this position: You will gain experience in a high profile global project, working with multi-function and multi-countries/regions, build in-depth understanding of Data & Technical solution and enhancing project management, change management and communication skills. What extra ingredients you will bring: Education / Certifications: Job specific requirements: Travel requirements: minimum local travel Work schedule: Prefer locations, Poland, India, Mexico, Brazil, China. No Relocation support available Business Unit Summary Job Type Temporary (Fixed Term) Data Science Analytics & Data Science

Posted 2 weeks ago

Apply

8.0 - 13.0 years

20 - 25 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Would you like to work with cutting-edge technologies? Are you passionate about leveraging the power of technology to effect change? Join our Global Sales Tools & Support Team The Central Sales Ops team is a cross-divisional business operations function. It operationalizes business architecture by designing, defining, and governing policies, processes, and tools supporting global go-to-market teams. Partner with the best As a Senior Business Operations Analyst, you will collaborate with sales management, divisional teams, and key stakeholders. You will understand business needs and design, deploy, and support critical business processes effectively. Enable Akamais Sales tech vision through analysis, use cases, business requirements, and implementation support. As a Business Operations Analyst Senior, you will be responsible for: Partnering with sales and channels leaders to define/execute strategic projects to increase revenue, productivity, and efficiency. Being a thought-leader and go-to-expert on process/tools. Partner with IT to drive alignment and execute initiatives. Executing on technology roadmap to support continuous improvement systems process and user experience for sellers and partners. Supporting data governance by stewarding data to improve hygiene and enhance its usefulness. Establishing formal and informal processes for collecting feedback, requirements, and enhancement requests related to sales process/tools. Leading cross-functional initiatives and collaborate closely with sales, pre-sales, marketing, and finance. Do what you love To be successful in this role you will: Possess 8 years of experience and a Bachelors degree. Demonstrate progressive experience with Salesforce Sales Cloud and/or Service Cloud. Have experience refining, optimizing or supporting sales processes and tools. Demonstrate deep knowledge of sales processes, Salesforce, tools, and excellent analytical, interpersonal, organizational, critical-thinking skills. Be result-oriented with attention to detail, ability to build credibility, trust, and rapport with stakeholders Possess excellent teamwork skills, flexibility, and ability to handle and prioritize multiple tasks Communicate information effectively to business partners and team members across the entire operations and GTM communities Be able to work cross-functionally and interact effectively with all levels of the organization. Build your career at Akamai Our ability to shape digital life today relies on developing exceptional people like you. The kind that can turn impossible into possible. We re doing everything we can to make Akamai a great place to work. A place where you can learn, grow and have a meaningful impact. With our company moving so fast, it s important that you re able to build new skills, explore new roles, and try out different opportunities. There are so many different ways to build your career at Akamai, and we want to support you as much as possible. We have all kinds of development opportunities available, from programs such as GROW and Mentoring, to internal events like the APEX Expo and tools such as Linkedin Learning, all to help you expand your knowledge and experience here. Learn more Not sure if this job is the right match for you or want to learn more about the job before you apply? Schedule a 15-minute exploratory call with the Recruiter and they would be happy to share more details.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

37 - 45 Lacs

Hyderabad

Work from Office

Position Summary Resource is responsible for managing (BMC Defender & Mainview , Cubus , Product Builder and HP PX Calculator) which includes implementation, configuration, maintenance, upgrades, patches and administration, performance tuning, identify repetitive tasks and automate. Job Responsibilities Good experience in applications builds in both windows and L inux infrastructure and able to lead project. Troubleshooting the application issues and identifying bugs and fixing the issue. Good knowledge in performance tunings Good experience in automation Strong collaboration with team members Capable of coaching up new team members Learn new technologies based on demand. Participate in cross-departmental efforts Leads initiatives within the community of practice Good decision-making skills. Take ownership of the deliverables from the entire team. Should be able to generate and present reports to leadership Coach other team members and bring them up to speed. Willing to work in rotational shifts Good Communication skill with the ability to communicate clearly and effectively Knowledge, Skills and Abilities Education Bachelors degree in computer science, Information Systems, or related field Experience 10+ years of total experience and at least 7+ years of experience in applications build in both windows and Linux infrastructure and able to lead project BMC Mainview BMC Defender Trillium Collibra Linux Administration Network DNS and Domain Setup Web Server Scripting Ansible Azure Pipelines Ansible Python Windows Debugging Collibra Data Governance -Edge (K3s application) BMC CorreLog SIEM Agent/Server for IBM zOS Application Debugging Elastic Azure DevOps Experience in creating change tickets and working on tasks in Service Now Experience of AI and ability to utilize AI and related tools (GitHub copilot) to solve complex problems

Posted 2 weeks ago

Apply

6.0 - 11.0 years

11 - 20 Lacs

Pune

Hybrid

A Day in the Life Our Global Diabetes Capability Center in Pune is expanding to serve more people living with diabetes globally. Our state-of-the-art facility is dedicated to transforming diabetes management through innovative solutions and technologies that reduce the burden of living with diabetes. Medtronic is hiring a Senior Data Governance Engineer. As a Senior engineer, you will be a key member of our Data Governance team, defining scope, assessing current data governance capabilities, building roadmaps to mature data governance capabilities. This role offers a dynamic opportunity to join Medtronic's Diabetes business. Medtronic has announced its intention to separate the Diabetes division to promote future growth and innovation within the business and reallocate investments and resources across Medtronic, subject to applicable information and consultation requirements. While you will start your employment with Medtronic, upon establishment of SpinCo or the transition of the Diabetes business to another company, your employment may transfer to either SpinCo or the other company, at Medtronic's discretion and subject to any applicable information and consultation requirements in your jurisdiction. Responsibilities may include the following and other duties may be assigned. Data Governance Strategy Development - responsible for the strategic development, architectural definition, and enterprise-wide data governance and data management initiatives supporting the delivery of data as a service Provide Data Governance and Data Management advisory expertise Responsible for the identification and evaluation of metadata platform alternatives, the development of the logical metadata framework architecture, and the definition and implementation of metadata maintenance processes Defining Data Governance operating models, roles, and responsibilities in collaboration with Medtronic requirements Responsible for the assessment and selection of Data Management tools landscape including: data profiling, data quality, metadata, MDM, rules engines, and pattern matching Refine and develop the data-centric approach and methodologies which may include areas such as: data strategy, data governance, data lineage, analytics, business intelligence, data architecture, data quality, master data management and data integration and delivery Assess current state capabilities, identify gaps versus leading practices, and recommend future state data requirements Identify opportunities for new workflows, third party glossary/metadata tools, database architectures, and third-party pre-existing data models. Work closely with business and technology stakeholders to assist in understanding and documenting data requirements, lineage, and partnering with IT data integration and data warehouse teams to ensure requirements are effectively executed Help define data modeling naming standards, abbreviations, guidelines, and best practices Enhance or design data model review processes based on business requirements Minimum Qualification: At least 5 years of experience developing / structuring an enterprise wide data governance organization and business process (operating models, roles, partner organizations, responsibilities) Hands-on experience with both the business side and IT side implementing or supporting an MDM and/or Data Warehouse and Reporting IT solutions Utilize strong business knowledge of the investment management industry and common data management operations Broad understanding of the role of data management within global markets organizations, information flow, and data governance issues Domain expertise in specific areas of Data Management such as: data strategy, data governance, data lineage, analytics, business intelligence, data architecture, data quality, master data management and data integration and delivery Note : Immedite or Notice period serving candidates preferred. If interested please share your updated CV on ashwini.ukekar@medtronic.com

Posted 2 weeks ago

Apply

8.0 - 12.0 years

18 - 27 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Work from Office

Job Description: Design, implement, and maintain data pipelines and data integration solutions using Azure Synapse Develop and optimize data models and data storage solutions on Azure Collaborate with data scientists and analysts to implement data processing and data transformation tasks. Ensure data quality and integrity through data validation and cleansing methodologies. Monitor and troubleshoot data pipelines to identify and resolve performance issues Collaborate with cross-functional teams to understand and prioritize data requirements. Stay up-to-date with the latest trends and technologies in data engineering and Azure services. Skills & Qualifications: Bachelors degree in IT, computer science, computer engineering, or similar 8+ years of experience in Data Engineering. Microsoft Azure Synapse Analytics experience is essential. (Azure Data Factory, Dedicated SQL Pool, Lake Database, Azure Storage) Hands on experience in Spark notebooks(python or Scala) is mandatory End-to-end Data Warehouse experience: Ingestion, ETL, big data pipelines, data architecture, message queuing, BI/Reporting, and Data Security. Advanced SQL/relational database knowledge and query authoring. Demonstrated experience in design and delivering data platforms for Business Intelligence and Data Warehouse. Strong skills in handling and analysing complex, high volume data with excellent attention in details. Knowledge of data modelling and data warehousing concepts, such as DataVault or 3NF. Experience with Data Governance (Quality, Lineage, Data dictionary, and Security). Familiar with Agile methodology and Agile working environment. Ability to work alone with POs, BAs, Architects.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies