Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 3.0 years
8 - 11 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description Job Description: We are seeking an experienced Data Governance and Catalog Specialist withexpertise in data access, cataloging, and governance. You will ensure data integrity, collaborate with stakeholders, and implement efficient data processes using tools like Informatica, Alation, Atlan, or Collibra. This role requires strong technical expertise, problem-solving skills, and the ability to work in a dynamic environment. Key Responsibilities: Lead data governance, cataloging, and access management initiatives. Develop and manage enterprise data catalogs for analytics and reporting. Configure data governance resources, business glossaries, policies, and dashboards. Implement critical data elements, data quality rules, and governance policies. Administer data catalog tools, configure data profiling, and manage lineage. Collaborate with data owners, stewards, and stakeholders to refine governance frameworks. Ensure compliance with data governance standards and best practices. Optimize data access processes to improve efficiency and security. Qualifications: 1 to 3 years of experience in data integration, management, and governance. Hands-on experience with Collibra or similar tools. Experience in executing large-scale data governance projects. Strong knowledge of metadata management and regulatory compliance. Familiarity with data lineage, data quality management, and critical data elements. Ability to troubleshoot data-related issues and implement solutions. Strong communication and collaboration skills to work with cross-functional teams.
Posted 9 hours ago
7.0 - 11.0 years
11 - 21 Lacs
Pune
Work from Office
Our Global Diabetes Capability Center in Pune is expanding to serve more people living with diabetes globally. Our state-of-the-art facility is dedicated to transforming diabetes management through innovative solutions and technologies that reduce the burden of living with diabetes. Medtronic is hiring a Senior Data Governance Engineer. As a Senior engineer, you will be a key member of our Data Governance team, defining scope, assessing current data governance capabilities, building roadmaps to mature data governance capabilities. This role offers a dynamic opportunity to join Medtronic's Diabetes business. Medtronic has announced its intention to separate the Diabetes division to promote future growth and innovation within the business and reallocate investments and resources across Medtronic, subject to applicable information and consultation requirements. While you will start your employment with Medtronic, upon establishment of SpinCo or the transition of the Diabetes business to another company, your employment may transfer to either SpinCo or the other company, at Medtronic's discretion and subject to any applicable information and consultation requirements in your jurisdiction. Responsibilities may include the following and other duties may be assigned: Data Governance Strategy Development - responsible for the strategic development, architectural definition, and enterprise wide data governance and data management initiatives supporting the delivery of data as a service. Provide Data Governance and Data Management advisory expertise Responsible for the identification and evaluation of metadata platform alternatives, the development of the logical metadata framework architecture, and the definition and implementation of metadata maintenance processes. Defining Data Governance operating models, roles, and responsibilities in collaboration with Medtronic requirements. Responsible for the assessment and selection of Data Management tools landscape including: data profiling, data quality, metadata, MDM, rules engines, and pattern matching Refine and develop the data-centric approach and methodologies which may include areas such as: data strategy, data governance, data lineage, analytics, business intelligence, data architecture, data quality, master data management and data integration and delivery Assess current state capabilities, identify gaps versus leading practices, and recommend future state data requirements. Identify opportunities for new workflows, third party glossary/metadata tools, database architectures, and third party pre-existing data models. Work closely with business and technology stakeholders to assist in understanding and documenting data requirements, lineage, and partnering with IT data integration and data warehouse teams to ensure requirements are effectively executed. Help define data modeling naming standards, abbreviations, guidelines, and best practices Enhance or design data model review processes based on business requirements. Required Knowledge and Experience: At least 5 years of experience developing / structuring an enterprise-wide data governance organization and business process (operating models, roles, partner organizations, responsibilities). Hands-on with both the business side and IT side implementing or supporting an MDM and/or Data Warehouse and Reporting IT solutions. Utilize strong business knowledge of the investment management industry and common data management operations. Broad understanding of the role of data management within global markets organizations, information flow, and data governance issues. Domain expertise in specific areas of Data Management such as: data strategy, data governance, data lineage, analytics, business intelligence, data architecture, data quality, master data management and data integration and delivery.
Posted 1 day ago
10.0 - 15.0 years
55 - 60 Lacs
Hyderabad, Bengaluru, Delhi / NCR
Work from Office
Position Overview We are seeking an experienced Data Catalog Lead to lead the implementation and ongoing development of enterprise data catalog using Collibra. This role focuses specifically on healthcare payer industry requirements, including complex regulatory compliance, member data privacy, and multi-system data integration challenges unique to health plan operations. Key Responsibilities Data Catalog Implementation & Development Configure and customize Collibra workflows, data models, and governance processes to support health plan business requirements Develop automated data discovery and cataloging processes for healthcare data assets including claims, eligibility, provider networks, and member information Design and implement data lineage tracking across complex healthcare data ecosystems spanning core administration systems, data warehouses, and analytics platforms Healthcare-Specific Data Governance Build specialized data catalog structures for healthcare data domains including medical coding systems (ICD-10, CPT, HCPCS), pharmacy data (NDC codes), and provider taxonomies Configure data classification and sensitivity tagging for PHI (Protected Health Information) and PII data elements in compliance with HIPAA requirements Implement data retention and privacy policies within Collibra that align with healthcare regulatory requirements and member consent management Develop metadata management processes for regulatory reporting datasets (HEDIS, Medicare Stars, MLR reporting, risk adjustment) Technical Integration & Automation Integrate Collibra with healthcare payer core systems including claims processing platforms, eligibility systems, provider directories, and clinical data repositories Implement automated data quality monitoring and profiling processes that populate the data catalog with technical and business metadata Configure Collibra's REST APIs to enable integration with existing data governance tools and business intelligence platforms Required Qualifications Collibra Platform Expertise 8+ years of hands-on experience with Collibra Data Intelligence Cloud platform implementation and administration Expert knowledge of Collibra's data catalog, data lineage, and data governance capabilities Proficiency in Collibra workflow configuration, custom attribute development, and role-based access control setup Experience with Collibra Connect for automated metadata harvesting and system integration Strong understanding of Collibra's REST APIs and custom development capabilities Healthcare Payer Industry Knowledge 4+ years of experience working with healthcare payer/health plan data environments Deep understanding of healthcare data types including claims (professional, institutional, pharmacy), eligibility, provider data, and member demographics Knowledge of healthcare industry standards including HL7, X12 EDI transactions, and FHIR specifications Familiarity with healthcare regulatory requirements (HIPAA, ACA, Medicare Advantage, Medicaid managed care) Understanding of healthcare coding systems (ICD-10-CM/PCS, CPT, HCPCS, NDC, SNOMED CT) Technical Skills Strong SQL skills and experience with healthcare databases (claims databases, clinical data repositories, member systems) Knowledge of cloud platforms (AWS, Azure, GCP) and their integration with Collibra cloud services Understanding of data modeling principles and healthcare data warehouse design patterns Data Governance & Compliance Experience implementing data governance frameworks in regulated healthcare environments Knowledge of data privacy regulations (HIPAA, state privacy laws) and their implementation in data catalog tools Understanding of data classification, data quality management, and master data management principles Experience with audit trail requirements and compliance reporting in healthcare organizations Preferred Qualifications Advanced Healthcare Experience Experience with specific health plan core systems (such as HealthEdge, Facets, QNXT, or similar platforms) Knowledge of Medicare Advantage, Medicaid managed care, or commercial health plan operations Understanding of value-based care arrangements and their data requirements Experience with clinical data integration and population health analytics Technical Certifications & Skills Collibra certification (Data Citizen, Data Steward, or Technical User) Experience with additional data catalog tools (Alation, Apache Atlas, IBM Watson Knowledge Catalog) Knowledge of data virtualization tools and their integration with data catalog platforms Experience with healthcare interoperability standards and API management Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai
Posted 1 day ago
5.0 - 10.0 years
20 - 25 Lacs
Bengaluru
Work from Office
The Platform Data Engineer will be responsible for designing and implementing robust data platform architectures, integrating diverse data technologies, and ensuring scalability, reliability, performance, and security across the platform. The role involves setting up and managing infrastructure for data pipelines, storage, and processing, developing internal tools to enhance platform usability, implementing monitoring and observability, collaborating with software engineering teams for seamless integration, and driving capacity planning and cost optimization initiatives.
Posted 2 days ago
5.0 - 8.0 years
8 - 16 Lacs
Gandhinagar, Ahmedabad
Work from Office
About TELUS Digital TELUS Digital (NYSE and TSX: TIXT) designs, builds, and delivers next-generation digital solutions to enhance the customer experience (CX) for global and disruptive brands. The companys services support the full lifecycle of its clients digital transformation journeys and enable them to more quickly embrace next-generation digital technologies to deliver better business outcomes. TELUS Digitals integrated solutions and capabilities span digital strategy, innovation, consulting and design, digital transformation and IT lifecycle solutions, data annotation, and intelligent automation, and omnichannel CX solutions that include content moderation, trust and safety solutions, and other managed solutions. Fueling all stages of company growth, TELUS Digital partners with brands across high-growth industry verticals, including tech and games, communications and media, eCommerce and fintech, healthcare, and travel and hospitality. Learn more at: telusinternational.com. Position Overview: We have an ambitious Enterprise Data Office and are building a class-leading data team that works to solve complex business challenges and provide insights to improve our business and customer experience. To enhance the team, we are looking for an innovative and enterprising Data Governance Engineer who plays a critical role in shaping and implementing our enterprise- wide data governance and data management roadmap and strategy. The Data Governance Engineer will focus their work on 'data is an asset' thinking across the enterprise, integrating the Data Governance tools implementations of Collibra Data Intelligence Platform, Collibra Data Quality, Google Cloud Data Related Tools, and Informatica MDM/RDM. This person will work with internal partners and developers to brainstorm and evaluate technical solutions, product integration opportunities and demonstrations. This role requires creative thinking, a deep curiosity and understanding of data models and usage, as well as empathy for partner/client challenges and pain points. Essential Responsibilities:
Posted 2 weeks ago
8.0 - 10.0 years
10 - 12 Lacs
Hyderabad
Work from Office
ABOUT THE ROLE Role Description: We are seeking a highly skilled and experienced hands-on Test Automation Engineering Manager with a deep e xpertise in Data Quality (DQ) , Data Integration (DIF) , and Data Governance . In this role, you will design and implement automated frameworks that ensure data accuracy, metadata consistency , and compliance throughout the data pipeline , leveraging technologies like Data bricks , AWS , and cloud-native tools . You will have a major focus on Data Cataloguing and Governance , ensuring that data assets are well-documented, auditable, and secure across the enterprise. In this role, you will be responsible for the end-to-end design and development of a test automation framework, working collaboratively with the team. As the delivery owner for test automation, your primary focus will be on building and automating comprehensive validation frameworks for data cataloging , data classification, and metadata tracking, while ensuring alignment with internal governance standards. will also work closely with data engineers, product teams, and data governance leads to enforce data quality and governance policies . Your efforts will play a key role in driving data integrity, consistency, and trust across the organization. The role is highly technical and hands-on , with a strong focus on automation, metadata validation , and ensuring data governance practices are seamlessly integrated into development pipelines. Roles & Responsibilities: Data Quality & Integration Frameworks Design and implement Data Quality (DQ) frameworks that validate schema compliance, transformations, completeness, null checks, duplicates, threshold rules, and referential integrity. Build Data Integration Frameworks (DIF) that validate end-to-end data pipelines across ingestion, processing, storage, and consumption layers. Automate data validations in Databricks/Spark pipelines, integrated with AWS services like S3, Glue, Athena, and Lake Formation. Develop modular, reusable validation components using PySpark, SQL, Python, and orchestration via CI/CD pipelines. Data Cataloging & Governance Integrate automated validations with AWS Glue Data Catalog to ensure metadata consistency, schema versioning, and lineage tracking. Implement checks to verify that data assets are properly cataloged, discoverable, and compliant with internal governance standards. Validate and enforce data classification, tagging, and access controls, ensuring alignment with data governance frameworks (e.g., PII/PHI tagging, role-based access policies). Collaborate with governance teams to automate policy enforcement and compliance checks for audit and regulatory needs. Visualization & UI Testing Automate validation of data visualizations in tools like Tableau, Power BI, Looker , or custom React dashboards. Ensure charts, KPIs, filters, and dynamic views correctly reflect backend data using UI automation (Selenium with Python) and backend validation logic. Conduct API testing (via Postman or Python test suites) to ensure accurate data delivery to visualization layers. Technical Skills and Tools Hands-on experience with data automation tools like Databricks and AWS is essential, as the manager will be instrumental in building and managing data pipelines. Leverage automated testing frameworks and containerization tools to streamline processes and improve efficiency. Experience in UI and API functional validation using tools such as Selenium with Python and Postman, ensuring comprehensive testing coverage. Technical Leadership, Strategy & Team Collaboration Define and drive the overall QA and testing strategy for UI and search-related components with a focus on scalability, reliability, and performance, while establishing alerting and reporting mechanisms for test failures, data anomalies, and governance violations. Contribute to system architecture and design discussions , bringing a strong quality and testability lens early into the development lifecycle. Lead test automation initiatives by implementing best practices and scalable frameworks, embedding test suites into CI/CD pipelines to enable automated, continuous validation of data workflows, catalog changes, and visualization updates Mentor and guide QA engineers , fostering a collaborative, growth-oriented culture focused on continuous learning and technical excellence. Collaborate cross-functionally with product managers, developers, and DevOps to align quality efforts with business goals and release timelines. Conduct code reviews, test plan reviews, and pair-testing sessions to ensure team-level consistency and high-quality standards. Good-to-Have Skills: Experience with data governance tools such as Apache Atlas , Collibra , or Alation Understanding of DataOps methodologies and practices Familiarity with monitoring/observability tools such as Datadog , Prometheus , or CloudWatch Experience building or maintaining test data generators Contributions to internal quality dashboards or data observability systems Awareness of metadata-driven testing approaches and lineage-based validations Experience working with agile Testing methodologies such as Scaled Agile. Familiarity with automated testing frameworks like Selenium, JUnit, TestNG, or PyTest. Must-Have Skills: Strong hands-on experience with Data Quality (DQ) framework design and automation Expertise in PySpark, Python, and SQL for data validations Solid understanding of ETL/ELT pipeline testing in Databricks or Apache Spark environments Experience validating structured and semi-structured data formats (e.g., Parquet, JSON, Avro) Deep familiarity with AWS data services: S3, Glue, Athena, Lake Formation, Data Catalog Integration of test automation with AWS Glue Data Catalog or similar catalog tools UI automation using Selenium with Python for dashboard and web interface validation API testing using Postman, Python, or custom API test scripts Hands-on testing of BI tools such as Tableau, Power BI, Looker, or custom visualization layers CI/CD test integration with tools like Jenkins, GitHub Actions, or GitLab CI Familiarity with containerized environments (e.g., Docker, AWS ECS/EKS) Knowledge of data classification, access control validation, and PII/PHI tagging Understanding of data governance standards (e.g., GDPR, HIPAA, CCPA) Understanding Data Structures: Knowledge of various data structures and their applications. Ability to analyze data and identify inconsistencies. Proven hands-on experience in test automation and data automation using Databricks and AWS. Strong knowledge of Data Integrity Frameworks (DIF) and Data Quality (DQ) principles. Familiarity with automated testing frameworks like Selenium, JUnit, TestNG, or PyTest. Strong understanding of data transformation techniques and logic. Education and Professional Certifications Bachelors degree in computer science and engineering preferred, other Engineering field is considered; Masters degree and 6+ years experience Or Bachelors degree and 8+ years Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills.
Posted 2 weeks ago
7.0 - 12.0 years
13 - 22 Lacs
Chennai, Bengaluru
Work from Office
Talend developer DatawarehouseBI Data Warehouse implementation Unit Testing troubleshooting ETLTalend DataStage ETL Data Catalog cloud database snowflakedeveloping Data Marts,Data warehousing Operational DataStoreDWH concepts,PerformanceTuning Query
Posted 3 weeks ago
5.0 - 10.0 years
18 - 30 Lacs
Hyderabad
Hybrid
We are seeking a skilled and experienced Collibra Developer to support and enhance our data governance and metadata management capabilities. The ideal candidate will be responsible for designing, developing, implementing, and maintaining Collibra solutions, integrating with various enterprise data systems, and ensuring alignment with data governance standards and business requirements. Key Responsibilities: Design and configure Collibra Data Intelligence Cloud solutions (Data Catalog, Data Governance, Lineage, Privacy). Develop and maintain workflows using Collibra Workflow Designer (BPMN). Integrate Collibra with enterprise systems (ETL tools, BI tools, data lakes/warehouses) via APIs, JDBC, or other connectors. Define and maintain data domains, data dictionaries, business glossaries, and data stewardship roles. Required Qualifications: Bachelor's degree in Computer Science, Information Systems, or related field. 5-7+ years of experience working with Collibra Data Intelligence Platform. Hands-on experience in Collibra Administration, Workflow Development (BPMN), and DGC configuration. Interested candidates share your CV at himani.girnar@alikethoughts.com with below details Candidate's name- Email and Alternate Email ID- Contact and Alternate Contact no- Total exp- Relevant experience- Current Org- Notice period- CCTC- ECTC- Current Location- Preferred Location- Pancard No-
Posted 3 weeks ago
3.0 - 8.0 years
0 - 1 Lacs
Thane, Hyderabad, Navi Mumbai
Work from Office
Role & responsibilities Lead the end-to-end implementation of a data cataloging solution within AWS (preferably AWS Glue Data Catalog or third-party tools like Apache Atlas, Alation, Collibra, etc.). Establish and manage metadata frameworks for structured and unstructured data assets in the data lake and data warehouse environments. Integrate the data catalog with AWS-based storage solutions such as S3, Redshift, Athena, Glue, and EMR. Collaborate with data Governance/BPRG/IT projects teams to define metadata standards, data classifications, and stewardship processes. Develop automation scripts for catalog ingestion, lineage tracking, and metadata updates using Python, Lambda, Pyspark or Glue/EMR customs jobs. Work closely with data engineers, data architects, and analysts to ensure metadata is accurate, relevant, and up to date. Implement role-based access controls and ensure compliance with data privacy and regulatory standards. Create detailed documentation and deliver training/workshops for internal stakeholders on using the data catalog. Preferred candidate profile AWS Certifications (e.g., AWS Certified Data Analytics, AWS Solutions Architect). Experience with data catalog tools like Alation, Collibra, or Informatica EDC. Or open sources tools hand-on experience. Exposure to data quality frameworks and stewardship practices. Knowledge of data migration with data catalog and data-mart is plus. 4 to 8+ years of experience in data engineering or metadata management roles. Proven expertise in implementing and managing data catalog solutions wiithin AWS environments. Strong knowledge of AWS Glue, S3, Athena, Redshift, EMR, Data Catalog and Lake Formation. Hands-on experience with metadata ingestion, data lineage, and classification processes. Proficiency in Python, SQL, and automation scripting for metadata pipelines. Familiarity with data governance and compliance standards (e.g., GDPR, RBI guidelines). Experience integrating with BI tools (e.g., Tableau, Power BI) and third-party catalog tools is a plus. Strong communication, Problem-solving, and stakeholder management skills.
Posted 3 weeks ago
7.0 - 12.0 years
18 - 20 Lacs
Hyderabad
Work from Office
We are Hiring Data Governance Analyst Level 3 for US based IT Company based in Hyderabad. Candidates with experience in Data Governance can apply. Job Title : Data Governance Analyst Level 3 Location : Hyderabad Experience : 7+ Years CTC : 18 LPA - 20 LPA Working shift : Day shift Description: We are seeking a seasoned and detail-focused Senior Data Governance Analyst (Level 3) to support and drive enterprise-wide data governance initiatives . This position will play a crucial role in implementing and managing data governance frameworks , ensuring data quality , and supporting regulatory compliance efforts across business units. The ideal candidate will bring deep expertise in data governance best practices , data quality management , metadata , and compliance standards within the financial services industry . As a senior team member, the analyst will collaborate closely with data stewards , business stakeholders , and technical teams to ensure consistent, accurate, and trusted use of enterprise data. Key Responsibilities: Implement and enhance data governance policies, standards, and processes across the organization Partner with business and technical teams to define and manage data ownership, stewardship , and accountability models Maintain and improve metadata and data lineage documentation using tools such as Collibra , Alation , or similar platforms Monitor key data quality metrics , conduct root cause analysis , and lead issue resolution efforts Ensure compliance with regulatory data requirements (e.g., BCBS 239, GDPR, CCPA ) Facilitate and lead data governance meetings , working groups, and stakeholder communications Support the creation and deployment of data literacy initiatives across the enterprise Document governance practices and develop reports for audits and executive leadership Serve as a subject matter expert in data governance and promote data management best practices across departments Required Skills & Qualifications: 5+ years of experience in Data Governance , Data Quality , or Data Management roles Proven experience in developing and managing data governance frameworks in complex organizational environments Strong understanding of data quality principles , data standards , and issue management workflows Experience with metadata management , data cataloging , and lineage tracking Proficiency with governance tools like Collibra , Alation , or similar platforms Solid grasp of data compliance and regulatory standards in the financial services sector Excellent communication, stakeholder engagement, and documentation skills Strong analytical thinking and problem-solving capabilities Preferred Qualifications: Experience in banking or financial services environments Understanding of enterprise data architecture , Master Data Management (MDM) , and BI/reporting systems Knowledge of data privacy regulations such as GDPR , CCPA , etc. Experience working within Agile project methodologies For further assistance contact/whatsapp : 9354909517 or write to hema@gist.org.in
Posted 3 weeks ago
8.0 - 12.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. Youll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, youll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience Your role and responsibilities Role Overview : We are looking for an experienced Denodo SME to design, implement, and optimize data virtualization solutions using Denodo as the enterprise semantic and access layer over a Cloudera-based data lakehouse. The ideal candidate will lead the integration of structured and semi-structured data across systems, enabling unified access for analytics, BI, and operational use cases. Key Responsibilities: Design and deploy the Denodo Platform for data virtualization over Cloudera, RDBMS, APIs, and external data sources. Define logical data models , derived views, and metadata mappings across layers (integration, business, presentation). Connect to Cloudera Hive, Impala, Apache Iceberg , Oracle, and other on-prem/cloud sources. Publish REST/SOAP APIs, JDBC/ODBC endpoints for downstream analytics and applications. Tune virtual views, caching strategies, and federation techniques to meet performance SLAs for high-volume data access. Implement Denodo smart query acceleration , usage monitoring, and access governance. Configure role-based access control (RBAC) , row/column-level security, and integrate with enterprise identity providers (LDAP, Kerberos, SSO). Work with data governance teams to align Denodo with enterprise metadata catalogs (e.g., Apache Atlas, Talend). Required education Bachelors Degree Preferred education Masters Degree Required technical and professional expertise Skills Required : 8-12 years in data engineering, with 4+ years of hands-on experience in Denodo Platform . Strong experience integrating RDBMS (Oracle, SQL Server), Cloudera CDP (Hive, Iceberg), and REST/SOAP APIs. Denodo Admin Tool, VQL, Scheduler, Data Catalog SQL, Shell scripting, basic Python (preferred). Deep understanding of query optimization , caching, memory management, and federation principles. Experience implementing data security, masking, and user access control in Denodo.
Posted 4 weeks ago
5.0 - 10.0 years
10 - 20 Lacs
Chennai
Hybrid
Location - Chennai Responsibilities Direct Responsibilities Work closely with WM IT DCIO team to execute the Data Governance Implementation across Data Initiatives e.g., RAMI (Data Retention), Data Privacy by Design, Data Quality, etc. Create and test proof-of-concept / solutions to support the strategic evolution of the software applications. Data Governance SME within Wealth Management operationally working with Data Custodian IT Officer (DCIO), DPO (Data Protection Officer), CDO (Chief Data Officer) teams. Hands-on with the development, testing, configuration, deployment of software systems in the Data Transversal organization Operationalize Data Policies / Frameworks including Business Glossaries, Data Dictionaries, Data Profiling, etc. Technical & Behavioral Competencies Minimum 7+ years of experience as: Data expertise (At least 2 of the following: Data Governance, Data Quality, Data Privacy & Protection, Data Management) Bachelors degree in Engineering (Computer science or Electronic & Communications) Qualications: Hands-on experience in working with Data (Data Profiling, Scorecards/BI) Previously worked in Data Governance and Data Security Financial Services products and Applications knowledge Working knowledge across Excel, SQL, Python, Collibra, PowerBI, Cloud Plus: Collibra Developer or Ranger Certified or similar certification is preferred. Skills required: Knowledge about Data and Compliance / Regulatory environment( global and local Data Regulations) Demonstrates flexibility and willingness to accept assignments and challenges in rapidly changing environment. Understand how data is used (e.g., Analytics, Business Intelligence, etc.) Working knowledge on Data lifecycle, and Data Transformations / Data Lineage At least 2 of the following: Data Quality, Data Architecture, Database Management, Data Privacy & Protection, Security of data Ability to define relevant key performance indicators (KPI) Problem solving and team collaboration Self-motivated and results driven Project management and business analysis Agile thinking Transversal skills: Proficient in design new process, adaptation of Group IT processes to Wealth Management IT Strong communication to elaborate across stakeholders, and support change Minimum 7 years of experience in Data / Tech
Posted 4 weeks ago
0.0 years
0 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Ready to shape the future of work At Genpact, we don&rsquot just adapt to change&mdashwe drive it. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that&rsquos shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Senior Principal Consultant - Lead Solution Architect, Google Cloud Platform Pre-Sales - Data & AI About Genpact: Genpact (NYSE: G) is a global professional services firm delivering outcomes that transform businesses. With a proud 25-year history and over 125,000 diverse professionals in 30+ countries, we are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for our clients. We serve leading global enterprises, including the Fortune Global 500, leveraging our deep industry expertise , digital innovation, and cutting-edge capabilities in data, technology, and AI. Join our vibrant team in India to shape the future of business through intelligent operations and drive meaningful impact. The Opportunity: Genpact India is expanding its Google Cloud Platform (GCP) capabilities and seeking a highly experienced and technically astute Senior Principal Consultant / Lead Solution Architect specializing in Data and Artificial Intelligence. This critical role will be at the forefront of Genpact%27s growth in the GCP ecosystem in India and globally, leading complex pre-sales engagements, designing transformative data and AI solutions, and fostering strong client relationships. You will operate as a trusted advisor, translating intricate client challenges into compelling, implementable solutions on Google Cloud. Responsibilities: Solution Architecture & Design Leadership: Lead the technical pre-sales process for complex data and AI opportunities on Google Cloud Platform / AWS / Azure (any 2) from initial discovery through to proposal and Statement of Work (SOW) development. Design and articulate highly scalable, secure, and resilient enterprise-grade data and AI architectures leveraging a wide array of GCP services / AWS / Azure services. (any 2) Client Engagement & Advisory: Engage deeply with prospective clients%27 senior IT and business stakeholders, including CXOs, to understand their strategic objectives , critical business challenges, and existing data landscape. Position Genpact%27s GCP / AWS / Azure Data & AI offerings as key enablers for their digital transformation journey. Technical Demonstrations & Workshops: Conduct impactful technical presentations, deep-dive workshops, and product demonstrations (including Proof-of-Concepts where required ) tailored to specific client needs, showcasing the advanced capabilities of GCP / AWS / Azure Data & AI services and Genpact%27s differentiated value. Proposal Development & Commercial Support: Take ownership of the technical sections of proposals, RFPs, and SOWs, ensuring accuracy, technical feasibility, clear value articulation, and alignment with Genpact%27s delivery capabilities. Provide robust technical estimation and sizing for proposed solutions. Sales & Delivery Collaboration: Partner closely with Genpact%27s sales teams to drive deal progression, providing technical guidance, competitive intelligence, and effective solution positioning. Collaborate with delivery teams to ensure proposed solutions are executable, scalable, and align with Genpact%27s operational excellence standards. Technology & Market Expertise: Maintain expert-level knowledge of the latest trends, services, and product roadmaps in Google Cloud Platform, AWS / Azure (any two) , Data Engineering, Machine Learning, Artificial Intelligence (including Generative AI), and relevant industry best practices. Thought Leadership & IP Contribution: Contribute to Genpact%27s intellectual property by developing reusable assets, solution accelerators, and participating in internal/external knowledge sharing, including whitepapers, blogs, and industry events. Mentorship & Capability Building: Mentor and guide junior architects and data engineers within the team, fostering a culture of technical excellence and continuous learning in Google Cloud Data & AI. Qualifications we seek in you! Minimum Qualifications progressive experience in technical roles within data analytics, data warehousing, business intelligence, machine learning, and artificial intelligence, with a significant portion in a client-facing pre-sales, solution architecture, or consulting capacity . hands-on experience architecting, designing, and delivering complex data and AI solutions on Google Cloud Platform. Deep and demonstrable expertise across the Google Cloud Data & AI stack: Core Data Services: BigQuery , Dataflow, Dataproc , Pub/Sub, Cloud Storage, Cloud SQL, Cloud Spanner, Composer, Data Catalog, Dataplex. AI/ML Services: Vertex AI (including MLOps , Workbench, Training, Prediction, Explainable AI), Generative AI offerings (e.g., Gemini, Imagen), Natural Language API, Vision AI, Speech-to-Text, Dialogflow , Recommendation AI. BI & Visualization: Looker, Data Studio. Similar to above Google Cloud stack, need to have exposure to one other cloud data stack - either AWS or Azure Proven experience in translating complex business challenges into viable , scalable technical solutions on GCP, articulated with clear business value. Exceptional communication, presentation, and interpersonal skills, with the ability to engage, influence, and build rapport with diverse audiences from technical teams to senior business executives. Strong problem-solving, analytical, and strategic thinking abilities, with a commercial mindset. Experience in leading and contributing to large, complex deal pursuits in a competitive environment. Bachelor%27s degree in Computer Science , Engineering, or a related technical field. Master%27s degree preferred. Google Cloud Professional Certifications are highly preferred (e.g., Professional Cloud Architect, Professional Data Engineer, Professional Machine Learning Engineer). Ability to travel to client sites within India and potentially internationally as required . Why join Genpact Be a transformation leader - Work at the cutting edge of AI, automation, and digital innovation Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career - Get hands-on experience, mentorship, and continuous learning opportunities Work with the best - Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 1 month ago
4.0 - 7.0 years
8 - 14 Lacs
Noida
Hybrid
Data Engineer (L3) || GCP Certified Employment Type : Full-Time Work Mode : In-office/ Hybrid Notice : Immediate joiners As a Data Engineer, you will design, develop, and support data pipelines and related data products and platforms. Your primary responsibilities include designing and building data extraction, loading, and transformation pipelines across on-prem and cloud platforms. You will perform application impact assessments, requirements reviews, and develop work estimates. Additionally, you will develop test strategies and site reliability engineering measures for data products and solutions, participate in agile development ""scrums"" and solution reviews, mentor junior Data Engineering Specialists, lead the resolution of critical operations issues, and perform technical data stewardship tasks, including metadata management, security, and privacy by design. Required Skills : Design, develop, and support data pipelines and related data products and platforms. Design and build data extraction, loading, and transformation pipelines and data products across on-prem and cloud platforms. Perform application impact assessments, requirements reviews, and develop work estimates. Develop test strategies and site reliability engineering measures for data products and solutions. Participate in agile development ""scrums"" and solution reviews. Mentor junior Data Engineers. Lead the resolution of critical operations issues, including post-implementation reviews. Perform technical data stewardship tasks, including metadata management, security, and privacy by design. Design and build data extraction, loading, and transformation pipelines using Python and other GCP Data Technologies Demonstrate SQL and database proficiency in various data engineering tasks. Automate data workflows by setting up DAGs in tools like Control-M, Apache Airflow, and Prefect. Develop Unix scripts to support various data operations. Model data to support business intelligence and analytics initiatives. Utilize infrastructure-as-code tools such as Terraform, Puppet, and Ansible for deployment automation. Expertise in GCP data warehousing technologies, including BigQuery, Cloud SQL, Dataflow, Data Catalog, Cloud Composer, Google Cloud Storage, IAM, Compute Engine, Cloud Data Fusion and Dataproc (good to have). data pipelines, agile development,scrums, GCP Data Technologies, Python, DAGs, Control-M, Apache Airflow, Data solution architecture Qualifications : Bachelor's degree in Software Engineering, Computer Science, Business, Mathematics, or related field. 4+ years of data engineering experience. 2 years of data solution architecture and design experience. GCP Certified Data Engineer (preferred). Job Type : Full-time
Posted 1 month ago
4 - 9 years
10 - 20 Lacs
Pune, Mumbai (All Areas)
Hybrid
Manager- Enterprise Data- 4-8 years- Mumbai (Hybrid) LOCATION - Mumbai Future Employer- This global financial services organization operates as a strategic capability center supporting a leading investment and savings business headquartered in the UK. With a focus on delivering technology, data, operations, and customer service solutions, it plays a critical role in enabling business growth and operational efficiency across global markets. Primary Key Responsibilities: Collaborate with Data Owners, Data Stewards, and business/technology teams to populate the enterprise data catalogue. Work with stakeholders to define and create data quality rules within the chosen data quality tooling. Support the roll-out and adoption of data management tooling and marketplace across various business areas. Assist in the implementation of data management policies, standards, and procedures. Contribute to other data management activities as required to enhance the organization's data capabilities. Utilize SQL to query and analyze datasets for data profiling and validation. Effectively manage time, prioritize tasks, and organize work to ensure deadlines are met. Demonstrate a proactive approach to learning and a passion for making a tangible difference in data management practices. Requirements: Practical experience of working in a data governance/management team at a significant organization for 4+ years. Working experience in one or more data management tools such as Informatica Axon/IDQ/EDC, Collibra, Datactics, or Microsoft Purview. Good knowledge of SQL for querying and analyzing datasets. Effective time management, prioritization, and organizational skills. Demonstrated passion and enthusiasm to learn and contribute to data management initiatives. Bachelor's or Master's degree in IT. Experience working in the financial services / insurance industry (Desirable). Data Management certifications (a plus). What's in it for you? Opportunity to be part of a significant transformation program within a leading financial services organization. Exposure to cutting-edge data management tools and technologies. Collaborative work environment with opportunities to interact with various business and technology stakeholders. Be a key contributor in building and enhancing the organization's data governance and management capabilities. Potential for professional growth and development within the Enterprise Data team and M&G plc. Be part of an inclusive employer that values diversity and fosters a culture where difference is celebrated. Reach us If you feel this opportunity is well aligned with your career progression plans, please feel free to reach me with your updated profile at priya.bhatia @crescendogroup.in Disclaimer Crescendo Global is specializes in Senior to C-level niche recruitment. We are passionate about empowering job seekers and employers with an engaging memorable job search and leadership hiring experience. Crescendo Global does not discriminate on the basis of race, religion, color, origin, gender, sexual orientation, age, marital status, veteran status, or disability status. Note We receive a lot of applications on a daily basis so it becomes a bit difficult for us to get back to each candidate. Please assume that your profile has not been shortlisted in case you don't hear back from us in 1 week. Your patience is highly appreciated.
Posted 1 month ago
10 - 20 years
20 - 30 Lacs
Hyderabad
Remote
Looking for Fulltime, Consulting or Freelance Experts with Alation Experience Note: Looking for Immediate Joiners only Summarized Purpose: We are seeking an exceptionally skilled and motivated Solution Architect/Data Governance Lead who can play a pivotal role in the realm of data analytics and business intelligence, driving impactful solutions that empower organizations to harness the full potential of data and enabling to make informed decisions that can help to achieve the business objectives and foster a data-driven culture. Essential Functions: Design and architect end-to-end data governance solutions, focusing on the implementation of the Alation tool, to meet the organization's data governance objectives and requirements. Collaborate with business stakeholders, data stewards, and IT teams to understand data governance needs and translate them into technical requirements, leveraging the capabilities of the Alation tool. Develop data governance strategies, policies, and standards that align with industry best practices and regulatory requirements, leveraging the Alation tool's features and functionalities. Implement and configure the Alation tool to support data governance initiatives, including data cataloging, data lineage, data quality monitoring, and metadata management. Define and implement data governance workflows and processes within the Alation tool, ensuring efficient data governance operations across the organization. Collaborate with cross-functional teams to integrate the Alation tool with existing data systems and infrastructure, ensuring seamless data governance processes. Conduct data profiling, data quality assessments, and data lineage analysis using the Alation tool to identify data issues and develop remediation strategies. Provide guidance and support to business stakeholders, data stewards, and data owners on the effective use of the Alation tool for data governance activities. Stay updated on the latest trends, emerging technologies, and best practices in data governance and the Alation tool, and proactively recommend enhancements and improvements. Collaborate with IT teams to ensure the successful implementation, maintenance, and scalability of the Alation tool, including upgrades, patches, and configurations. Knowledge, Skills and Abilities: Bachelor's degree in computer science, information systems, or a related field. A master's degree is preferred. Must have minimum 15 years experience in IT who has min 3 years experience in Alation and min 10+ years experience in SQL, EDW or any Data Engineering/Data Science capabilities. Proven experience as a Data Governance Solution Architect, Data Architect, or similar role, with hands-on experience implementing data governance solutions using the Alation tool (Must Have) Strong expertise in designing and implementing end-to-end data governance frameworks and solutions, leveraging the Alation tool (Must Have) In-depth knowledge of data governance principles, data management best practices, and regulatory requirements (e.g., GDPR, CCPA). Proficiency in configuring and customizing the Alation tool, including data cataloging, data lineage, data quality, and metadata management features. (Must Have) Experience in data profiling, data quality assessment, and data lineage analysis using the Alation tool or similar data governance platforms. (Must Have) Familiarity with data integration, data modeling, and data architecture concepts. Excellent analytical, problem-solving, and decision-making skills with a keen attention to detail. Strong communication and interpersonal skills with the ability to effectively collaborate with cross-functional teams and influence stakeholders. Proven ability to manage multiple projects simultaneously, prioritize tasks, and meet deadlines. Professional certifications in data governance, such as CDMP (Certified Data Management Professional), DAMA-CDMP (Data Management Association Certified Data Management Professional) , or similar certifications, are a plus. (Nice to Have) Good Understanding of Master Data Management, Data Integration and SQL Skills Exposure to dimensional data modeling, Data Vault Modeling, ETL, ELT and data warehousing methodologies. Ability to effectively communicate in writing and orally with a wide range of audiences and maintain interpersonal relationships. Ability to work within time constraints and manage multiple tasks against critical deadlines. Ability to perform problem solving and apply critical thinking, deductive reasoning, and inductive reasoning to identify solutions. Certifications from Alation are desired (Must Have )
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20183 Jobs | Dublin
Wipro
10025 Jobs | Bengaluru
EY
8024 Jobs | London
Accenture in India
6531 Jobs | Dublin 2
Amazon
6260 Jobs | Seattle,WA
Uplers
6244 Jobs | Ahmedabad
Oracle
5916 Jobs | Redwood City
IBM
5765 Jobs | Armonk
Capgemini
3771 Jobs | Paris,France
Tata Consultancy Services
3728 Jobs | Thane