Jobs
Interviews

245 Data Architect Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Be an essential element to a brighter future. We work together to transform essential resources into critical ingredients for mobility, energy, connectivity and health. Join our values-led organization committed to building a more resilient world with people and planet in mind. Our core values are the foundation that make us successful for ourselves, our customers and the planet. Job Description Overview As part of the Global Data & Analytics Technology team within Corporate IT, the Enterprise Master Data Architect plays a strategic role in shaping and executing enterprise-wide master data initiatives. This role partners closely with business leaders, the Corporate Master Data Management team, and Business Relationship Managers to define and deliver scalable solutions using SAP Master Data Governance (MDG). We re looking for a forward-thinking architect with a strong blend of technical expertise and business acumen someone who can balance innovation with execution, and who thrives in a fast-paced, collaborative environment. Key Responsibilities Collaborate with business stakeholders to define enterprise master data strategies and governance frameworks. Design and implement SAP MDG solutions that support the collection, processing, and stewardship of master data across domains. Lead the development and enforcement of data governance policies, standards, and best practices. Architect and deliver SAP-centric master data solutions that align with enterprise goals and compliance requirements. Provide technical leadership and mentorship to MDM team members and cross-functional partners. Ensure consistency, quality, and accessibility of master data across systems and business units. Drive continuous improvement in data architecture, modeling, and integration practices. Qualifications Bachelor s degree in Computer Science, Information Systems, or a related field. Proven experience designing and architecting enterprise Master Data solutions. 4+ years of hands-on experience with SAP MDG and SAP Data Architecture. Strong functional knowledge of master data domains: customer, vendor, product/material, and finance in S/4HANA or ECC. Experience with SAP Data Services and SAP Information Steward for data conversion, quality, and cleansing. Proficiency in defining systems strategy, requirements gathering, prototyping, testing, and deployment. Strong configuration and solution design skills. ABAP development experience required, including custom enhancements and data modeling. Experience with SAP S/4HANA 2021 or later preferred. Excellent communication, collaboration, and time management skills. Ability to lead cross-functional teams and manage multiple priorities in a dynamic environment. Benefits of Joining Albemarle Competitive compensation Comprehensive benefits package A diverse array of resources to support you professionally and personally. We are partners to one another in pioneering new ways to be better for ourselves, our teams, and our communities. When you join Albemarle, you become our most essential element and you can anticipate competitive compensation, a comprehensive benefits package, and resources that foster your well-being and fuel your personal growth. Help us shape the future, build with purpose and grow together.

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role Developing chatbots using Dialogflow CX Architecting and building AI powered chatbot applications using platforms such as Dialogflow CX Designing/developing user-centric conversation experiences involving chat, text, or voice Labels that are used by Conversational Architects in the Conversation Nodes can be considered as Pages in Dialogflow Your Profile Sound knowledge of cloud platforms (GCP/AWS/Azure) Experience in integrating APIs using NodeJS and Python Construct intents, entities, and annotations in Dialogflow tool Write API documentation that outlines endpoints that Customers need to implement the CCAI on their end Liaise with the Customer and Data Architect on use case requirements and API technical requirements What you'll love about working here You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.,

Posted 1 month ago

Apply

2.0 - 18.0 years

12 - 16 Lacs

Hyderabad

Work from Office

Career Category Engineering Job Description [Role Name : IS Architecture] Job Posting Title: Data Architect Workday Job Profile : Principal IS Architect Department Name: Digital, Technology & Innovation Role GCF: 06A ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: The role is responsible for developing and maintaining the data architecture of the Enterprise Data Fabric. Data Architecture includes the activities required for data flow design, data modeling, physical data design, query performance optimization. The Data Architect is a senior-level position responsible for developing business information models by studying the business, our data, and the industry. This role involves creating data models to realize a connected data ecosystem that empowers consumers. The Data Architect drives cross-functional data interoperability, enables efficient decision-making, and supports AI usage of Foundational Data. This role will manage a team of Data Modelers. Roles & Responsibilities: Provide oversight to data modeling team members. Develop and maintain conceptual logical, and physical data models and to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Maintain comprehensive documentation of the architecture, including principles, standards, and models Evaluate and recommend technologies and tools that best fit the solution requirements Evaluate emerging technologies and assess their potential impact. Drive continuous improvement in the architecture by identifying opportunities for innovation and efficiency Basic Qualifications and Experience: [GCF Level 6A] Doctorate Degree and 2 years of experience in Computer Science, IT or related field OR Master s degree with 8 - 10 years of experience in Computer Science, IT or related field OR Bachelor s degree with 10 - 14 years of experience in Computer Science, IT or related field OR Diploma with 14 - 18 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills : Data Modeling: Expert in creating conceptual, logical, and physical data models to represent information structures. Ability to interview and communicate with business Subject Matter experts to develop data models that are useful for their analysis needs. Metadata Management : Knowledge of metadata standards, taxonomies, and ontologies to ensure data consistency and quality. Information Governance: Familiarity with policies and procedures for managing information assets, including security, privacy, and compliance. Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), performance tuning on big data processing Good-to-Have Skills: Experience with Graph technologies such as Stardog, Allegrograph, Marklogic Professional Certifications Certifications in Databricks are desired Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 1 month ago

Apply

8.0 - 14.0 years

30 - 35 Lacs

Hyderabad

Work from Office

Role Description: The role is responsible for developing and maintaining the data architecture of the Enterprise Data Fabric. Data Architecture includes the activities required for data flow design, data modeling, physical data design, query performance optimization. The Data Architect is a senior-level position responsible for developing business information models by studying the business, our data, and the industry. This role involves creating data models to realize a connected data ecosystem that empowers consumers. The Data Architect drives cross-functional data interoperability, enables efficient decision-making, and supports AI usage of Foundational Data. This role will manage a team of Data Modelers. Roles & Responsibilities: Provide oversight to data modeling team members. Develop and maintain conceptual logical, and physical data models and to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Maintain comprehensive documentation of the architecture, including principles, standards, and models Evaluate and recommend technologies and tools that best fit the solution requirements Evaluate emerging technologies and assess their potential impact. Drive continuous improvement in the architecture by identifying opportunities for innovation and efficiency Basic Qualifications and Experience: [GCF Level 6A] Doctorate Degree and 2 years of experience in Computer Science, IT or related field OR Master s degree with 8 - 10 years of experience in Computer Science, IT or related field OR Bachelor s degree with 10 - 14 years of experience in Computer Science, IT or related field OR Diploma with 14 - 18 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills : Data Modeling: Expert in creating conceptual, logical, and physical data models to represent information structures. Ability to interview and communicate with business Subject Matter experts to develop data models that are useful for their analysis needs. Metadata Management : Knowledge of metadata standards, taxonomies, and ontologies to ensure data consistency and quality. Information Governance: Familiarity with policies and procedures for managing information assets, including security, privacy, and compliance. Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), performance tuning on big data processing Good-to-Have Skills: Experience with Graph technologies such as Stardog, Allegrograph, Marklogic Professional Certifications Certifications in Databricks are desired Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 1 month ago

Apply

8.0 - 12.0 years

19 - 22 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Job Description: Senior Data Architect (Contract) Company : Emperen Technologies Location- Remote, Delhi NCR,Bengaluru,Chennai,Pune,Kolkata,Ahmedabad,Mumbai, Hyderabad Type: Contract (8-12 Months) Experience: 8-12 Years Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted 1 month ago

Apply

13.0 - 20.0 years

35 - 70 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Required Skills and Experience 13+ Years is a must with 7+ years of relevant experience working on Big Data Platform technologies. Proven experience in technical skills around Cloudera, Teradata, Databricks, MS Data Fabric, Apache, Hadoop, Big Query, AWS Big Data Solutions (EMR, Redshift, Kinesis, Qlik) Good Domain Experience in BFSI or Manufacturing area . Excellent communication skills to engage with clients and influence decisions. High level of competence in preparing Architectural documentation and presentations. Must be organized, self-sufficient and can manage multiple initiatives simultaneously. Must have the ability to coordinate with other teams independently. Work with both internal/external stakeholders to identify business requirements, develop solutions to meet those requirements / build the Opportunity. Note: If you have experience in BFSI Domain than the location will be Mumbai only If you have experience in Manufacturing Domain the location will be Mumbai & Bangalore only. Interested candidates can share their updated resumes on shradha.madali@sdnaglobal.com

Posted 1 month ago

Apply

8.0 - 17.0 years

13 - 17 Lacs

Hyderabad

Work from Office

Career Category Engineering Job Description [Role Name : IS Architecture] Job Posting Title: Data Architect Workday Job Profile : Principal IS Architect Department Name: Digital, Technology Innovation Role GCF: 06A ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: The role is responsible for developing and maintaining the data architecture of the Enterprise Data Fabric. Data Architecture includes the activities required for data flow design, data modeling, physical data design, query performance optimization. The Data Architect is a senior-level position responsible for developing business information models by studying the business, our data, and the industry. This role involves creating data models to realize a connected data ecosystem that empowers consumers. The Data Architect drives cross-functional data interoperability, enables efficient decision-making, and supports AI usage of Foundational Data. This role will manage a team of Data Modelers. Roles Responsibilities: Provide oversight to data modeling team members. Develop and maintain conceptual logical, and physical data models and to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Maintain comprehensive documentation of the architecture, including principles, standards, and models Evaluate and recommend technologies and tools that best fit the solution requirements Evaluate emerging technologies and assess their potential impact. Drive continuous improvement in the architecture by identifying opportunities for innovation and efficiency Basic Qualifications and Experience: [GCF Level 6A] Doctorate Degree and 8 years of experience in Computer Science, IT or related field OR Master s degree with 12 - 15 years of experience in Computer Science, IT or related field OR Bachelor s degree with 14 - 17 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills : Data Modeling: Expert in creating conceptual, logical, and physical data models to represent information structures. Ability to interview and communicate with business Subject Matter experts to develop data models that are useful for their analysis needs. Metadata Management : Knowledge of metadata standards, taxonomies, and ontologies to ensure data consistency and quality. Information Governance: Familiarity with policies and procedures for managing information assets, including security, privacy, and compliance. Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), performance tuning on big data processing Good-to-Have Skills: Experience with Graph technologies such as Stardog, Allegrograph, Marklogic Professional Certifications Certifications in Databricks are desired Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 1 month ago

Apply

5.0 - 10.0 years

6 - 10 Lacs

Mumbai

Remote

Travel Requirement : will be plus if willing to travel to the UK as needed Job Description : We are seeking a highly experienced Senior Data Engineer with a background in Microsoft Fabric and have done projects in it. This is a remote position based in India, ideal for professionals who are open to occasional travel to the UK and must possess a valid passport. Key Responsibilities : - Design and implement scalable data solutions using Microsoft Fabric - Lead complex data integration, transformation, and migration projects - Collaborate with global teams to deliver end-to-end data pipelines and architecture - Optimize performance of data systems and troubleshoot issues proactively - Ensure data governance, security, and compliance with industry best practices Required Skills and Experience : - 5+ years of experience in data engineering, including architecture and development - Expertise in Microsoft Fabric, Data Lake, Azure Data Services, and related technologies - Experience in SQL, data modeling, and data pipeline development - Knowledge of modern data platforms and big data technologies - Excellent communication and leadership skills Preferred Qualifications : - Good communication skills - Understanding of data governance and security best practices Perks & Benefits : - Work-from-home flexibility - Competitive salary and perks - Opportunities for international exposure - Collaborative and inclusive work culture

Posted 1 month ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Duration : 6 Months Timings : General IST Notice Period : within 15 days or immediate joiner About The Role : As a Data Engineer for the Data Science team, you will play a pivotal role in enriching and maintaining the organization's central repository of datasets. This repository serves as the backbone for advanced data analytics and machine learning applications, enabling actionable insights from financial and market data. You will work closely with cross-functional teams to design and implement robust ETL pipelines that automate data updates and ensure accessibility across the organization. This is a critical role requiring technical expertise in building scalable data pipelines, ensuring data quality, and supporting data analytics and reporting infrastructure for business growth. Note : Must be ready for face-to-face interview in Bangalore (last round). Should be working with Azure as cloud technology. Key Responsibilities : ETL Development : - Design, develop, and maintain efficient ETL processes for handling multi-scale datasets. - Implement and optimize data transformation and validation processes to ensure data accuracy and consistency. - Collaborate with cross-functional teams to gather data requirements and translate business logic into ETL workflows. Data Pipeline Architecture : - Architect, build, and maintain scalable and high-performance data pipelines to enable seamless data flow. - Evaluate and implement modern technologies to enhance the efficiency and reliability of data pipelines. - Build pipelines for extracting data via web scraping to source sector-specific datasets on an ad hoc basis. Data Modeling : - Design and implement data models to support analytics and reporting needs across teams. - Optimize database structures to enhance performance and scalability. Data Quality And Governance : - Develop and implement data quality checks and governance processes to ensure data integrity. - Collaborate with stakeholders to define and enforce data quality standards across the and Communication - Maintain detailed documentation of ETL processes, data models, and other key workflows. - Effectively communicate complex technical concepts to non-technical stakeholders and business Collaboration - Work closely with the Quant team and developers to design and optimize data pipelines. - Collaborate with external stakeholders to understand business requirements and translate them into technical solutions. Essential Requirements Basic Qualifications : - Bachelor's degree in Computer Science, Information Technology, or a related field. - Familiarity with big data technologies like Hadoop, Spark, and Kafka. - Experience with data modeling tools and techniques. - Excellent problem-solving, analytical, and communication skills. - Proven experience as a Data Engineer with expertise in ETL techniques (minimum years). - 3-6 years of strong programming experience in languages such as Python, Java, or Scala - Hands-on experience in web scraping to extract and transform data from publicly available web sources. - Proficiency with cloud-based data platforms such as AWS, Azure, or GCP. - Strong knowledge of SQL and experience with relational and non-relational databases. - Deep understanding of data warehousing concepts and Qualifications : - Master's degree in Computer Science or Data Science. - Knowledge of data streaming and real-time processing frameworks. - Familiarity with data governance and security best practices

Posted 1 month ago

Apply

4.0 - 9.0 years

11 - 15 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Cp\Were Hiring: Cloud Data Architect Azure DatabricksC/p\Cp\Key Responsibilities:C / p\Cul\Cli\Design and implement scalable, efficient cloud data models using Azure Data Lake and Azure Databricks.C/li\Cli\Ensure data quality, consistency, and integrity across all data models and platforms.C/li\Cli\Define and enforce development standards and best practices.C/li\Cli\Architect and model Business Intelligence (BI) and Analytics solutions to support data-driven decision-making.C / li\Cli\Collaborate with stakeholders to gather business requirements and translate them into technical specifications.C/li\Cli\Develop and maintain data models, data integration pipelines, and data warehousing solutions.C/li\Cli\Build and manage ETL (Extract, Transform, Load) processes to ensure timely and accurate data availability.C / li\C / ul\Cp\Required Qualifications:C / p\Cul\Cli\Proven experience as a Data Scientist, Data Architect, Data Analyst, or similar role.C/li\Cli\Strong understanding of data warehouse architecture and principles.C/li\Cli\Proficiency in SQL and experience with database systems such as Oracle, SQL Server, or PostgreSQL.C/li\Cli\Hands-on experience with Databricks for data engineering and SQL warehouse MUST.C/li\Cli\Familiarity with data visualization tools like Power BI or Qlik.C/li\Cli\Experience with data warehousing platforms such as Snowflake or Amazon Redshift.C/li\Cli\Strong analytical and problem-solving skills.C/li\Cli\Excellent communication and collaboration abilities.C / li\Cli\Bachelordegree in Computer Science, Engineering, or a related field; Masterdegree preferred.C/li\Cli\Demonstrated expertise in BI and Analytics architecture, ETL design, and data integration workflows.C/li\C/ul\Cp\Cbr\C/p

Posted 1 month ago

Apply

3.0 - 8.0 years

9 - 14 Lacs

Gurugram

Remote

Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization.

Posted 1 month ago

Apply

7.0 - 12.0 years

20 - 25 Lacs

Mumbai

Work from Office

Deloitte is looking for Technology and Transformation - EAD- Data Architect -Senior Consultant to join our dynamic team and embark on a rewarding career journey A Data Architect is a professional who is responsible for designing, building, and maintaining an organization's data architecture 1 Designing and implementing data models, data integration solutions, and data management systems that ensure data accuracy, consistency, and security 2 Developing and maintaining data dictionaries, metadata, and data lineage documents to ensure data governance and compliance 3 Data Architect should have a strong technical background in data architecture and management, as well as excellent communication skills 4 Strong problem-solving skills and the ability to think critically are also essential to identify and implement solutions to complex data issues

Posted 1 month ago

Apply

10.0 - 17.0 years

8 - 13 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Data architect with data migration from Banking domain Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

10.0 - 15.0 years

30 - 45 Lacs

Hyderabad

Hybrid

Job Title: IT-Lead Architect Architect AI Years of Experience: 10-15 Years Mandatory Skills: Data Architect, Team Leadership, AI/ML Expert, Azure, SAP Good to have: Visualization, Python Key Responsibilities: Lead a team of architects and engineers focused on Strategic Azure architecture and AI projects. Develop and maintain the companys data architecture strategy and lead design/architecture validation reviews. Drive the adoption of new AI/ML technologies and assess their impact on data strategy. Architect scalable data flows, storage, and analytics platforms, ensuring secure and cost-effective solutions. Establish data governance frameworks and promote best practices for data quality. Act as a technical advisor on complex data projects and collaborate with stakeholders. Work with technologies including SQL, SYNAPSE, Databricks, PowerBI, Fabric, Python, SQL Server, and NoSQL. Required Qualifications & Experience: Bachelor’s or Master’s degree in Computer Science or related field. At least 5 years in a leadership role in data architecture. Expert in Azure, Databricks, and Synapse. Proven experience leading technical teams and strategic projects, specifically designing and implementing AI solutions within data architectures. Deep knowledge of cloud data platforms (Azure, Fabric, Databricks, AWS), data modeling, ETL/ELT, big data, relational/NoSQL databases, and data security. 5 years of experience in AI model design & deployment. Strong experience in Solution Architecture. Excellent communication, stakeholder management, and problem-solving skills.

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, supported and inspired by a collaborative community of colleagues around the world, and able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Experience in developing digital marketing / digital analytics solutions using Adobe products is required for this role. You should have experience with Adobe Experience Cloud products and recent experience with Adobe Experience Platform or similar CDP. A good knowledge of Data Science workspace and building intelligent Services on AEP is essential. Strong knowledge of datasets in Adobe Experience Platform, loading data into the Platform through data source connectors, APIs, and streaming ingestion connectors is expected. Experience in creating all required Adobe XDM (Experience Data Model) in JSON based on approved data models for all loading data files is a key requirement. You should also have knowledge of utilizing Adobe Experience Platform (AEP) UI & POSTMAN to automate all customer schema data lake & profile design setups within each sandbox environment. Experience in configuration within Adobe Experience Platform all necessary identities & privacy settings and creating new segments within AEP to meet customer use cases is important. Testing and validating the segments with the required destinations, managing customer data using Real-Time Customer Data Platform (RTCDP), and analyzing customer data using Customer Journey Analytics (CJA) are also part of the responsibilities. Experience with creating connection, data views, and dashboards in CJA is necessary. Hands-on experience in configuration and integration of Adobe Marketing Cloud modules like Audience Manager, Analytics, Campaign, and Target is required. Adobe Experience Cloud tool certifications (Adobe Campaign, Adobe Experience Platform, Adobe Target, Adobe Analytics) are desirable. Proven ability to communicate verbally and in writing in a high-performance, collaborative environment is expected. Experience with data analysis, modeling, and mapping to coordinate closely with Data Architect(s) is also beneficial. At Capgemini, you can shape your career with a range of career paths and internal opportunities within the Capgemini group. You will receive personalized career guidance from our leaders and comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. Capgemini is committed to ensuring that people of all backgrounds feel encouraged and have a sense of belonging. You are valued for who you are, and you can bring your original self to work. Every Monday, kick off the week with a musical performance by our in-house band - The Rubber Band. Also, get to participate in internal sports events, yoga challenges, or marathons. At Capgemini, you can work on cutting-edge projects in tech and engineering with industry leaders or create solutions to overcome societal and environmental challenges.,

Posted 1 month ago

Apply

10.0 - 12.0 years

27 - 32 Lacs

Gurugram

Work from Office

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : - 10+ years of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 1 month ago

Apply

3.0 - 8.0 years

9 - 14 Lacs

Noida

Remote

Role : Data Modeler Lead Location : Remote Experience : 10years+ Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization

Posted 1 month ago

Apply

8.0 - 10.0 years

16 - 20 Lacs

Noida

Remote

IT & Technology Senior Manager Data Analytics Company Background - GET provides skills and expertise to the Oil & Gas industry. - It provisions operational & supervisory field support, remote engineering, technical consulting and training services. Role And Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement recommendations including changes in operational processes, technology or data management - Primary area of focus: PSCM/ VMI business; secondary area of focus: ICS KPI s - Business improvements pre & post (either operational program, algorithm, model or resultant software). Improvements measured in time and/or dollar savings - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : - R or python (Data Cleaning, Statistical and Modelling packages) , SQL, VBA and DAX (PowerBI) - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique Compliance Requirements : - GET has a Business Ethics Policy which provides guidance to all employees in their day-to-day roles as well as helping you and the business comply with the law at all times. The incumbent must read, understand and comply with, at all times, the policy along with all other corresponding policies, procedures and directives. QHSE Responsibilities - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company s, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills - Analytical skills - Negotiation - Convincing skills Key Competencies - Never give up attitude - Flexible - Eye to detail Experience: Minimum 8 Years Of Experience

Posted 1 month ago

Apply

10.0 - 12.0 years

20 - 25 Lacs

Noida

Work from Office

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 1 month ago

Apply

8.0 - 10.0 years

16 - 20 Lacs

Pune, Anywhere in /Multiple Locations

Work from Office

IT & Technology Senior Manager Data Analytics Company Background - GET provides skills and expertise to the Oil & Gas industry. - It provisions operational & supervisory field support, remote engineering, technical consulting and training services. Role And Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement recommendations including changes in operational processes, technology or data management - Primary area of focus: PSCM/ VMI business; secondary area of focus: ICS KPI s - Business improvements pre & post (either operational program, algorithm, model or resultant software). Improvements measured in time and/or dollar savings - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : - R or python (Data Cleaning, Statistical and Modelling packages) , SQL, VBA and DAX (PowerBI) - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique Compliance Requirements : - GET has a Business Ethics Policy which provides guidance to all employees in their day-to-day roles as well as helping you and the business comply with the law at all times. The incumbent must read, understand and comply with, at all times, the policy along with all other corresponding policies, procedures and directives. QHSE Responsibilities - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company s, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills - Analytical skills - Negotiation - Convincing skills Key Competencies - Never give up attitude - Flexible - Eye to detail Experience: Minimum 8 Years Of Experience

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

InOrg Global is looking for Data Architect - Databricks to join our dynamic team and embark on a rewarding career journey. We are seeking an experienced and innovative Architect specializing in Azure Databricks to join our team. As an Architect, you will play a crucial role in designing, implementing, and optimizing solutions leveraging Azure Databricks for our clients. You will be responsible for understanding client requirements, architecting scalable and efficient solutions, and providing technical leadership throughout the project lifecycle. The ideal candidate will have a strong background in data engineering, cloud computing, and hands - on experience with Azure Databricks. Responsibilities : Collaborate with clients to understand their business requirements and translate them into technical solutions leveraging Azure Databricks. Design end - to - end data engineering solutions on Azure Databricks, including data ingestion, processing, transformation, and visualization. Architect scalable and reliable data pipelines using Azure Databricks, Apache Spark, and other relevant technologies. Provide technical leadership and guidance to development teams throughout the project lifecycle. Perform code reviews, optimize performance, and ensure best practices are followed in Azure Databricks implementations. Collaborate with cross - functional teams including data scientists, analysts, and infrastructure teams to deliver integrated solutions. Stay updated with the latest advancements in Azure Databricks and recommend innovative solutions to enhance productivity and efficiency. Participate in pre - sales activities, including solution demonstrations, workshops, and proposal development.

Posted 1 month ago

Apply

10.0 - 12.0 years

20 - 25 Lacs

Pune

Work from Office

Key Responsibilities : As an Enterprise Data Architect, you will : - Lead Data Architecture : Design, develop, and implement comprehensive enterprise data architectures, primarily leveraging Azure and Snowflake platforms. - Data Transformation & ETL : Oversee and guide complex data transformation and ETL processes for large and diverse datasets, ensuring data integrity, quality, and performance. - Customer-Centric Data Design : Specialize in designing and optimizing customer-centric datasets from various sources, including CRM, Call Center, Marketing, Offline, and Point of Sale systems. - Data Modeling : Drive the creation and maintenance of advanced data models, including Relational, Dimensional, Columnar, and Big Data models, to support analytical and operational needs. - Query Optimization : Develop, optimize, and troubleshoot complex SQL and NoSQL queries to ensure efficient data retrieval and manipulation. - Data Warehouse Management : Apply advanced data warehousing concepts to build and manage high-performing, scalable data warehouse solutions. - Tool Evaluation & Implementation : Evaluate, recommend, and implement industry-leading ETL tools such as Informatica and Unifi, ensuring best practices are followed. - Business Requirements & Analysis : Lead efforts in business requirements definition and management, structured analysis, process design, and use case documentation to translate business needs into technical specifications. - Reporting & Analytics Support : Collaborate with reporting teams, providing architectural guidance and support for reporting technologies like Tableau and PowerBI. - Software Development Practices : Apply professional software development principles and best practices to data solution delivery. - Stakeholder Collaboration : Interface effectively with sales teams and directly engage with customers to understand their data challenges and lead them to successful outcomes. - Project Management & Multi-tasking : Demonstrate exceptional organizational skills, with the ability to manage and prioritize multiple simultaneous customer projects effectively. - Strategic Thinking & Leadership : Act as a self-managed, proactive, and customer-focused leader, driving innovation and continuous improvement in data architecture. Position Requirements : of strong experience with data transformation & ETL on large data sets. - Experience with designing customer-centric datasets (i.e., CRM, Call Center, Marketing, Offline, Point of Sale, etc.). - 5+ years of Data Modeling experience (i.e., Relational, Dimensional, Columnar, Big Data). - 5+ years of complex SQL or NoSQL experience. - Extensive experience in advanced Data Warehouse concepts. - Proven experience with industry ETL tools (i.e., Informatica, Unifi). - Solid experience with Business Requirements definition and management, structured analysis, process design, and use case documentation. - Experience with Reporting Technologies (i.e., Tableau, PowerBI). - Demonstrated experience in professional software development. - Exceptional organizational skills and ability to multi-task simultaneous different customer projects. - Strong verbal & written communication skills to interface with sales teams and lead customers to successful outcomes. - Must be self-managed, proactive, and customer-focused. Technical Skills : - Cloud Platforms : Microsoft Azure - Data Warehousing : Snowflake - ETL Methodologies : Extensive experience in ETL processes and tools - Data Transformation : Large-scale data transformation - Data Modeling : Relational, Dimensional, Columnar, Big Data - Query Languages : Complex SQL, NoSQL - ETL Tools : Informatica, Unifi (or similar enterprise-grade tools) - Reporting & BI : Tableau, PowerBI

Posted 1 month ago

Apply

8.0 - 10.0 years

22 - 27 Lacs

Ahmedabad

Work from Office

Role And Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement recommendations including changes in operational processes, technology or data management - Primary area of focus: PSCM/ VMI business; secondary area of focus: ICS KPI s - Business improvements pre & post (either operational program, algorithm, model or resultant software). Improvements measured in time and/or dollar savings - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : - R or python (Data Cleaning, Statistical and Modelling packages) , SQL, VBA and DAX (PowerBI) - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique Compliance Requirements : - GET has a Business Ethics Policy which provides guidance to all employees in their day-to-day roles as well as helping you and the business comply with the law at all times. The incumbent must read, understand and comply with, at all times, the policy along with all other corresponding policies, procedures and directives. QHSE Responsibilities - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company s, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills - Analytical skills - Negotiation - Convincing skills Key Competencies - Never give up attitude - Flexible - Eye to detail Experience: Minimum 8 Years Of Experience

Posted 1 month ago

Apply

3.0 - 8.0 years

9 - 14 Lacs

Ahmedabad

Remote

Healthcare experience is Mandatory Position Overview : We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities : Data Architecture & Modeling : - Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management - Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) - Create and maintain data lineage documentation and data dictionaries for healthcare datasets - Establish data modeling standards and best practices across the organization Technical Leadership : - Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica - Architect scalable data solutions that handle large volumes of healthcare transactional data - Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise : - Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) - Design data models that support analytical, reporting and AI/ML needs - Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations - Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality : - Implement data governance frameworks specific to healthcare data privacy and security requirements - Establish data quality monitoring and validation processes for critical health plan metrics - Lead eAorts to standardize healthcare data definitions across multiple systems and data sources Required Qualifications : Technical Skills : - 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data - Expert-level proficiency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches - Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing - Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) - Proficiency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge : - Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data - Experience with healthcare data standards and medical coding systems - Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) - Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication : - Proven track record of leading data modeling projects in complex healthcare environments - Strong analytical and problem-solving skills with ability to work with ambiguous requirements - Excellent communication skills with ability to explain technical concepts to business stakeholders - Experience mentoring team members and establishing technical standards Preferred Qualifications : - Experience with Medicare Advantage, Medicaid, or Commercial health plan operations - Cloud platform certifications (AWS, Azure, or GCP) - Experience with real-time data streaming and modern data lake architectures - Knowledge of machine learning applications in healthcare analytics - Previous experience in a lead or architect role within healthcare organization

Posted 1 month ago

Apply

13.0 - 15.0 years

50 - 60 Lacs

Bengaluru

Work from Office

KPMG India is looking for Azure Data Architect - Manager to join our dynamic team and embark on a rewarding career journey A Data Architect is a professional who is responsible for designing, building, and maintaining an organization's data architecture1Designing and implementing data models, data integration solutions, and data management systems that ensure data accuracy, consistency, and security2Developing and maintaining data dictionaries, metadata, and data lineage documents to ensure data governance and compliance3Data Architect should have a strong technical background in data architecture and management, as well as excellent communication skills4Strong problem-solving skills and the ability to think critically are also essential to identify and implement solutions to complex data issues

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies