Jobs
Interviews

241 Data Architect Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

16 - 20 Lacs

gurugram

Work from Office

Role And Responsibilities : - Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions. - Mine and analyze data from company databases to drive optimization and improvement of business strategies. - Assess the effectiveness and accuracy of data sources and data gathering techniques. - Develop custom data models and algorithms to apply to data sets. - Use predictive modelling to increase and optimize business outcomes. - Work individually or with extended teams to operationalize models & algorithms into structured software, programs or operational processes - Coordinate with different functional teams to implement models and monitor outcomes. - Develop processes and tools to monitor and analyze model performance and data accuracy. - Provide recommendations to business users based upon data/ model outcomes, and implement recommendations including changes in operational processes, technology or data management - Primary area of focus: PSCM/ VMI business; secondary area of focus: ICS KPI s - Business improvements pre & post (either operational program, algorithm, model or resultant software). Improvements measured in time and/or dollar savings - Satisfaction score of business users (of either operational program, algorithm, model or resultant software). Qualifications And Education Requirements - Graduate BSc/BTech in applied sciences with year 2 statistics courses. - Relevant Internship (at least 2 months) OR Relevant Certifications of Preferred Skills. Preferred Skills - Strong problem-solving skills with an emphasis on business development. - Experience the following coding languages : - R or python (Data Cleaning, Statistical and Modelling packages) , SQL, VBA and DAX (PowerBI) - Knowledge of working with and creating data architectures. - Knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) and their real-world advantages/drawbacks. - Knowledge of advanced statistical techniques and concepts (regression, properties of distributions, statistical tests, and proper usage, etc.) and experience with applications. - Excellent written and verbal communication skills for coordinating across teams. - A drive to learn and master new technologies and technique Compliance Requirements : - GET has a Business Ethics Policy which provides guidance to all employees in their day-to-day roles as well as helping you and the business comply with the law at all times. The incumbent must read, understand and comply with, at all times, the policy along with all other corresponding policies, procedures and directives. QHSE Responsibilities - Demonstrate a personal commitment to Quality, Health, Safety and the Environment. - Apply GET, and where appropriate Client Company s, Quality, Health, Safety & Environment Policies and Safety Management Systems. - Promote a culture of continuous improvement, and lead by example to ensure company goals are achieved and exceeded. Skills - Analytical skills - Negotiation - Convincing skills Key Competencies - Never give up attitude - Flexible - Eye to detail

Posted 4 days ago

Apply

8.0 - 13.0 years

20 - 25 Lacs

pune

Work from Office

Primary Roles And Responsibilities : - Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack. - Ability to provide solutions that are forward-thinking in data engineering and analytics space. - Collaborate with DW/BI leads to understand new ETL pipeline development requirements. - Triage issues to find gaps in existing pipelines and fix the issues. - Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs. - Help joiner team members to resolve issues and technical challenges. - Drive technical discussion with client architect and team members. - Orchestrate the data pipelines in scheduler via Airflow. Skills And Qualifications : - Bachelor's and/or masters degree in computer science or equivalent experience. - Must have total 6+ yrs of IT experience and 3+ years' experience in Data warehouse/ETL projects. - Deep understanding of Star and Snowflake dimensional modelling. - Strong knowledge of Data Management principles. - Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture. - Should have hands-on experience in SQL, Python and Spark (PySpark). - Candidate must have experience in AWS/ Azure stack. - Desirable to have ETL with batch and streaming (Kinesis). - Experience in building ETL / data warehouse transformation processes. - Experience with Apache Kafka for use with streaming data / event-based data. - Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala). - Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J). - Experience working with structured and unstructured data including imaging & geospatial data. - Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. - Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot. - Databricks Certified Data Engineer Associate/Professional Certification (Desirable). - Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects. - Should have experience working in Agile methodology. - Strong verbal and written communication skills. - Strong analytical and problem-solving skills with a high attention to detail.

Posted 4 days ago

Apply

8.0 - 12.0 years

19 - 22 Lacs

ahmedabad

Work from Office

This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions.Responsibilities :Data Architecture Design and Development:- Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies.- Develop and maintain conceptual, logical, and physical data models.- Define and enforce data standards, policies, and procedures.- Evaluate and select appropriate data technologies and tools.- Ensure scalability, performance, and security of data architectures.- MS Dynamics and Data Lake Integration :- Lead the integration of MS Dynamics with data lake environments.- Design and implement data pipelines for efficient data movement between systems.- Troubleshoot and resolve integration issues.- Optimize data flow and performance within the integrated environment.ETL and Data Integration :- Design, develop, and implement ETL processes for data extraction, transformation, and loading.- Ensure data quality and consistency throughout the integration process.- Develop and maintain data integration documentation.- Implement data validation and error handling mechanisms.Data Modeling and Data Governance :- Develop and maintain data models that align with business requirements.- Implement and enforce data governance policies and procedures.- Ensure data security and compliance with relevant regulations.- Establish and maintain data dictionaries and metadata repositories.Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues.- Conduct root cause analysis and implement corrective actions.- Provide technical guidance and support to development teams.- Communicate issues and risks proactively.Collaboration and Communication :- Collaborate with stakeholders to understand data requirements and translate them into technical solutions.- Communicate effectively with technical and non-technical audiences.- Participate in design reviews and code reviews.- Work as good single contributor and good team player. Qualifications :Experience :- 8-12 years of hands-on experience in data architecture and related fields.- Minimum 4 years of experience in architectural design and integration.- Experience working with cloud based data solutions.Technical Skills :- Strong expertise in MS Dynamics and data lake architecture.- Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.).- Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling).- Strong understanding of data warehousing concepts and best practices.- Proficiency in SQL and other data query languages.- Experience with data quality assurance and data governance.- Experience with cloud platforms such as Azure or AWS.Soft Skills :- Strong analytical and problem-solving skills.- Excellent communication and interpersonal skills.- Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities.- Proactive and self-motivated. - Ability to deal with ambiguity.- Open to continuous learning.- Self-confident and humble.- Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted 5 days ago

Apply

8.0 - 12.0 years

19 - 22 Lacs

jaipur

Work from Office

Type: Contract (8-12 Months) Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted 5 days ago

Apply

8.0 - 12.0 years

19 - 22 Lacs

noida

Work from Office

Job Description: Senior Data Architect (Contract) Company : Emperen TechnologiesLocation: India (Remote)Type: Contract (8-12 Months)Experience: 8-12 YearsAbout Emperen Technologies:Emperen Technologies is a dynamic consulting firm founded in 2010, dedicated to delivering impactful results for clients through strong, collaborative partnerships. We specialize in implementing strategic visions for Fortune 500 companies, non-profit organizations, and innovative startups. Our client-centric, values-driven approach, coupled with a scalable business model, empowers clients to navigate and thrive in the ever-evolving technology landscape. We are committed to building a team of top-tier talent who are inspired by our vision and driven to excellence.Role Overview :We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities :Data Architecture Design and Development: Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. Develop and maintain conceptual, logical, and physical data models. Define and enforce data standards, policies, and procedures. Evaluate and select appropriate data technologies and tools. Ensure scalability, performance, and security of data architectures. MS Dynamics and Data Lake Integration : Lead the integration of MS Dynamics with data lake environments. Design and implement data pipelines for efficient data movement between systems. Troubleshoot and resolve integration issues. Optimize data flow and performance within the integrated environment.ETL and Data Integration : Design, develop, and implement ETL processes for data extraction, transformation, and loading. Ensure data quality and consistency throughout the integration process. Develop and maintain data integration documentation. Implement data validation and error handling mechanisms.Data Modeling and Data Governance : Develop and maintain data models that align with business requirements. Implement and enforce data governance policies and procedures. Ensure data security and compliance with relevant regulations. Establish and maintain data dictionaries and metadata repositories.Issue Resolution and Troubleshooting : Proactively identify and resolve architectural issues. Conduct root cause analysis and implement corrective actions. Provide technical guidance and support to development teams. Communicate issues and risks proactively.Collaboration and Communication : Collaborate with stakeholders to understand data requirements and translate them into technical solutions. Communicate effectively with technical and non technical audiences. Participate in design reviews and code reviews. Work as good single contributor and good team player.Qualifications :Experience : 8 - 12 years of hands on experience in data architecture and related fields. Minimum 4 years of experience in architectural design and integration. Experience working with cloud based data solutions.Technical Skills : Strong expertise in MS Dynamics and data lake architecture. Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). Strong understanding of data warehousing concepts and best practices. Proficiency in SQL and other data query languages. Experience with data quality assurance and data governance. Experience with cloud platforms such as Azure or AWS.Soft Skills : Strong analytical and problem solving skills. Excellent communication and interpersonal skills. Ability to work independently and as part of a team. Flexible and adaptable to changing priorities. Proactive and self motivated. Ability to deal with ambiguity. Open to continuous learning. Selfconfident and humble. Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted 5 days ago

Apply

8.0 - 12.0 years

19 - 22 Lacs

pune

Remote

Job Description: Senior Data Architect (Contract) Company : Emperen Technologies Location: India (Remote) Type: Contract (8-12 Months) Experience: 8-12 Years About Emperen Technologies: Emperen Technologies is a dynamic consulting firm founded in 2010, dedicated to delivering impactful results for clients through strong, collaborative partnerships. We specialize in implementing strategic visions for Fortune 500 companies, non-profit organizations, and innovative startups. Our client-centric, values-driven approach, coupled with a scalable business model, empowers clients to navigate and thrive in the ever-evolving technology landscape. We are committed to building a team of top-tier talent who are inspired by our vision and driven to excellence. Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted 5 days ago

Apply

8.0 - 12.0 years

19 - 22 Lacs

surat

Remote

Job Description: Senior Data Architect (Contract) Company : Emperen Technologies Location: India (Remote) Type: Contract (8-12 Months) Experience: 8-12 Years Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted 5 days ago

Apply

8.0 - 12.0 years

19 - 22 Lacs

kolkata

Remote

Job Description: Senior Data Architect (Contract) Company : Emperen Technologies Location: India (Remote) Type: Contract (8-12 Months) Experience: 8-12 Years About Emperen Technologies: Emperen Technologies is a dynamic consulting firm founded in 2010, dedicated to delivering impactful results for clients through strong, collaborative partnerships. We specialize in implementing strategic visions for Fortune 500 companies, non-profit organizations, and innovative startups. Our client-centric, values-driven approach, coupled with a scalable business model, empowers clients to navigate and thrive in the ever-evolving technology landscape. We are committed to building a team of top-tier talent who are inspired by our vision and driven to excellence. Role Overview : We are seeking a highly skilled and experienced Senior Data Architect to join our team on a contract basis. This role will be pivotal in designing and implementing robust data architectures, ensuring data governance, and driving data-driven insights. The ideal candidate will possess deep expertise in MS Dynamics, data lake architecture, ETL processes, data modeling, and data integration. You will collaborate closely with stakeholders to understand their data needs and translate them into scalable and efficient solutions. Responsibilities : Data Architecture Design and Development: - Design and implement comprehensive data architectures, including data lakes, data warehouses, and data integration strategies. - Develop and maintain conceptual, logical, and physical data models. - Define and enforce data standards, policies, and procedures. - Evaluate and select appropriate data technologies and tools. - Ensure scalability, performance, and security of data architectures. - MS Dynamics and Data Lake Integration : - Lead the integration of MS Dynamics with data lake environments. - Design and implement data pipelines for efficient data movement between systems. - Troubleshoot and resolve integration issues. - Optimize data flow and performance within the integrated environment. ETL and Data Integration : - Design, develop, and implement ETL processes for data extraction, transformation, and loading. - Ensure data quality and consistency throughout the integration process. - Develop and maintain data integration documentation. - Implement data validation and error handling mechanisms. Data Modeling and Data Governance : - Develop and maintain data models that align with business requirements. - Implement and enforce data governance policies and procedures. - Ensure data security and compliance with relevant regulations. - Establish and maintain data dictionaries and metadata repositories. Issue Resolution and Troubleshooting : - Proactively identify and resolve architectural issues. - Conduct root cause analysis and implement corrective actions. - Provide technical guidance and support to development teams. - Communicate issues and risks proactively. Collaboration and Communication : - Collaborate with stakeholders to understand data requirements and translate them into technical solutions. - Communicate effectively with technical and non-technical audiences. - Participate in design reviews and code reviews. - Work as good single contributor and good team player. Qualifications : Experience : - 8-12 years of hands-on experience in data architecture and related fields. - Minimum 4 years of experience in architectural design and integration. - Experience working with cloud based data solutions. Technical Skills : - Strong expertise in MS Dynamics and data lake architecture. - Proficiency in ETL tools and techniques (e.g., Azure Data Factory, SSIS, etc.). - Expertise in data modeling techniques (e.g., dimensional modeling, relational modeling). - Strong understanding of data warehousing concepts and best practices. - Proficiency in SQL and other data query languages. - Experience with data quality assurance and data governance. - Experience with cloud platforms such as Azure or AWS. Soft Skills : - Strong analytical and problem-solving skills. - Excellent communication and interpersonal skills. - Ability to work independently and as part of a team. - Flexible and adaptable to changing priorities. - Proactive and self-motivated. - Ability to deal with ambiguity. - Open to continuous learning. - Self-confident and humble. - Intelligent, rigorous thinker who can operate successfully amongst bright people.

Posted 5 days ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

bengaluru

Work from Office

Duration : 6 Months Timings : General IST Notice Period : within 15 days or immediate joiner About The Role : As a Data Engineer for the Data Science team, you will play a pivotal role in enriching and maintaining the organization's central repository of datasets. This repository serves as the backbone for advanced data analytics and machine learning applications, enabling actionable insights from financial and market data. You will work closely with cross-functional teams to design and implement robust ETL pipelines that automate data updates and ensure accessibility across the organization. This is a critical role requiring technical expertise in building scalable data pipelines, ensuring data quality, and supporting data analytics and reporting infrastructure for business growth. Note : Must be ready for face-to-face interview in Bangalore (last round). Should be working with Azure as cloud technology. Key Responsibilities : ETL Development : - Design, develop, and maintain efficient ETL processes for handling multi-scale datasets. - Implement and optimize data transformation and validation processes to ensure data accuracy and consistency. - Collaborate with cross-functional teams to gather data requirements and translate business logic into ETL workflows. Data Pipeline Architecture : - Architect, build, and maintain scalable and high-performance data pipelines to enable seamless data flow. - Evaluate and implement modern technologies to enhance the efficiency and reliability of data pipelines. - Build pipelines for extracting data via web scraping to source sector-specific datasets on an ad hoc basis. Data Modeling : - Design and implement data models to support analytics and reporting needs across teams. - Optimize database structures to enhance performance and scalability. Data Quality And Governance : - Develop and implement data quality checks and governance processes to ensure data integrity. - Collaborate with stakeholders to define and enforce data quality standards across the and Communication - Maintain detailed documentation of ETL processes, data models, and other key workflows. - Effectively communicate complex technical concepts to non-technical stakeholders and business Collaboration - Work closely with the Quant team and developers to design and optimize data pipelines. - Collaborate with external stakeholders to understand business requirements and translate them into technical solutions. Essential Requirements Basic Qualifications : - Bachelor's degree in Computer Science, Information Technology, or a related field. - Familiarity with big data technologies like Hadoop, Spark, and Kafka. - Experience with data modeling tools and techniques. - Excellent problem-solving, analytical, and communication skills. - Proven experience as a Data Engineer with expertise in ETL techniques (minimum years). - 3-6 years of strong programming experience in languages such as Python, Java, or Scala - Hands-on experience in web scraping to extract and transform data from publicly available web sources. - Proficiency with cloud-based data platforms such as AWS, Azure, or GCP. - Strong knowledge of SQL and experience with relational and non-relational databases. - Deep understanding of data warehousing concepts and Qualifications : - Master's degree in Computer Science or Data Science. - Knowledge of data streaming and real-time processing frameworks. - Familiarity with data governance and security best practices.

Posted 5 days ago

Apply

5.0 - 9.0 years

0 Lacs

nagpur, maharashtra

On-site

As a Data Architect at our company, you will be responsible for designing scalable data architectures for web-based platforms or cloud-native systems. Your role will involve hands-on experience with relational and NoSQL databases such as PostgreSQL, MongoDB, and Cassandra. Additionally, you will work with cloud-based data services, data pipelines, and orchestration tools like Azure Data Services, AWS, GCP, Apache Airflow, and Azure Data Factory. In this role, you will have the opportunity to utilize your expertise in Big Data technologies including Spark, Kafka, and Delta Lake. A deep understanding of data modeling, ETL/ELT processes, and data lifecycle management will be crucial to your success in this position. Familiarity with cybersecurity, log/event data formats (e.g., syslog, JSON, STIX), and security telemetry is considered a strong advantage. Your responsibilities will include defining the data architecture and strategy for the CMP, ensuring alignment with product requirements and security standards. You will design and implement data models, data flows, and integration patterns for structured, semi-structured, and unstructured data. Collaboration with DevOps, engineering, and security teams will be essential to build scalable data pipelines and ensure real-time and batch processing capabilities. Moreover, you will be expected to select and integrate appropriate data storage and analytics technologies such as relational databases, data lakes, NoSQL, and time-series databases. Ensuring compliance with data governance, privacy, and security best practices will be a key aspect of your role. You will also establish data quality frameworks, metadata management, and lineage tracking to support analytics and reporting use cases with robust data architecture foundations. At our company, we offer a culture of caring where people come first. You will experience an inclusive culture of acceptance and belonging, building meaningful connections with collaborative teammates, supportive managers, and compassionate leaders. We are committed to your continuous learning and development, providing numerous opportunities to grow personally and professionally. You will have the chance to work on projects that matter, collaborating with clients globally to engineer impactful solutions. We believe in the importance of balance and flexibility, offering various career areas, roles, and work arrangements to help you achieve a work-life balance. As a high-trust organization, integrity is key, and you can trust GlobalLogic as a safe, reliable, and ethical global company. Join us in shaping the digital revolution, transforming businesses, and redefining industries through intelligent products, platforms, and services.,

Posted 1 week ago

Apply

12.0 - 16.0 years

0 Lacs

haryana

On-site

You are invited to apply for the position of Global IT Data Architect-Senior Manager at a well-known management consulting firm in Gurgaon. With over 12 years of experience in the field, you will be responsible for leading data warehouse and database related projects, with a special emphasis on cloud databases like Snowflake and Redshift. Your role will involve designing Data Warehousing Architecture, BI/Analytical systems, Data cataloguing, and MDM. Your expertise in Conceptual, Logical, and Physical Data Modelling will be crucial for the success of the projects. Additionally, you will be expected to document all architecture-related work effectively and efficiently. Proficiency in data storage, ETL/ELT processes, and data analytics tools such as AWS Glue, DBT/Talend, FiveTran, APIs, Tableau, Power BI, and Alteryx is essential for this role. Moreover, experience with Cloud Big Data technologies like AWS, Azure, GCP, and Snowflake will be considered a strong asset. Experience in working with agile methodologies such as Scrum, Kanban, and Meta Scrum with cross-functional teams is advantageous. Your excellent written, oral communication, and presentation skills will be vital for effectively conveying architecture, features, and solution recommendations. To be considered for this position, you should hold a minimum of a Bachelor's degree in Computer Science, Engineering, or a related field. Additional certification in Data Management or cloud data platforms like Snowflake is preferred. If you meet these qualifications and are ready to take on this challenging role, please send your resume to leeba@mounttalent.com. Join us in shaping the future of IT data architecture and make a significant impact in the world of management consulting.,

Posted 3 weeks ago

Apply

8.0 - 12.0 years

0 Lacs

rajasthan

On-site

As a Data Architect in Georgia, you will be responsible for designing, creating, and managing data architecture to support the organization's data needs. With a minimum of 8 years and a maximum of 10 years of experience, you will utilize your expertise to develop data strategies, implement data models, and ensure data quality and integrity. Additionally, you will work closely with cross-functional teams to analyze data requirements, optimize data processes, and drive data-driven decision-making within the organization. A graduate qualification is required for this role.,

Posted 3 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

You should have over 10 years of experience in data architecture, data engineering, or related roles. Your expertise should include designing and implementing enterprise-level data solutions with a hands-on technical approach. You should have a proven track record of managing client relationships and leading technical teams. In terms of technical skills, you must be well-versed in data modeling, data warehousing, and database design, including both relational and NoSQL databases. You should have a strong proficiency in data engineering, which includes experience with ETL tools, data integration frameworks, and big data technologies. Hands-on experience with Google Cloud data platform and modern data processing frameworks is crucial. Moreover, familiarity with scripting and programming languages like Python and SQL for hands-on development and troubleshooting is essential. Experience with data governance frameworks & solutions such as Informatica, Collibra, Purview, etc., will be a plus. Soft skills required for this role include exceptional client management and communication skills to confidently interact with both technical and non-technical stakeholders. You should possess proven team management and leadership abilities, including mentoring, coaching, and project management. Strong analytical and problem-solving skills with a proactive, detail-oriented approach are necessary. The ability to work collaboratively in a fast-paced, dynamic environment while successfully driving multiple projects to completion is important. Preferred certifications for this position include Professional Cloud Architect (GCP), Data Architect, Certified Data Management Professional (CDMP), or similar credentials.,

Posted 3 weeks ago

Apply

15.0 - 19.0 years

0 Lacs

hyderabad, telangana

On-site

As a Technical Lead / Data Architect, you will play a crucial role in our organization by leveraging your expertise in modern data architectures, cloud platforms, and analytics technologies. In this leadership position, you will be responsible for designing robust data solutions, guiding engineering teams, and ensuring successful project execution in collaboration with the project manager. Your key responsibilities will include architecting and designing end-to-end data solutions across multi-cloud environments such as AWS, Azure, and GCP. You will lead and mentor a team of data engineers, BI developers, and analysts to deliver on complex project deliverables. Additionally, you will define and enforce best practices in data engineering, data warehousing, and business intelligence. You will design scalable data pipelines using tools like Snowflake, dbt, Apache Spark, and Airflow, and act as a technical liaison with clients, providing strategic recommendations and maintaining strong relationships. To be successful in this role, you should have at least 15 years of experience in IT with a focus on data architecture, engineering, and cloud-based analytics. You must have expertise in multi-cloud environments and cloud-native technologies, along with deep knowledge of Snowflake, Data Warehousing, ETL/ELT pipelines, and BI platforms. Strong leadership and mentoring skills are essential, as well as excellent communication and interpersonal abilities to engage with both technical and non-technical stakeholders. In addition to the required qualifications, certifications in major cloud platforms and experience in enterprise data governance, security, and compliance are preferred. Familiarity with AI/ML pipeline integration would be a plus. We offer a collaborative work environment, opportunities to work with cutting-edge technologies and global clients, competitive salary and benefits, and continuous learning and professional development opportunities. Join us in driving innovation and excellence in data architecture and analytics.,

Posted 3 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

As a senior-level Data Engineer with Machine Learning Analyst capabilities, you will play a crucial role in leading the architecture, development, and management of scalable data solutions. Your expertise in data architecture, big data pipeline development, and data quality enhancement will be key in processing large-scale datasets and supporting machine learning workflows. Your key responsibilities will include designing, developing, and maintaining end-to-end data pipelines for ingestion, transformation, and delivery across various business systems. You will ensure robust data quality, data lineage, data reconciliation, and governance practices. Additionally, you will architect and manage data warehouse and big data solutions supporting both structured and unstructured data. Optimizing and automating ETL/ELT processes for high-volume data environments will be essential, with a focus on processing 5B+ records. Collaborating with data scientists and analysts to support machine learning workflows and implementing streamlined DAAS workflows will also be part of your role. To succeed in this position, you must have at least 10 years of experience in data engineering, including data architecture and pipeline development. Your proven experience with Spark and Hadoop clusters for processing large-scale datasets, along with a strong understanding of ETL frameworks, data quality processes, and automation best practices, will be critical. Experience in data ingestion, lineage, governance, and reconciliation, as well as a solid understanding of data warehouse design principles and data modeling, are must-have skills. Expertise in automated data processing, especially for DAAS platforms, is essential. Desirable skills for this role include experience with Apache HBase, Apache NiFi, and other Big Data tools, knowledge of distributed computing principles and real-time data streaming, familiarity with machine learning pipelines and supporting data structures, and exposure to data cataloging and metadata management tools. Proficiency in Python, Scala, or Java for data engineering tasks is also beneficial. In addition to technical skills, soft skills such as a strong analytical and problem-solving mindset, excellent communication skills for collaboration across technical and business teams, and the ability to work independently, manage multiple priorities, and lead data initiatives are required. If you are excited about the opportunity to work as a Data Engineer with Machine Learning Analyst capabilities and possess the necessary skills and experience, we look forward to receiving your application.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

hyderabad, telangana

On-site

Be a part of a dynamic team and excel in an environment that values diversity and creativity. Continue to sharpen your skills and ambition while pushing the industry forward. As a Data Architect at JPMorgan Chase within the Employee Platforms, you serve as a seasoned member of a team to develop high-quality data architecture solutions for various software applications and platforms. By incorporating leading best practices and collaborating with teams of architects, you are an integral part of carrying out critical technology solutions across multiple technical areas within various business functions in support of the firm's business objectives. In this role, you will be responsible for designing and implementing data models that support our organization's data strategy. You will work closely with Data Product Managers, Engineering teams, and Data Governance teams to ensure the delivery of high-quality data products that meet business needs and adhere to best practices. Job responsibilities include: - Executing data architecture solutions and technical troubleshooting with the ability to think beyond routine or conventional approaches to build solutions and break down problems. - Collaborating with Data Product Managers to understand business requirements and translate them into data modeling specifications. Conducting interviews and workshops with stakeholders to gather detailed data requirements. - Creating and maintaining data dictionaries, entity-relationship diagrams, and other documentation to support data models. - Producing secure and high-quality production code and maintaining algorithms that run synchronously with appropriate systems. - Evaluating data architecture designs and providing feedback on recommendations. - Representing the team in architectural governance bodies. - Leading the data architecture team in evaluating new technologies to modernize the architecture using existing data standards and frameworks. - Gathering, analyzing, synthesizing, and developing visualizations and reporting from large, diverse data sets in service of continuous improvement of data frameworks, applications, and systems. - Proactively identifying hidden problems and patterns in data and using these insights to drive improvements to coding hygiene and system architecture. - Contributing to data architecture communities of practice and events that explore new and emerging technologies. Required qualifications, capabilities, and skills: - Formal training or certification in Data Architecture and 3+ years of applied experience. - Hands-on experience in data platforms, cloud services (e.g., AWS, Azure, or Google Cloud), and big data technologies. - Strong understanding of database management systems, data warehousing, and ETL processes. - Excellent communication and collaboration skills, with the ability to work effectively with cross-functional teams. - Knowledge of data governance principles and best practices. - Ability to evaluate current technologies to recommend ways to optimize data architecture. - Hands-on practical experience in system design, application development, testing, and operational stability. - Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming and database querying languages. - Overall knowledge of the Software Development Life Cycle. - Solid understanding of agile methodologies such as continuous integration and delivery, application resiliency, and security. - Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills: - Experience with cloud-based data platforms (e.g., AWS, Azure, Google Cloud). - Familiarity with big data technologies (e.g., Hadoop, Spark). - Certification in data modeling or data architecture.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

As an Associate Data Architect at Quantiphi, you will be part of a dynamic team that thrives on innovation and growth. Your role will involve designing and delivering big data pipelines for structured and unstructured data across diverse geographies, particularly focusing on assisting healthcare organizations in achieving their business objectives through the utilization of data ingestion technologies, cloud services, and DevOps practices. Your responsibilities will include collaborating with cloud engineers and clients to address large-scale data challenges by creating tools for migration, storage, and processing on Google Cloud. You will be instrumental in crafting cloud migration strategies for both cloud-based and on-premise applications, as well as diagnosing and resolving complex issues within distributed systems to enhance efficiency at scale. In this role, you will have the opportunity to design and implement cutting-edge solutions for data storage and computation for various clients. You will work closely with experts from different domains such as Cloud engineering, Software engineering, and ML engineering to develop platforms and applications that align with the evolving trends in the healthcare sector, including digital diagnosis, AI marketplace, and software as a medical product. Effective communication with cross-functional teams, including Infrastructure, Network, Engineering, DevOps, SiteOps, and cloud customers, will be essential to drive successful project outcomes. Additionally, you will play a key role in building advanced automation tools, monitoring solutions, and data operations frameworks across multiple cloud environments to streamline processes and enhance operational efficiency. A strong understanding of data modeling and governance principles will be crucial for this role, enabling you to contribute meaningfully to the development of scalable and sustainable data architectures. If you thrive in a fast-paced environment that values innovation, collaboration, and continuous learning, then a career as an Associate Data Architect at Quantiphi is the perfect fit for you. Join us and be part of a team of dedicated professionals who are passionate about driving positive change through technology and teamwork.,

Posted 4 weeks ago

Apply

15.0 - 24.0 years

35 - 45 Lacs

Mumbai, Bengaluru, Mumbai (All Areas)

Work from Office

Greetings!!! This is in regards to a Job opportunity for Data Architect with Datamatics Global Services Ltd. Position: Data Architect Website: https://www.datamatics.com/ Job Location: Mumbai(Andheri - Seepz)/Bangalore(Kalyani Neptune Bannerghatta Road) Job Description: Job Overview: We are seeking a Data Architect to lead end-to-end solutioning for enterprise data platforms while driving strategy, architecture, and innovation within our Data Center of Excellence (COE). This role requires deep expertise in Azure, Databricks, SQL, and Python, alongside strong pre-sales and advisory capabilities. The architect will serve as a trusted advisor, mentoring and guiding delivery teams, and defining scalable data strategies that align with business objectives. Key Responsibilities: Core Engineering Data Architecture & Solutioning - Design and implement enterprise-wide data architectures, ensuring scalability, security, and performance. - Lead end-to-end data solutioning, covering ingestion, transformation, governance, analytics, and visualization. - Architect high-performance data pipelines leveraging Azure Data Factory, Databricks, SQL, and Python. - Establish data governance frameworks, integrating Delta Lake, Azure Purview, and metadata management best practices. - Optimize data models, indexing strategies, and high-volume query processing. - Oversee data security, access controls, and compliance policies within cloud environments. - Mentor engineering teams, guiding best practices in data architecture, pipeline development, and optimization. Data COE & Thought Leadership - Define data architecture strategies, frameworks, and reusable assets for the Data COE. - Drive best practices, standards, and innovation across data engineering and analytics teams. - Act as a subject matter expert, shaping data strategy, scalability models, and governance frameworks. - Lead data modernization efforts, advising on cloud migration, system optimization, and future-proofing architectures. - Deliver technical mentorship, ensuring teams adopt cutting-edge data engineering techniques. - Represent the Data COE in industry discussions, internal training, and thought leadership sessions. Pre-Sales & Solution Advisory - Engage in pre-sales consulting, defining enterprise data strategies for prospects and existing customers. - Craft solution designs, architecture blueprints, and contribute to proof-of-concept (PoC) implementations. - Partner with sales and consulting teams to translate client needs into scalable data solutions. - Provide strategic guidance on Azure, Databricks, and cloud adoption roadmaps. - Present technical proposals and recommendations to executive stakeholders and customers. - Stay ahead of emerging cloud data trends to enhance solution offerings. Required Skills & Qualifications: - 15+ years of experience in data architecture, engineering, and cloud data solutions. - Proven expertise in Azure, Databricks, SQL, and Python as primary technologies. - Proficiency in other relevant cloud and data engineering tools based on business needs. - Deep knowledge of data governance, metadata management, and security policies. - Strong pre-sales, consulting, and solution advisory experience in enterprise data platforms. - Advanced skills in SQL optimization, data pipeline architecture, and high-scale analytics. - Leadership experience in mentoring teams, defining best practices, and driving thought leadership. - Expertise in Delta Lake, Azure Purview, and scalable data architectures. - Strong stakeholder management skills across technical and business domains. Preferred but Not Mandatory: - Familiarity with Microsoft Fabric and Power BI data accessibility techniques. - Hands-on experience with CI/CD for data pipelines, DevOps, and version control practices. Additional Notes: - The technologies listed above are primary but indicative. - The candidate should have the flexibility to work with additional tools and platforms based on business needs.

Posted 4 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

kolkata, west bengal

On-site

You should have 3-4 years of experience in Data Integration and Data transformation implementation, including Business Requirement gathering, Design, Configurations, Data integration with ETL Tool, Data testing and validation, and Report development. Good documentation skills and Data modelling experience are required. You will be the Point of contact between the client and the technology development team. You should hold a qualification of BE/B-TECH OR Masters. Strong BI Functional and Technical knowledge, Data modelling, Data Architect, ETL and Reporting development, administration, performance tuning experience, and database and Data warehousing knowledge are essential skills. Hands-on Experience on at least 1-2 end-to-end ETL implementation projects is necessary. A strong knowledge and experience of EDW concepts and methodology is expected. Experience in Client interaction and requirement gathering from clients is crucial. Knowledge in ETL tool and multiple reporting/data visualization tools is an added advantage. Your responsibilities will include Source system analysis, Data analysis and profiling, Creation of technical specifications, Implementing process design and target data models, Developing, testing, debugging, and documenting ETL and data integration processes, Supporting existing applications and ETL processes, Providing solutions to resolve departmental pain points, Addressing performance or data quality issues, and creating and maintaining data integration processes for the Collections Analytics Program. As part of the Responsibility Framework, you are expected to Communicate with Impact & Empathy, Develop Self & Others through Coaching, Build & Sustain Relationships, Be Passionate about Client Service, Be Curious: Learn, Share & Innovate, and Be Open-Minded, Practical & Agile with Change. This ETL role is at the Mid to Senior Level in the IT industry with 3-4 years of work experience required. The Annual CTC is Open, with 3 vacancies available and a Short Notice period. The contact person for this job is TAG.,

Posted 4 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Experience in developing digital marketing/digital analytics solutions using Adobe products is essential for this role. You should have experience in Adobe Experience Cloud products and recent experience with Adobe Experience Platform or a similar CDP. Good knowledge of the Data Science workspace and building intelligent Services on AEP is required. You should also have a strong understanding of datasets in Adobe Experience Platform, including loading data into the Platform through data source connectors, APIs, and streaming ingestion connectors. Furthermore, experience in creating all required Adobe XDM (Experience Data Model) in JSON based on the approved data model for all loading data files is necessary. Knowledge on utilizing Adobe Experience Platform (AEP) UI & POSTMAN to automate all customer schema data lake & profile design setups within each sandbox environment is also expected. Additionally, you should have experience in configuration within Adobe Experience Platform for all necessary identities & privacy settings and creating new segments within AEP to meet customer use cases. It is important to be able to test/validate the segments with the required destinations. Managing customer data by using Real-Time Customer Data Platform (RTCDP) and analyzing customer data by using Customer Journey Analytics (CJA) are key responsibilities of this role. You are required to have experience with creating connections, data views, and dashboards in CJA. Hands-on experience in the configuration and integration of Adobe Marketing Cloud modules like Audience Manager, Analytics, Campaign, and Target is also essential. Adobe Experience Cloud tool certifications (Adobe Campaign, Adobe Experience Platform, Adobe Target, Adobe Analytics) are desirable for this position. Proven ability to communicate verbally and in writing in a high-performance, collaborative environment is expected. Experience with data analysis, modeling, and mapping to coordinate closely with Data Architect(s) is also a part of this role. At Capgemini, you can shape your career with a range of career paths and internal opportunities within the Capgemini group. Comprehensive wellness benefits, including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work, are provided. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. Capgemini is a global business and technology transformation partner, helping organizations accelerate their dual transition to a digital and sustainable world while creating tangible impact for enterprises and society. With over 55 years of heritage, Capgemini is trusted by clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by market-leading capabilities in AI, generative AI, cloud, and data, combined with deep industry expertise and a partner ecosystem.,

Posted 4 weeks ago

Apply

12.0 - 19.0 years

17 - 32 Lacs

Pune

Work from Office

Must have: Python, Spark, and AWS. Good at problem solving, well-versed with overall project architecture and Hands-on Coding exp. Required Skills: Proficiency in multiple programming languages - ideally Python Proficiency in at least one cluster computing frameworks (preferably Spark, alternatively Flink or Storm) Proficiency in at least one cloud data lakehouse platforms (preferably AWS data lake services or Databricks, alternatively Hadoop), at least one relational data stores (Postgres, Oracle or similar) and at least one NOSQL data stores (Cassandra, Dynamo, MongoDB or similar) Proficiency in at least one scheduling/orchestration tools (preferably Airflow, alternatively AWS Step Functions or similar) Proficiency with data structures, data serialization formats (JSON, AVRO, Protobuf, or similar), big-data storage formats (Parquet, Iceberg, or similar), data processing methodologies (batch, micro-batching, and stream), one or more data modelling techniques (Dimensional, Data Vault, Kimball, Inmon, etc.), Agile methodology (develop PI plans and roadmaps), TDD (or BDD) and CI/CD tools (Jenkins, Git,) Strong organizational, problem-solving, and critical thinking skills; Strong documentation skills Preferred skills: Proficiency in IaC (preferably Terraform, alternatively AWS cloud formation)

Posted 1 month ago

Apply

4.0 - 9.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Be an essential element to a brighter future. We work together to transform essential resources into critical ingredients for mobility, energy, connectivity and health. Join our values-led organization committed to building a more resilient world with people and planet in mind. Our core values are the foundation that make us successful for ourselves, our customers and the planet. Job Description Overview As part of the Global Data & Analytics Technology team within Corporate IT, the Enterprise Master Data Architect plays a strategic role in shaping and executing enterprise-wide master data initiatives. This role partners closely with business leaders, the Corporate Master Data Management team, and Business Relationship Managers to define and deliver scalable solutions using SAP Master Data Governance (MDG). We re looking for a forward-thinking architect with a strong blend of technical expertise and business acumen someone who can balance innovation with execution, and who thrives in a fast-paced, collaborative environment. Key Responsibilities Collaborate with business stakeholders to define enterprise master data strategies and governance frameworks. Design and implement SAP MDG solutions that support the collection, processing, and stewardship of master data across domains. Lead the development and enforcement of data governance policies, standards, and best practices. Architect and deliver SAP-centric master data solutions that align with enterprise goals and compliance requirements. Provide technical leadership and mentorship to MDM team members and cross-functional partners. Ensure consistency, quality, and accessibility of master data across systems and business units. Drive continuous improvement in data architecture, modeling, and integration practices. Qualifications Bachelor s degree in Computer Science, Information Systems, or a related field. Proven experience designing and architecting enterprise Master Data solutions. 4+ years of hands-on experience with SAP MDG and SAP Data Architecture. Strong functional knowledge of master data domains: customer, vendor, product/material, and finance in S/4HANA or ECC. Experience with SAP Data Services and SAP Information Steward for data conversion, quality, and cleansing. Proficiency in defining systems strategy, requirements gathering, prototyping, testing, and deployment. Strong configuration and solution design skills. ABAP development experience required, including custom enhancements and data modeling. Experience with SAP S/4HANA 2021 or later preferred. Excellent communication, collaboration, and time management skills. Ability to lead cross-functional teams and manage multiple priorities in a dynamic environment. Benefits of Joining Albemarle Competitive compensation Comprehensive benefits package A diverse array of resources to support you professionally and personally. We are partners to one another in pioneering new ways to be better for ourselves, our teams, and our communities. When you join Albemarle, you become our most essential element and you can anticipate competitive compensation, a comprehensive benefits package, and resources that foster your well-being and fuel your personal growth. Help us shape the future, build with purpose and grow together.

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

maharashtra

On-site

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way you'd like, where you'll be supported and inspired by a collaborative community of colleagues around the world, and where you'll be able to reimagine what's possible. Join us and help the world's leading organizations unlock the value of technology and build a more sustainable, more inclusive world. Your Role Developing chatbots using Dialogflow CX Architecting and building AI powered chatbot applications using platforms such as Dialogflow CX Designing/developing user-centric conversation experiences involving chat, text, or voice Labels that are used by Conversational Architects in the Conversation Nodes can be considered as Pages in Dialogflow Your Profile Sound knowledge of cloud platforms (GCP/AWS/Azure) Experience in integrating APIs using NodeJS and Python Construct intents, entities, and annotations in Dialogflow tool Write API documentation that outlines endpoints that Customers need to implement the CCAI on their end Liaise with the Customer and Data Architect on use case requirements and API technical requirements What you'll love about working here You can shape your career with us. We offer a range of career paths and internal opportunities within Capgemini group. You will also get personalized career guidance from our leaders. You will get comprehensive wellness benefits including health checks, telemedicine, insurance with top-ups, elder care, partner coverage, or new parent support via flexible work. You will have the opportunity to learn on one of the industry's largest digital learning platforms, with access to 250,000+ courses and numerous certifications. About Capgemini Capgemini is a global business and technology transformation partner, helping organizations to accelerate their dual transition to a digital and sustainable world, while creating tangible impact for enterprises and society. It is a responsible and diverse group of 340,000 team members in more than 50 countries. With its strong over 55-year heritage, Capgemini is trusted by its clients to unlock the value of technology to address the entire breadth of their business needs. It delivers end-to-end services and solutions leveraging strengths from strategy and design to engineering, all fueled by its market-leading capabilities in AI, generative AI, cloud and data, combined with its deep industry expertise and partner ecosystem.,

Posted 1 month ago

Apply

2.0 - 18.0 years

12 - 16 Lacs

Hyderabad

Work from Office

Career Category Engineering Job Description [Role Name : IS Architecture] Job Posting Title: Data Architect Workday Job Profile : Principal IS Architect Department Name: Digital, Technology & Innovation Role GCF: 06A ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world s toughest diseases, and make people s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what s known today. ABOUT THE ROLE Role Description: The role is responsible for developing and maintaining the data architecture of the Enterprise Data Fabric. Data Architecture includes the activities required for data flow design, data modeling, physical data design, query performance optimization. The Data Architect is a senior-level position responsible for developing business information models by studying the business, our data, and the industry. This role involves creating data models to realize a connected data ecosystem that empowers consumers. The Data Architect drives cross-functional data interoperability, enables efficient decision-making, and supports AI usage of Foundational Data. This role will manage a team of Data Modelers. Roles & Responsibilities: Provide oversight to data modeling team members. Develop and maintain conceptual logical, and physical data models and to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Maintain comprehensive documentation of the architecture, including principles, standards, and models Evaluate and recommend technologies and tools that best fit the solution requirements Evaluate emerging technologies and assess their potential impact. Drive continuous improvement in the architecture by identifying opportunities for innovation and efficiency Basic Qualifications and Experience: [GCF Level 6A] Doctorate Degree and 2 years of experience in Computer Science, IT or related field OR Master s degree with 8 - 10 years of experience in Computer Science, IT or related field OR Bachelor s degree with 10 - 14 years of experience in Computer Science, IT or related field OR Diploma with 14 - 18 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills : Data Modeling: Expert in creating conceptual, logical, and physical data models to represent information structures. Ability to interview and communicate with business Subject Matter experts to develop data models that are useful for their analysis needs. Metadata Management : Knowledge of metadata standards, taxonomies, and ontologies to ensure data consistency and quality. Information Governance: Familiarity with policies and procedures for managing information assets, including security, privacy, and compliance. Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), performance tuning on big data processing Good-to-Have Skills: Experience with Graph technologies such as Stardog, Allegrograph, Marklogic Professional Certifications Certifications in Databricks are desired Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 1 month ago

Apply

8.0 - 14.0 years

30 - 35 Lacs

Hyderabad

Work from Office

Role Description: The role is responsible for developing and maintaining the data architecture of the Enterprise Data Fabric. Data Architecture includes the activities required for data flow design, data modeling, physical data design, query performance optimization. The Data Architect is a senior-level position responsible for developing business information models by studying the business, our data, and the industry. This role involves creating data models to realize a connected data ecosystem that empowers consumers. The Data Architect drives cross-functional data interoperability, enables efficient decision-making, and supports AI usage of Foundational Data. This role will manage a team of Data Modelers. Roles & Responsibilities: Provide oversight to data modeling team members. Develop and maintain conceptual logical, and physical data models and to support business needs Establish and enforce data standards, governance policies, and best practices Design and manage metadata structures to enhance information retrieval and usability Maintain comprehensive documentation of the architecture, including principles, standards, and models Evaluate and recommend technologies and tools that best fit the solution requirements Evaluate emerging technologies and assess their potential impact. Drive continuous improvement in the architecture by identifying opportunities for innovation and efficiency Basic Qualifications and Experience: [GCF Level 6A] Doctorate Degree and 2 years of experience in Computer Science, IT or related field OR Master s degree with 8 - 10 years of experience in Computer Science, IT or related field OR Bachelor s degree with 10 - 14 years of experience in Computer Science, IT or related field OR Diploma with 14 - 18 years of experience in Computer Science, IT or related field Functional Skills: Must-Have Skills : Data Modeling: Expert in creating conceptual, logical, and physical data models to represent information structures. Ability to interview and communicate with business Subject Matter experts to develop data models that are useful for their analysis needs. Metadata Management : Knowledge of metadata standards, taxonomies, and ontologies to ensure data consistency and quality. Information Governance: Familiarity with policies and procedures for managing information assets, including security, privacy, and compliance. Hands on experience with big data technologies and platforms, such as Databricks, Apache Spark (PySpark, SparkSQL), performance tuning on big data processing Good-to-Have Skills: Experience with Graph technologies such as Stardog, Allegrograph, Marklogic Professional Certifications Certifications in Databricks are desired Soft Skills: Excellent critical-thinking and problem-solving skills Strong communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated awareness of presentation skills Shift Information: This position requires you to work a later shift and may be assigned a second or third shift schedule. Candidates must be willing and able to work during evening or night shifts, as required based on business requirements. EQUAL OPPORTUNITY STATEMENT We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. .

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies