Jobs
Interviews

1817 Data Architecture Jobs - Page 36

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

11.0 - 18.0 years

30 - 45 Lacs

Hyderabad

Work from Office

Role & responsibilities We are looking for an experienced Data Architect with deep expertise in Snowflake technologies to lead the design, development, and deployment of scalable data architectures. This role involves building robust data pipelines, optimizing data warehouses, and supporting complex data migrations, ensuring data quality, security, and governance across all layers. Preferred candidate profile Data Modeling : Star, Snowflake, Data Vault, hybrid schemas, partitioning, clustering Databases : Snowflake, Oracle, SQL Server, Greenplum, PostgreSQL ETL/ELT Tools : Informatica IDMC, DataStage Big Data Tools : Hadoop, Hive Cloud Integration : AWS Services S3, EC2, Lambda, Glue Programming Languages : Python, PySpark Schedulers : Control-M Data Security : RBAC, data masking, encryption, audit trails, compliance (HIPAA, GDPR) Automation : Advanced SQL, API integration, DevOps practices Data Governance : Data quality, lineage, cataloging, MDM, metadata management

Posted 1 month ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Hyderabad

Work from Office

Exciting opportunity for a Delivery Lead Data Architect to join a high-growth analytics environment. You will be responsible for leading end-to-end technical delivery across data platforms, ensuring robust architecture, performance, and cross-functional alignment. Location : Gurugram (Hybrid) Your Future Employer A reputed analytics-driven organization focused on delivering innovative and scalable data solutions. Known for its inclusive work culture and continuous learning environment. Responsibilities 1) Leading design and delivery across complete SDLC for data and analytics projects 2) Translating business requirements into scalable data architecture and models 3) Collaborating with engineering, BI, testing, and support teams for smooth execution 4) Guiding development of ETL pipelines and reporting workflows 5) Mentoring team on best practices in data modeling, engineering, and architecture 6) Driving client communication, estimations, and technical workshops Requirements 1) Bachelors or Masters in Computer Science, IT, or related field 2) 6+ years of experience in data architecture and delivery leadership 3) Proficiency in SQL, Python, data modeling, and ETL tools 4) Experience with cloud platforms (AWS, Azure, or GCP) and Power BI 5) Strong understanding of SDLC, DevOps, and managed services delivery 6) Excellent communication, stakeholder management, and team leadership skills Whats in it for you 1) Opportunity to lead enterprise-level data programs 2) Work across modern cloud-native technologies 3) Competitive compensation with growth opportunities 4) Inclusive and collaborative work environment

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 8 Lacs

Hyderabad

Work from Office

Key Responsibilities: Design, develop, and maintain Qlik View applications and dashboards. Collaborate with business stakeholders to gather requirements and translate them into technical specifications. Perform data analysis and create data models to support business intelligence initiatives. Optimize Qlik View applications for performance and scalability. Provide technical support and troubleshooting for Qlik View applications. Ensure data accuracy and integrity in all Qlik View applications. Integrate Snowflake with Qlik View to enhance data processing and analytics capabilities. Stay updated with the latest Qlik View features and best practices. Conduct training sessions for end-users to maximize the utilization of Qlik View applications. Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience between 2-5 years as a Qlik View Developer. Strong knowledge of Qlik View architecture, data modeling, and scripting. Proficiency in SQL and database management. Knowledge of Snowflake and its integration with Qlik View. Excellent analytical and problem-solving skills. Ability to work independently and as part of a team. Strong communication and interpersonal skills.

Posted 1 month ago

Apply

9.0 - 14.0 years

10 - 20 Lacs

Mumbai, Bengaluru

Work from Office

Greetings from Future Focus Infotech!!! We have multiple opportunities Data architect Exp: 9+yrs Location : Mumbai / Bangalore Job Type- This is a Permanent position with Future Focus Infotech Pvt Ltd & you will be deputed with our client. A small glimpse about Future Focus Infotech Pvt Ltd. (Company URL: www.focusinfotech.com) If you are interested in above opportunity, send updated CV and below information to reema.b@focusinfotech.com Kindly mention the below details. Total Years of Experience: Current CTC: Expected CTC: Notice Period : Current location: Available for interview on weekdays: Pan Card : Thanks & Regards, Reema reema.b@focusinfotech.com 8925798887

Posted 1 month ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Mumbai

Work from Office

Design and implement data architecture and models for Big Data solutions using MapR and Hadoop ecosystems. You will optimize data storage, ensure data scalability, and manage complex data workflows. Expertise in Big Data, Hadoop, and MapR architecture is required for this position.

Posted 1 month ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Hyderabad

Work from Office

Design and implement data architectures and models, focusing on data warehouses and Snowflake-based environments. Ensure that data is structured for efficient querying and analysis, aligning with business goals and performance requirements.

Posted 1 month ago

Apply

10.0 - 17.0 years

12 - 22 Lacs

Gurugram

Work from Office

We know the importance that food plays in people's lives the power it has to bring people, families and communities together. Our purpose is to bring enjoyment to people’s lives through great tasting food, in a way which reflects our values. McCain has recently committed to implementing regenerative agriculture practices across 100 percent of our potato acreage by 2030. Ask us more about our commitment to sustainability. OVERVIEW McCain is embarking on a digital transformation. As part of this transformation, we are making significant investments into our data platforms, common data models. data structures and data policies to increase the quality of our data, the confidence of our business teams to use this data to make better decisions and drive value through the use of data. We have a new and ambitious global Digital & Data group, which serves as a resource to the business teams in our regions and global functions. We are currently recruiting an experienced Data Architect to build enterprise data model McCain. JOB PURPOSE: Reporting to the Data Architect Lead, Global Data Architect will take a lead role in creating the enterprise data model for McCain Foods, bringing together data assets across agriculture, manufacturing, supply chain and commercial. This data model will be the foundation for our analytics program that seeks to bring together McCain’s industry-leading operational data sets, with 3rd party data sets, to drive world-class analytics. Working with a diverse team of data governance experts, data integration architects, data engineers and our analytics team including data scientist, you will play a key role in creating a conceptual, logical and physical data model that underpins the Global Digital & Data team’s activities. . JOB RESPONSIBILITIES: Develop an understanding of McCain’s key data assets and work with data governance team to document key data sets in our enterprise data catalog Work with business stakeholders to build a conceptual business model by understanding the business end to end process, challenges, and future business plans. Collaborate with application architects to bring in the analytics point of view when designing end user applications. Develop Logical data model based on business model and align with business teams Work with technical teams to build physical data model, data lineage and keep all relevant documentations Develop a process to manage to all models and appropriate controls With a use-case driven approach, enhance and expand enterprise data model based on legacy on-premises analytics products, and new cloud data products including advanced analytics models Design key enterprise conformed dimensions and ensure understanding across data engineering teams (including third parties); keep data catalog and wiki tools current Primary point of contact for new Digital and IT programs, to ensure alignment to enterprise data model Be a clear player in shaping McCain’s cloud migration strategy, enabling advanced analytics and world-leading Business Intelligence analytics Work in close collaboration with data engineers ensuring data modeling best practices are followed MEASURES OF SUCCESS: Demonstrated history of driving change in a large, global organization A true passion for well-structured and well-governed data; you know and can explain to others the real business risk of too many mapping tables You live for a well-designed and well-structured conformed dimension table Focus on use-case driven prioritization; you are comfortable pushing business teams for requirements that connect to business value and also able to challenge requirements that will not achieve the business’ goals Developing data models that are not just elegant, but truly optimized for analytics, both advanced analytics use cases and dashboarding / BI tools A coaching mindset wherever you go, including with the business, data engineers and other architects A infectious enthusiasm for learning: about our business, deepening your technical knowledge and meeting our teams Have a “get things done” attitude. Roll up the sleeves when necessary; work with and through others as needed KEY QUALIFICATION & EXPERIENCES: Data Design and Governance At least 5 years of experience with data modeling to support business process Ability to design complex data models to connect and internal and external data Nice to have: Ability profile the data for data quality requirements At least 8 years of experience with requirement analysis; experience working with business stakeholders on data design Experience on working with real-time data. Nice to have: experience with Data Catalog tools Ability to draft accurate documentation that supports the project management effort and coding Technical skills At least 5 years of experience designing and working in Data Warehouse solutions building data models; preference for having S4 hana knowledge. At least 2 years of experience in visualization tools preferably Power BI or similar tools.e At least 2 years designing and working in Cloud Data Warehouse solutions; preference for Azure Databricks, Azure Synapse or earlier Microsoft solutions Experience Visio, Power Designer, or similar data modeling tools Nice to have: Experience in data profiling tools informatica, Collibra or similar data quality tools Nice to have: Working experience on MDx Experience in working in Azure cloud environment or similar cloud environment Must have : Ability to develop queries in SQL for assessing , manipulating, and accessing data stored in relational databases , hands on experience in PySpark, Python Nice to have: Ability to understand and work with unstructured data Nice to have at least 1 successful enterprise-wide cloud migration being the data architect or data modeler. - mainly focused on building data models. Nice to have: Experience on working with Manufacturing /Digital Manufacturing. Nice to have: experience designing enterprise data models for analytics, specifically in a PowerBI environment Nice to have: experience with machine learning model design (Python preferred) Behaviors and Attitudes Comfortable working with ambiguity and defining a way forward. Experience challenging current ways of working A documented history of successfully driving projects to completion Excellent interpersonal skills Attention to the details. Good interpersonal and communication skills Comfortable leading others through change

Posted 1 month ago

Apply

5.0 - 10.0 years

17 - 30 Lacs

Bengaluru

Work from Office

Position Overview We are seeking a skilled Data Engineer to design, build, and maintain scalable data pipelines and infrastructure. The ideal candidate will have expertise in Data Architecture, Data Modeling, Data Governance, PySpark, and Databricks to support our data-driven initiatives. You will collaborate with data scientists, analysts, and business teams to ensure high-quality, reliable, and secure data solutions. Key Responsibilities Design and implement scalable data pipelines for batch and real-time processing. Develop and maintain data models to support analytics and business intelligence. Ensure data governance best practices, including data quality, lineage, and compliance. Optimize and manage Databricks environments for efficient data processing. Write and optimize PySpark jobs for large-scale data transformations. Collaborate with stakeholders to define data architecture standards and best practices. Automate data workflows and improve ETL/ELT processes. Troubleshoot and resolve data-related issues in production environments. Required Skills & Qualifications Must-Have: Strong experience in Data Architecture and Data Modeling (relational, dimensional, NoSQL). Hands-on experience with PySpark for big data processing. Proficiency in Azure Databricks (or Databricks on other clouds). Knowledge of Data Governance frameworks (metadata management, data lineage, security). Expertise in SQL and experience with cloud data platforms (Azure, AWS, or GCP). Familiarity with ETL/ELT tools and workflow orchestration (e.g., Airflow, Azure Data Factory). Nice-to-Have: Experience with Delta Lake, Snowflake, or Synapse . Knowledge of CI/CD pipelines for data engineering. Understanding of machine learning data pipelines . Certifications in Databricks, Azure Data Engineer, or AWS Big Data .

Posted 1 month ago

Apply

14.0 - 21.0 years

40 - 60 Lacs

Noida, Gurugram, Bengaluru

Hybrid

The opportunity Were looking for Associate Director – Data & AI Strategy. The main objective of the role is to develop and articulate a comprehensive and forward-looking Data & AI strategy that aligns with the overall business strategy and objectives. Skills- Data Engineering, Data Strategy & Governance with Go -to- Market. Skills and attributes for success 15-17 years of total experience with 10+ years in Data Strategy and architecture field 15-17 years of total experience with 7+ years in AI Strategy & implementation. Strong knowledge of data architecture, data models and cutover strategies using industry standard tools and technologies Architecture design and implementation experience with medium to complex on-prem to cloud migrations with any of the major cloud platforms (preferably AWS/Azure/GCP) Solid hands-on 10+ years of professional experience with creation and implementation of data science engagements and helping create AI/ML products Proven track record of implementing machine-learning solutions, development in multiple languages and statistical analysis 7+ years experience in Azure database offerings [ Relational, NoSQL, Datawarehouse ] 7+ years hands-on experience in various Azure services preferred Azure Data Factory, Kafka, Azure Data Explorer, Storage, Azure Data Lake, Azure Synapse Analytics, Azure Analysis Services & Databricks Minimum of 8 years of hands-on database design, modeling and integration experience with relational data sources, such as SQL Server databases, Oracle/MySQL, Azure SQL and Azure Synapse Familiarity with machine learning frameworks (like Keras or PyTorch) and libraries (like scikit-learn) Strong creative instincts related to data analysis and visualization. Aggressive curiosity to learn the business methodology, data model and user personas. Strong understanding of BI and DWH best practices, analysis, visualization, and latest trends. Experience with the software development lifecycle (SDLC) and principles of product development such as installation, upgrade and namespace management Willingness to mentor team members Solid analytical, technical and problem-solving skills Excellent written and verbal communication skills Strong project and people management skills with experience in serving global clients . To qualify for the role, you must have Masters Degree in Computer Science, Business Administration or equivalent work experience. Fact driven and analytically minded with excellent attention to details Hands-on experience with data engineering tasks such as building analytical data records and experience manipulating and analysing large volumes of data Relevant work experience of minimum 15 to 17 years in a big 4 or technology/ consulting set up Help incubate new finance analytic products by executing Pilot, Proof of Concept projects to establish capabilities and credibility with users and clients. This may entail working either as an independent SME or as part of a larger team. Ideally, youll also have Ability to think strategically/end-to-end with result-oriented mindset Ability to build rapport within the firm and win the trust of the clients Willingness to travel extensively and to work on client sites / practice office locations Strong experience in SQL server and MS Excel plus atleast one other SQL dialect e.g. MS Access\Postgresql\Oracle PLSQL\MySQLStrong in Data Structures & Algorithm Experience of interfacing with databases such as Azure databases, SQL server, Oracle, Teradata etc Preferred exposure to JSON, Cloud Foundry, Pivotal, MatLab, Spark, Greenplum, Cassandra, Amazon Web Services, Microsoft Azure, Google Cloud, Informatica, Angular JS, Python, etc. What we look for A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY SaT practices globally with leading businesses across a range of industries What we offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career.

Posted 1 month ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : Data EngineeringMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your typical day will involve collaborating with team members to ensure the successful implementation of software solutions, performing maintenance and enhancements, and contributing to the overall development process. You will be responsible for delivering high-quality code while adhering to best practices and project timelines, ensuring that the applications meet the needs of the clients effectively. Roles & Responsibilities:- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Good To Have Skills: Experience with Data Engineering.- Strong understanding of database design principles and data architecture.- Experience with data integration and ETL processes.- Familiarity with data warehousing concepts and technologies. Have a good understanding of IFW industry data model Conversant with industry standard data modelling tools. Information Modelling on master data domains e.g. Party, Agreement, Location + canonical message model Maintaining Logical & Physical data model for Master Data Management solution Maintaining and updating Data Element Catalog for consumers as applicable Supporting business/technical stakeholders on any information required on data models and related artifacts" Additional Information:- The candidate should have minimum 7 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

10.0 - 15.0 years

1 - 6 Lacs

Hyderabad

Hybrid

Role: GCP Data Architect Experience: 10+ years Work location: Hyderabad (Hybrid work from office) Notice period: Immediate joiners to 30 days max (But preferred would be someone who can join within 15 days) Shift timing: 2:30 PM to 11:30 PM (IST) We are looking for a GCP Data Architect with deep technical expertise in cloud-native data platforms and architecture, who also brings experience in building practices/CoEs and engaging in pre-sales solutioning. This role will be instrumental in driving cloud data transformations for enterprise clients, shaping reusable accelerators, and leading conversations with key stakeholders. Required Skills & Qualifications: 10+ years of overall experience, with 3+ years as an architect on GCP data platforms. Expertise in BigQuery, Cloud Storage, Dataflow, Pub/Sub, Looker, Data Catalog, and IAM. Hands-on experience with Terraform or similar IaC tools. Proficiency in Python, SQL, or Apache Beam for data processing pipelines. Solid grasp of data governance, security, lineage, and compliance frameworks. Strong understanding of hybrid/multi-cloud data strategies, data lakehouses, and real-time analytics. Demonstrated ability to build internal CoEs or practices with reusable assets and frameworks. Experience collaborating with CXOs, Data Leaders, and Enterprise Architects. Strong communication skillswritten and verbal—for stakeholder and customer engagement. Preferred Certifications: Professional Cloud Architect – Google Cloud Professional Data Engineer – Google Cloud

Posted 1 month ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Mumbai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of software solutions, while also performing maintenance and enhancements to existing applications. You will be responsible for delivering high-quality code and contributing to the overall success of the projects you are involved in, ensuring that all components function seamlessly together. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities and foster a culture of continuous improvement.- Mentor junior professionals to help them develop their skills and grow within the organization. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of database design principles and data architecture.- Experience with data integration and ETL processes.- Familiarity with data warehousing concepts and technologies.- Ability to analyze and optimize data models for performance and scalability. Additional Information:- The candidate should have minimum 5 years of experience in Data Modeling Techniques and Methodologies.- This position is based in Mumbai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Ab Initio Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your typical day will involve collaborating with team members to ensure the successful execution of projects, performing maintenance and enhancements, and contributing to the development of innovative solutions that meet client needs. You will be responsible for managing your tasks effectively while ensuring high-quality deliverables and maintaining a focus on continuous improvement. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities and foster a culture of learning.- Monitor project progress and provide timely updates to stakeholders to ensure alignment with project goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of data integration and ETL processes.- Experience with performance tuning and optimization of data processing workflows.- Familiarity with data quality and data governance principles.- Ability to troubleshoot and resolve technical issues in a timely manner. Additional Information:- The candidate should have minimum 7.5 years of experience in Ab Initio.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Chennai

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic environment where you will analyze, design, code, and test various components of application code across multiple clients. Your typical day will involve collaborating with team members to ensure the successful execution of projects, addressing challenges that arise during the development process, and contributing to the overall improvement of application performance and functionality. You will also be responsible for maintaining existing systems while implementing enhancements to meet evolving client needs, ensuring that the software solutions are robust and efficient. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of database design principles and data architecture.- Experience with data integration and ETL processes.- Familiarity with data warehousing concepts and technologies.- Ability to analyze and optimize data models for performance. Additional Information:- The candidate should have minimum 5 years of experience in Data Modeling Techniques and Methodologies.- This position is based in Chennai.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a variety of tasks that involve analyzing, designing, coding, and testing multiple components of application code across various clients. Your typical day will include collaborating with team members to ensure the successful implementation of software solutions, performing maintenance and enhancements, and contributing to the overall development work of the project. You will be responsible for ensuring that the application meets the required standards and functions effectively, while also addressing any issues that may arise during the development process. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of database design principles and data architecture.- Experience with data integration and ETL processes.- Familiarity with data warehousing concepts and technologies.- Ability to analyze and optimize data models for performance. Additional Information:- The candidate should have minimum 7.5 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Pune

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a variety of tasks that involve analyzing, designing, coding, and testing multiple components of application code across various clients. Your typical day will include collaborating with team members to ensure the quality and functionality of the software, addressing any issues that arise, and implementing enhancements to improve overall performance. You will also be responsible for maintaining existing applications and contributing to the development of new features, ensuring that all work aligns with project goals and client requirements. Roles & Responsibilities:- Expected to be an SME, collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities and foster a culture of continuous improvement.- Monitor project progress and provide regular updates to stakeholders to ensure alignment with project objectives. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of database design principles and data architecture.- Experience with data integration and ETL processes.- Familiarity with data warehousing concepts and technologies.- Ability to work with various data modeling tools and software. Additional Information:- The candidate should have minimum 5 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Pune

Work from Office

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Ab Initio Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your typical day will involve collaborating with team members to ensure the successful implementation of software solutions, performing maintenance and enhancements, and contributing to the overall development process. You will be responsible for delivering high-quality code while adhering to best practices and project timelines, ensuring that the applications meet the needs of the clients effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio.- Strong understanding of data integration and ETL processes.- Experience with performance tuning and optimization of data processing.- Familiarity with data warehousing concepts and methodologies.- Ability to troubleshoot and resolve complex technical issues. Additional Information:- The candidate should have minimum 7.5 years of experience in Ab Initio.- This position is based at our Pune office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Data Modeling Techniques and Methodologies Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your typical day will involve collaborating with team members to ensure the successful implementation of software solutions, performing maintenance and enhancements, and contributing to the overall development process. You will be responsible for delivering high-quality code while adhering to best practices and project timelines, ensuring that the applications meet the needs of the clients effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Continuously evaluate and improve development processes to increase efficiency. Professional & Technical Skills: - Must To Have Skills: Proficiency in Data Modeling Techniques and Methodologies.- Strong understanding of database design principles and data architecture.- Experience with data integration and ETL processes.- Familiarity with data warehousing concepts and technologies.- Ability to analyze and optimize data models for performance. Additional Information:- The candidate should have minimum 5 years of experience in Data Modeling Techniques and Methodologies.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 month ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Snowflake Data Warehouse, Manual Testing Good to have skills : Data EngineeringMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture. You will be involved in various stages of the data platform lifecycle, ensuring that all components work harmoniously to support the organization's data needs and objectives. Your role will require you to navigate complex data environments, providing insights and recommendations that drive effective data management and governance practices. Key Responsibilities:a Overall 12+ of data experience including 5+ years on Snowflake and 3+ years on DBT (Core and Cloud)b Played a key role in DBT related discussions with teams and clients to understand business problems and solutioning requirementsc As a DBT SME liaise with clients on business/ technology/ data transformation programs; orchestrate the implementation of planned initiatives and realization of business outcomes d Spearhead team to translate business goals/ challenges into practical data transformation and technology roadmaps and data architecture designs e Strong experience in designing, architecting and managing(admin) Snowflake solutions and deploying data analytics solutions in Snowflake.f Strong inclination for practice building that includes spearheading thought leadership discussions, managing team activities Technical Experience:a Strong Experience working as a Snowflake on Cloud DBT Data Architect with thorough knowledge of different servicesb Ability to architect solutions from OnPrem to cloud and create end to end data pipelines using DBT c Excellent process knowledge in one or more of the following areas:Finance, Healthcare, Customer Experienced Experience in working on Client Proposals (RFP's), Estimation, POCs, POVs on new Snowflake featurese DBT (Core and Cloud) end to end migration experience that includes DBT migration - Refactoring SQL for modularity, DBT modeling experience (.sql or .py files credbt job scheduling on atleast 2 projectsf Knowledge of Jinja template language (Macros) would be added advantageg Knowledge of Special features like DBT documentation, semantice layers creation, webhooks etc.h DBT and cloud certification is important.i Develop, fine-tune, and integrate LLM models (OpenAI, Anthropic, Mistral, etc.) into enterprise workflows via Cortex AI.j Deploy AI Agents capable of reasoning, tool use, chaining, and task orchestration for knowledge retrieval and decision support.k Guide the creation and management of GenAI assets like prompts, embeddings, semantic indexes, agents, and custom bots.l Collaborate with data engineers, ML engineers, and leadership team to translate business use cases into GenAI-driven solutions.m Provide mentorship and technical leadership to a small team of engineers working on GenAI initiatives.n Stay current with advancements in Snowflake, LLMs, and generative AI frameworks to continuously enhance solution capabilities.o Should have good understanding of SQL, Python. Also, the architectural concepts of Snowflake should be clear. Professional Attributes:a client management, stakeholder management, collaboration, interpersonal and relationship building skillsb Ability to create innovative solutions for key business challengesc Eagerness to learn and develop self on an ongoing basisd Structured communication written, verbal and presentational. Educational Qualification:a MBA Technology/Data related specializations/ MCA/ Advanced Degrees in STEM Qualification 15 years full time education

Posted 1 month ago

Apply

8.0 - 13.0 years

32 - 45 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Job Title: Data Architect Location: Bangalore ,Hyderabad,Chennai, Pune, Gurgaon - hybrid- 2/3 days WFO Experience: 8+ years Position Overview: We are seeking a highly skilled and strategic Data Architect to design, build, and maintain the organizations data architecture. The ideal candidate will be responsible for aligning data solutions with business needs, ensuring data integrity, and enabling scalable and efficient data flows across the enterprise. This role requires deep expertise in data modeling, data integration, cloud data platforms, and governance practices. Key Responsibilities: Architectural Design: Define and implement enterprise data architecture strategies, including data warehousing, data lakes, and real-time data systems. Data Modeling: Develop and maintain logical, physical, and conceptual data models to support analytics, reporting, and operational systems. Platform Management: Select and oversee implementation of cloud and on-premises data platforms (e.g., Snowflake, Redshift, BigQuery, Azure Synapse, Databricks). Integration & ETL: Design robust ETL/ELT pipelines and data integration frameworks using tools such as Apache Airflow, Informatica, dbt, or native cloud services. Data Governance: Collaborate with stakeholders to implement data quality, data lineage, metadata management, and security best practices. Collaboration: Work closely with data engineers, analysts, software developers, and business teams to ensure seamless and secure data access. Performance Optimization: Tune databases, queries, and storage strategies for performance, scalability, and cost-efficiency. Documentation: Maintain comprehensive documentation for data structures, standards, and architectural decisions. Required Qualifications: Bachelors or master’s degree in computer science, Information Systems, or a related field. 5+ years of experience in data architecture, data engineering, or database development. Strong expertise in data modeling, relational and NoSQL databases (e.g., PostgreSQL, MySQL, MongoDB, Cassandra). Experience with modern data platforms and cloud ecosystems (AWS, Azure, or GCP). Hands-on experience with data warehousing solutions and tools (e.g., Snowflake, Redshift, BigQuery). Proficiency in SQL and data scripting languages (e.g., Python, Scala). Familiarity with data privacy regulations (e.g., GDPR, HIPAA) and security standards. Tech Stack AWS Cloud – S3, EC2, EMR, Lambda, IAM, Snowflake DB Databricks Spark/Pyspark, Python Good Knowledge of Bedrock and Mistral AI RAG & NLP LangChain and LangRAG LLMs Anthropic Claude, Mistral, LLaMA etc.,

Posted 1 month ago

Apply

12.0 - 15.0 years

45 - 50 Lacs

Bengaluru

Hybrid

Azure Data Architect with 12+ yrs exp in Azure Data Bricks, Power BI, ETL, ADF, SQL, and Data Lakes. Skilled in cloud data architecture, reporting, data pipelines, and governance. Azure certified preferred. Strong leadership & Agile experience.

Posted 1 month ago

Apply

8.0 - 10.0 years

4 - 7 Lacs

Chandigarh, New Delhi, Puducherry

Work from Office

Minimum 8 to 10 yrs experience on EHANA/NATIVE Hana applications. Must have good HANDS on experience on EHANA/Native Hana. Expertise in EHANA object Development including HANA calculation graphical views, SQL scripted views, Scaler and table function, Stored procedures. Strong experience with SQL Script, PL/SQL, Stored Procedures, Function Calls, Designing Tables, Table Functions & HANA views (Calc Views). Apply advanced data modelling techniques, including currency conversion, variables and input parameters. Implement decision automation using business rules. Knowledge about the concept of authorization in SAP HANA, and implementing a security model using analytic privileges, SQL privileges, pre-defined roles, and schemas. Evaluate the impact of different implementation options such as table joins, aggregation, or filters. Good have the knowledge on Data provisioning Technique ( SLT/BODS) Should adhere to Data Architecture, Modelling and Coding guidelines. Preparation of Design documents and/or Technical Documents. Able to identify performance bottlenecks in HANA objects and tune those objects to improve performance. Provide and develop best practices for application development on the EHANA platform. Design applications to deal with real time reporting requirements. Provide scalable solutions for large data volumes (Terabytes of data) and complex reporting needs. Perform Hands-on analysis, design and deployment of solutions. Monitor, investigate and optimize data models and reporting performance on SAP HANA. Understand the implication of the performance of the various reporting tools and connection type. And advantage will be given if having Working experience on reporting tools such as SAP Business Objects ( Webi, AO) & SAC. Preferred to have Tableau experience as well. Location: Chandigarh,Dadra & Nagar Haveli,Hyderabad,Jammu,Lakshadweep,Nagar,New Delhi,Puducherry,Daman,Diu,Goa,Haveli,Sikkim

Posted 1 month ago

Apply

6.0 - 10.0 years

4 - 8 Lacs

Pune

Work from Office

Position Overview Summary: The Data Engineer will expand and optimize the data and data pipeline architecture, as well as optimize data flow and collection for cross functional teams. The Data Engineer will perform data architecture analysis, design, development and testing to deliver data applications, services, interfaces, ETL processes, reporting and other workflow and management initiatives. The role also will follow modern SDLC principles, test driven development and source code reviews and change control standards in order to maintain compliance with policies. This role requires a highly motivated individual with strong technical ability, data capability, excellent communication and collaboration skills including the ability to develop and troubleshoot a diverse range of problems. Responsibilities Design and develop enterprise data data architecture solutions using Hadoop and other data technologies like Spark, Scala.

Posted 1 month ago

Apply

5.0 - 10.0 years

11 - 15 Lacs

Hyderabad

Work from Office

Stellantis is seeking a passionate, innovative, results-oriented Information Communication Technology (ICT) Manufacturing AWS Cloud Architect to join the team. As a Cloud architect, the selected candidate will leverage business analysis, data management, and data engineering skills to develop sustainable data tools supporting Stellantiss Manufacturing Portfolio Planning. This role will collaborate closely with data analysts and business intelligence developers within the Product Development IT Data Insights team. Job responsibilities include but are not limited to Having deep expertise in the design, creation, management, and business use of large datasets, across a variety of data platforms Assembling large, complex sets of data that meet non-functional and functional business requirements Identifying, designing and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes Building required infrastructure for optimal extraction, transformation and loading of data from various data sources using AWS, cloud and other SQL technologies. Working with stakeholders to support their data infrastructure needs while assisting with data-related technical issues Maintain high-quality ontology and metadata of data systems Establish a strong relationship with the central BI/data engineering COE to ensure alignment in terms of leveraging corporate standard technologies, processes, and reusable data models Ensure data security and develop traceable procedures for user access to data systems Qualifications, Experience and Competency Education Bachelors or Masters degree in Computer Science, or related IT-focused degree Experience Essential Overall 10-15 years of IT experience Develop, automate and maintain the build of AWS components, and operating systems. Work with application and architecture teams to conduct proof of concept (POC) and implement the design in a production environment in AWS. Migrate and transform existing workloads from on premise to AWS Minimum 5 years of experience in the area of data engineering or data architectureconcepts, approach, data lakes, data extraction, data transformation Proficient in ETL optimization, designing, coding, and tuning big data processes using Apache Spark or similar technologies. Experience operating very large data warehouses or data lakes Investigate and develop new micro services and features using the latest technology stacks from AWS Self-starter with the desire and ability to quickly learn new technologies Strong interpersonal skills with ability to communicate & build relationships at all levels Hands-on experience from AWS cloud technologies like S3, AWS glue, Glue Catalog, Athena, AWS Lambda, AWS DMS, pyspark, and snowflake. Experience with building data pipelines and applications to stream and process large datasets at low latencies. Identifying, designing, and implementing internal process improvements including re-designing infrastructure for greater scalability, optimizing data delivery, and automating manual processes Desirable Familiarity with data analytics, Engineering processes and technologies Ability to work successfully within a global and cross-functional team A passion for technology. We are looking for someone who is keen to leverage their existing skills while trying new approaches and share that knowledge with others to help grow the data and analytics teams at Stellantis to their full potential! Specific Skill Requirement AWS services (GLUE, DMS, EC2, RDS, S3, VPCs and all core services, Lambda, API Gateway, Cloud Formation, Cloud watch, Route53, Athena, IAM) andSQL, Qlik sense, python/Spark, ETL optimization , If you are interested, please Share below details and Updated Resume Matched First Name Last Name Date of Birth Pass Port No and Expiry Date Alternate Contact Number Total Experience Relevant Experience Current CTC Expected CTC Current Location Preferred Location Current Organization Payroll Company Notice period Holding any offer

Posted 1 month ago

Apply

7.0 - 10.0 years

20 - 25 Lacs

Pune

Work from Office

Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expertlevel SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelors in Computer Science, Engineering, Information Systems (Masters preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies