Jobs
Interviews

176 Dimensional Modeling Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 6.0 years

5 - 9 Lacs

bengaluru

Work from Office

Your Role We are seeking an experienced Data Engineer with strong expertise in SQL, Spark SQL, Databricks (on Azure or AWS), Unity Catalog, and PySpark to design, build, and optimize modern data solutions. The ideal candidate will also bring in-depth knowledge of data warehousing concepts and best practices to support scalable, high-performance data platforms. In this role you will play a key role in 410 years of experience in Data Engineering / ETL Development. Strong expertise in SQL and Spark SQL (complex queries, optimization, performance tuning). Hands-on experience with Databricks on Azure or AWS (Delta Lake, Lakehouse). Proficiency in PySpark for data processing. Experience with Unity Catalog for data governance, security, and access management. Solid understanding of data warehousing principles, dimensional modeling, and best practices. Knowledge of Azure Data Services (ADLS, ADF, Synapse) or AWS Data Services (S3, Glue, Redshift, Athena, etc.) is a plus. Your Profile Strong customer orientation, decision making, problem solving, communication and presentation skills Very good judgement skills and ability to shape compelling solutions and solve unstructured problems with assumptions Very good collaboration skills and ability to interact with multi-cultural and multi-functional teams spread across geographies Strong executive presence andspirit Superb leadership and team building skills with ability to build consensus and achieve goals through collaboration rather than direct line authority

Posted 1 day ago

Apply

10.0 - 15.0 years

7 - 11 Lacs

bengaluru

Work from Office

SymphonyAI is a global leader in AI-driven enterprise applications, transforming industries with cutting-edge artificial intelligence and machine learning solutions. We empower organizations across retail, CPG, financial services, manufacturing, media, enterprise IT and the public sector by delivering data-driven insights that drive business value. Headquartered in Palo Alto, California, SymphonyAI has a wide range of products and a strong global presence, with operations in North America, Southeast Asia, the Middle East, and India. The company is dedicated to fostering a high-performance culture and maintaining its position as one of the largest and fastest-growing AI portfolios in the industry. Job Description About the Role We are looking for a Data Warehouse Engineer with strong expertise across the Azure Data Platform to design, build, and maintain modern data warehouse and analytics solutions. This role requires hands-on experience with Azure Synapse Analytics, Data Factory, Data Lake, Azure Analysis Services, and Power BI . The ideal candidate will ensure seamless data ingestion, storage, transformation, analysis, and visualization, enabling the business to make data-driven decisions. Key Responsibilities Data Ingestion & Orchestration 10-15years of experience in designing and building scalable ingestion pipelines using Azure Data Factory . Integrate data from multiple sources (SQL Server, relational databases, Azure SQL DB, Cosmos DB, Table Storage). Manage batch and real-time ingestion into Azure Data Lake Storage . Data Storage & Modelling Develop and optimize data warehouse solutions in Azure Synapse Analytics . Implement robust ETL/ELT processes to ensure data quality and consistency. Create data models for analytical and reporting needs. Data Analysis & Security Build semantic data models using Azure Analysis Services for enterprise reporting. Collaborate with BI teams to deliver well-structured datasets for reporting in Power BI . Implement Azure Active Directory for authentication, access control, and security best practices. Visualization & Business Support Support business teams in building insightful Power BI dashboards and reports . Translate business requirements into scalable and optimized BI solutions. Provide data-driven insights in a clear, business-friendly manner. Optimization & Governance Monitor system performance and optimize pipelines for efficiency and cost control. Establish standards for data governance, data quality, and metadata management . Qualifications & Skills Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or related field. Proven experience as a Data Warehouse Engineer / Data Engineer with strong expertise in: Azure Synapse Analytics Azure Data Factory Azure Data Lake Storage Azure Analysis Services Azure SQL Database / SQL Server Power BI (reporting & dashboarding) Strong proficiency in SQL and data modelling (star schema, snowflake schema, dimensional modelling). Knowledge of Azure Active Directory for authentication & role-based access control. Excellent problem-solving skills and ability to optimize large-scale data solutions. Strong communication skills to collaborate effectively with both technical and business stakeholders.

Posted 1 day ago

Apply

8.0 - 13.0 years

15 - 30 Lacs

hyderabad, chennai

Hybrid

Data Modeller Exp-8+ years Location-Hyderabad/Chennai Role & responsibilities Key Responsibilities: • Design and develop enterprise-grade data models (3NF, Dimensional, and Semantic) to support analytics and operational use cases • Collaborate with business and engineering teams to define data products aligned to business domains • Translate complex mortgage banking concepts into scalable and extensible models • Ensure alignment with modern data architecture and cloud platforms (e.g., Snowflake, DBT) • Contribute to the creation of canonical models and reusable patterns for enterprise use Required Qualifications • 5+ years of experience in data modeling with strong focus on mortgage or financial services • Hands-on experience with 3NF , Dimensional , and Semantic modeling • Strong understanding of data as a product and domain-driven design • Experience working in modern data ecosystems; familiarity with Snowflake, DBT , and BI tools is a plus • Excellent communication skills to work across business and technical teams

Posted 1 day ago

Apply

5.0 - 10.0 years

15 - 27 Lacs

bengaluru

Work from Office

Hi, Greetings from Preludesys India Pvt Ltd!! We are hiring for one of our prestigious clients for the below position!!! Job Posting: Data Modeler -SA Notice Period: Immediate - 30 Days Role Overview We are looking for an experienced Data Modeler with a strong foundation in dimensional data modeling and a proven ability to design and maintain conceptual, logical, and physical data models. The ideal candidate will have a minimum of 5+ years of experience in data modeling and architecture, preferably within the banking or financial services industry. Key Responsibilities Design, develop, and maintain dimensional data models to support analytics and reporting. Design conceptual, logical, and physical data models Utilize AWS services for scalable data model design Align data models with business rules and governance standards. Collaborate with business stakeholders, data architects, and engineers to ensure data models align with business rules and data governance standards. Translate business requirements into scalable and efficient data models. Maintain comprehensive documentation for data models, metadata, and data dictionaries. Ensure consistency and integrity of data models across systems and platforms. Partner with data engineering teams to implement models in AWS-based environments, including Redshift, Glue, and Lake Formation. Required Skills and Qualifications 5+ years of experience in data modeling, with a focus on dimensional modeling and data warehouse design. Proficiency in developing conceptual, logical, and physical data models. Strong understanding of data governance, data quality, and metadata management. Hands-on experience with AWS services such as Redshift, Glue, and Lake Formation. Familiarity with data modeling tools (e.g., ER/Studio, ERwin, or similar). Excellent communication skills and ability to work with cross-functional teams. Preferred Qualifications Experience in the banking or financial services sector. Knowledge of data lake architecture and modern data stack tools. AWS or data modeling certifications are a plus.

Posted 2 days ago

Apply

10.0 - 14.0 years

25 - 37 Lacs

pune, bengaluru, mumbai (all areas)

Hybrid

Role & responsibilities Job Description Summary The purpose of this role is to oversee the development of our database marketing solutions, using database technologies such as Microsoft SQL Server/Azure, Amazon Redshift, Google BigQuery. The role will be involved in design, specifications, troubleshooting and issue resolution. The ability to communicate to both technical and non-technical audiences is key. Business Title Associate Technical Architect Years of Experience 10+ Years Must have skills 1. Database (one or more of MS SQL Server, Oracle, Cloud SQL, Cloud Spanner, etc.) 2. Data Warehouse (one or more of Big Query, SnowFlake, etc.) 3. ETL tool (two or more of Cloud Data Fusion, Dataflow, Dataproc, Pub/Sub, Composer, Cloud Functions, Cloud Run, etc.) 4. Experience in Cloud platforms - GCP 5. Python, PySpark, Project & resource management 6. SVN, JIRA, Automation workflow (Composer, Cloud Scheduler, Apache Airflow, Tidal, Tivoli or similar) Good to have skills 1. UNIX shell scripting, SnowFlake, Reshift, Familiar with NoSQL such as MongoDB, etc 2. ETL tool (Databricks / AWS Glue / AWS Lambda / Amazon Kinesis / Amazon Firehose / Azure Data Factory / ADF / DBT / Talend, Informatica, IICS (Informatica cloud) ) 3. Experience in Cloud platforms - AWS / Azure 4. Client-facing skills Job Descreption The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day to day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after sales support and best practise advice. Interactions with internal stakeholders and clients to explain technology solutions and a clear understanding of clients business requirements through which to guide optimal design to meet their needs. Key responsibiltes Ability to design simple to medium data solutions for clients by using cloud architecture using GCP • Strong understanding of DW, data mart, data modelling, data structures, databases, and data ingestion and transformation. • Working knowledge of ETL as well as database skills • Working knowledge of data modelling, data structures, databases, and ETL processes • Strong understand of relational and non-relational databases and when to use them • Leadership and communication skills to collaborate with local leadership as well as our global teams • Translating technical requirements into ETL/ SQL application code • Document project architecture, explain detailed design to team and create low level to high level design • Create technical documents for ETL and SQL developments using Visio, PowerPoint and other MS Office package • Will need to engage with Project Managers, Business Analysts and Application DBA to implement ETL Solutions • Perform mid to complex level tasks independently • Support Client, Data Scientists and Analytical Consultants working on marketing solution • Work with cross functional internal team and external clients • Strong project Management and organization skills. Ability to lead 1 2 projects of team size 2 3 team members. • Code management systems which includes Code review, deployment, cod • Work closely with the QA / Testing team to help identify/implement defect reduction initiatives • Work closely with the Architecture team to make sure Architecture standards and principles are followed during development • Performing Proof of Concepts on new platforms/ validate proposed solutions • Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed • Must understand software development methodologies including waterfall and agile • Distribute and manage SQL development Work across the team • The candidate must be willing to work during overlapping hours with US-based teams to ensure effective collaboration and communication, typically between [e.g., 6:00 PM to 11:00 PM IST], depending on project needs. Education Qulification Bachelors or Master Degree in Computer Science Certification (Must): Snowflake Associate / Core or Min. Basic level certification in AZURE Shift timing GMT (UK Shift) - 2 PM to 11 PM

Posted 2 days ago

Apply

6.0 - 10.0 years

22 - 30 Lacs

hyderabad, chennai, bengaluru

Hybrid

Curious about the role? What your typical day would look like? As a Senior Data Engineer, you will work to solve some of the organizational data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member and Data Modeling lead as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might • Engage the clients & understand the business requirements to translate those into data models. • Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. • Contribute to Data Modeling accelerators • Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. • Gather and publish Data Dictionaries. • Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. • Use the Data Modelling tool to create appropriate data models. • Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. • Use version control to maintain versions of data models. • Collaborate with Data Engineers to design and develop data extraction and integration code modules. • Partner with the data engineers to strategize ingestion logic and consumption patterns. Job Requirement Expertise and Qualifications What do we expect? • 6+ years of experience in Data space. • Decent SQL skills. • Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) • Real-time experience working in OLAP & OLTP database models (Dimensional models). • Good understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. • Eye to analyze data & comfortable with following agile methodology. • Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP). You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry

Posted 2 days ago

Apply

10.0 - 14.0 years

30 - 37 Lacs

hyderabad, chennai, bengaluru

Hybrid

Curious about the role? What your typical day would look like? As an Architect, you will work to solve some of the most complex and captivating data management problems that would enable them as a data-driven organization; Seamlessly switch between roles of an Individual Contributor, team member, and Data Modeling Architect as demanded by each project to define, design, and deliver actionable insights. On a typical day, you might • Engage the clients & understand the business requirements to translate those into data models. • Analyze customer problems, propose solutions from a data structural perspective, and estimate and deliver proposed solutions. • Create and maintain a Logical Data Model (LDM) and Physical Data Model (PDM) by applying best practices to provide business insights. • Use the Data Modelling tool to create appropriate data models • Create and maintain the Source to Target Data Mapping document that includes documentation of all entities, attributes, data relationships, primary and foreign key structures, allowed values, codes, business rules, glossary terms, etc. • Gather and publish Data Dictionaries. • Ideate, design, and guide the teams in building automations and accelerators • Involve in maintaining data models as well as capturing data models from existing databases and recording descriptive information. • Contribute to building data warehouse & data marts (on Cloud) while performing data profiling and quality analysis. • Use version control to maintain versions of data models. • Collaborate with Data Engineers to design and develop data extraction and integration code modules. • Partner with the data engineers & testing practitioners to strategize ingestion logic, consumption patterns & testing. • Ideate to design & develop the next-gen data platform by collaborating with cross-functional stakeholders. • Work with the client to define, establish and implement the right modelling approach as per the requirement • Help define the standards and best practices • Involve in monitoring the project progress to keep the leadership teams informed on the milestones, impediments, etc. • Coach team members, and review code artifacts. • Contribute to proposals and RFPs Job Requirement What do we expect? • 10+ years of experience in Data space. • Decent SQL knowledge • Able to suggest modeling approaches for a given problem. • Significant experience in one or more RDBMS (Oracle, DB2, and SQL Server) • Real-time experience working in OLAP & OLTP database models (Dimensional models). • Comprehensive understanding of Star schema, Snowflake schema, and Data Vault Modelling. Also, on any ETL tool, Data Governance, and Data quality. • Eye to analyze data & comfortable with following agile methodology. • Adept understanding of any of the cloud services is preferred (Azure, AWS & GCP) • Enthuse to coach team members & collaborate with various stakeholders across the organization and take complete ownership of deliverables. • Experience in contributing to proposals and RFPs • Good experience in stakeholder management • Decent communication and experience in leading the team You are important to us, lets stay connected! Every individual comes with a different set of skills and qualities so even if you dont tick all the boxes for the role today, we urge you to apply as there might be a suitable/unique role for you tomorrow. We are an equal-opportunity employer. Our diverse and inclusive culture and values guide us to listen, trust, respect, and encourage people to grow the way they desire. Note: The designation will be commensurate with expertise and experience. Compensation packages are among the best in the industry .

Posted 2 days ago

Apply

5.0 - 7.0 years

9 - 13 Lacs

bengaluru

Work from Office

We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models), SQL, MDX/DAX, and data modeling. The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities - Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). - Build and optimize MDX or DAX queries for advanced reporting needs. - Create and manage data models (Star/Snowflake schemas) supporting business KPIs. - Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). - Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. - Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. - Maintain data quality and consistency across data sources and reporting layers. - Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary : Required Skills : - SSAS Tabular & Multidimensional. - SQL Server (Advanced SQL, Views, Joins, Indexes). - DAX & MDX. - Data Modeling & OLAP concepts. Secondary : - ETL Tools (SSIS or equivalent). - Power BI or similar BI/reporting tools. - Performance tuning & troubleshooting in SSAS and SQL. - Version control (TFS/Git), deployment best practices.

Posted 3 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

The Lead IT Data Modeler plays a crucial role in developing data models and architectures that cater to present business requirements while also being scalable for future challenges. You will be responsible for providing advanced mentorship to both junior and senior data modelers, fostering a culture of continuous improvement and innovation in data management practices. Your expertise in query optimization, execution plans, and complex SQL for managing large datasets will be essential for this role. Proficiency in database systems like MySQL, PostgreSQL, Oracle, and Microsoft SQL Server is required, along with utilizing version control systems such as Git for efficient change management. Additionally, you will integrate data engineering practices like ETL, job orchestration, and monitoring to enhance data transformation processes, ensuring robust data systems that support critical business functions. Your responsibilities will include but are not limited to: - Demonstrating proficiency in managing relational database systems like BigQuery, MySQL, PostgreSQL, Oracle, or Microsoft SQL Server. - Utilizing SQL for querying and manipulating data, including creating, modifying, and optimizing database queries. - Applying the fundamentals of data modeling, including entity-relationship diagrams (ERDs), data flow diagrams, and dimensional modeling. - Maintaining clear and up-to-date data model diagrams, schema descriptions, and metadata through strong documentation skills. - Using version control systems like Git to track changes to data models and related code. - Developing complex SQL queries, common table expressions, and optimization techniques for managing and querying large datasets. - Understanding data governance principles, data stewardship, and data cataloging. - Optimizing database and query performance through techniques like query execution plans, indexing, and caching strategies. - Mentoring junior data modelers and providing leadership in data modeling initiatives. - Designing comprehensive data architectures aligning with business needs and supporting long-term data strategies. - Designing data models covering the entire enterprise to ensure data consistency and alignment with business processes. - Leading and mentoring a team of data modelers while providing guidance, training, and oversight. - Developing and executing a strategic plan for data modeling initiatives in alignment with the organization's vision. - Performing other job-related duties that align with the organization's vision, mission, and values within your scope of practice. Qualifications: - Education: Bachelor's Degree or relevant experience. - Preferred Certification(s): Any relevant IT Certification. - Experience: 5+ years of relevant and practical experience. Special Skills: - Proficiency in designing comprehensive data models aligned with business processes. - Advanced knowledge of SQL, data warehousing, and dimensional modeling. - Skills in data governance, cataloging, and stewardship. - Experience in enterprise-level data strategy and managing complex data architectures. Soft Skills: - Visionary Leadership: Crafting scalable foundational data models for current and future business needs. - Mentorship: Providing advanced guidance to both junior and senior data modelers. - Innovation: Fostering a culture of continuous improvement in data management practices. - Strategic Communication: Articulating complex data strategies to executive stakeholders. - Change Management: Guiding teams through evolving data technologies and methodologies. - Conflict Resolution: Addressing challenges within data modeling teams and cross-functional collaborations. - Decision-Making: Influencing informed choices about data architecture and modeling approaches. - Emotional Intelligence: Understanding and managing team dynamics for optimal performance. - Negotiation: Balancing technical requirements with business needs and resource constraints. - Thought Leadership: Driving industry best practices in data modeling and architecture. Relocation Assistance Eligible: No Work Shift: [Not specified],

Posted 6 days ago

Apply

3.0 - 10.0 years

0 Lacs

pune, maharashtra

On-site

As a Data Engineering Manager at Saisystems Health Engineering & Product in Pune, you will play a crucial role in leading data development and analytics initiatives. Your responsibilities will include architecting and optimizing data solutions across relational and non-relational databases, managing a cross-functional data team, and implementing scalable BI and reporting frameworks. Your expertise in technical aspects, leadership skills, and ability to leverage data for business value will be essential for success in this role. You will be responsible for leading the development, implementation, and optimization of databases to support business needs, mentoring a team of database developers, data engineers, and BI professionals, designing and implementing scalable data models and reporting systems, and collaborating with stakeholders to deliver actionable insights. Additionally, you will administer database performance tuning, backup strategies, and security protocols, enforce data governance and quality standards, and manage Business Intelligence tools such as Power BI or Tableau. To excel in this role, you should have a minimum of 10 years of experience in database development and data management, with at least 3 years in a leadership role overseeing data or BI teams. Your technical skills should include a strong command of SQL, proficiency in BI tools, experience with ETL tools and data integration platforms, familiarity with data warehouse solutions and Azure cloud services, and knowledge of database security best practices. Preferred qualifications include experience with DevOps practices for data projects and certifications in BI platforms or cloud database technologies. Your performance will be evaluated based on the delivery of scalable data solutions, usability of dashboards and analytics, performance tuning improvements, data governance practices, and team growth through effective leadership. As a successful candidate, you should possess strong analytical and problem-solving skills, strategic planning abilities, effective leadership capabilities, business acumen, and excellent communication skills. Your passion for innovation, focus on building new data solutions, and willingness to work with modern technologies will be key to driving impactful and forward-thinking data initiatives in a collaborative environment. Join us in shaping our data strategy, building scalable analytics solutions, and driving innovation that supports critical business decisions and growth. This role offers a unique opportunity to work with modern cloud technologies and advanced BI tools, making a significant impact through data-driven insights and solutions.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Software Engineer/Senior Software Engineer at CGI, you will have the opportunity to work on Oracle Data Integrator (ODI) and Oracle Business Intelligence Enterprise Edition (OBIEE) projects. With 4 to 6 years of experience, you will be responsible for developing complex reports using various BI reporting tools such as OBIEE 12C. Your expertise in creating different types of views, configuring aggregate tables, and implementing dimension hierarchies will be essential in meeting project requirements. Your role will also involve working on ODI architecture, data modeling, and tuning SQL/PL-SQL queries for optimal performance. To be successful in this position, you should have a university degree or equivalent experience along with a minimum of 5 years of relevant experience. Mastery of OBIEE 11g/12c and ODI 11g within an Oracle BI Applications 11g context is required. Experience in ETL design and implementation, dimensional modeling, and structured implementation methodologies like Oracle's OUM will be beneficial. Additionally, knowledge of PeopleSoft, HR domain, and the ability to work effectively in a team environment are assets that will contribute to your success in this role. At CGI, we value ownership, teamwork, respect, and belonging, offering you the opportunity to play an integral role in bringing innovative solutions to life. You will be part of a collaborative environment where your contributions are recognized, and you have the chance to shape your career growth with the support of leaders who prioritize your well-being and professional development. If you are looking to join a dynamic team at one of the world's largest IT and business consulting services firms, CGI welcomes you to explore this exciting opportunity and be part of our journey towards achieving collective success.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

maharashtra

On-site

As a Senior Data Modeler based in Navi Mumbai, you will bring to the table 10-12 years of hands-on experience in Data Modeling. Your expertise will encompass strong Data modeling skills at both project and enterprise levels, covering Conceptual, Logical, Physical, 3NF, and Dimensional modeling. Your proficiency in Dimensional Modeling will be evident as you craft conformed dimensions that can be effectively utilized across various Business Units. You will be adept at utilizing Data Modeling tools such as Power Designer, Oracle Designer, or Erwin. Collaborating with Data Engineering and Data Analytics teams will be a significant part of your role, requiring your ability to write SQL queries, ideally ANSI SQL, to extract and manipulate data effectively. Your knowledge of AI technologies and trends will be an asset, along with a good understanding of platforms like Databricks and Azure Cloud. Experience in the Retail domain will be highly advantageous for this role. Your role will demand strong analytic skills, excellent written and oral communication abilities, and the capacity to explain complex Data structures to non-technical stakeholders. Your interpersonal, negotiating, and influencing skills will be crucial in driving successful Data modeling initiatives within the organization. If you are ready to take on this challenging opportunity, with an immediate start available, we look forward to receiving your application.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

navi mumbai, maharashtra

On-site

Celebal Technologies is a prominent software services company specializing in Data Science, Big Data, and Enterprise Cloud solutions. Our focus is on assisting organizations in gaining a competitive edge through intelligent data solutions driven by Robotics, Artificial Intelligence, and Machine Learning. We provide customized solutions to enhance productivity, efficiency, and accuracy for data-centric enterprises. As a Lead Data Modeler at Celebal Technologies, you will play a crucial role in our team. This is a full-time remote position where you will be primarily responsible for tasks related to Data Governance, Data Modeling, Data Quality, Data Architecture, and Extract Transform Load (ETL) processes on a daily basis. The ideal candidate for this role should possess the following qualifications: - Proficiency in Data Governance, Data Quality, and Data Architecture - Knowledge of Data Modeling and ETL processes - Hands-on experience with managing large datasets - Sound understanding of data management principles - Expertise in Data modeling including Conceptual, Logical, Physical, 3nF, and Dimensional at both project and enterprise levels - Familiarity with tools like Databricks, Azure, etc. - Previous experience in the Retail Domain would be highly advantageous - Proficiency in Dimensional Modeling and crafting conformed dimensions for Business Units - Ability to work with Data Modeling Tools such as Power designer, Oracle Designer, or Erwin - Strong problem-solving and analytical capabilities - Experience with relevant data modeling and governance tools and technologies - Professional certifications related to data modeling and governance are considered a plus. If you are enthusiastic about working in a dynamic environment where you can contribute significantly to data-driven solutions, we encourage you to apply for the Lead Data Modeler position at Celebal Technologies.,

Posted 1 week ago

Apply

3.0 - 10.0 years

0 Lacs

hyderabad, telangana

On-site

You will be responsible for providing consultancy services in Fusion Data Intelligence (FDI/FAW), Oracle Analytics Cloud (OAC), and Business Intelligence Publisher (BIP). Your role will involve working with clients remotely or in various cities including Bangalore, Pune, Hyderabad, Gurgaon, Kolkata, Chennai, and Mumbai. To qualify for this role, you must hold a Bachelor's degree in computer science or Information Technology with a minimum of 3-10 years of relevant experience in FDI/FAW, OAC, and BIP. Having experience with OBIEE/OBIA is considered beneficial, while expertise in OAC/FDI is a must-have requirement. Your skills should include proficiency in SQL, a minimum of 2 years of experience in semantic modeling, and a strong grasp of data warehousing, dimensional modeling, and ETL processes. Additionally, you should have a good understanding of OAC/FDI Architecture, knowledge of provisioning and configuring an FDI instance, and the ability to provide architecture recommendations based on project requirements. It is essential that you possess functional knowledge of fusion SaaS applications within domains such as ERP Fusion Finance, ERP Supply Chain, and ERP Fusion HCM. Furthermore, industry knowledge in sectors like healthcare, retail, or insurance is advantageous. In this role, you will be expected to lead a team of 4-5 junior members, communicate effectively, and drive requirements from clients to design analytical solutions. Your ability to understand OOTB features in FDI and handle customizations in reports, subject areas, ETL processes, and security will be crucial in delivering successful outcomes.,

Posted 2 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

eClinical Solutions helps life sciences organizations around the world accelerate clinical development initiatives with expert data services and the elluminate Clinical Data Cloud the foundation of digital trials. Together, the elluminate platform and digital data services give clients self-service access to all their data from one centralized location plus advanced analytics that help them make smarter, faster business decisions. The Senior Software Developer plays a crucial role in collaborating with the Product Manager, Implementation Consultants (ICs), and clients to understand requirements for meeting data analysis needs. This position requires good collaboration skills to provide guidance on analytics aspects to the team in various analytics-related activities. Experienced in Qlik Sense Architecture design and proficient in load script implementation and best practices. Hands-on experience in Qlik Sense development, dashboarding, data-modelling, and reporting techniques. Skilled in data integration through ETL processes from various sources and adept at data transformation including the creation of QVD files and set analysis. Capable of data modeling using Dimensional Modelling, Star schema, and Snowflake schema. The Senior Software Developer should possess strong SQL skills, particularly in SQL Server, to validate Qlik Sense dashboards and work on internal applications. Knowledge of deploying Qlik Sense applications using Qlik Management Console (QMC) is advantageous. Responsibilities include working with ICs, product managers, and clients to gather requirements, configuration, migration, and support of Qlik Sense applications, implementation of best practices, and staying updated on new technologies. Candidates for this role should hold a Bachelor of Science / BTech / MTech / Master of Science degree in Computer Science or equivalent work experience. Effective verbal and written communication skills are essential. Additionally, candidates are required to have a minimum of 3 - 5 years of experience in implementing end-to-end business intelligence using Qlik Sense, with thorough knowledge of Qlik Sense architecture, design, development, testing, and deployment processes. Understanding of Qlik Sense best practices, relational database concepts, data modeling, SQL code writing, and ETL procedures is crucial. Technical expertise in Qlik Sense, SQL Server, data modeling, and experience with clinical trial data and SDTM standards is beneficial. This position offers the opportunity to accelerate skills and career growth within a fast-growing company while contributing to the future of healthcare. eClinical Solutions fosters an inclusive culture that values diversity and encourages continuous learning and improvement. The company is an equal opportunity employer committed to making employment decisions based on qualifications, merit, culture fit, and business needs.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

As a Senior SQL Developer at our company, you will play a crucial role in our BI & analytics team by expanding and optimizing our data and data queries. Your responsibilities will include optimizing data flow and collection for consumption by our BI & Analytics platform. You should be an experienced data querying builder and data wrangler with a passion for optimizing data systems from the ground up. Collaborating with software developers, database architects, data analysts, and data scientists, you will support data and product initiatives and ensure consistent optimal data delivery architecture across ongoing projects. Your self-directed approach will be essential in supporting the data needs of multiple systems and products. If you are excited about enhancing our company's data architecture to support our upcoming products and data initiatives, this role is perfect for you. Your essential functions will involve creating and maintaining optimal SQL queries, Views, Tables, and Stored Procedures. By working closely with various business units such as BI, Product, and Reporting, you will contribute to developing the data warehouse platform vision, strategy, and roadmap. Understanding physical and logical data models and ensuring high-performance access to diverse data sources will be key aspects of your role. Encouraging the adoption of organizational frameworks through documentation, sample code, and developer support will also be part of your responsibilities. Effective communication of progress and effectiveness of developed frameworks to department heads and managers will be essential. To be successful in this role, you should possess a Bachelor's or Master's degree or equivalent combination of education and experience in a relevant field. Proficiency in T-SQL, Data Warehouses, Star Schema, Data Modeling, OLAP, SQL, and ETL is required. Experience in creating Tables, Views, and Stored Procedures is crucial. Familiarity with BI and Reporting Platforms, industry trends, and knowledge of multiple database platforms like SQL Server and MySQL are necessary. Proficiency in Source Control and Project Management tools such as Azure DevOps, Git, and JIRA is expected. Experience with SonarQube for clean coding T-SQL practices and DevOps best practices will be advantageous. Applicants must have exceptional written and spoken communication skills and strong team-building abilities to contribute to making strategic decisions and advising senior management on technical matters. With at least 5+ years of experience in a data warehousing position, including working as a SQL Developer, and experience in system development lifecycle, you should also have a proven track record in data integration, consolidation, enrichment, and aggregation. Strong analytical skills, attention to detail, organizational skills, and the ability to mentor junior colleagues will be crucial for success in this role. This full-time position requires flexibility to support different time zones between 12 PM IST to 9 PM IST, Monday through Friday. You will work in a Hybrid Mode and spend at least 2 days working from the office in Hyderabad. Occasional evening and weekend work may be expected based on client needs or job-related emergencies. This job description may not cover all responsibilities and duties, which may change with or without notice.,

Posted 3 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

dehradun, uttarakhand

On-site

As a Data Modeler, your primary responsibility will be to design and develop conceptual, logical, and physical data models supporting enterprise data initiatives. You will work with modern storage formats like Parquet and ORC, and build and optimize data models within Databricks Unity Catalog. Collaborating with data engineers, architects, analysts, and stakeholders, you will ensure alignment with ingestion pipelines and business goals. Translating business and reporting requirements into robust data architecture, you will follow best practices in data warehousing and Lakehouse design. Your role will involve maintaining metadata artifacts, enforcing data governance, quality, and security protocols, and continuously improving modeling processes. You should have over 10 years of hands-on experience in data modeling within Big Data environments. Your expertise should include OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices. Proficiency in modeling methodologies like Kimball, Inmon, and Data Vault is essential. Hands-on experience with modeling tools such as ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart is preferred. Experience in Databricks with Unity Catalog and Delta Lake is required, along with a strong command of SQL and Apache Spark for querying and transformation. Familiarity with the Azure Data Platform, including Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database, is beneficial. Exposure to Azure Purview or similar data cataloging tools is a plus. Strong communication and documentation skills are necessary for this role, as well as the ability to work in cross-functional agile environments. A Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field is required. Certifications such as Microsoft DP-203: Data Engineering on Microsoft Azure are a plus. Experience working in agile/scrum environments and exposure to enterprise data security and regulatory compliance frameworks like GDPR and HIPAA are advantageous.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As a Business Intelligence Specialist at Adobe, you will have the opportunity to work closely with Business analysts to understand design specifications and translate requirements into technical models, dashboards, reports, and applications. Your role will involve collaborating with business users to cater to their ad-hoc requests and deliver scalable solutions on MSBI platforms. You will be responsible for system integration of data sources, creating technical documents, and ensuring data and code quality through standard methodologies and processes. To succeed in this role, you should have at least 3 years of experience in SSIS, SSAS, Data Warehousing, Data Analysis, and Business Intelligence. You should also possess advanced proficiency in Data Warehousing tools and technologies, including databases, SSIS, and SSAS, along with in-depth understanding of Data Warehousing principles and Dimensional Modeling techniques. Hands-on experience in ETL processes, database optimization, and query tuning is essential. Familiarity with cloud platforms such as Azure and AWS, as well as Python or PySpark and Databricks, would be beneficial. Experience in creating interactive dashboards using Power BI is an added advantage. In addition to technical skills, strong problem-solving and analytical abilities, quick learning capabilities, and excellent communication and presentation skills are important for this role. A Bachelor's degree in Computer Science, Information Technology, or an equivalent technical discipline is required. At Adobe, we value a free and open marketplace for all employees and provide internal growth opportunities for your career development. We encourage creativity, curiosity, and continuous learning as part of your career journey. To prepare for internal opportunities, update your Resume/CV and Workday profile, explore the Internal Mobility page on Inside Adobe, and check out tips to help you prep for interviews. The Talent Team will reach out to you within 2 weeks of applying for a role via Workday, and if you move forward in the interview process, inform your manager for support in your career growth. Join Adobe to work in an exceptional environment with colleagues committed to helping each other grow through ongoing feedback. If you are looking to make an impact and grow your career, Adobe is the place for you. Discover more about employee experiences on the Adobe Life blog and explore the meaningful benefits we offer. For any accommodation needs during the application process, please contact accommodations@adobe.com.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

kolkata, west bengal

On-site

As a Data Modeler specializing in Hybrid Data Environments, you will play a crucial role in designing, developing, and optimizing data models that facilitate enterprise-level analytics, insights generation, and operational reporting. You will collaborate with business analysts and stakeholders to comprehend business processes and translate them into effective data modeling solutions. Your expertise in traditional data stores such as SQL Server and Oracle DB, along with proficiency in Azure/Databricks cloud environments, will be essential in migrating and optimizing existing data models. Your responsibilities will include designing logical and physical data models that capture the granularity of data required for analytical and reporting purposes. You will establish data modeling standards and best practices to maintain data architecture integrity and collaborate with data engineers and BI developers to ensure data models align with analytical and operational reporting needs. Conducting data profiling and analysis to understand data sources, relationships, and quality will inform your data modeling process. Your qualifications should include a Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field, along with a minimum of 5 years of experience in data modeling. Proficiency in SQL, familiarity with data modeling tools, and understanding of Azure cloud services, Databricks, and big data technologies are essential. Your ability to translate complex business requirements into effective data models, strong analytical skills, attention to detail, and excellent communication and collaboration abilities will be crucial in this role. In summary, as a Data Modeler for Hybrid Data Environments, you will drive the development and maintenance of data models that support analytical and reporting functions, contribute to the establishment of data governance policies and procedures, and continuously refine data models to meet evolving business needs and leverage new data modeling techniques and cloud capabilities.,

Posted 3 weeks ago

Apply

10.0 - 14.0 years

0 Lacs

dehradun, uttarakhand

On-site

You should have familiarity with modern storage formats like Parquet and ORC. Your responsibilities will include designing and developing conceptual, logical, and physical data models to support enterprise data initiatives. You will build, maintain, and optimize data models within Databricks Unity Catalog, developing efficient data structures using Delta Lake to optimize performance, scalability, and reusability. Collaboration with data engineers, architects, analysts, and stakeholders is essential to ensure data model alignment with ingestion pipelines and business goals. You will translate business and reporting requirements into a robust data architecture using best practices in data warehousing and Lakehouse design. Additionally, maintaining comprehensive metadata artifacts such as data dictionaries, data lineage, and modeling documentation is crucial. Enforcing and supporting data governance, data quality, and security protocols across data ecosystems will be part of your role. You will continuously evaluate and improve modeling processes. The ideal candidate will have 10+ years of hands-on experience in data modeling in Big Data environments. Expertise in OLTP, OLAP, dimensional modeling, and enterprise data warehouse practices is required. Proficiency in modeling methodologies including Kimball, Inmon, and Data Vault is expected. Hands-on experience with modeling tools like ER/Studio, ERwin, PowerDesigner, SQLDBM, dbt, or Lucidchart is preferred. Proven experience in Databricks with Unity Catalog and Delta Lake is necessary, along with a strong command of SQL and Apache Spark for querying and transformation. Experience with the Azure Data Platform, including Azure Data Factory, Azure Data Lake Storage, Azure Synapse Analytics, and Azure SQL Database is beneficial. Exposure to Azure Purview or similar data cataloging tools is a plus. Strong communication and documentation skills are required, with the ability to work in cross-functional agile environments. Qualifications for this role include a Bachelor's or Master's degree in Computer Science, Information Systems, Data Engineering, or a related field. Certifications such as Microsoft DP-203: Data Engineering on Microsoft Azure are desirable. Experience working in agile/scrum environments and exposure to enterprise data security and regulatory compliance frameworks (e.g., GDPR, HIPAA) are also advantageous.,

Posted 3 weeks ago

Apply

15.0 - 19.0 years

0 Lacs

karnataka

On-site

Seeking an experienced Senior Business Intelligence Expert with deep expertise in PowerBI development and a proven track record of creating high-performance, visually compelling business intelligence solutions. The ideal candidate will have extensive experience in semantic modeling, data pipeline development, and API integration, with the ability to transform complex data into actionable insights through intuitive dashboards that follow consistent branding guidelines and utilize advanced visualizations. As a Senior Business Intelligence Expert, you will be responsible for designing, developing, and maintaining enterprise-level PowerBI solutions that drive business decisions across the organization. Your expertise in data modeling, ETL processes, and visualization best practices will be essential in delivering high-quality BI assets that meet performance standards and provide exceptional user experiences. Lead the optimization and performance tuning of PowerBI reports, dashboards, and datasets to ensure fast loading times and efficient data processing. Enhance the BI user experience by implementing consistent branding, modern visual designs, and intuitive navigation across all PowerBI assets. Develop and maintain complex data models using PowerBI's semantic modeling capabilities to ensure data accuracy, consistency, and usability. Create and maintain data ingestion pipelines using Databricks, Python, and SQL to transform raw data into structured formats suitable for analysis. Design and implement automated processes for integrating data from various API sources. Collaborate with stakeholders to understand business requirements and translate them into effective BI solutions. Provide technical leadership and mentoring to junior BI developers. Document technical specifications, data dictionaries, and user guides for all BI solutions. Minimum 15+ years of experience in business intelligence, data analytics, or related field. Expert-level proficiency with PowerBI Desktop, PowerBI Service, and PowerBI Report Server. Advanced knowledge of DAX, M language, and PowerQuery for sophisticated data modeling. Strong expertise in semantic modeling principles and best practices. Extensive experience with custom visualizations and complex dashboard design. Proficient in SQL for data manipulation and optimization. Experience with Python for data processing and ETL workflows. Proven track record of API integration and data ingestion from diverse sources. Strong understanding of data warehouse concepts and dimensional modeling. Bachelor's degree in Computer Science, Information Systems, or related field (or equivalent experience). The ideal candidate will also possess knowledge and experience with emerging technologies and advanced PowerBI capabilities that can further enhance our BI ecosystem. Nice to Have Skills: Experience implementing AI-powered analytics tools and integrating them with PowerBI. Proficiency with Microsoft Copilot Studio for creating AI-powered business applications. Expertise across the Microsoft Power Platform (Power Apps, Power Automate, Power Virtual Agents). Experience with third-party visualization tools such as Inforiver for enhanced reporting capabilities. Knowledge of writeback architecture and implementation in PowerBI solutions. Experience with PowerBI APIs for custom application integration and automation. Familiarity with DevOps practices for BI development and deployment. Certifications such as Microsoft Certified: Data Analyst Associate, Power BI Developer, or Azure Data Engineer. This role offers an opportunity to work with cutting-edge business intelligence technologies while delivering impactful solutions that drive organizational success through data-driven insights. Come as You Are. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request an accommodation.,

Posted 3 weeks ago

Apply

3.0 - 8.0 years

8 - 12 Lacs

Mumbai

Work from Office

Job Description A "Reporting & Analytics Datasphere Consultant/BW4HANA" role involves designing, developing, and implementing data warehousing solutions using SAP Datasphere and BW/4HANA to extract, transform, and load data for comprehensive reporting and analytics, requiring expertise in data modeling, data quality, and creating visualizations to support business decision-making within an organization, Key Responsibilities Requirement Gathering: Collaborate with stakeholders to understand business needs, identify data sources, and define reporting requirements for data warehousing solutions, Data Modeling: Design and build data models within Datasphere and BW/4HANA, including dimension and fact tables, to optimize data access and analysis, Data Extraction and Transformation (ETL): Develop ETL processes using Datasphere to extract data from various source systems, cleanse, transform, and load it into the data warehouse, Reporting Development: Create comprehensive reports and dashboards using SAP Analytics Cloud (SAC) or other visualization tools, leveraging data from the data warehouse to provide insights, Performance Optimization: Monitor system performance, identify bottlenecks, and implement optimizations to ensure efficient data processing and query execution, Data Quality Management: Establish data quality checks and monitoring processes to ensure data accuracy and integrity within the data warehouse, Implementation and Deployment: Deploy data warehouse solutions, including configuration, testing, and user training, Technical Support: Provide technical support to users on data warehouse queries, reporting issues, and system maintenance, Required Skills Proficient in SAP BW/4HANA and Datasphere: Deep understanding of data modeling, data extraction, transformation, and loading functionalities within the platform, Data Warehousing Concepts: Strong knowledge of dimensional modeling, data mart design, and data quality best practices, Reporting and Visualization Tools: Expertise in using SAP Analytics Cloud (SAC) or other visualization tools to create interactive reports and dashboards, SQL and Programming Skills: Proficiency in SQL queries and potentially scripting languages like ABAP to manipulate and extract data, Business Acumen: Ability to translate business requirements into technical solutions and understand key business metrics, Communication Skills: Excellent communication skills to effectively collaborate with stakeholders and present technical concepts clearly, At DXC Technology, we believe strong connections and community are key to our success Our work model prioritizes in-person collaboration while offering flexibility to support wellbeing, productivity, individual work styles, and life circumstances Were committed to fostering an inclusive environment where everyone can thrive, Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf More information on employment scams is available here, Show

Posted 3 weeks ago

Apply

15.0 - 19.0 years

0 Lacs

karnataka

On-site

As a Senior Business Intelligence Expert, you will leverage your extensive experience in PowerBI development to create high-performance and visually compelling business intelligence solutions. Your expertise in semantic modeling, data pipeline development, and API integration will play a crucial role in transforming complex data into actionable insights through intuitive dashboards that adhere to consistent branding guidelines and utilize advanced visualizations. You will be responsible for designing, developing, and maintaining enterprise-level PowerBI solutions that drive key business decisions throughout the organization. Your proficiency in data modeling, ETL processes, and visualization best practices will be essential in delivering top-notch BI assets that meet performance standards and offer exceptional user experiences. Key Responsibilities: - Lead optimization and performance tuning of PowerBI reports, dashboards, and datasets to ensure fast loading times and efficient data processing. - Enhance BI user experience by implementing consistent branding, modern visual designs, and intuitive navigation across all PowerBI assets. - Develop and maintain complex data models using PowerBI's semantic modeling capabilities for data accuracy, consistency, and usability. - Create and maintain data ingestion pipelines using Databricks, Python, and SQL to transform raw data into structured formats suitable for analysis. - Design and implement automated processes for integrating data from various API sources. - Collaborate with stakeholders to understand business requirements and translate them into effective BI solutions. - Provide technical leadership and mentoring to junior BI developers. - Document technical specifications, data dictionaries, and user guides for all BI solutions. Required Qualifications: - Minimum 15+ years of experience in business intelligence, data analytics, or related field. - Good experience in Databricks. - Expert-level proficiency with PowerBI Desktop, PowerBI Service, and PowerBI Report Server. - Advanced knowledge of DAX, M language, and PowerQuery for sophisticated data modeling. - Strong expertise in semantic modeling principles and best practices. - Extensive experience with custom visualizations and complex dashboard design. - Proficient in SQL for data manipulation and optimization. - Experience with Python for data processing and ETL workflows. - Proven track record of API integration and data ingestion from diverse sources. - Strong understanding of data warehouse concepts and dimensional modeling. - Bachelor's degree in Computer Science, Information Systems, or related field (or equivalent experience). Nice to Have Skills: - Experience implementing AI-powered analytics tools and integrating them with PowerBI. - Proficiency with Microsoft Copilot Studio for creating AI-powered business applications. - Expertise across the Microsoft Power Platform (Power Apps, Power Automate, Power Virtual Agents). - Experience with third-party visualization tools such as Inforiver for enhanced reporting capabilities. - Knowledge of writeback architecture and implementation in PowerBI solutions. - Experience with PowerBI APIs for custom application integration and automation. - Familiarity with DevOps practices for BI development and deployment. - Certifications such as Microsoft Certified: Data Analyst Associate, Power BI Developer, or Azure Data Engineer. This role offers an exciting opportunity to work with cutting-edge business intelligence technologies and deliver impactful solutions that drive organizational success through data-driven insights.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

You are an experienced Data Modeller with specialized knowledge in designing and implementing data models for modern data platforms, specifically within the healthcare domain. Your expertise includes a deep understanding of data modeling techniques, healthcare data structures, and experience with Databricks Lakehouse architecture. Your role involves translating complex business requirements into efficient and scalable data models that support analytics and reporting needs. You will be responsible for designing and implementing logical and physical data models for Databricks Lakehouse implementations. Collaboration with business stakeholders, data architects, and data engineers is crucial to create data models that facilitate the migration from legacy systems to the Databricks platform while ensuring data integrity, performance, and compliance with healthcare industry standards. Key responsibilities include creating and maintaining data dictionaries, entity relationship diagrams, and model documentation. You will develop dimensional models, data vault models, and other relevant modeling approaches. Additionally, supporting the migration of data models, ensuring alignment with overall data architecture, and implementing data modeling best practices are essential aspects of your role. Your qualifications include extensive experience in data modeling for analytics and reporting systems, strong knowledge of dimensional modeling, data vault, and other methodologies. Proficiency in Databricks platform, Delta Lake architecture, healthcare data modeling, and industry standards is required. You should have experience in migrating data models from legacy systems, strong SQL skills, and understanding of data governance principles. Technical skills that you must possess include expertise in data modeling methodologies, Databricks platform, SQL, data definition languages, data warehousing concepts, ETL/ELT processes, performance tuning, metadata management, data cataloging, cloud platforms, big data technologies, and healthcare industry knowledge. Your knowledge should encompass healthcare data structures, terminology, coding systems, data standards, analytics use cases, regulatory requirements, clinical and operational data modeling challenges, and population health and value-based care data needs. Your educational background should include a Bachelor's degree in Computer Science, Information Systems, or a related field, with an advanced degree being preferred. Professional certifications in data modeling or related areas would be advantageous for this role.,

Posted 3 weeks ago

Apply

10.0 - 17.0 years

12 - 17 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

POSITION OVERVIEW: We are seeking an experienced and highly skilled Data Engineer with deep expertise in Microsoft Fabric , MS-SQL, data warehouse architecture design , and SAP data integration. The ideal candidate will be responsible for designing, building, and optimizing data pipelines and architectures to support our enterprise data strategy. The candidate will work closely with cross-functional teams to ingest, transform, and make data (from SAP and other systems) available in our Microsoft Azure environment, enabling robust analytics and business intelligence. KEY ROLES & RESPONSIBILITIES : Spearhead the design, development, deployment, testing, and management of strategic data architecture, leveraging cutting-edge technology stacks on cloud, on-prem and hybrid environments Design and implement an end-to-end data architecture within Microsoft Fabric / SQL, including Azure Synapse Analytics (incl. Data warehousing). This would also encompass a Data Mesh Architecture. Develop and manage robust data pipelines to extract, load, and transform data from SAP systems (e.g., ECC, S/4HANA, BW). Perform data modeling and schema design for enterprise data warehouses in Microsoft Fabric. Ensure data quality, security, and compliance standards are met throughout the data lifecycle. Enforce Data Security measures, strategies, protocols, and technologies ensuring adherence to security and compliance requirements Collaborate with BI, analytics, and business teams to understand data requirements and deliver trusted datasets. Monitor and optimize performance of data processes and infrastructure. Document technical solutions and develop reusable frameworks and tools for data ingestion and transformation. Establish and maintain robust knowledge management structures, encompassing Data Architecture, Data Policies, Platform Usage Policies, Development Rules, and more, ensuring adherence to best practices, regulatory compliance, and optimization across all data processes Implement microservices, APIs and event-driven architecture to enable agility and scalability. Create and maintain architectural documentation, diagrams, policies, standards, conventions, rules and frameworks to effective knowledge sharing and handover. Monitor and optimize the performance, scalability, and reliability of the data architecture and pipelines. Track data consumption and usage patterns to ensure that infrastructure investment is effectively leveraged through automated alert-driven tracking. KEY COMPETENCIES: Microsoft Certified: Fabric Analytics Engineer Associate or equivalent certificate for MS SQL. Prior experience working in cloud environments (Azure preferred). Understanding of SAP data structures and SAP integration tools like SAP Data Services, SAP Landscape Transformation (SLT), or RFC/BAPI connectors. Experience with DevOps practices and version control (e.g., Git). Deep understanding of SAP architecture, data models, security principles, and platform best practices. Strong analytical skills with the ability to translate business needs into technical solutions. Experience with project coordination, vendor management, and Agile or hybrid project delivery methodologies. Excellent communication, stakeholder management, and documentation skills. Strong understanding of data warehouse architecture and dimensional modeling. Excellent problem-solving and communication skills. QUALIFICATIONS / EXPERIENCE / SKILLS Qualifications : Bachelors degree in Computer Science, Information Systems, or a related field. Certifications such as SQL, Administrator, Advanced Administrator, are preferred. Expertise in data transformation using SQL, PySpark, and/or other ETL tools. Strong knowledge of data governance, security, and lineage in enterprise environments. Advanced knowledge in SQL, database procedures/packages and dimensional modeling Proficiency in Python, and/or Data Analysis Expressions (DAX) (Preferred, not mandatory) Familiarity with PowerBI for downstream reporting (Preferred, not mandatory). Experience : • 10 years of experience as a Data Engineer or in a similar role. Skills: Hands-on experience with Microsoft SQL (MS-SQL), Microsoft Fabric including Synapse (Data Warehousing, Notebooks, Spark) Experience integrating and extracting data from SAP systems, such as: o SAP ECC or S/4HANA SAP BW o SAP Core Data Services (CDS) Views or OData Services Knowledge of Data Protection laws across countries (Preferred, not mandatory)

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies