Home
Jobs

69 Dimensional Modeling Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Job Role: A Data Modeler designs and creates data structures to support business processes and analytics, ensuring data integrity and efficiency. They translate business requirements into technical data models, focusing on accuracy, scalability, and consistency. Here's a more detailed look at the role: Responsibilities: Designing and developing data models: This includes creating conceptual, logical, and physical models to represent data in a structured way. Translating business needs : They work with stakeholders to understand business requirements and translate them into actionable data structures. Ensuring data integrity: They implement data validation rules and constraints to maintain the accuracy and reliability of data. Optimizing data models: Data modelers optimize models for performance, scalability, and usability, ensuring data can be efficiently stored and retrieved. Collaborating with other teams : They work with database administrators, data engineers, and business analysts to ensure data models align with business needs and technical requirements. Documenting data models: They provide clear documentation of data structures and relationships, including entity-relationship diagrams and metadata. Skills: Data modeling techniques: Knowledge of various data modeling approaches, including normalization, denormalization, and dimensional modeling. Database technologies: Understanding of relational databases, NoSQL databases, and other database systems. SQL: Proficiency in writing SQL queries for database management and data manipulation. Data modeling tools: Familiarity with tools like PowerDesigner, ERwin, or Visio. Communication and collaboration: Strong communication skills to effectively work with diverse teams and stakeholders. Problem-solving:. Ability to identify and resolve data model performance issues and ensure data accuracy.

Posted 10 hours ago

Apply

7.0 - 12.0 years

5 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

BE/B.Tech or equivalent The data modeler designs, implements, and documents data architecture and data modeling solutions, which include the use of relational, dimensional, and NoSQL data bases. These solutions support enterprise information management, business intelligence, machine learning, data science, and other business interests. 7+ years of hands-on relational, dimensional, and/or analytic experience (using RDBMS, dimensional, NoSQL data platform technologies, and ETL and data ingestion protocols). Experience with data warehouse, data lake, and enterprise big data platforms in multi- data -center contexts required. Good knowledge of metadata management, data modeling, and related tools (Erwin or ER Studio or others) required. Experience in team management, communication, and presentation. Be responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL). Oversee and govern the expansion of existing data architecture and the optimization of data query performance via best practices. The candidate must be able to work independently and collaboratively. Responsibilities Implement business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning). Work with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models Define and govern data modeling and design standards, tools, best practices, and related development for enterprise data models. Identify the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization. Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC. Work proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Required Skills Required Skills - Data Modeling, Dimensional modeling, Erwin, Data Management, RDBMS, SQL/NoSQL, ETL

Posted 11 hours ago

Apply

7.0 - 9.0 years

5 - 9 Lacs

Mumbai

Work from Office

Naukri logo

Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.

Posted 13 hours ago

Apply

6.0 - 11.0 years

20 - 35 Lacs

Bengaluru

Remote

Naukri logo

Greetings from BCforward INDIA TECHNOLOGIES PRIVATE LIMITED. Contract To Hire(C2H) Role Location: WFH Payroll: BCforward Work Mode: Hybrid JD Axiom Developer Skills: AxiomSL, Relational / Dimensional Modeling; Python Should have V9 and v10 experience Should have Free form Report experience Should have US Reg Reporting Experience Creation of custom workflows , report customization , configuration and build with Axiom SL . Must have End to End Process knowledge . Please share your Updated Resume, PAN card soft copy, Passport size Photo & UAN History. Interested applicants can share updated resume to g.sreekanth@bcforward.com Note: Looking for Immediate to 15-Days joiners at most. All the best

Posted 14 hours ago

Apply

5.0 - 7.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models), SQL, MDX/DAX, and data modeling. The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities - Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). - Build and optimize MDX or DAX queries for advanced reporting needs. - Create and manage data models (Star/Snowflake schemas) supporting business KPIs. - Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). - Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. - Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. - Maintain data quality and consistency across data sources and reporting layers. - Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary : Required Skills : - SSAS Tabular & Multidimensional. - SQL Server (Advanced SQL, Views, Joins, Indexes). - DAX & MDX. - Data Modeling & OLAP concepts. Secondary : - ETL Tools (SSIS or equivalent). - Power BI or similar BI/reporting tools. - Performance tuning & troubleshooting in SSAS and SQL. - Version control (TFS/Git), deployment best practices.

Posted 1 day ago

Apply

12.0 - 16.0 years

1 - 1 Lacs

Hyderabad

Remote

Naukri logo

Were Hiring: Azure Data Factory (ADF) Developer Hyderabad Location: Onsite at Canopy One Office, Hyderabad/Remote Type: Full-time/Partime/Contract | Offshore role | Must be available to work in Eastern Time Zone (EST) We’re looking for an experienced ADF Developer to join our offshore team supporting a major client. This role focuses on building robust data pipelines using Azure Data Factory (ADF) and working closely with client stakeholders for transformation logic and data movement. Key Responsibilities Design, build, and manage ADF data pipelines Implement transformations and aggregations based on mappings provided Work with data from the bronze (staging) area, pre-loaded via Boomi Collaborate with client-side data managers (based in EST) to deliver clean, reliable datasets Requirements Proven hands-on experience with Azure Data Factory Strong understanding of ETL workflows and data transformation Familiarity with data staging/bronze layer concepts Willingness to work in Eastern Time Zone (EST) hours Preferred Qualifications Knowledge of Kimball Data Warehousing (huge advantage!) Experience working in an offshore coordination model Exposure to Boomi is a plus Role & responsibilities Preferred candidate profile

Posted 3 days ago

Apply

4.0 - 9.0 years

20 - 30 Lacs

Pune, Bengaluru

Hybrid

Naukri logo

Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 5-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only Outstation candidate's will not be considered

Posted 3 days ago

Apply

4.0 - 7.0 years

1 - 4 Lacs

Noida

Hybrid

Naukri logo

Position Overview The role would be to help architect, build, and maintain a robust, scalable, and sustainable businessintelligence platform. Assisted by the Data Team this role will work with highly scalable systems, complex data models, and a large amount of transactional data. Company Overview BOLD is an established and fast-growing product company that transforms work lives. Since 2005,weve helped more than 10,000,000 folks from all over America(and beyond!) reach higher and dobetter. A career at BOLD promises great challenges, opportunities, culture and the environment. Withour headquarters in Puerto Rico and offices in San Francisco and India, were a global organization on a path to change the career industry Key Responsibilities Architect, develop, and maintain a highly scalable data warehouse and build/maintain ETL processes. Utilize Python and Airflow to integrate data from across the business into data warehouse. Integrate third party data into the data warehouse like google analytics, google ads, Iterable Required Skills Experience working as an ETL developer in a Data Engineering, Data Warehousing or Business Intelligence team Understanding of data integration/data engineering architecture and should be aware of ETL standards, methodologies, guidelines and techniques Hands on with python programming language and its packages like Pandas, NumPy Strong understanding of SQL queries, aggregate functions, Complex joins and performance tuning Should have good exposure of Databases like Snowflake/SQL Server/Oracle/PostgreSQL(any one of these) Broad understanding of data warehousing and dimensional modelling concept

Posted 4 days ago

Apply

4.0 - 9.0 years

20 - 35 Lacs

Hyderabad, Gurugram

Hybrid

Naukri logo

Role & responsibilities Job Summary We are looking for experienced Data Modelers to support large-scale data engineering and analytics initiatives. The role involves developing logical and physical data models, working closely with business and engineering teams to define data requirements, and ensuring alignment with enterprise standards. • Independently complete conceptual, logical and physical data models for any supported platform, including SQL Data Warehouse, Spark, Data Bricks Delta Lakehouse or other Cloud data warehousing technologies. • Governs data design/modelling documentation of metadata (business definitions of entities and attributes) and constructions database objects, for baseline and investment funded projects, as assigned. • Develop a deep understanding of the business domains like Customer, Sales, Finance, Supplier, and enterprise technology inventory to craft a solution roadmap that achieves business objectives, maximizes reuse. Drive collaborative reviews of data model design, code, data, security features to drive data product development. • Show expertise for data at all levels: low-latency, relational, and unstructured data stores; analytical and data lakes; SAP Data Model. • Develop reusable data models based on cloud-centric, code-first approaches to data management and data mapping. • Partner with the data stewards team for data discovery and action by business customers and stakeholders. • Provides and/or supports data analysis, requirements gathering, solution development, and design reviews for enhancements to, or new, applications/reporting. • Assist with data planning, sourcing, collection, profiling, and transformation. • Support data lineage and mapping of source system data to canonical data stores. • Create Source to Target Mappings (STTM) for ETL and BI developers. Preferred candidate profile Expertise in data modelling tools (ER/Studio, Erwin, IDM/ARDM models, CPG / Manufacturing/Sales/Finance/Supplier/Customer domains ) • Experience with at least one MPP database technology such as Databricks Lakehouse, Redshift, Synapse, Teradata, or Snowflake. • Experience with version control systems like GitHub and deployment & CI tools. • Experience of metadata management, data lineage, and data glossaries is a plus. • Working knowledge of agile development, including DevOps and DataOps concepts. • Working knowledge of SAP data models, particularly in the context of HANA and S/4HANA, Retails Data like IRI, Nielsen Retail.

Posted 1 week ago

Apply

16.0 - 22.0 years

40 - 55 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Role & responsibilities • Minimum 15 years of experience • Deep understanding of data architecture principles, data modelling, data integration, data governance, and data management technologies. • Experience in Data strategies and developing logical and physical data models on RDBMS, NoSQL, and Cloud native databases. • Decent experience in one or more RDBMS systems (such as Oracle, DB2, SQL Server) • Good understanding of Relational, Dimensional, Data Vault Modelling • Experience in implementing 2 or more data models in a database with data security and access controls. • Good experience in OLTP and OLAP systems • Excellent Data Analysis skills with demonstrable knowledge on standard datasets and sources. • Good Experience on one or more Cloud DW (e.g. Snowflake, Redshift, Synapse) • Experience on one or more cloud platforms (e.g. AWS, Azure, GCP) • Understanding of DevOps processes • Hands-on experience in one or more Data Modelling Tools • Good understanding of one or more ETL tool and data ingestion frameworks • Understanding of Data Quality and Data Governance • Good understanding of NoSQL Database and modeling techniques • Good understanding of one or more Business Domains • Understanding of Big Data ecosystem • Understanding of Industry Data Models • Hands-on experience in Python • Experience in leading the large and complex teams • Good understanding of agile methodology • Extensive expertise in leading data transformation initiatives, driving cultural change, and promoting a data driven mindset across the organization. • Excellent Communication skills • Understand the Business Requirements and translate business requirements into conceptual, logical and physical Data models. • Work as a principal advisor on data architecture, across various data requirements, aggregation data lake data models data warehouse etc. • Lead cross-functional teams, define data strategies, and leverage the latest technologies in data handling. • Define and govern data architecture principles, standards, and best practices to ensure consistency, scalability, and security of data assets across projects. • Suggest best modelling approach to the client based on their requirement and target architecture. • Analyze and understand the Datasets and guide the team in creating Source to Target Mapping and Data Dictionaries, capturing all relevant details. • Profile the Data sets to generate relevant insights. • Optimize the Data Models and work with the Data Engineers to define the Ingestion logic, ingestion frequency and data consumption patterns. • Establish data governance practices, including data quality, metadata management, and data lineage, to ensure data accuracy, reliability, and compliance. • Drives automation in modeling activities Collaborate with Business Stakeholders, Data Owners, Business Analysts, Architects to design and develop next generation data platform. • Closely monitor the Project progress and provide regular updates to the leadership teams on the milestones, impediments etc. • Guide /mentor team members, and review artifacts. • Contribute to the overall data strategy and roadmaps. • Propose and execute technical assessments, proofs of concept to promote innovation in the data space.

Posted 1 week ago

Apply

1.0 - 6.0 years

2 - 3 Lacs

Gurugram

Work from Office

Naukri logo

- Inspect raw materials and products for dimensional, mechanical, & visual standards - Perform tensile, hardness, & surface finish tests - Use calipers, micrometers, & testers - Identify & correct quality issues - Work with teams to improve processes Required Candidate profile - Diploma/degree in Engg or related field - Min. 1 yr QC experience in fastener/metal manufacturing industry - Proficient in Vernier Caliper & Micrometer - Team player with attention to detail

Posted 1 week ago

Apply

10.0 - 20.0 years

25 - 40 Lacs

Hyderabad, Pune

Hybrid

Naukri logo

Data Modeler / Lead - Healthcare Data Systems Position Overview We are seeking an experienced Data Modeler/Lead with deep expertise in health plan data models and enterprise data warehousing to drive our healthcare analytics and reporting initiatives. The candidate should have hands-on experience with modern data platforms and a strong understanding of healthcare industry data standards. Key Responsibilities Data Architecture & Modeling • Design and implement comprehensive data models for health plan operations, including member enrollment, claims processing, provider networks, and medical management • • • Develop logical and physical data models that support analytical and regulatory reporting requirements (HEDIS, Stars, MLR, risk adjustment) Create and maintain data lineage documentation and data dictionaries for healthcare datasets Establish data modeling standards and best practices across the organization Technical Leadership • • • Lead data warehousing initiatives using modern platforms like Databricks or traditional ETL tools like Informatica Architect scalable data solutions that handle large volumes of healthcare transactional data Collaborate with data engineers to optimize data pipelines and ensure data quality Healthcare Domain Expertise • Apply deep knowledge of health plan operations, medical coding (ICD-10, CPT, HCPCS), and healthcare data standards (HL7, FHIR, X12 EDI) • • Design data models that support analytical, reporting and AI/ML needs Ensure compliance with healthcare regulations including HIPAA/PHI, and state insurance regulations • Partner with business stakeholders to translate healthcare business requirements into technical data solutions Data Governance & Quality • • • Implement data governance frameworks specic to healthcare data privacy and security requirements Establish data quality monitoring and validation processes for critical health plan metrics Lead eorts to standardize healthcare data denitions across multiple systems and data sources Required Qualications Technical Skills • • • • • 10+ years of experience in data modeling with at least 4 years focused on healthcare/health plan data Expert-level prociency in dimensional modeling, data vault methodology, or other enterprise data modeling approaches Hands-on experience with Informatica PowerCenter/IICS or Databricks platform for large-scale data processing Strong SQL skills and experience with Oracle Exadata and cloud data warehouses (Databricks) Prociency with data modeling tools (Hackolade, ERwin, or similar) Healthcare Industry Knowledge • Deep understanding of health plan data structures including claims, eligibility, provider data, and pharmacy data • • Experience with healthcare data standards and medical coding systems Knowledge of regulatory reporting requirements (HEDIS, Medicare Stars, MLR reporting, risk adjustment) • Familiarity with healthcare interoperability standards (HL7 FHIR, X12 EDI) Leadership & Communication • • • • Proven track record of leading data modeling projects in complex healthcare environments Strong analytical and problem-solving skills with ability to work with ambiguous requirements Excellent communication skills with ability to explain technical concepts to business stakeholders Experience mentoring team members and establishing technical standards Preferred Qualications • Experience with Medicare Advantage, Medicaid, or Commercial health plan operations • • • • Cloud platform certications (AWS, Azure, or GCP) Experience with real-time data streaming and modern data lake architectures Knowledge of machine learning applications in healthcare analytics Previous experience in a lead or architect role within healthcare organizations

Posted 1 week ago

Apply

5.0 - 7.0 years

8 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for an experienced SSAS Data Engineer with strong expertise in SSAS (Tabular and/or Multidimensional Models), SQL, MDX/DAX, and data modeling. The ideal candidate will have a solid background in designing and developing BI solutions, working with large datasets, and building scalable SSAS cubes for reporting and analytics. Experience with ETL processes and reporting tools like Power BI is a strong plus. Key Responsibilities - Design, develop, and maintain SSAS models (Tabular and/or Multidimensional). - Build and optimize MDX or DAX queries for advanced reporting needs. - Create and manage data models (Star/Snowflake schemas) supporting business KPIs. - Develop and maintain ETL pipelines for efficient data ingestion (preferably using SSIS or similar tools). - Implement KPIs, aggregations, partitioning, and performance tuning in SSAS cubes. - Collaborate with data analysts, business stakeholders, and Power BI teams to deliver accurate and insightful reporting solutions. - Maintain data quality and consistency across data sources and reporting layers. - Implement RLS/OLS and manage report security and governance in SSAS and Power BI. Primary : Required Skills : - SSAS Tabular & Multidimensional. - SQL Server (Advanced SQL, Views, Joins, Indexes). - DAX & MDX. - Data Modeling & OLAP concepts. Secondary : - ETL Tools (SSIS or equivalent). - Power BI or similar BI/reporting tools. - Performance tuning & troubleshooting in SSAS and SQL. - Version control (TFS/Git), deployment best practices.

Posted 1 week ago

Apply

4.0 - 9.0 years

20 - 30 Lacs

Pune, Bengaluru

Hybrid

Naukri logo

Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 4-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only

Posted 1 week ago

Apply

4.0 - 6.0 years

8 - 18 Lacs

Bengaluru

Hybrid

Naukri logo

Seeking detail-oriented BI Developers with MicroStrategy expertise to design, develop, and maintain BI solutions. Convert business needs into actionable insights using dashboards, reports, and analytics tools to support smart, data-driven decisions.

Posted 1 week ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Pune, Chennai

Work from Office

Naukri logo

Data Modeler- Primary Skills: Data modeling (conceptual, logical, physical), relational/dimensional/data vault modeling, ERwin/IBM Infosphere, SQL (Oracle, SQL Server, PostgreSQL, DB2), banking domain data knowledge (Retail, Corporate, Risk, Compliance), data governance (BCBS 239, AML, GDPR), data warehouse/lake design, Azure cloud. Secondary Skills: MDM, metadata management, real-time modeling (payments/fraud), big data (Hadoop, Spark), streaming platforms (Confluent), M&A data integration, data cataloguing, documentation, regulatory trend awareness. Soft skills: Attention to detail, documentation, time management, and teamwork.

Posted 1 week ago

Apply

2.0 - 5.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 310007 We are currently seeking a Digital Engineering Staff Engineer to join our team in Bangalore, Karntaka (IN-KA), India (IN). Data Modeler Position Overview: The Data Modeler will be responsible for designing and implementing data models that support the organization's data management and analytics needs. This role involves collaborating with various stakeholders to understand data sources, relationships, and business requirements, and translating them into effective data structures. Key Responsibilities: Collaborate with Business Analysts: Understand different data sources and their relationships. Prepare Conformed Dimension Matrix: Identify different grains of facts, finalize dimensions, and harmonize data across sources. Create Data Models: Develop Source to Target Mapping (STMs) documentation and custom mappings (both technical and non-technical). Include Transformation Rules: Ensure STMs include pseudo SQL queries for transformation rules. Coordinate Reviews: Work with Data Architects, Product Owners, and Enablement teams to review and approve models, STMs, and custom mappings. Engage with Data Engineers: Clarify any questions related to STMs and custom mappings. Required Technical Skills: Proficiency in SQL: Strong understanding of SQL and database management systems. Data Modeling Tools: Familiarity with tools such as ERwin, IBM InfoSphere Data Architect, or similar. Data Warehousing Concepts: Solid knowledge of data warehousing principles, ETL processes, and OLAP. Data Governance and Compliance: Understanding of data governance frameworks and compliance requirements. Key Competencies: Analytical Skills: Ability to analyze complex data sets and derive meaningful insights. Attention to Detail: Ensure accuracy and consistency in data models. Communication Skills: Effectively collaborate with stakeholders and articulate technical concepts to non-technical team members. Project Management Skills: Ability to prioritize tasks, manage timelines, and coordinate with cross-functional teams. Continuous Learning and Adaptability: Commitment to ongoing professional development and adaptability to changing business needs and technologies. Additional : Problem-Solving Abilities: Innovative solutions to data integration, quality, and performance challenges. Knowledge of Data Modeling Methodologies: Entity-relationship modeling, dimensional modeling, normalization techniques. Familiarity with Business Intelligence Tools: Enhance ability to design data structures that facilitate data analysis and visualization. Preferred Qualifications: Experience in SDLC: Understanding of all phases of the Software Development Life Cycle. Certifications: Relevant certifications in data modeling, data warehousing, or related fields.

Posted 2 weeks ago

Apply

7.0 - 12.0 years

25 - 35 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Role - Data Modeler/Senior Data Modeler Exp - 5 to 12 Yrs Locs - Hyderabad, Pune, Bengaluru Position - Permanent Must have skills: - Strong SQL - Strong Data Warehousing skills - ER/Relational/Dimensional Data Modeling - Data Vault Modeling - OLAP, OLTP - Schemas & Data Marts Good to have skills: - Data Vault - ERwin / ER Studio - Cloud Platforms (AWS or Azure)

Posted 2 weeks ago

Apply

3.0 - 8.0 years

3 - 7 Lacs

Mumbai, Pune, Bengaluru

Work from Office

Naukri logo

Locations-Bangalore, Mumbai, Pune, Hyderabad, Chennai, Kolkata, Delhi 3-8 years of IT experience as in development and implementation of Business Intelligence and Data warehousing solutions using OBIEE/OAC/OAS Knowledge of Analysis Design, Development, Customization, Implementation & Maintenance of OBIEE/OAS/OAC Must have experience of working in OAC including security design and implementation in OAC. Must have good knowledge & experience in RPD development and dimensional modelling, including but not limited to handling multiple fact table dimensional Modelling, facts of different grain level, MUDE Environment, Hierarchies, Fragmentation etc. Must have good knowledge in OAC DV Knowledge in OAC/Oracle DV/FAW/OBIA & BI Publisher is a plus. Sound knowledge in writing SQL and debugging queries. Strong knowledge in front-end in developing OBIEE/OAS/OAC reports and dashboards using different views. Good knowledge of performance tuning of reports (Indexing, caching, aggregation, SQL modification, hints etc.) Excellent communication skills, organized and effective in delivering high-quality solutions using OBIEE/OAS/OAC

Posted 2 weeks ago

Apply

7.0 - 9.0 years

5 - 9 Lacs

Surat

Work from Office

Naukri logo

Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.

Posted 2 weeks ago

Apply

7.0 - 9.0 years

5 - 9 Lacs

Chennai

Remote

Naukri logo

Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.

Posted 2 weeks ago

Apply

7.0 - 9.0 years

5 - 9 Lacs

Agra

Remote

Naukri logo

Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.

Posted 2 weeks ago

Apply

7.0 - 9.0 years

5 - 9 Lacs

Kanpur

Remote

Naukri logo

Key Responsibilities : Design and develop data models to support business intelligence and analytics solutions. Work with Erwin or Erwin Studio to create, manage, and optimize conceptual, logical, and physical data models. Implement Dimensional Modelling techniques for data warehousing and reporting. Collaborate with business analysts, data engineers, and stakeholders to gather and understand data requirements. Ensure data integrity, consistency, and compliance with Banking domain standards. Work with Snowflake to develop and optimize cloud-based data models. Write and execute complex SQL queries for data analysis and validation. Identify and resolve data quality issues and inconsistencies. Required Skills & Qualifications : 7+ years of experience in Data Modelling and Data Analysis. Strong expertise in Erwin or Erwin Studio for data modeling. Experience with Dimensional Modelling and Data Warehousing (DWH) concepts. Proficiency in Snowflake and SQL for data management and querying. Previous experience in the Banking domain is mandatory. Strong analytical and problem-solving skills. Ability to work independently in a remote environment. Excellent verbal and written communication skills.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Pune, Gurugram, Jaipur

Work from Office

Naukri logo

Role & responsibilities Should be very good in Erwin data Modelling. Should have good knowledge about data quality and data catalog. Should have good understanding in data lineage. Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Should be very good in Erwin data Modelling. Should have good knowledge about data quality and data catalog. Should have good understanding in data lineage. Should have good understanding of the data modelling in both OLTP and OLAP system. Should have worked in any of the data warehouse as big data architecture. Should be very good in ANSI SQL. Should have good understanding on data visualization. Should be very comfortable and experience in Data analysis. Should have good knowledge in data cataloging tool and data access policy.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies