Home
Jobs

3 Er Diagrams Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

15 - 25 Lacs

Bengaluru

Remote

Naukri logo

Hi, Job Description Summary The Sr. Engineer, Data Modeler will be responsible for shaping and managing the data models and architecture, enabling the organization to store, analyze, and leverage large-scale healthcare data efficiently. This includes developing and implementing reliable, scalable, and effective data models for various data warehouse solutions using cutting-edge tools such as Fivetran, DBT, Snowflake, AWS, Atlan, Erwin ER Diagrams , and Sigma Computing. This role will collaborate with a diverse set of stakeholders to develop a comprehensive data architecture that supports decision-making, reporting, analytics, and data governance. This role requires significant experience with dimensional models, RDBMS, cloud platforms, and ETL processes. This role will be responsible for defining and designing data models that support data governance, data quality, and master data management (MDM), while also working with stakeholders to implement data-driven solutions that enhance business outcomes in the healthcare sector. A strong focus will be placed on creating a trusted data environment by ensuring accurate data mapping, implementing Golden Record practices. Job Description Data Modeling & Architecture: Design and implement conceptual, logical, and physical data models, including Entity-Relationship (ER) models, Star Schema, Snowflake Schema, Data Vault Modeling, and Dimensional Modeling. Lead the design of normalized and denormalized structures to meet business requirements and ensure optimal performance of the Data Warehouse and Data Marts. Collaborate with business and technical teams to map business requirements to data models, ensuring that Master Data Management (MDM) processes and Golden Record concepts are well-defined. Build and maintain a comprehensive Business Glossary and Data Dictionary to standardize definitions and ensure consistency across the organization. Data Lineage & Mapping: Ensure that data lineage is accurately defined, visualized, and documented across the Data Warehouse environment. Oversee the data mapping process to track the flow of data from source to destination, ensuring consistency, integrity, and transparency of data throughout its lifecycle. Data Governance & Quality: Implement Data Governance processes to manage data access, quality, security, and compliance. Define and enforce Data Quality standards and practices, including Data Cleansing, to ensure data integrity and accuracy within the data warehouse environment. Work with stakeholders to establish governance frameworks for Data Lineage, ensuring data traceability, and transparency across the platform. Work with data architects and IT leadership to establish guidelines for data access, data security, and lifecycle management. Real-Time Data Ingestion & Change Data Capture (CDC): Design and implement real-time data ingestion pipelines using Kafka, AWS Kinesis, or Snowpipe to enable streaming data integration into the data warehouse. Implement Change Data Capture (CDC) mechanisms to efficiently capture and propagate data changes from operational systems using tools such as Fivetran or AWS, Lambda. Ensure low-latency processing, incremental updates, and data availability for real-time analytics and reporting. Quality Assurance & Continuous Improvement: Ensure high standards for data quality through rigorous testing, data validation, and performance optimization. Continuously evaluate and improve data modeling processes, tools, and methodologies. Automation & Process Improvement: Work with Data Engineers and development teams to improve data platform automation and enhance the data modeling lifecycle. Continuously monitor, test, and optimize data models and pipelines to ensure scalability, flexibility, and performance of the Data Warehouse. Documentation & Reporting: Maintain clear and up-to-date documentation for data models, data lineage, data mappings, and architectural decisions. Create and present technical diagrams, such as Entity-Relationship Diagrams (ERDs), to stakeholders and ensure alignment with business objectives. Platform Design & Deployment: Develop data architecture for the analytics platform on Snowflake and integrate with other AWS tools for robust data management. Work closely with data engineers to automate data pipeline deployments and updates using Fivetran, DBT, and cloud-based solutions. Stakeholder Collaboration: Partner with Product Manager, and other technical teams to define requirements and deliver optimal data architecture solutions. Conduct regular meetings to communicate technical decisions and ensure alignment with business goals and strategy. Contribute to proposal creation and RFP submissions, ensuring technical feasibility and best practices. Documentation & Reporting: Document all design decisions and data models, adhering to existing guidelines and ensuring clear communication across teams. Create presentations and visual data architecture diagrams for internal and external stakeholders. Perform other duties that support the overall objective of the position. Education Required: Bachelor's degree in Computer Science, Information Technology, Data Science, or a related field. Masters degree or certifications in Data Architecture, Cloud Technologies, or related areas is a plus. Or, any combination of education and experience which would provide the required qualifications for the position. Experience Required: 6+ to 10 years in Data Modeling and Data 6+ to 10 years of hands-on experience in data modeling, data architecture, or information architecture with a focus on large-scale data warehouses. 6+ years of experience with dimensional models and relational database management systems (SQL Server, Oracle, DB2, etc.). 5+ years of experience with cloud technologies, especially AWS services and tools. Experience with ETL tools and automation (e.g., Fivetran, DBT ). Experience with data governance, data quality frameworks, and metadata management. Preferred: Experience in healthcare data modeling and data warehousing. Expertise in AWS environments. Hands-on experience with data integration and cloud automation tools. Familiarity with business intelligence tools (e.g., Sigma Computing). Understanding of healthcare-specific data governance, regulatory frameworks, and security compliance (e.g., HIPAA). Knowledge, Skills & Abilities: Knowledge of: Solid understanding of Data Vault, Star Schema, Snowflake Schema, and dimensional modeling. Proficient in SQL and experience with cloud-based data warehouse solutions such as Snowflake. Familiarity with AWS cloud services, Sigma Computing, Atlan, and Erwin ER diagrams. Skill in: Excellent communication skills to engage with both technical and non-technical stakeholders. Strong analytical and problem-solving skills to design scalable and efficient data models. Ability to: Ability to take ownership of deliverables, manage multiple tasks, and work effectively within an Agile methodology. Proven leadership ability to coach and mentor junior team members.

Posted 6 days ago

Apply

4.0 - 9.0 years

20 - 30 Lacs

Pune, Bengaluru

Hybrid

Naukri logo

Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 5-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only Outstation candidate's will not be considered

Posted 1 week ago

Apply

4.0 - 9.0 years

20 - 30 Lacs

Pune, Bengaluru

Hybrid

Naukri logo

Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 4-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies