Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
ahmedabad, gujarat
On-site
We are seeking a Data Modeler with expertise in mortgage banking data to support a large-scale Data Modernization program. As a Data Modeler, your primary responsibilities will include designing and developing enterprise-grade data models such as 3NF, Dimensional, and Semantic models to cater to both analytics and operational use cases. You will collaborate closely with business and engineering teams to define data products that are aligned with specific business domains. Your role will involve translating complex mortgage banking concepts into scalable and extensible models that meet the requirements of the organization. It is crucial to ensure that the data models are in alignment with modern data architecture principles and are compatible with cloud platforms like Snowflake and DBT. Additionally, you will be expected to contribute to the creation of canonical models and reusable patterns for enterprise-wide use. To be successful in this role, you should possess the following qualifications: - A minimum of 5 years of experience in data modeling with a strong emphasis on mortgage or financial services. - Hands-on experience in developing 3NF, Dimensional, and Semantic models. - Profound understanding of data as a product and domain-driven design principles. - Familiarity with modern data ecosystems and tools such as Snowflake, DBT, and BI tools would be advantageous. - Excellent communication skills are essential to effectively collaborate with both business and technical teams. This position requires the candidate to work onsite in either Hyderabad or Ahmedabad.,
Posted 2 days ago
5.0 - 8.0 years
8 - 11 Lacs
Hyderabad, Bengaluru
Hybrid
Use data mappings and models provided by the data modeling team to build robust pipelines in Snowflake.Design and implement data pipelines with proper 2NF/3NF normalization standards.Expert-level SQL and experience with data transformation Required Candidate profile Expert-level SQL exp with data transformation.data architecture normalization techniques 2NF/3NF Exp cloud-based data platforms and pipeline design.exp AWS data services.Carrier CAB process.
Posted 1 week ago
6.0 - 8.0 years
8 - 10 Lacs
Pune
Work from Office
Job Summary We are looking for a seasoned Data Modeler / Data Analyst to design and implement scalable, reusable logical and physical data models on Google Cloud Platformprimarily BigQuery. You will partner closely with data engineers, analytics teams, and business stakeholders to translate complex business requirements into performant data models that power reporting, self-service analytics, and advanced data science workloads. Key Responsibilities Gather and analyze business requirements to translate them into conceptual, logical, and physical data models on GCP (BigQuery, Cloud SQL, Cloud Spanner, etc.). Design star/snowflake schemas, data vaults, and other modeling patterns that balance performance, flexibility, and cost. Implement partitioning, clustering, and materialized views in BigQuery to optimize query performance and cost efficiency. Establish and maintain data modelling standards, naming conventions, and metadata documentation to ensure consistency across analytic and reporting layers. Collaborate with data engineers to define ETL/ELT pipelines and ensure data models align with ingestion and transformation strategies (Dataflow, Cloud Composer, Dataproc, dbt). Validate data quality and lineage; work with BI developers and analysts to troubleshoot performance issues or data anomalies. Conduct impact assessments for schema changes and guide version-control processes for data models. Mentor junior analysts/engineers on data modeling best practices and participate in code/design reviews. Contribute to capacity planning and cost-optimization recommendations for BigQuery datasets and reservations. Must-Have Skills 6-8 years of hands-on experience in data modeling, data warehousing, or database design, including at least 2 years on GCP BigQuery. Proficiency in dimensional modeling, 3NF, and modern patterns such as data vault. Expert SQL skills with demonstrable ability to optimize complex analytical queries on BigQuery (partitioning, clustering, sharding strategies). Strong understanding of ETL/ELT concepts and experience working with tools such as Dataflow, Cloud Composer, or dbt. Familiarity with BI/reporting tools (Looker, Tableau, Power BI, or similar) and how model design impacts dashboard performance. Experience with data governance practices—data cataloging, lineage, and metadata management (e.g., Data Catalog). Excellent communication skills to translate technical concepts into business-friendly language and collaborate across functions. Good to Have Experience of working on Azure Cloud (Fabric, Synapse, Delta Lake) Education Bachelor’s or Master’s degree in Computer Science, Information Systems, Engineering, Statistics, or a related field. Equivalent experience will be considered.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
32455 Jobs | Dublin
Wipro
16590 Jobs | Bengaluru
EY
11025 Jobs | London
Accenture in India
10991 Jobs | Dublin 2
Amazon
8878 Jobs | Seattle,WA
Uplers
8715 Jobs | Ahmedabad
IBM
8204 Jobs | Armonk
Oracle
7750 Jobs | Redwood City
Capgemini
6181 Jobs | Paris,France
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi