Home
Jobs

Data Architect - ETL, Snowflake, DBT

10 - 18 years

22 - 27 Lacs

Posted:1 month ago| Platform: Naukri logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Role: Data Architect / Data Modeler - ETL, Snowflake, DBT Location: Remote Duration: 14+ Months Timings: 5:30pm IST to 1:30am IST Note: Looking for Immediate Joiners Job Summary: We are seeking a seasoned Data Architect / Modeler with deep expertise in Snowflake , DBT , and modern data architectures including Data Lake , Lakehouse , and Databricks platforms. The ideal candidate will be responsible for designing scalable, performant, and reliable data models and architectures that support analytics, reporting, and machine learning needs across the organization. Key Responsibilities: Architect and design data solutions using Snowflake , Databricks , and cloud-native lakehouse principles . Lead the implementation of data modeling best practices (star/snowflake schemas, dimensional models) using DBT . Build and maintain robust ETL/ELT pipelines supporting both batch and real-time data processing. Develop data governance and metadata management strategies to ensure high data quality and compliance. Define data architecture frameworks, standards, and principles for enterprise-wide adoption. Work closely with business stakeholders, data engineers, analysts, and platform teams to translate business needs into scalable data solutions. Provide guidance on data lake and data warehouse integration , helping bridge structured and unstructured data needs. Establish data lineage, documentation, and maintain architecture diagrams and data dictionaries. Stay up to date with industry trends and emerging technologies in cloud data platforms and recommend improvements. Required Skills & Qualifications: 10+ years of experience in data architecture, data engineering, or data modeling roles. Strong experience with Snowflake including performance tuning, security, and architecture. Hands-on experience with DBT (Data Build Tool) for building and maintaining data transformation workflows. Deep understanding of Lakehouse Architecture , Data Lake implementations, and Databricks . Solid grasp of dimensional modeling , normalization/denormalization strategies, and data warehouse design principles. Experience with cloud platforms (e.g., AWS, Azure, or GCP) Proficiency in SQL and scripting languages (e.g., Python). Familiarity with data governance frameworks , data catalogs, and metadata management tools.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now

RecommendedJobs for You