Snowflake with DBT

2 - 6 years

7 - 17 Lacs

Posted:1 month ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

About the Role: We are looking for a highly skilled and passionate Data Engineer to join our dynamic data team. In this role, you will be instrumental in designing, building, and optimizing our data infrastructure, with a strong emphasis on leveraging Snowflake for data warehousing and dbt (data build tool) for data transformation. You will work across various cloud environments (AWS, Azure, GCP), ensuring our data solutions are scalable, reliable, and efficient. This position requires a deep understanding of data warehousing principles, ETL/ELT methodologies, and a commitment to data quality and governance. Responsibilities: Design, develop, and maintain robust and scalable data pipelines using various data integration tools and techniques within a cloud environment. Build and optimize data models and transformations in Snowflake using dbt, ensuring data accuracy, consistency, and performance. Manage and administer Snowflake environments, including performance tuning, cost optimization, and security configurations. Develop and implement data ingestion strategies from diverse source systems (APIs, databases, files, streaming data) into Snowflake. Write, optimize, and maintain complex SQL queries for data extraction, transformation, and loading (ETL/ELT) processes. Implement data quality checks, validation rules, and monitoring solutions within dbt and Snowflake. Collaborate closely with data analysts, data scientists, and business stakeholders to understand data requirements and translate them into efficient data solutions. Promote and enforce data governance best practices, including metadata management, data lineage, and documentation. Participate in code reviews, contribute to architectural discussions, and champion best practices in data engineering and dbt development. Troubleshoot and resolve data-related issues, ensuring data availability and reliability. Stay current with industry trends and new technologies in the data engineering space, particularly around Snowflake, dbt, and cloud platforms. Qualifications: Required: Bachelor's degree in Computer Science, Engineering, Information Technology, or a related quantitative field. 3 to 5 years of professional experience as a Data Engineer. Expert-level proficiency with Snowflake for data warehousing, including performance optimization and resource management. Extensive hands-on experience with dbt (data build tool) for data modeling, testing, and documentation. Strong proficiency in SQL, with the ability to write complex, optimized queries. Solid programming skills in Python for data manipulation, scripting, and automation. Experience with at least one major cloud platform (AWS, Azure, or GCP) and its core data services. Proven understanding of data warehousing concepts, dimensional modeling, and ETL/ELT principles. Experience with version control systems (e.g., Git). Excellent analytical, problem-solving, and debugging skills. Strong communication and collaboration abilities, with a capacity to work effectively with cross-functional teams.

Mock Interview

Practice Video Interview with JobPe AI

Start Job-Specific Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Skills

Practice coding challenges to boost your skills

Start Practicing Now
Randomtrees logo
Randomtrees

Technology - Machine Learning

Tech City

RecommendedJobs for You

Kolkata, Mumbai, New Delhi, Hyderabad, Pune, Chennai, Bengaluru