Data Architect

10 - 15 years

37 - 45 Lacs

Posted:None| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

About the Role

We are looking for a Senior Data Engineer with an architects mindset and thought leadership approach on the Data & Analytics team. This role goes beyond developmentit's about end-to-end thinking, mentoring and collaborating with a global team.

As a key technical leader, you will align data strategies with business needs, and drive best practices in data modeling, governance, and performance optimization. you will play a pivotal role in streamlining tooling for Democratized Development within Snowflake, DBT related data platforms and build high impact data models.

You will drive performance optimization, cost efficiency, data governance, and model architecture, ensuring that our data infrastructure is scalable, secure, and high-performing.

This role requires deep expertise in cloud-based data engineering, a strong problem-solving mindset, and the ability to collaborate with business and analytics teams . You will work on optimizing compute resources, improving data quality, and enforcing governance standards , support code reviews, and mentor team members, fostering a high-performing data engineering culture.

Key Responsibilities

Data Acquisition & Pipeline Development

Develop and maintain scalable data pipelines for efficient data ingestion, transformation, and integration.

Work with Fivetran, Python, and other ETL/ELT tools to automate and optimize data acquisition from various sources.

Ensure reliable data movement from SaaS platforms (e.g., Salesforce, Gong, Google Analytics) and operational databases into Snowflake.

Monitor and enhance pipeline performance, identifying areas for optimization and fault tolerance.

Evaluate and recommend new technologies for data ingestion, transformation, and orchestration.

Democratized Development & Tooling

Enable and streamline self-service data development for analysts and data practitioners.

Drive best practices in DBT and Snowflake for modular, reusable, and governed data modeling.

Design and maintain CI/CD pipelines to support version control, testing, and deployment in a modern data stack.

Performance Optimization & Cost Efficiency

Optimize and tune Snowflake queries and workloads for performance and cost efficiency.

Implement warehouse resource scaling strategies to reduce compute costs while maintaining SLAs.

Monitor and analyze query performance, storage consumption, and data usage patterns to identify optimization opportunities.

Data Quality, Governance & Security

Establish data quality monitoring frameworks, integrating automated validation and anomaly detection.

Enforce data governance policies, including access controls, lineage tracking, and compliance standards.

Work closely with security teams to implement data protection strategies in Snowflake.

Data Model Development & Architecture Review

Design and develop scalable, well-structured data models in Snowflake.

Perform data model reviews to ensure consistency, efficiency, and alignment with business needs.

Collaborate with Analytics & BI teams to define metrics layers and transformation logic in DBT.

Collaboration & Agile Execution

Collaborate with BI teams to ensure data models meet reporting requirements in Power BI.

Partner with cross-functional teams including Sales, Marketing, Customer Support, Finance, and Product to deliver trusted, high-quality data solutions.

Work within an Agile framework, delivering iterative improvements to data infrastructure.

Stay ahead of industry trends in modern data engineering, Snowflake, and DBT to drive innovation.

Qualifications

Bachelors or Masters degree in Computer Science, Information Technology, or a related field.

5+ years of experience in Data Engineering with strong expertise in Snowflake.

2+ years of hands-on experience in DBT for data modeling and transformation.

2+ years of experience in Python, particularly for data pipelines and automation.

Strong expertise in SQL optimization and performance tuning.

Deep understanding of ETL/ELT architectures, data warehousing, and cloud data management best practices.

Experience implementing cost monitoring and optimization techniques in Snowflake.

Strong problem-solving skills and ability to troubleshoot complex data issues.

Excellent communication skills and ability to work in collaborative, cross-functional teams.

Experience with Agile methodologies and iterative data development processes is a plus.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Aeries Technology logo
Aeries Technology

Technology

Tech City

RecommendedJobs for You

mumbai, maharashtra, india

bengaluru, karnataka

bengaluru, karnataka, india

hyderabad, telangana, india