We are seeking an experienced and strategic Data Architect to design, build, and manage our modern enterprise data platform. The ideal candidate will be a subject matter expert in Microsoft Azure data services, with deep, hands-on experience architecting solutions using Snowflake as the core data cloud platform, dbt for data transformation, and Python for automation and advanced data engineering.
Key Responsibilities
- Cloud Data Architecture: Design and implement a scalable, end-to-end data architecture on Microsoft Azure, integrating services like Azure Data Factory (ADF), Azure Synapse Analytics, Azure Databricks, Azure Functions, and Azure Data Lake Storage (ADLS Gen2).
- Snowflake Platform Leadership: Architect, deploy, and govern our Snowflake Data Cloud environment. This includes designing data models (e.g., star schema, data vault), implementing role-based access control (RBAC), optimizing virtual warehouses for performance and cost, and managing data sharing and governance features.
- Modern Data Transformation: Lead the strategy and implementation of our data transformation layer using dbt (data build tool). Establish and enforce best practices for dbt project structure, coding standards, documentation, testing, and CI/CD integration.
- Python for Data Engineering: Utilize Python for complex data ingestion pipelines, API integrations, orchestration scripting, data quality automation, and to support advanced analytics and machine learning workflows (e.g., using Snowpark for Python).
- Data Governance & Security: Design and enforce data governance policies, data quality frameworks, and security protocols across the platform. Collaborate with security and compliance teams to ensure solutions meet enterprise standards (e.g., GDPR, CCPA).
- Technical Leadership & Mentoring: Provide technical guidance and mentorship to data engineers and developers. Lead architectural design reviews and champion best practices in data engineering and analytics engineering.
- Stakeholder Collaboration: Partner with business leaders, data analysts, and product managers to understand data needs and design solutions that deliver actionable insights.
- Innovation & Evaluation: Stay current with emerging technologies and industry trends. Evaluate and recommend new tools and technologies to enhance our data platform's capabilities.
Required Skills and Qualifications
- Experience: 10+ years in data engineering/architecture, with at least 3+ years in a data architect role on a major cloud platform.
- Azure Expertise: Proven experience designing and building enterprise-grade data solutions on Microsoft Azure. Deep knowledge of Azure Data Factory, ADLS Gen2, Azure Synapse, and Azure security/networking concepts (e.g., VNet, Private Link).
- Snowflake Expertise: Expert-level, hands-on experience with Snowflake architecture, including data modeling, performance tuning, cost optimization, security administration (RBAC), and features like Snowpipe, Streams, and Tasks.
- dbt Proficiency: Extensive experience developing, deploying, and managing production-grade data transformation pipelines using dbt (dbt Core and/or dbt Cloud).
- Python Proficiency: Strong programming skills in Python for data engineering (e.g., Pandas, PySpark, SQLAlchemy) and experience building data pipelines and automation scripts.
- Data Modeling: Strong understanding of data warehousing concepts and data modeling techniques (e.g., Kimball, Inmon, Data Vault).
- SQL Mastery: Expert-level SQL skills for complex querying, data manipulation, and performance optimization.
- DevOps Mindset: Experience with CI/CD practices and tools (e.g., Azure DevOps, GitHub Actions) for data pipelines and infrastructure (IaC using Terraform or Bicep is a plus).