Job
Description
Job Summary: • We are seeking an experienced Lead Snowflake Data Engineer to join our Data & Analytics team. This role involves designing, implementing, and optimizing Snowflake-based data solutions while providing strategic direction and leadership to a team of junior and mid-level data engineers. The ideal candidate will have deep expertise in Snowflake, cloud data platforms, ETL/ELT processes, and Medallion data architecture best practices. The lead data engineer role has a strong focus on performance optimization, security, scalability, and Snowflake credit control and management. This is a tactical role requiring independent in-depth data analysis and data discovery to understand our existing source systems, fact and dimension data models, and implement an enterprise data warehouse solution in Snowflake. Essential Functions and Tasks: • Lead the design, development, and maintenance of a scalable Snowflake data solution serving our enterprise data & analytics team. • Architect and implement data pipelines, ETL/ELT workflows, and data warehouse solutions using Snowflake and related technologies. • Optimize Snowflake database performance, storage, and security. • Provide guidance on Snowflake best practices • Collaborate with cross-functional teams of data analysts, business analysts, data scientists, and software engineers, to define and implement data solutions. • Ensure data quality, integrity, and governance across the organization. • Provide technical leadership and mentorship to junior and mid-level data engineers. • Troubleshoot and resolve data-related issues, ensuring high availability and performance of the data platform. Education and Experience Requirements: • Bachelors or Master’s degree in Computer Science, Information Systems, or a related field. • 7+ years of experience in-depth data engineering, with at least 3+ minimum years of dedicated experience engineering solutions in a Snowflake environment. • Tactical expertise in ANSI SQL, performance tuning, and data modeling techniques. • Strong experience with cloud platforms (preference to Azure) and their data services. • Proficiency in ETL/ELT development using tools such as Azure Data Factory, dbt, Matillion, Talend, or Fivetran. • Hands-on experience with scripting languages like Python for data processing. • Strong understanding of data governance, security, and compliance best practices. • Snowflake SnowPro certification; preference to the engineering course path. • Experience with CI/CD pipelines, DevOps practices, and Infrastructure as Code (IaC). • Knowledge of streaming data processing frameworks such as Apache Kafka or Spark Streaming. • Familiarity with BI and visualization tools such as PowerBI Knowledge, Skills, and Abilities: • Familiarity working in an agile scum team, including sprint planning, daily stand-ups, backlog grooming, and retrospectives. • Ability to self-manage large complex deliverables and document user stories and tasks through Azure Dev Ops. • Personal accountability to committed sprint user stories and tasks • Strong analytical and problem-solving skills with the ability to handle complex data challenges • Ability to read, understand, and apply state/federal laws, regulations, and policies. • Ability to communicate with diverse personalities in a tactful, mature, and professional manner. • Ability to remain flexible and work within a collaborative and fast paced environment. • Understand and comply with company policies and procedures. • Strong oral, written, and interpersonal communication skills. • Strong time management and organizational skills. Physical Demands: • 40 hours per week • Occasional Standing • Occasional Walking • Sitting for prolonged periods of time • Frequent hand, finger movement • Communicate verbally and in writing • Extensive use of computer keyboard and viewing of computer screen • Specific vision abilities required by this job include close vision