Data Engineer
Our Enterprise Data Analytics (EDA) is looking for a Data Engineer to join our growing data engineering team. We are a globally distributed remote first team. You ll work in a collaborative Agile environment using the latest engineering best practices with involvement in all aspects of the software development lifecycle. You will craft and develop curated data products, applying standard architectural data modeling practices . You will be primarily developing Data Warehouse Solutions using technologies such as dbt, Airflow, terraform.
What you get to do every single day: -
Collaborate with team members and business partners to collect business requirements, define successful analytics outcomes and design data models
-
Use best engineering practices such as version control system, CI/CD, code review, pair programming
-
Design, build, and maintain ELT pipelines in Enterprise Data Warehouse to ensure reliable business reporting
-
Design build ELT based data models using SQL dbt
-
Build analytics solutions that provide practical insights into customer 360, finance, product, sales and other key business domains
-
Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery
-
Work with data and analytics experts to strive for greater functionality in our data systems
What you bring to the role: Basic Qualifications
-
3+ years of data / analytics engineering experience building, working maintaining data pipelines ETL processes on big data environments
-
Basic knowledge in modern as well as classic Data Modeling - Kimball, Innmon, etc
-
Experience with any of the programming language: Python, Go, Java, Scala, we use primarily Python
-
SQL knowledge and experience working with Cloud columnar databases (We use Snowflake) as well as working familiarity with a variety of databases
-
Familiarity with processes supporting data transformation, data structures, metadata, dependency and workload management
-
Experience performing root cause analysis on internal and external data and processes to answer specific business questions and identify opportunities for improvement.
-
Strong communication skills and be adaptive for changing requirements and tech stack.
Preferred Qualifications
-
Demonstrated experience in one or many business domains
-
1+ completed projects with dbt
-
Proficient knowledge in SQL and/or python
-
Experience using Airflow as data orchestration tool
What does our data stack looks like: -
ELT (Snowflake, dbt, Airflow, Kafka, HighTouch)
-
BI (Tableau, Looker)
-
Infrastructure (GCP, AWS, Kubernetes, Terraform, Github Actions)