4 - 9 years

2 - 6 Lacs

Posted:1 month ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Data Intelligence Platform team is responsible for housing the enterprise data within Lululemon. The enterprise data helps teams to build intelligent applications, optimized processes, and personalized guest experiences. The team serves as a source of reference for cross functional reporting, marketing, and data science teams. A successful candidate will be a problem solver and an expert in ETL programming/scripting, data modelling, data integration, SQL and have exemplary communication skills. The candidate will need to be comfortable with ambiguity in a fast-paced and ever-changing environment, and able to think big while paying careful attention to detail. The candidate will know, and love working with new technologies, can model multidimensional datasets, and can partner with cross functional business teams to answer key business questions. Apart from building data pipelines, you will be an advocate for automation, performance tuning and cost optimization. Be ready to question the status quo and bring forth intelligent solutions and proof of concepts. Key Responsibilities Understand requirements and take ownership of the end-to-end delivery of the data product/pipelines. Build robust and scalable data pipelines using ADF, Snowflake, Airflow and PySpark. Embrace automation, reduce cost, and human dependencies wherever possible. Build frameworks and reusable components. Bring a product development mindset to every aspect of data engineering development activities. Qualifications Bachelor s degree in Computer Science, Mathematics, Statistics, Operations Research. 4+ years of experience as a Data Engineer or in a similar role. Experience with Databricks, Airflow, Azure Data Factory, and PySpark. Proficiency in Python. SQL proficiency with expertise in Snowflake. Experience with real-time data processing frameworks such as Kafka Streams. Strong knowledge of ETL/ELT processes, data modeling, and data warehousing. Working knowledge of DevOps tools (Jenkins, Azure DevOps, GitLab) and version control (Git). Experience with any of cloud systems Azure, AWS or GCP. Experience with data modeling, data warehousing, and building ETL pipelines. Knowledge of data management fundamentals, data storage principles, ETL, Data Modeling, and Data Architecture. Co-ordinating with global engineering Team. Excellent problem-solving skills, combined with the ability to present your findings/insights clearly and compellingly in both verbal and written form.

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You