Home
Jobs

Senior Data Engineer - Azure Data Services

7 years

0 Lacs

Posted:1 week ago| Platform: Linkedin logo

Apply

Work Mode

Remote

Job Type

Full Time

Job Description

Job Title :

Senior Data Engineer Azure, ETL, Snowflake.

Experience :

7+ yrs.

Location :

Remote.

Job Summary

We are seeking a highly skilled and experienced Senior Data Engineer with a strong background in ETL processes, Cloud data platforms (Azure), Snowflake, SQL, and Python scripting.The ideal candidate will have hands-on experience building robust data pipelines, performing data ingestion from multiple sources, and working with modern data tools like ADF, Databricks, Fivetran, and DBT.

Key Responsibilities

  • Develop and maintain end-to-end data pipelines using Azure Data Factory, Databricks, and Snowflake.
  • Write optimized SQL queries, stored procedures, and views to transform and retrieve data.
  • Perform data ingestion and integration from various formats including JSON, XML, Parquet, TXT, XLSX, etc.
  • Work on data mapping, modelling, and transformation tasks across multiple data sources.
  • Build and deploy custom connectors using Python, PySpark, or ADF.
  • Implement and manage Snowflake as a data storage and processing solution.
  • Collaborate with cross-functional teams to ensure code promotion and versioning using GitHub.
  • Ensure smooth cloud migration and data pipeline deployment using Azure services.
  • Work with Fivetran and DBT for ingestion and transformation as required.
  • Participate in Agile/Scrum ceremonies and follow DevSecOps practices.

Mandatory Skills & Qualifications

  • 7 years of experience in Data Engineering, ETL development, or similar roles.
  • Proficient in SQL with strong understanding of joins, filters, and aggregations.
  • Solid programming skills in Python (Functions, Loops, API requests, JSON parsing, etc.
  • Strong experience with ETL tools such as Informatica, Talend, Teradata, or DataStage.
  • Experience with Azure Cloud Services, specifically :
  • Azure Data Factory (ADF).
  • Databricks.
  • Azure Data Lake.
  • Hands-on experience in Snowflake implementation (ETL or Storage Layer).
  • Familiarity with data modelling, data mapping, and pipeline creation.
  • Experience working with semi-structured/unstructured data formats.
  • Working knowledge of GitHub for version control and code management.

Good To Have / Preferred Skills

  • Experience using Fivetran and DBT for ingestion and transformation.
  • Knowledge of AWS or GCP cloud environments.
  • Familiarity with DevSecOps processes and CI/CD pipelines within Azure.
  • Proficiency in Excel and Macros.
  • Exposure to Agile methodologies (Scrum/Kanban).
  • Understanding of custom connector creation using PySpark or ADF.

Soft Skills

  • Strong analytical and problem-solving skills.
  • Effective communication and teamwork abilities.
  • Ability to work independently and take ownership of deliverables.
  • Detail-oriented with a commitment to quality.

Why Join Us?

  • Work on modern, cloud-based data platforms.
  • Exposure to a diverse tech stack and new-age data tools.
  • Flexible remote working opportunity aligned with a global team.
  • Opportunity to work on critical enterprise-level data solutions.
(ref:hirist.tech)

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You