Data Platform Engineer

2 - 5 years

9 - 13 Lacs

Posted:17 hours ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description


 About The Role  

Project Role :
Data Platform Engineer

Project Role Description :
Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models.
Must have skills :Databricks Unified Data Analytics Platform

Good to have skills :
NA
Minimum 5 year(s) of experience is required

Educational Qualification :
15 years full time education
Summary:As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the overall data architecture strategy. You will be involved in various stages of the data platform lifecycle, ensuring that all components work seamlessly together to support the organization's data needs and objectives. Roles & Responsibilities:Develop high-quality, scalable ETL/ELT pipelines using Databricks technologies including Delta Lake, Auto Loader, and DLT.Excellent programming and debugging skills in Python.Strong hands-on experience with PySpark to build efficient data transformation and validation logic.Must be proficient in at least one cloud platform:AWS, GCP, or Azure.Create modular dbx functions for transformation, PII masking, and validation logic reusable across DLT and notebook pipelines.Implement ingestion patterns using Auto Loader with checkpointing and schema evolution for structured and semi-structured data.Build secure and observable DLT pipelines with DLT Expectations, supporting Bronze/Silver/Gold medallion layering.Configure Unity Catalog:set up catalogs, schemas, user/group access, enable audit logging, and define masking for PII fields.Enable secure data access across domains and workspaces via Unity Catalog External Locations, Volumes, and lineage tracking.Access and utilize data assets from the Databricks Marketplace to support enrichment, model training, or benchmarking.Collaborate with data sharing stakeholders to implement Delta Sharing both internally and externally.Integrate Power BI/Tableau/Looker with Databricks using optimized connectors (ODBC/JDBC) and Unity Catalog security controls.Build stakeholder-facing SQL Dashboards within Databricks to monitor KPIs, data pipeline health, and operational SLAs.Prepare GenAI-compatible datasets:manage vector embeddings, index with Databricks Vector Search, and use Feature Store with MLflow.Package and deploy pipelines using Databricks Asset Bundles through CI/CD pipelines in GitHub or GitLab.Troubleshoot, tune, and optimize jobs using Photon engine and serverless compute, ensuring cost efficiency and SLA reliability.Experience with cloud-based services relevant to data engineering, data storage, data processing, data warehousing, real-time streaming, and serverless computing.Hands on Experience in applying Performance optimization techniquesUnderstanding data modeling and data warehousing principles is essential. Good to Have:1.Certifications:Databricks Certified Professional or similar certifications.2.Machine Learning:Knowledge of machine learning concepts and experience with popular ML libraries.3.Knowledge of big data processing (e.g., Spark, Hadoop, Hive,Kafka)4.Data Orchestration:Apache Airflow.5.Knowledge of CI/CD pipelines and DevOps practices in a cloud environment.6.Experience with ETL tools like Informatica, Talend, Matillion, or Fivetran.7.Familiarity with dbt (Data Build Tool)Education Qualification:- 15 years full time education is required.
 Qualification 15 years full time education

Mock Interview

Practice Video Interview with JobPe AI

Start PySpark Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Accenture logo
Accenture

Professional Services

Dublin

RecommendedJobs for You