Jobs
Interviews

2 Delta Sharing Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 13.0 years

20 - 35 Lacs

Noida, Gurugram

Hybrid

Primary Skills Databricks Engineer: 7+ yrs of total experience with around 3 yrs in leading and overseeing the design, development, and management of the data infrastructure on the Databricks platform within an AWS / Azure cloud environment. Create new Databricks workspaces (premium, standard, serverless) and clusters including right sizing Drop unused workspaces Delta Sharing: Work with enterprise teams on connected data (data sharing) User Management \ Create new security groups and add/delete users Assign Unity Catalog permissions to respective groups/teams Review and analyze Databricks logs and error messages. Identify and address problems related to cluster configuration or job failures. Optimize Databricks notebooks and jobs for performance. Develop and test Databricks clusters to ensure stability and scalability. Outlines the security and compliance obligations of Databricks. Creating and maintaining database standards and policies. Administering database objects to achieve optimum utilization. Mentor team members on cluster management, job optimization, and resource allocation within Databricks environments Ensure adherence to compliance standards and maintain platform security Drive adoption of advanced capabilities in Databricks like Photon and Graviton instances for improved efficiency Regularly update and refine existing architectures to meet changing business and technology needs Cloud computing expertise : A strong understanding of cloud computing services, including cloud infrastructure, software, and platform as a service (IaaS, SaaS, and PaaS). Proficiency in cloud platforms (AWS, Azure), networking, security, programming, scripting, database management, automation tools. Secondary / or Good to Have Skills. Experience on DevOps: Preferably Terraform and GIT to develop WF for automation.

Posted 2 weeks ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Noida, Pune, Bengaluru

Hybrid

Work Mode: Hybrid (3 days WFO) Locations: Bangalore, Noida, Pune, Mumbai, Hyderabad (Candidates must be in Accion cities to collect assets and attend in-person meetings as required). Key Requirements: Technical Skills: Databricks Expertise: 5+ years of hands-on experience in data engineering/ETL using Databricks on AWS/Azure cloud infrastructure. Proficiency in Delta Lake, Unity Catalog, Delta Sharing, Delta Live Tables (DLT), MLflow, and Databricks SQL. Experience with Databricks CI/CD tools (e.g., BitBucket, GitHub Actions, Databricks CLI). Data Warehousing & Engineering: Strong understanding of data warehousing concepts (Dimensional, SCD2, Data Vault, OBT, etc.). Proven ability to implement highly performant data ingestion pipelines from multiple sources. Experience integrating end-to-end Databricks pipelines to ensure data quality and consistency. Programming: Strong proficiency in Python and SQL. Basic working knowledge of API or stream-based data extraction processes (e.g., Salesforce API, Bulk API). Cloud Technologies: Preferred experience with AWS services (e.g., S3, Athena, Glue, Lambda). Power BI: 3+ years of experience in Power BI and data warehousing for root cause analysis and business improvement opportunities. Additional Skills: Working knowledge of Data Management principles (quality, governance, security, privacy, lifecycle management, cataloging). Nice to have: Databricks certifications and AWS Solution Architect certification. Nice to have: Experience with building data pipelines from business applications like Salesforce, Marketo, NetSuite, Workday, etc. Responsibilities: Develop, implement, and maintain highly efficient ETL pipelines on Databricks. Perform root cause analysis and identify opportunities for data-driven business improvements. Ensure quality, consistency, and governance of all data pipelines and repositories. Work in an Agile/DevOps environment to deliver iterative solutions. Collaborate with cross-functional teams to meet business requirements. Stay updated on the latest Databricks and AWS features, tools, and best practices. Work Schedule: Regular: 11:00 AM to 8:00 PM. Flexibility is required for project-based overlap. Interested candidates should share their resumes with the following details: Current CTC Expected CTC Preferred Location: Bangalore, Noida, Pune, Mumbai, Hyderabad Notice Period Contact Information:

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies