AWS Databricks

6 - 10 years

27 - 42 Lacs

Posted:4 days ago| Platform: Naukri logo

Apply

Work Mode

Work from Office

Job Type

Full Time

Job Description

Job Summary

We are seeking a AWS Databricks with 6 to 10 years of experience to join our team. The ideal candidate will have expertise in Spark in Scala Delta Sharing and Databricks Unity Catalog . This role involves working with cutting-edge technologies like Databricks CLI Delta Live Pipelines and Structured Streaming. The candidate will play a crucial role in managing risk and ensuring data integrity using tools such as Apache Airflow Amazon S3 and Python. The position is hybrid with no travel
 

Responsibilities

  • Develop and maintain scalable data pipelines using Spark in Scala to ensure efficient data processing and analysis.
  • Implement Delta Sharing and Databricks Unity Catalog to manage and secure data access across the organization.
  • Utilize Databricks CLI and Delta Live Pipelines to automate data workflows and improve operational efficiency.
  • Design and execute Structured Streaming processes to handle real-time data ingestion and processing.
  • Apply risk management strategies to identify and mitigate potential data-related risks.
  • Integrate Apache Airflow for orchestrating complex data workflows and ensuring seamless data operations.
  • Leverage Amazon S3 for data storage solutions ensuring high availability and durability of data assets.
  • Utilize Python for scripting and automation tasks to enhance productivity and streamline processes.
  • Develop and optimize Databricks SQL queries to extract meaningful insights from large datasets.
  • Implement Databricks Delta Lake to ensure data reliability and consistency across various data sources.
  • Manage Databricks Workflows to automate and schedule data tasks improving overall data management efficiency.
  • Collaborate with cross-functional teams to ensure alignment on data strategies and objectives.
  • Contribute to the continuous improvement of data practices and methodologies to support the companys mission. 

     

Qualifications

  • Possess strong expertise in Spark in Scala and Databricks technologies.
  • Demonstrate proficiency in Delta Sharing and Databricks Unity Catalog .
  • Have experience with Databricks CLI and Delta Live Pipelines.
  • Show capability in Structured Streaming and risk management.
  • Be skilled in Apache Airflow and Amazon S3.
  • Have a strong command of Python for data-related tasks.
  • Be familiar with Databricks SQL and Delta Lake.
  • Understand Databricks Workflows and their applications.
  • Exhibit problem-solving skills and attention to detail.
  • Be able to work in a hybrid model with a focus on day shifts.
  • Have excellent communication and collaboration skills.
  • Be committed to continuous learning and professional development.
  • Be adaptable to changing technologies and business needs.

Certifications Required

Databricks Certified Associate Developer for Apache Spark

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now
Cognizant logo
Cognizant

IT Services and IT Consulting

Teaneck New Jersey

RecommendedJobs for You

Hyderabad, Pune, Bengaluru

Hyderabad, Telangana, India

Bengaluru / Bangalore, Karnataka, India

Gurugram, Haryana, India

mumbai, hyderabad, bengaluru

pune, ahmedabad, delhi / ncr