Posted:4 days ago|
Platform:
Remote
Full Time
Mode: Remote
Education and Work Experience Requirements:
Key Responsibilities:
Databricks Platform: Act as a subject matter expert for the Databricks platform within the Digital
Capital team, provide technical guidance, best practices, and innovative solutions.
Databricks Workflows and Orchestration: Design and implement complex data pipelines using
Azure Data Factory or Qlik replicate.
End-to-End Data Pipeline Development: Design, develop, and implement highly scalable and
efficient ETL/ELT processes using Databricks notebooks (Python/Spark or SQL) and other
Databricks-native tools.
Delta Lake Expertise: Utilize Delta Lake for building reliable data lake architecture, implementing
ACID transactions, schema enforcement, time travel, and optimizing data storage for performance.
Spark Optimization: Optimize Spark jobs and queries for performance and cost efficiency within
the Databricks environment. Demonstrate a deep understanding of Spark architecture, partitioning,
caching, and shuffle operations.
Data Governance and Security: Implement and enforce data governance policies, access
controls, and security measures within the Databricks environment using Unity Catalog and other
Databricks security features.
Collaborative Development: Work closely with data scientists, data analysts, and business
stakeholders to understand data requirements and translate them into Databricks based data
solutions.
Monitoring and Troubleshooting: Establish and maintain monitoring, alerting, and logging for Databricks jobs and clusters, proactively identifying and resolving data pipeline issues. Code Quality and Best Practices: Champion best practices for Databricks development, including version control (Git), code reviews, testing frameworks, and documentation. Performance Tuning: Continuously identify and implement performance improvements for existing Databricks data pipelines and data models. Cloud Integration: Experience integrating Databricks with other cloud services (e.g., Azure Data Lake Storage Gen2, Azure Synapse Analytics, Azure Key Vault) for a seamless data ecosystem. Traditional Data Warehousing & SQL: Design, develop, and maintain schemas and ETL processes for traditional enterprise data warehouses. Demonstrate expert-level proficiency in SQL for complex data manipulation, querying, and optimization within relational database systems.
solutions.
Lake Storage Gen2, Azure Synapse Analytics, Azure Key Vault) for a seamless data ecosystem
Additional Information:
Qualifications - BE, MS, M.Tech or MCA.
Certifications: Databricks Certified Associat
Pradeepit Consulting Services
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Practice Python coding challenges to boost your skills
Start Practicing Python Now14.0 - 19.0 Lacs P.A.
14.0 - 16.0 Lacs P.A.
25.0 - 30.0 Lacs P.A.
45.0 - 70.0 Lacs P.A.
20.0 - 25.0 Lacs P.A.
Hyderabad
40.0 - 45.0 Lacs P.A.
Noida, New Delhi, Pune, Chennai, Bengaluru, Vadodara
22.5 - 27.5 Lacs P.A.
Hyderabad
6.0 - 11.0 Lacs P.A.
30.0 - 35.0 Lacs P.A.
Chennai
45.0 - 50.0 Lacs P.A.