Home
Jobs

1 Databricks Pipeline Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 6.0 years

6 - 16 Lacs

Pune

Work from Office

Naukri logo

Skills: Performance Testing, Databricks Pipeline Key Responsibilities: Design and execute performance testing strategies specifically for Databricks-based data pipelines. Identify performance bottlenecks and provide optimization recommendations across Spark/Databricks workloads. Collaborate with development and DevOps teams to integrate performance testing into CI/CD pipelines. Analyze job execution metrics, cluster utilization, memory/storage usage, and latency across various stages of data pipeline processing. Create and maintain performance test scripts, frameworks, and dashboards using tools like JMeter, Locust, or custom Python utilities. Generate detailed performance reports and suggest tuning at the code, configuration, and platform levels. Conduct root cause analysis for slow-running ETL/ELT jobs and recommend remediation steps. Participate in production issue resolution related to performance and contribute to RCA documentation. Technical Skills: Mandatory: Strong understanding of Databricks, Apache Spark, and performance tuning techniques for distributed data processing systems. Hands-on experience in Spark (PySpark/Scala) performance profiling, partitioning strategies, and job parallelization. 2+ years of experience in performance testing and load simulation of data pipelines. Solid skills in SQL, Snowflake, and analyzing performance via query plans and optimization hints. Familiarity with Azure Databricks, Azure Monitor, Log Analytics, or similar observability tools. Proficient in scripting (Python/Shell) for test automation and pipeline instrumentation. Experience with DevOps tools such as Azure DevOps, GitHub Actions, or Jenkins for automated testing. Comfortable working in Unix/Linux environments and writing shell scripts for monitoring and debugging. Good to Have: Experience with job schedulers like Control-M, Autosys, or Azure Data Factory trigger flows. Exposure to CI/CD integration for automated performance validation. Understanding of network/storage I/O tuning parameters in cloud-based environments.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies