Home
Jobs

598 Azure Databricks Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

12 - 19 Lacs

Pune

Hybrid

Naukri logo

This is Only for Pune Local Candidates ( Not for Relocation Candidates) Role : Data Engineer This is C2H Role Experience : 3- 8 yrs Location : Kharadi , Pune Excellent Communication SKills NP: Immediate joiner to 1 m Primary Skills Python, document intelligence, NLP, unstructured data extraction (desirable to have OpenAI and prompt engineering) Secondary Skills Azure infra experiences and data bricks Mandatory Skills Data Infrastructure & Engineering Designing, building, productionizing, and maintaining scalable and reliable data infrastructure and data products. Experience with data modeling, pipeline idempotency, and operational observability 2.Programming Languages: Proficiency in one or more object-oriented programming languages such as: Python Scala Java C# 3.Database Technology : Strong experience with: SQL and NoSQL databases Query structures and design best practices Scalability, readability, and reliability in database design 4.Distributed Systems Experience implementing large-scale distributed systems in collaboration with senior team members. 5. . Software Engineering Best Practices Technical design and reviews Unit testing, monitoring, and alerting Code versioning, code reviews, and documentation CI/CD pipeline development and maintenance 6.Security & Compliance Deploying secure and well-tested software and data assets Meeting privacy and compliance requirement 7.Site Reliability Engineering Service reliability, on-call rotations, defining and maintaining SLAs Infrastructure as code and containerized deployments Job Description : Able to enrich data by data transformation and joining with other datasets. Able to analyze data and derive statistical insights. Able to convey story through data visualization. Ability to build Data pipelines for diverse interfaces. Good understating of API workflow. Technical Skills : AWS Data Lake and AWS data hub and AWS cloud platform. Interested Candidate Share Resume at dipti.bhaisare@in.experis.com

Posted 6 hours ago

Apply

8.0 - 13.0 years

20 - 30 Lacs

Chennai

Remote

Naukri logo

Job Summary: We are seeking a highly skilled Azure Solution Architect to design, implement, and oversee cloud-based solutions on Microsoft Azure. The ideal candidate will have a deep understanding of cloud architecture, a strong technical background, and the ability to align Azure capabilities with business needs. You will lead the architecture and design of scalable, secure, and resilient Azure solutions across multiple projects. Role & responsibilities: Design end-to-end data architectures on Azure using Microsoft Fabric, Data Lake (ADLS Gen2), Azure SQL/Synapse, and Power BI. Lead the implementation of data integration and orchestration pipelines using Azure Data Factory and Fabric Data Pipelines. Architect Lakehouse/Data Warehouse solutions for both batch and real-time processing, ensuring performance, scalability, and cost optimization. Establish data governance, lineage, and cataloging frameworks using Microsoft Purview and other observability tools. Enable data quality, classification, and privacy controls aligned with compliance and regulatory standards. Drive adoption of event-driven data ingestion patterns using Event Hubs, Event Grid, or Stream Analytics. Provide architectural oversight on reporting and visualization solutions using Power BI integrated with Fabric datasets and models. Define architecture standards, data models, and reusable components to accelerate project delivery. Collaborate with data stewards, business stakeholders, and engineering teams to define functional and non-functional requirements. Support CI/CD, infrastructure as code, and DevOps for data pipelines using Azure DevOps or GitHub Actions. Lead Proof of Concepts (PoCs) and performance evaluations for emerging Azure data services and tools. Monitor system performance, data flow, and health using Azure Monitor and Fabric observability capabilities. Required Qualifications: Bachelors degree in Computer Science, Data Engineering, or a related field. 5+ years of experience as a data architect or solution architect in cloud data environments. 3+ years of hands-on experience designing and implementing data solutions on Microsoft Azure . Strong hands-on expertise with: Azure Data Factory Microsoft Fabric (Data Engineering, Data Warehouse, Real-Time Analytics, Power BI) Azure Data Lake (ADLS Gen2), Azure SQL, and Synapse Analytics Power BI for enterprise reporting and data modeling Experience with data governance and cataloging tools , ideally Microsoft Purview. Proficient in data modeling techniques (dimensional, normalized, or data vault). Strong understanding of security, RBAC, data encryption, Key Vault, and privacy requirements in Azure. Preferred Qualifications: Microsoft Certified: Azure Solutions Architect Expert (AZ-305) or Azure Enterprise Data Analyst Associate (DP-500) . Hands-on experience with Microsoft Fabric end-to-end implementation. Familiarity with medallion architecture , delta lake, and modern lakehouse principles. Experience in Agile/Scrum environments and stakeholder engagement across business and IT. Strong communication skills, with the ability to explain complex concepts to both technical and non-technical audiences.

Posted 7 hours ago

Apply

7.0 - 10.0 years

22 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Exp - 6 to 9 Years Role - Azure Data Engineer Position - Permanent FTE Company - Data Analytics MNC Locations - Pune, Hyderabad, Bengaluru Mode - Hybrid (2-3 days from office) MUST HAVE - - Very Strong Python Coding skills of atleast 3 years hands-on - Excellent SQL skills of atleast 3 years hands-on - Strong PySpark skills - In Depth hands-on experience in Azure Databricks & Data Factory - Strong knowledge of Datawarehouse Important Note - Candidates must have PF in all companies worked thought out career and under One UAN.

Posted 7 hours ago

Apply

8.0 - 12.0 years

5 - 10 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Teradata to Snowflake and Databricks on Azure Cloud,data migration projects, including complex migrations to Databricks,Strong expertise in ETL pipeline design and optimization, particularly for cloud environments and large-scale data migration

Posted 8 hours ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

We are seeking a skilled Business Analyst with 46 years of experience, including at least 2 years in Azure Data Engineering projects, for a 6-month remote full-time role. The ideal candidate will work closely with stakeholders to gather and analyze business and technical requirements, collaborate with Azure Data Engineers, and support design decisions across data integration, transformation, and storage layers. Strong SQL skills, understanding of data governance, and experience in data platforms are essential. Excellent communication and stakeholder management skills are required. Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune,Remote

Posted 9 hours ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

Noida, Pune, Bengaluru

Work from Office

Naukri logo

Hi All, Happiest Minds Technologies are looking for Azure Databricks Developers & Architects. The mode of interview will be face to face on 28th June. Interview locations- Pune, Bangalore, Noida 1. Azure Databricks 5 to 10 Yrs As a Senior Azure Data Engineer, you will leverage Azure technologies to drive data transformation, analytics, and machine learning. You will design scalable Databricks data pipelines using PySpark, transforming raw data into actionable insights. Your role includes building, deploying, and maintaining machine learning models using MLlib or TensorFlow while optimizing cloud data integration from Azure Blob Storage, Data Lake, and SQL/NoSQL sources. You will execute large-scale data processing using Spark Pools, fine-tuning configurations for efficiency. The ideal candidate holds a Bachelors or Masters in Computer Science, Data Science, or a related field, with 7+ years in data engineering and 3+ years specializing in Azure Databricks, PySpark, and Spark Pools. Proficiency in Python PySpark, Pandas, NumPy, SciPy, Spark SQL, DataFrames, RDDs, Delta Lake, Databricks Notebooks, and MLflow is required, along with hands-on experience in Azure Data Lake, Blob Storage, and Synapse Analytics. 2. Azure Data Bricks Architect - 10 to 15 Yrs Key Responsibilities: Architect and design end-to-end data solutions on Azure, with a focus on Databricks. Lead data architecture initiatives, ensuring alignment with best practices and business objectives. Collaborate with stakeholders to define data strategies, architectures, and roadmaps. Migrate and transform data from Oracle to Azure Data Lake. Ensure data solutions are secure, reliable, and scalable. Provide technical leadership and mentorship to junior team members. Required Skills: Extensive experience with Azure Data Services, including Azure Data Factory, Azure SQL Data Warehouse. Deep expertise in Databricks, including Spark, Delta Lake. Strong understanding of data architecture principles and best practices. Proven track record of leading large-scale data projects and initiatives. Design data integration strategies, ensuring seamless integration between Azure services and on-premises/cloud applications. Optimize performance and cost efficiency for Databricks clusters, data pipelines, and storage systems. Monitor and manage cloud resources to ensure high availability, performance and scalability. Should have experience in setting up and configuring Azure DevOps. Excellent communication and collaboration skills. Interested candidates can send your resume at sreemoy.p.das@happiestminds.com

Posted 10 hours ago

Apply

7.0 - 12.0 years

20 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

TECHNICAL SKILLS AND EXPERIENCE Most important: 7+ years professional experience as a data engineer, with at least 4 utilizing cloud technologies. Proven experience building ETL or ETL data pipelines with Databricks either in Azure or AWS using PySpark language. Strong experience with the Microsoft Azure Data Stack (Databricks, Data Lake Gen2, ADF etc.) Strong SQL skills and proficiency in Python adhering to standards such as PEP Proven experience with unit testing and applying appropriate testing methodologies using libraries such as Pytest, Great Expectations, or similar. Demonstrable experience with CICD including release and test automation tools and processes such as Azure Devops, Terraform, Powershell and Bash scripting or similar. Strong understanding of data modeling, data warehousing, and OLAP concepts. Excellent technical documentation skills. Preferred candidate profile

Posted 11 hours ago

Apply

5.0 - 10.0 years

18 - 24 Lacs

Bengaluru

Work from Office

Naukri logo

Hiring Senior Data Engineer (5+ yrs) with expertise in Azure Data Factory, Databricks, PySpark, AWS. Build scalable ETL pipelines. Location: Bangalore/Hyderabad/Chennai. Immediate to 30 days joiners. share your resume to vadiraj@vtrickstech.com Health insurance Provident fund

Posted 12 hours ago

Apply

8.0 - 13.0 years

20 - 35 Lacs

Hyderabad

Work from Office

Naukri logo

We are looking for an experienced Azure Data Engineer with 2+ years of hands-on experience in Azure Data Lake and Azure Data Factory. The ideal candidate will have a strong background in connecting data sources to the Data Lake, writing PiSpark SQL codes, and building SSIS packages. Additionally, experience in data architecture, data modeling, and creating visualizations is essential. Key Responsibilities : Work with Azure Data Lake and Azure Data Factory to design, implement, and manage data pipelines. Connect various data sources (applications, databases, etc.) to the Azure Data Lake for storage and processing. Write PiSpark SQL codes and SSIS packages for data retrieval and transformation from different data sources. Design and develop efficient Data Architecture and Data Modeling solutions to support business requirements. Create data visualizations to communicate insights to stakeholders and decision-makers. Optimize data workflows and pipelines for better performance and scalability. Collaborate with cross-functional teams to ensure seamless data integration and delivery. Ensure data integrity, security, and compliance with best practices. Skills and Qualifications : 2+ years of experience working with Azure Data Lake, Azure Data Factory, and related Azure services. Proficiency in writing PiSpark SQL codes for data extraction and transformation. Experience in developing SSIS packages for data integration and automation. Strong understanding of Data Architecture and Data Modeling concepts. Experience in creating effective and insightful data visualizations using tools like Power BI or similar. Familiarity with cloud-based storage and computing concepts and best practices. Strong problem-solving skills with an ability to troubleshoot and optimize data workflows. Ability to collaborate effectively in a team environment and communicate with stakeholders. Preferred Qualifications : Certifications in Azure (e.g., Azure Data Engineer or similar) would be a plus. Experience with other Azure tools like Azure Synapse, Databricks, etc.

Posted 13 hours ago

Apply

4.0 - 8.0 years

10 - 20 Lacs

Gurugram

Remote

Naukri logo

The Role: We are seeking a Cloud Infrastructure and DevOps Engineer with a strong background in designing, automating, and maintaining secure, scalable cloud environments on Microsoft Azure. The ideal candidate will have hands-on experience with Azure Data Lake, Azure Data Factory, and Databricks, and will play a key role in supporting data platform operations through infrastructure as-code, CI/CD automation, and monitoring best practices. Key Responsibilities: Design and implement cloud infrastructure to support data engineering workloads across Azure Data Lake, Azure Data Factory, and Databricks. Develop and maintain infrastructure-as-code using tools like Terraform. Automate build, release, and deployment pipelines using Azure DevOps or GitHub Actions. Set up and maintain monitoring, alerting, and logging for Azure data services to ensure performance and reliability. Manage role-based access control (RBAC), service principals, and security configurations for Azure resources. Ensure high availability, disaster recovery, and backup configurations are in place across critical workloads. Collaborate with data engineers and architects to optimise pipeline orchestration and resource provisioning. Implement governance, cost optimization, and compliance across cloud environments. Provide ongoing support and enhancements post-deployment. Qualifications And Experience: Strong hands-on experience with Microsoft Azure, especially deploying and managing Azure Data Lake, Azure Data Factory (ADF), and Azure Databricks. Proficiency in infrastructure-as-code (IaC) using tools such as Terraform. Experience building CI/CD pipelines in Azure DevOps or equivalent tools (GitHub Actions, Jenkins). Knowledge of containerization (Docker), and orchestration (Kubernetes or Azure Kubernetes Service - AKS) is advantageous. Familiarity with Azure networking, identity and access management, and security best practices. Comfortable working with scripting languages like PowerShell, Bash, or Python. Proven ability to analyse complex problems and deliver practical solutions Strong written and verbal communication skills to interact with both technical and non- technical stakeholders CERTIFICATIONS: Microsoft Certified: Azure Administrator Associate (AZ-104) Highly preferred Microsoft Certified: DevOps Engineer Expert (AZ-400) Highly desirable

Posted 13 hours ago

Apply

0.0 - 5.0 years

0 Lacs

Pune

Remote

Naukri logo

The candidate must be proficient in Python, libraries and frameworks. Good with Data Modeling, Pyspark, MySQL concepts, Power BI, AWS, Azure concepts Experience in optimizing large transactional DBs Data, visualization tools, Databricks, fast API.

Posted 1 day ago

Apply

7.0 - 12.0 years

16 - 31 Lacs

Pune, Delhi / NCR, Mumbai (All Areas)

Hybrid

Naukri logo

Job Title: Lead Data Engineer Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyse, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities: Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory, Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from various competencies Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Required Qualifications: 10 Years industry implementation experience with data integration tools such as AWS services Redshift, Athena, Lambda, Glue, S3, ETL, etc. 5-8 years of management experience required 5-8 years consulting experience preferred Minimum of 5 years of data architecture, data modelling or similar experience Bachelor’s degree or equivalent experience, Master’s Degree Preferred Strong data warehousing, OLTP systems, data integration and SDLC Strong experience in orchestration & working experience cloud native / 3rd party ETL data load orchestration Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Strong databricks experience required to create notebooks in pyspark Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.) Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.) Strong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc. Preferred Skills & Experience: Knowledge and working experience with Data Integration processes, such as Data Warehousing, EAI, etc. Experience in providing estimates for the Data Integration projects including testing, documentation, and implementation Ability to analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions. Ability to provide technical direction to other team members including contractors and employees. Ability to contribute to conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together. Proven experience leading team members, directly or indirectly, in completing high-quality major deliverables with superior results Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Can create documentation and presentations such that the they “stand on their own” Can advise sales on evaluation of Data Integration efforts for new or existing client work. Can contribute to internal/external Data Integration proof of concepts. Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered. Ability to work independently on projects as well as collaborate effectively across teams Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success Strong team building, interpersonal, analytical, problem identification and resolution skills Experience working with multi-level business communities Can effectively utilise SQL and/or available BI tool to validate/elaborate business rules. Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues. Effectively influences and, at times, oversees business and data analysis activities to ensure sufficient understanding and quality of data. Demonstrates a complete understanding of and utilises DSC methodology documents to efficiently complete assigned roles and associated tasks. Deals effectively with all team members and builds strong working relationships/rapport with them. Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution. Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point. Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM.

Posted 2 days ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Kolkata, Pune, Bengaluru

Work from Office

Naukri logo

About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Azure Data Engineer Qualification : Any Graduate or above Relevant Experience : 4 to 10 yrs Must Have Skills : Azure, ADB, PySpark Roles and Responsibilites: Strong experience in implementation and management of lake House using Databricks and Azure Tech stack (ADLS Gen2, ADF, Azure SQL) . Strong hands-on expertise with SQL, Python, Apache Spark and Delta Lake. Proficiency in data integration techniques, ETL processes and data pipeline architectures. Demonstrable experience using GIT and building CI/CD pipelines for code management. Develop and maintain technical documentation for the platform. Ensure the platform is developed with software engineering, data analytics and data security practices in mind. Developing and optimizing data processing and data storage systems, ensuring high performance, reliability, and security. Experience working in Agile Methodology and well-knowledgeable in using ADO Boards for Sprint deliveries. Excellent communication skills and able to communicate clearly technical and business concepts both verbally and in writing. Ability to work in a team environment and collaborate with all the levels effectively by sharing ideas and knowledge. Location : Kolkata, Pune, Mumbai, Bangalore, BBSR Notice period : Immediate / 90 days Shift Timing : General Shift Mode of Interview : Virtual Mode of Work : WFO Thanks & Regards Bhavana B Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,India. Direct Number:8067432454 bhavana.b@blackwhite.in |www.blackwhite.in

Posted 2 days ago

Apply

4.0 - 9.0 years

20 - 30 Lacs

Pune, Bengaluru

Hybrid

Naukri logo

Job role & responsibilities:- Understanding operational needs by collaborating with specialized teams Supporting key business operations. This involves in supporting architecture designing and improvements ,Understanding Data integrity and Building Data Models, Designing and implementing agile, scalable, and cost efficiency solution. Lead a team of developers, implement Sprint planning and executions to ensure timely deliveries Technical Skills, Qualification and experience required:- Proficient in Data Modelling 5-10 years of experience in Data Modelling. Exp in Data modeling tools ( Tool - Erwin). building ER diagram Hands on experince into ERwin / Visio tool Hands-on Expertise in Entity Relationship, Dimensional and NOSQL Modelling Familiarity with manipulating dataset using using Python. Exposure of Azure Cloud Services (Azure DataFactory, Azure Devops and Databricks) Exposure to UML Tool like Ervin/Visio Familiarity with tools such as Azure DevOps, Jira and GitHub Analytical approaches using IE or other common Notations Strong hands-on experience on SQL Scripting Bachelors/Master's Degree in Computer Science or related field Experience leading agile scrum, sprint planning and review sessions Good communication and interpersonal skills Good communication skills to coordinate between business stakeholders & engineers Strong results-orientation and time management True team player who is comfortable working in a global team Ability to establish relationships with stakeholders quickly in order to collaborate on use cases Autonomy, curiosity and innovation capability Comfortable working in a multidisciplinary team within a fast-paced environment * Immediate Joiners will be preferred only Outstation candidate's will not be considered

Posted 2 days ago

Apply

5.0 - 8.0 years

22 - 25 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

Key Responsibilities: Design and develop ETL/ELT pipelines using Azure Data Factory , Snowflake , and DBT . Build and maintain data integration workflows from various data sources to Snowflake. Write efficient and optimized SQL queries for data extraction and transformation. Work with stakeholders to understand business requirements and translate them into technical solutions. Monitor, troubleshoot, and optimize data pipelines for performance and reliability. Maintain and enforce data quality, governance, and documentation standards. Collaborate with data analysts, architects, and DevOps teams in a cloud-native environment. Must-Have Skills: Strong experience with Azure Cloud Platform services. Proven expertise in Azure Data Factory (ADF) for orchestrating and automating data pipelines. Proficiency in SQL for data analysis and transformation. Hands-on experience with Snowflake and SnowSQL for data warehousing. Practical knowledge of DBT (Data Build Tool) for transforming data in the warehouse. Experience working in cloud-based data environments with large-scale datasets. Good-to-Have Skills: Experience with Azure Data Lake , Azure Synapse , or Azure Functions . Familiarity with Python or PySpark for custom data transformations. Understanding of CI/CD pipelines and DevOps for data workflows. Exposure to data governance , metadata management , or data catalog tools. Knowledge of business intelligence tools (e.g., Power BI, Tableau) is a plus. Qualifications: Bachelors or Masters degree in Computer Science, Data Engineering, Information Systems, or a related field. 5+ years of experience in data engineering roles using Azure and Snowflake. Strong problem-solving, communication, and collaboration skills.

Posted 3 days ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Key Responsibilities: ¢ Work on client projects to deliver AWS, PySpark, Databricks based Data engineering & Analytics solutions. €¢ Build and operate very large data warehouses or data lakes. €¢ ETL optimization, designing, coding, & tuning big data processes using Apache Spark. €¢ Build data pipelines & applications to stream and process datasets at low latencies. €¢ Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience: €¢ Minimum of 5 years of experience in Databricks engineering solutions on AWS Cloud platforms using PySpark, Databricks SQL, Data pipelines using Delta Lake. €¢ Minimum of 5 years of experience years of experience in ETL, Big Data/Hadoop and data warehouse architecture & delivery. Email at- maya@mounttalent.com

Posted 3 days ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Bengaluru

Remote

Naukri logo

Role & responsibilities Looking for a Data Engineer experienced with Databricks and strong proficiency in PySpark. Must have hands-on experience with Oracle or other relational databases. Proficient in Python, with awareness of web frameworks like Flask or Streamlit. Ability to build scalable data pipelines and support data-driven applications.

Posted 3 days ago

Apply

6.0 - 10.0 years

20 - 25 Lacs

Pune, Mumbai (All Areas)

Work from Office

Naukri logo

Position: Data Engineer Experience: 6 +yrs. Job Location: Pune / Mumbai Job Profile Summary- Azure Databricks and Hands on Pyspark with tuning Azure Data Factory pipelines for various data loading into ADB, perf tuning Azure Synapse Azure Monitoring and Log Analytics ( error handling in ADF pipelines and ADB) Logic Apps and Functions Performance Tuning Databricks, Datafactory and Synapse Databricks data loading (layers ) and Export (which connection options, which best approach for report and access for fast)

Posted 3 days ago

Apply

5.0 - 10.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

Date 1 Jun 2025 Location: Bangalore, KA, IN Company Alstom At Alstom, we understand transport networks and what moves people. From high-speed trains, metros, monorails, and trams, to turnkey systems, services, infrastructure, signalling and digital mobility, we offer our diverse customers the broadest portfolio in the industry. Every day, 80,000 colleagues lead the way to greener and smarter mobility worldwide, connecting cities as we reduce carbon and replace cars. Could you be the full-time JIVS Expert in our IS&T/Processes Solutions Architecture team were looking for Your future role Take on a new challenge and apply your extensive expertise in Azure Blob Storage and JIVS technology in a new cutting-edge field. Youll work alongside innovative, collaborative, and solution-focused teammates. You'll lead the optimization of data management and migration strategies, ensuring seamless transitions of data and maintaining database integrity. Day-to-day, youll work closely with teams across the business (such as Business Stakeholders, IT Infrastructure, and Business Solutions), collaborate with partners relevant to Archiving Projects Delivery, and much more. Youll specifically take care of developing and implementing JIVS solutions, monitoring and maintaining the performance of JIVS applications, and utilizing Azure Blob Storage for efficient data management. Well look to you for: Designing and managing JIVS solutions that align with organizational goals Collaborating with cross-functional teams to analyze system requirements Ensuring the reliability and scalability of JIVS applications Administering and maintaining database systems for high availability and security Executing data migration projects with precision Managing the decommissioning of applications, including data extraction and transfer Creating and maintaining comprehensive documentation All about you We value passion and attitude over experience. Thats why we dont expect you to have every single skill. Instead, weve listed some that we think will help you succeed and grow in this role: Bachelor's degree in Computer Science, Information Technology, or related field Experience or understanding of JIVS implementations and management Knowledge of Azure Blob Storage services and best practices Familiarity with scripting languages and tools in JIVS and Azure environments A certification in database technologies or cloud database solutions is a plus Excellent problem-solving skills and collaborative teamwork abilities Strong communication skills, both verbal and written Things youll enjoy Join us on a life-long transformative journey the rail industry is here to stay, so you can grow and develop new skills and experiences throughout your career. Youll also: Enjoy stability, challenges and a long-term career free from boring daily routines Work with new security standards for rail signalling Collaborate with transverse teams and helpful colleagues Contribute to innovative projects that shape the future of mobility Utilise our flexible and dynamic working environment Steer your career in whatever direction you choose across functions and countries Benefit from our investment in your development, through award-winning learning opportunities Progress towards leadership and specialized technical roles Benefit from a fair and dynamic reward package that recognises your performance and potential, plus comprehensive and competitive social coverage (life, medical, pension) You dont need to be a train enthusiast to thrive with us. We guarantee that when you step onto one of our trains with your friends or family, youll be proud. If youre up for the challenge, wed love to hear from you! Important to note As a global business, were an equal-opportunity employer that celebrates diversity across the 63 countries we operate in. Were committed to creating an inclusive workplace for everyone.

Posted 3 days ago

Apply

5.0 - 7.0 years

11 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

3 - 5 years of analytics experience in Retail/CPG industry covering at least 2 of the below mentioned areas: Customer (insights/trends) Campaigns/CRM Loyalty Marketing To define, develop & support analytical solutions, along with extracting insights from data for improving business decisions of various retail functions (Ex. Marketing, Merchandising, Retail etc.) The recruit will be directly working with the on-shore Analytics team and with key business stakeholders of retail functions in each business unit to understand their business problems around merchandising and help deliver data-driven decision making. Analyze data o evaluate existing decisions framework and apply analytical techniques to discover meaningful patterns. Develop and support sophisticated & innovative analytical solutions that generate actionable insights by utilizing diverse information. Provide high-end consulting to functional teams to help them sharpen their business strategy. Keep abreast of industry trends and emerging methodologies to continuously improve skill set. Contribute to knowledge sharing and improve team productivity through training/documentation of best practices. Exposure to Azure Databricks (Good to have) Experience in Power BI (Good to have) Desired Skills and Experience: Bachelors or masters degree in any quantitative discipline (B.Tech/M.Tech in Computer Science/IT preferred) Candidates from Economics, Statistics/Data Science or Operations Research background from reputable institutions would also be preferred Ability to handle large datasets and expertise in analytical tools such as SAS, SQL, R, Python, Spark etc. Expertise in analytical techniques such as Linear Regression, Logistic Regression, Cluster analysis, Market Basket Analysis, Product Bundling, Cross/Upsell Analysis etc. Strong MS office skills and data visualization competence Excellent verbal and written presentation skills Should be able to articulate thoughts and ideas properly (structured approach should be there for problem solving) Attention to detail and ability to work in high pressure environment. Strong Drive and passion to deliver business impact through retail analytics. Strong business acumen and ability to translate data insights into meaningful business recommendations. Open to Travel (up to 20%) on need basis

Posted 3 days ago

Apply

1.0 - 4.0 years

10 - 15 Lacs

Mumbai

Work from Office

Naukri logo

Overview We are building cutting-edge software and data workflows to identify and analyze the exposure and impact to climate change and financially relevant ESG risks. We leverage artificial intelligence (AI) and alternative data to deliver dynamic investment-relevant insights to power your investment decisions. Clients from across the capital ecosystem use our integrated data, analytical tools, indexes and insights for a clear view of the impact of ESG and Climate risks to their investment portfolios. We are seeking an outstanding Software Engineer to join our ESG&Climate Application Development team in the Pune/Mumbai or Budapest offices. As part of a global team you will collaborate in in cross-functional teams to build and improve our industry-leading ESG and Climate solutions. Responsibilities Design, develop, test, and maintain software applications to meet project requirements. Collaborate with product managers and other stakeholders to gather and refine requirements Participate in code reviews to maintain high coding standards and best practices. Troubleshoot and debug applications to resolve issues and improve performance. Document software designs, architectures, and processes for future reference. Support deployment and integration activities to ensure smooth implementation of software solutions Qualifications Expected: Bachelor’s degree in computer science, Mathematics, Engineering, related field, or equivalent experience Strong communication, interpersonal and problem-solving skills Good hands-on working experience in Python or Java Experience building RESTful Web Services using Fast API, Django or Flask. Good Understanding and hands on experience with SQL/NoSQL Databases Good understanding of the importance of testing in software development and the usage of unit testing framework like pytest/unittest. Hands on cloud technologies – Google or Azure preferred and experience in developing and managing microservices on cloud. Experience with Source code control systems, especially Git. Preferred: Hands on experience with data engineering technologies like Azure Databricks, Spark, or similar framework Some DevOps experience, knowledge of security best practices. Exposure to use of AI, LLM to solve business problems is added advantage. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer committed to diversifying its workforce. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 3 days ago

Apply

4.0 - 6.0 years

16 - 25 Lacs

Hyderabad

Remote

Naukri logo

Experience Required: 4 to 6Years Mandate Mode of work: Remote Skills Required: Azure Data Factory, SQL, Databricks, Python/Scala Notice Period : Immediate Joiners/ Permanent(Can join within July 4th 2025 ) 4 to 6 years of experience with Big Data technologies Experience with Microsoft Azure cloud platform. Experience in SQL and experience with SQL-based database systems. Hands-on experience with Azure data services, such as Azure SQL Database, Azure Data Lake, and Azure Blob Storage . Experience with data integration and ETL (Extract, Transform, Load) processes . Experience with programming languages such as Python Relevant certifications in Azure data services or data engineering are a plus. Interested candidate can share your resume to OR you can refer your friend to Pavithra.tr@enabledata.com for the quick response.

Posted 3 days ago

Apply

3.0 - 5.0 years

8 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Understanding the requirements and developing ADF pipelines Good knowledge of data bricks Strong understanding of the existing ADF pipelines and enhancements Deployment and Monitoring ADF Jobs Good understanding of SQL concepts and Strong in SQL query writing Understanding and writing the stored procedures Performance Tuning Roles and Responsibilities Understand business and data integration requirements. Design, develop, and implement scalable and reusable ADF pipelines for ETL/ELT processes. Leverage Databricks for advanced data transformations within ADF pipelines. Collaborate with data engineers to integrate ADF with Azure Databricks notebooks for big data processing. Analyze and understand existing ADF workflows. Implement improvements, optimize data flows, and incorporate new features based on evolving requirements. Manage deployment of ADF solutions across development, staging, and production environments. Set up monitoring, logging, and alerts to ensure smooth pipeline executions and troubleshoot failures. Write efficient and complex SQL queries to support data analysis and ETL tasks. Tune SQL queries for performance, especially in large-volume data scenarios. Design, develop, and maintain stored procedures for data transformation and business logic. Ensure procedures are optimized and modular for reusability and performance. Identify performance bottlenecks in queries and data processing routines. Apply indexing strategies, query refactoring, and execution plan analysis to enhance performance

Posted 3 days ago

Apply

5.0 - 10.0 years

16 - 27 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

6-7 Years of Data and Analytics experience with minimum 3 years in Azure Cloud Excellent communication and interpersonal skills. Extensive experience in Azure stack ADLS, Azure SQL DB, Azure Data Factory, Azure Data bricks, Azure Synapse, CosmoDB, Analysis Services, Event Hub etc.. Experience in job scheduling using Oozie or Airflow or any other ETL scheduler Design and build production data pipelines from ingestion to consumption within a big data architecture, using Java, Python, Scala. Good experience in designing & delivering data analytics solutions using Azure Cloud native services. Good experience in Requirements Analysis and Solution Architecture Design, Data modelling, ETL, data integration and data migration design Documentation of solutions (e.g. data models, configurations, and setup). Well versed with Waterfall, Agile, Scrum and similar project delivery methodologies. Experienced in internal as well as external stakeholder management Experience in MDM / DQM / Data Governance technologies like Collibra, Atacama, Alation, Reltio will be added advantage. Azure Data Engineer or Azure Solution Architect certification will be added advantage. Nice to have skills: Working experience with Snowflake, Databricks, Open source stack like Hadoop Bigdata, Pyspark, Scala, Python, Hive etc.

Posted 3 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Naukri logo

This role requires deep understanding of data warehousing, business intelligence (BI), and data governance principles, with strong focus on the Microsoft technology stack. Data Architecture: Develop and maintain the overall data architecture, including data models, data flows, data quality standards. Design and implement data warehouses, data marts, data lakes on Microsoft Azure platform Business Intelligence: Design and develop complex BI reports, dashboards, and scorecards using Microsoft Power BI. Data Engineering: Work with data engineers to implement ETL/ELT pipelines using Azure Data Factory. Data Governance: Establish and enforce data governance policies and standards. Primary Skills Experience: 15+ years of relevant experience in data warehousing, BI, and data governance. Proven track record of delivering successful data solutions on the Microsoft stack. Experience working with diverse teams and stakeholders. Required Skills and Experience: Technical Skills: Strong proficiency in data warehousing concepts and methodologies. Expertise in Microsoft Power BI. Experience with Azure Data Factory, Azure Synapse Analytics, and Azure Databricks. Knowledge of SQL and scripting languages (Python, PowerShell). Strong understanding of data modeling and ETL/ELT processes. Secondary Skills Soft Skills Excellent communication and interpersonal skills. Strong analytical and problem-solving abilities. Ability to work independently and as part of a team. Strong attention to detail and organizational skills.

Posted 3 days ago

Apply

Exploring Azure Databricks Jobs in India

Azure Databricks is a popular cloud-based big data analytics platform that is widely used by organizations in India. As the demand for big data professionals continues to grow, the job market for Azure Databricks roles in India is also expanding rapidly. Job seekers with skills in Azure Databricks can find a plethora of opportunities across various industries in the country.

Top Hiring Locations in India

  1. Bangalore
  2. Hyderabad
  3. Pune
  4. Mumbai
  5. Gurgaon

Average Salary Range

The average salary range for Azure Databricks professionals in India varies based on experience levels. Entry-level professionals can expect to earn around INR 6-8 lakhs per annum, while experienced professionals with several years of experience can earn upwards of INR 15 lakhs per annum.

Career Path

A typical career progression in Azure Databricks may include roles such as Junior Developer, Senior Developer, Tech Lead, and eventually moving up to roles like Architect or Data Engineer.

Related Skills

In addition to Azure Databricks, professionals in this field are often expected to have skills in: - Apache Spark - SQL - Python - Data Warehousing concepts - Data visualization tools like Power BI or Tableau

Interview Questions

  • What is Azure Databricks and how does it differ from Apache Spark? (basic)
  • How do you optimize performance in Azure Databricks? (medium)
  • Explain the concept of Delta Lake in Azure Databricks. (medium)
  • What are the different types of clusters in Azure Databricks and when would you use each? (medium)
  • How do you handle security in Azure Databricks? (advanced)
  • Explain the process of job scheduling in Azure Databricks. (medium)
  • What are the advantages of using Azure Databricks over on-premises data processing solutions? (basic)
  • How do you handle schema evolution in Azure Databricks? (medium)
  • Explain the concept of Structured Streaming in Azure Databricks. (medium)
  • How does Azure Databricks integrate with other Azure services like Azure Data Lake Storage or Azure SQL Database? (advanced)
  • What are the different pricing tiers available for Azure Databricks and how do they differ? (medium)
  • Explain the role of a Workspace in Azure Databricks. (basic)
  • How do you troubleshoot performance issues in Azure Databricks? (medium)
  • What is the role of a Job in Azure Databricks and how do you create one? (basic)
  • How do you monitor and manage costs in Azure Databricks? (medium)
  • Explain the concept of Libraries in Azure Databricks. (basic)
  • How do you implement data encryption at rest and in transit in Azure Databricks? (advanced)
  • What are the different data storage options available in Azure Databricks? (basic)
  • How do you handle data skew in Azure Databricks? (medium)
  • Explain the concept of Autoscaling in Azure Databricks. (medium)
  • How do you perform ETL operations in Azure Databricks? (medium)
  • What are the best practices for data governance in Azure Databricks? (advanced)
  • How do you handle version control in Azure Databricks notebooks? (medium)
  • Explain the concept of Machine Learning in Azure Databricks. (medium)

Closing Remark

As you explore opportunities in the Azure Databricks job market in India, make sure to brush up on your skills, prepare thoroughly for interviews, and showcase your expertise confidently. With the right preparation and a positive attitude, you can excel in your Azure Databricks career journey. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies