Home
Jobs

14 Azure Fabric Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

3 - 12 Lacs

Hyderabad, Telangana, India

On-site

Foundit logo

Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results 5+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar field Must have experience executing projects end to end. At least one data engineering project should have worked in Azure Synapse, ADF or Azure Fabric Should be experienced in handling multiple data sources Technical expertise with data models, data mining, and segmentation techniques Deep understanding, both conceptually and in practice of at least one object orientated library (Python, pySpark) Strong SQL skills and a good understanding of existing SQL warehouses and relational databases. Strong Spark, PySpark, Spark SQL skills and good understanding of distributed processing frameworks. Build large-scale batches and real-time data pipelines. Ability to work independently and mentor junior resources. Desire to lead and develop a team of Data Engineers across multiple levels Experience or knowledge in Data Governance Azure Cloud experience with Data modeling, CICD, Agile Methodologies, DockerKubernetes

Posted 12 hours ago

Apply

7.0 - 12.0 years

2 - 11 Lacs

Hyderabad, Telangana, India

On-site

Foundit logo

Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results Qualifications7+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar fie

Posted 14 hours ago

Apply

5.0 - 10.0 years

2 - 12 Lacs

Hyderabad, Telangana, India

On-site

Foundit logo

Responsibilities Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results Qualifications5+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar fie

Posted 14 hours ago

Apply

5.0 - 10.0 years

12 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Design and implement robust and scalable data pipelines using Azure Data Factory, Azure Data Lake, and Azure SQL. Work extensively with Azure Fabric, CosmosDB, and SQL Server to develop and optimize end-to-end data solutions. Perform Database Design, Data Modeling, and Performance Tuning to ensure system reliability and data integrity. Write and optimize complex SQL queries to support data ingestion, transformation, and reporting needs. Proactively implement SQL optimization and preventive maintenance strategies to ensure efficient database performance. Lead data migration efforts from on-premise to cloud or across Azure services. Collaborate with cross-functional teams to gather requirements and translate them into technical solutions. Maintain clear documentation and follow industry best practices for security, compliance, and scalability. Required Skills : Proven experience working with: Azure Fabric SQL Server Azure Data Factory Azure Data Lake Cosmos DB Strong hands-on expertise in: Complex SQL queries SQL query efficiency and optimization Database design and data modeling Data migration techniques and performance tuning Solid understanding of cloud infrastructure and data integration patterns in Azure. Experience working in agile environments with CI/CD practices. Nice to have :Microsoft Azure certifications related to Data Engineering or Azure Solutions Location - Bengaluru , Hyderabad , Chennai , Pune , Noida , Mumbai.

Posted 2 days ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Pune

Work from Office

Naukri logo

Azure Cloud Data Lead Job Title: Azure Cloud Data Lead Location: Pune, India Experience: 7 - 12 Years Work Mode: Full-time, Office-based Company Overview : Smartavya Analytica is a niche Data and AI company based in Mumbai, established in 2017. We specialize in data-driven innovation, transforming enterprise data into strategic insights. With expertise spanning over 25+ Data Modernization projects and handling large datasets up to 24 PB in a single implementation, we have successfully delivered data and AI projects across multiple industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are specialists in Cloud, Hadoop, Big Data, AI, and Analytics, with a strong focus on Data Modernization for On-premises, Private, and Public Cloud Platforms. Visit us at: https://smart-analytica.com Job Summary: We are looking for a highly experienced Azure Cloud Data Lead to oversee the architecture, design, and delivery of enterprise-scale cloud data solutions. This role demands deep expertise in Azure Data Services , strong hands-on experience with data engineering and governance , and a strategic mindset to guide cloud modernization initiatives across complex environments. Key Responsibilities: Architect and design data lakehouses , data warehouses , and analytics platforms using Azure Data Services . Lead implementations using Azure Data Factory (ADF) , Azure Synapse Analytics , and Azure Fabric (OneLake ecosystem). Define and implement data governance frameworks including cataloguing, lineage, security, and quality controls. Collaborate with business stakeholders, data engineers, and developers to translate business requirements into scalable Azure architectures. Ensure platform design meets performance, scalability, security, and regulatory compliance needs. Guide migration of on-premises data platforms to Azure Cloud environments. Create architectural artifacts: solution blueprints, reference architectures, governance models, and best practice guidelines. Collaborate with Sales / presales to customer meetings to understand the business requirement, the scope of work and propose relevant solutions. Drive the MVP/PoC and capability demos to prospective customers / opportunities Must-Have Skills: 7 - 12 years of experience in data architecture, data engineering, or analytics solutions. Hands-on expertise in Azure Cloud services: ADF , Synapse , Azure Fabric (OneLake) , and Databricks (good to have). Strong understanding of data governance , metadata management, and compliance frameworks (e.g., GDPR, HIPAA). Deep knowledge of relational and non-relational databases (SQL, NoSQL) on Azure. Experience with security practices (IAM, RBAC, encryption, data masking) in cloud environments. Strong client-facing skills with the ability to present complex solutions clearly. Preferred Certifications: Microsoft Certified: Azure Solutions Architect Expert Microsoft Certified: Azure Data Engineer Associate

Posted 1 week ago

Apply

4.0 - 6.0 years

0 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hireMicrosoft Fabric Professionals in the following areas : Experience 4-6 Years Job Description Experience in Azure Fabric, Azure Data factory, Azure Databricks, Azure Synapse, Azure SQL, ETL Create Pipelines, datasets, dataflows, Integration runtimes and monitoring Pipelines and trigger runs Extract Transformation and Load Data from source system and processing the data in Azure Databricks Create SQL scripts to perform complex queries Create Synapse pipelines to migrate data from Gen2 to Azure SQL Data Migration pipeline to Azure cloud (Azure SQL). Database migration from on-prem SQL server to Azure Dev Environment by using Azure DMS and Data Migration Assistant Experience in using azure data catalog Experience in Big Data Batch Processing Solutions Interactive Processing Solutions Real Time Processing Solutions Certifications Good To Have At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture

Posted 2 weeks ago

Apply

6.0 - 10.0 years

8 - 12 Lacs

Pune, Gurugram, Bengaluru

Work from Office

Naukri logo

Contractual Hiring manager :- My profile :- linkedin.com/in/yashsharma1608 Payroll of :- https://www.nyxtech.in/ 1. AZURE DATA ENGINEER WITH FABRIC The Role : Lead Data Engineer PAYROLL Client - Brillio About Role: Experience 6 to 8yrs Location- Bangalore , Hyderabad , Pune , Chennai , Gurgaon (Hyderabad is preferred) Notice- 15 days / 30 days. Budget -15 LPA AZURE FABRIC EXP MANDATE Skills : Azure Onelake, datapipeline , Apache Spark , ETL , Datafactory , Azure Fabric , SQL , Python/Scala. Key Responsibilities: Data Pipeline Development: Lead the design, development, and deployment of data pipelines using Azure OneLake, Azure Data Factory, and Apache Spark, ensuring efficient, scalable, and secure data movement across systems. ETL Architecture: Architect and implement ETL (Extract, Transform, Load) workflows, optimizing the process for data ingestion, transformation, and storage in the cloud. Data Integration: Build and manage data integration solutions that connect multiple data sources (structured and unstructured) into a cohesive data ecosystem. Use SQL, Python, Scala, and R to manipulate and process large datasets. Azure OneLake Expertise: Leverage Azure OneLake and Azure Synapse Analytics to design and implement scalable data storage and analytics solutions that support big data processing and analysis. Collaboration with Teams: Work closely with Data Scientists, Data Analysts, and BI Engineers to ensure that the data infrastructure supports analytical needs and is optimized for performance and accuracy. Performance Optimization: Monitor, troubleshoot, and optimize data pipeline performance to ensure high availability, fast processing, and minimal downtime. Data Governance & Security: Implement best practices for data governance, data security, and compliance within the Azure ecosystem, ensuring data privacy and protection. Leadership & Mentorship: Lead and mentor a team of data engineers, promoting a collaborative and high-performance team culture. Oversee code reviews, design decisions, and the implementation of new technologies. Automation & Monitoring: Automate data engineering workflows, job scheduling, and monitoring to ensure smooth operations. Use tools like Azure DevOps, Airflow, and other relevant platforms for automation and orchestration. Documentation & Best Practices: Document data pipeline architecture, data models, and ETL processes, and contribute to the establishment of engineering best practices, standards, and guidelines. C Innovation: Stay current with industry trends and emerging technologies in data engineering, cloud computing, and big data analytics, driving innovation within the team.C

Posted 2 weeks ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

GitHub Actions + DevOPS Terraform / Bicep AKS Cluster and deployment Cloud Technology majorly Azure SQL server, Fabric, Data bricks etc.. Good to have Powershell Flawless Communication Job Description We are seeking an experienced Sr. Azure GitHub DevOps Engineer to join our team, supporting a global enterprise client. In this role, you will be responsible for designing and optimizing DevOps pipelines, leveraging GitHub Actions and Azure DevOps tools to streamline software delivery and infrastructure automation. This role requires expertise in GitHub Actions, Azure-native services, and modern DevOps methodologies to enable seamless collaboration and ensure scalable, secure, and efficient cloud-based solutions. Key Responsibilities GitHub Actions Development: Design, implement, and optimize CI/CD workflows using GitHub Actions to support multi-environment deployments. Leverage GitHub Actions for automated builds, tests, and deployments, ensuring integration with Azure services. Create reusable GitHub Actions templates and libraries for consistent DevOps practices. GitHub Repository Administration: Manage GitHub repositories, branching strategies, and access permissions. Implement GitHub features like Dependabot, code scanning, and security alerts to enhance code quality and security. Azure DevOps Integration: Utilize Azure Pipelines in conjunction with GitHub Actions to orchestrate complex CI/CD workflows. Configure and manage Azure services such as: Azure Kubernetes Service (AKS) for container orchestration. Azure Application Gateway and Azure Front Door for load balancing and traffic management. Azure Monitoring , Azure App Insights , and Azure KeyVault for observability, diagnostics, and secure secrets management. HELM charts and Microsoft Bicep for Infrastructure as Code. Automation & Scripting: Develop robust automation scripts using PowerShell , Bash , or Python to streamline operational tasks. Automate monitoring, deployments, and environment management workflows. Infrastructure Management: Oversee and maintain cloud environments with a focus on scalability, security, and reliability. Implement containerization strategies using Docker and orchestration via AKS . Collaboration: Partner with cross-functional teams to align DevOps practices with business objectives while maintaining compliance and security standards. Monitoring & Optimization: Deploy and maintain monitoring and logging tools to ensure system performance and uptime. Optimize pipeline execution times and infrastructure costs. Documentation & Best Practices: Document GitHub Actions workflows, CI/CD pipelines, and Azure infrastructure configurations. Advocate for best practices in version control, security, and DevOps methodologies. Qualifications Education: Bachelor's degree in Computer Science, Information Technology, or related field (preferred). Experience: 3+ years of experience in DevOps engineering with a focus on GitHub Actions and Azure DevOps tools. Proven track record of designing CI/CD workflows using GitHub Actions in production environments. Extensive experience with Azure services, including AKS, Azure Front Door, Azure Application Gateway, Azure KeyVault, Azure App Insights, and Azure Monitoring. Hands-on experience with Infrastructure as Code tools, including Microsoft Bicep and HELM charts . Technical Skills: GitHub Actions Expertise: Deep understanding of GitHub Actions, workflows, and integrations with Azure services. Scripting & Automation: Proficiency in PowerShell , Bash , and Python for creating automation scripts and custom GitHub Actions. Containerization & Orchestration: Experience with Docker and Kubernetes , including Azure Kubernetes Service (AKS). Security Best Practices: Familiarity with securing CI/CD pipelines, secrets management, and cloud environments. Monitoring & Optimization: Hands-on experience with Azure Monitoring, App Insights, and logging solutions to ensure system reliability. Soft Skills: Strong problem-solving and analytical abilities. Excellent communication and collaboration skills, with the ability to work in cross-functional and global teams. Detail-oriented with a commitment to delivering high-quality results. Preferred Qualifications Experience in DevOps practices within the financial or tax services industries. Familiarity with advanced GitHub features such as Dependabot, Security Alerts, and CodeQL. Knowledge of additional CI/CD platforms like Jenkins or CircleCI.

Posted 2 weeks ago

Apply

10.0 - 16.0 years

25 - 27 Lacs

Chennai

Work from Office

Naukri logo

We at Dexian India, are looking to hire a Cloud Data PM with over 10 years of hands-on experience in AWS/Azure, DWH, and ETL. The role is based in Chennai with a shift from 2.00pm to 11.00pm IST. Key qualifications we seek in candidates include: - Solid understanding of SQL and data modeling - Proficiency in DWH architecture, including EDW/DM concepts and Star/Snowflake schema - Experience in designing and building data pipelines on Azure Cloud stack - Familiarity with Azure Data Explorer, Data Factory, Data Bricks, Synapse Analytics, Azure Fabric, Azure Analysis Services, and Azure SQL Datawarehouse - Knowledge of Azure DevOps and CI/CD Pipelines - Previous experience managing scrum teams and working as a Scrum Master or Project Manager on at least 2 projects - Exposure to on-premise transactional database environments like Oracle, SQL Server, Snowflake, MySQL, and/or Postgres - Ability to lead enterprise data strategies, including data lake delivery - Proficiency in data visualization tools such as Power BI or Tableau, and statistical analysis using R or Python - Strong problem-solving skills with a track record of deriving business insights from large datasets - Excellent communication skills and the ability to provide strategic direction to technical and business teams - Prior experience in presales, RFP and RFI responses, and proposal writing is mandatory - Capability to explain complex data solutions clearly to senior management - Experience in implementing, managing, and supporting data warehouse projects or applications - Track record of leading full-cycle implementation projects related to Business Intelligence - Strong team and stakeholder management skills - Attention to detail, accuracy, and ability to meet tight deadlines - Knowledge of application development, APIs, Microservices, and Integration components Tools & Technology Experience Required: - Strong hands-on experience in SQL or PLSQL - Proficiency in Python - SSIS or Informatica (Mandatory one of the tools) - BI: Power BI, or Tableau (Mandatory one of the tools)

Posted 1 month ago

Apply

10.0 - 16.0 years

25 - 28 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

We at Dexian India, are looking to hire a Cloud Data PM with over 10 years of hands-on experience in AWS/Azure, DWH, and ETL. The role is based in Chennai with a shift from 2.00pm to 11.00pm IST. Key qualifications we seek in candidates include: - Solid understanding of SQL and data modeling - Proficiency in DWH architecture, including EDW/DM concepts and Star/Snowflake schema - Experience in designing and building data pipelines on Azure Cloud stack - Familiarity with Azure Data Explorer, Data Factory, Data Bricks, Synapse Analytics, Azure Fabric, Azure Analysis Services, and Azure SQL Datawarehouse - Knowledge of Azure DevOps and CI/CD Pipelines - Previous experience managing scrum teams and working as a Scrum Master or Project Manager on at least 2 projects - Exposure to on-premise transactional database environments like Oracle, SQL Server, Snowflake, MySQL, and/or Postgres - Ability to lead enterprise data strategies, including data lake delivery - Proficiency in data visualization tools such as Power BI or Tableau, and statistical analysis using R or Python - Strong problem-solving skills with a track record of deriving business insights from large datasets - Excellent communication skills and the ability to provide strategic direction to technical and business teams - Prior experience in presales, RFP and RFI responses, and proposal writing is mandatory - Capability to explain complex data solutions clearly to senior management - Experience in implementing, managing, and supporting data warehouse projects or applications - Track record of leading full-cycle implementation projects related to Business Intelligence - Strong team and stakeholder management skills - Attention to detail, accuracy, and ability to meet tight deadlines - Knowledge of application development, APIs, Microservices, and Integration components Tools & Technology Experience Required: - Strong hands-on experience in SQL or PLSQL - Proficiency in Python - SSIS or Informatica (Mandatory one of the tools) - BI: Power BI, or Tableau (Mandatory one of the tools)

Posted 1 month ago

Apply

9.0 - 14.0 years

30 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

Design, deploy, and optimize Azure-based data pipelines and architectures. Ensure scalability, data integrity, and CI/CD automation. Collaborate with analytics teams and lead data engineering initiatives across hybrid data platforms Required Candidate profile Bachelor’s in CS/IT with 7–12 years of experience in Azure data engineering. Strong in ADF, Synapse, Databricks, and CI/CD. Able to mentor junior engineers, optimize large-scale data systems

Posted 1 month ago

Apply

10.0 - 20.0 years

20 - 35 Lacs

Chennai

Work from Office

Naukri logo

Hadoop, Spark, and Kafka, in the Cloud ecosystem. AWS, Azure GCP. data modeling, data integration, and data management best practices. GDPR, HIPAA, and other industry regulations. AWS, Azure and GCP

Posted 1 month ago

Apply

15.0 - 20.0 years

20 - 30 Lacs

Pune

Work from Office

Naukri logo

We are seeking a seasoned Technical Project Manager to oversee and guide large service engagements, involving teams of 35-50 individuals. This role requires a balance of technical know-how, exceptional leadership abilities, and proven project management skills.

Posted 1 month ago

Apply

10.0 - 15.0 years

30 - 45 Lacs

Pune

Work from Office

Naukri logo

Azure Cloud Data Solutions Architect Job Title: Azure Cloud Data Solutions Architect Location: Pune, India Experience: 10 - 15 Years Work Mode: Full-time, Office-based Company : Smartavya Analytica Private Limited Company Overview: Smartavya Analytica is a niche Data and AI company based in Mumbai, established in 2017. We specialize in data-driven innovation, transforming enterprise data into strategic insights. With expertise spanning over 25+ Data Modernization projects and handling large datasets up to 24 PB in a single implementation, we have successfully delivered data and AI projects across multiple industries, including retail, finance, telecom, manufacturing, insurance, and capital markets. We are specialists in Cloud, Hadoop, Big Data, AI, and Analytics, with a strong focus on Data Modernization for On-premises, Private, and Public Cloud Platforms. Visit us at: https://smart-analytica.com Job Summary: We are seeking an experienced Azure Cloud Data Solutions Architect to lead end-to-end architecture and delivery of enterprise-scale cloud data platforms. The ideal candidate will have deep expertise in Azure Data Services , Data Engineering , and Data Governance , with the ability to architect and guide cloud modernization initiatives. Key Responsibilities: Architect and design data lakehouses , data warehouses , and analytics platforms using Azure Data Services . Lead implementations using Azure Data Factory (ADF) , Azure Synapse Analytics , and Azure Fabric (OneLake ecosystem). Define and implement data governance frameworks including cataloguing, lineage, security, and quality controls. Collaborate with business stakeholders, data engineers, and developers to translate business requirements into scalable Azure architectures. Ensure platform design meets performance, scalability, security, and regulatory compliance needs. Guide migration of on-premises data platforms to Azure Cloud environments. Create architectural artifacts: solution blueprints, reference architectures, governance models, and best practice guidelines. Collaborate with Sales / presales to customer meetings to understand the business requirement, the scope of work and propose relevant solutions. Drive the MVP/PoC and capability demos to prospective customers / opportunities Must-Have Skills: 1015 years of experience in data architecture, data engineering, or analytics solutions. Hands-on expertise in Azure Cloud services: ADF , Synapse , Azure Fabric (OneLake) , and Databricks (good to have). Strong understanding of data governance , metadata management, and compliance frameworks (e.g., GDPR, HIPAA). Deep knowledge of relational and non-relational databases (SQL, NoSQL) on Azure. Experience with security practices (IAM, RBAC, encryption, data masking) in cloud environments. Strong client-facing skills with the ability to present complex solutions clearly. Preferred Certifications: Microsoft Certified: Azure Solutions Architect Expert Microsoft Certified: Azure Data Engineer Associate

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies