Jobs
Interviews

8 Onelake Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

kochi, kerala

On-site

You will be working as a Senior Microsoft Fabric Developer in an immediate project that aims to develop and implement an automated reporting solution using Microsoft Fabric. Your primary responsibilities will include utilizing Microsoft Fabric as the primary platform, Azure Logic Apps for API integrations, Power BI for report creation within Fabric, Power Automate for report distribution, and OneLake for data storage. Your role will require deep expertise in Microsoft Fabric, focusing on data integration, processing, and report development. You should have a strong background in Power BI, specifically within the Fabric environment, and proficiency in Azure Logic Apps for API integrations. Additionally, familiarity with Power Automate for workflow automation, understanding of data modeling and ETL processes, as well as experience with SQL and data analysis are essential skills for this position. Desired skills for this role include knowledge of MSP operations and common tools, experience with Microsoft 365 security features and reporting, familiarity with PDF generation from Power BI reports, understanding of data privacy and security best practices, and previous experience in creating reporting solutions for service providers. Apart from technical skills, you are expected to have excellent communication skills in English, both written and verbal. You should be able to work independently, take initiative, and approach problem-solving proactively. Business acumen, cost-awareness, and the commitment to seeing the project through to successful completion are also key criteria for this role. Additionally, you must be available to overlap with Irish time zones for at least 4 hours per day. If you meet these requirements and are ready to contribute to the successful completion of the project, we look forward to receiving your application.,

Posted 2 days ago

Apply

2.0 - 5.0 years

8 - 15 Lacs

Gurugram

Remote

Job Description: We are looking for a talented and driven MS Fabric Developer / Data Analytics Engineer with expertise in Microsoft Fabric ecosystem, data transformation, and analytics. The ideal candidate will be responsible for designing, developing, and optimizing data pipelines, working with real-time analytics, and implementing best practices in data modeling and reporting. Key Responsibilities: Work with MS Fabric components , including: Data Lake OneLake Lakehouse Warehouse Real-Time Analytics Develop and maintain data transformation scripts using: Power Query T-SQL Python Build scalable and efficient data models and pipelines for analytics and reporting Collaborate with BI teams and business stakeholders to deliver data-driven insights Implement best practices for data governance, performance tuning, and storage optimization Support real-time and near real-time data streaming and transformation tasks Required Skills: Hands-on experience with MS Fabric and associated data services Strong command over Power Query , T-SQL , and Python for data transformations Experience working in modern data lakehouse and real-time analytics environments Good to Have: DevOps knowledge for automating deployments and managing environments Familiarity with Azure services and cloud data architecture Understanding of CI/CD pipelines for data projects

Posted 4 days ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Hyderabad

Hybrid

Database Administrator (DBA) - T-SQL / Microsoft Fabric / Azure Data Services Required Qualifications Bachelors degree in Computer Science, Information Technology, or a related discipline 5+ years of hands-on experience as a DBA, with strong exposure to: T-SQL SSMS SQL Server 2019 or later Solid knowledge of Microsoft Fabric components and their interoperability with the Power Platform ecosystem Experience with: Azure SQL Database Azure Managed Instance Data Lake (Gen2 / OneLake) Strong understanding of: RDBMS design Data normalization Performance tuning techniques Hands-on with HA/DR mechanisms such as: Always On Availability Groups Log Shipping Azure Failover Groups Proficient in monitoring and diagnostic tools: SQL Profiler Extended Events Azure Log Analytics Query Performance Insight Experience in implementing: Data privacy Encryption (e.g., TDE, Always Encrypted) Firewall rules Security auditing Preferred Skills & Tools Proficiency in: Azure Data Factory (ADF) Azure Synapse Power BI Dataflows Familiarity with Microsoft Purview for data lineage and governance Hands-on with CI/CD pipelines for SQL using Azure DevOps YAML Understanding of: Fabric workspace administration Capacity planning Security roles Knowledge of NoSQL / Azure Cosmos DB is a plus Experience with monitoring tools like Grafana or Prometheus (especially in hybrid setups) Scripting experience in Python and/or PowerShell for automation Experience with ERP integrations and third-party data replication tools like: Fivetran BryteFlow Qlik Replicate Qualification : - Bachelors degree in Computer Science, Information Technology, Business Administration, or a related field Skills : - T-SQL, SSMS, SQL Server 2019, Microsoft Fabric, Power Platform, Azure SQL Database, Azure Managed Instance, Data Lake Gen2, OneLake, Always On Availability Groups, Log Shipping, Azure Failover Groups, SQL Profiler, Extended Events, Azure Log Analytics, Query Performance Insight, TDE, Always Encrypted, Azure Data Factory (ADF), Azure Synapse, Power BI Dataflows, Microsoft Purview, Azure DevOps, YAML, Grafana, Prometheus, Fivetran, BryteFlow, Qlik Replicate

Posted 5 days ago

Apply

4.0 - 8.0 years

15 - 27 Lacs

Indore, Hyderabad

Hybrid

Data Engineer - D365 OneLake Integration Specialist Position Overview: We are seeking an experienced Data Engineer with expertise in Microsoft D365 ERP and OneLake integration to support a critical acquisition integration project. The successful candidate will assess existing data integrations, collaborate with our data team to migrate pipelines to Snowflake using Matillion, and ensure seamless data flow for go-live critical reports by November 2025. Role & responsibilities: Assessment & Documentation: Analyze and document existing D365 to OneLake/Fabric integrations and data flows Data Pipeline Migration: Collaborate with the current data team to redesign and migrate data integrations from D365 to Snowflake using Matillion Integration Architecture : Understand and map current Power BI reporting dependencies and data sources Go-Live Support: Identify critical reports for go-live and recommend optimal data integration strategies Technical Collaboration: Work closely with existing data engineering team to leverage current Snowflake and Matillion expertise Knowledge Transfer: Document findings and provide recommendations on existing vs. new integration approaches ERP Implementation Support: Support the acquired company's ERP go-live timeline and requirements Required Qualifications: Technical Skills 3+ years experience with Microsoft Dynamics 365 ERP data integrations 2+ years hands-on experience with Microsoft OneLake and Fabric ecosystem Strong experience with Snowflake data warehouse platform Proficiency in Matillion ETL tool for data pipeline development Experience with Power BI data modeling and reporting architecture Strong SQL skills and data modeling expertise Knowledge of Azure Data Factory or similar cloud ETL tools Experience with REST APIs and data connector frameworks Business & Soft Skills Experience supporting ERP implementation projects and go-live activities Strong analytical and problem-solving skills for complex data integration challenges Excellent documentation and communication skills Ability to work in fast-paced, deadline-driven environments Experience in M&A integration projects (preferred) Project management skills and ability to prioritize go-live critical deliverables Preferred candidate profile Microsoft Azure certifications (DP-203, DP-900) Experience with Snowflake SnowPro certification Previous experience with acquisition integration projects Knowledge of financial and operational reporting requirements Familiarity with data governance and compliance frameworks

Posted 5 days ago

Apply

3.0 - 7.0 years

12 - 15 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Work from Office

We are looking for an experienced Data Engineer/BI Developer with strong hands-on expertise in Microsoft Fabric technologies, including OneLake, Lakehouse, Data Lake, Warehouse, and Real-Time Analytics, along with proven skills in Power BI, Azure Synapse Analytics, and Azure Data Factory (ADF). The ideal candidate should also possess working knowledge of DevOps practices for data engineering and deployment automation. Key Responsibilities: Design and implement scalable data solutions using Microsoft Fabric components: OneLake, Data Lake, Lakehouse, Warehouse, and Real-Time Analytics Build and manage end-to-end data pipelines integrating structured and unstructured data from multiple sources. Integrate Microsoft Fabric with Power BI, Synapse Analytics, and Azure Data Factory to enable modern data analytics solutions. Develop and maintain Power BI datasets, dashboards, and reports using data from Fabric Lakehouses or Warehouses. Implement data governance, security, and compliance policies within the Microsoft Fabric ecosystem. Collaborate with stakeholders for requirements gathering, data modeling, and performance tuning. Leverage Azure DevOps / Git for version control, CI/CD pipelines, and deployment automation of data artifacts. Monitor, troubleshoot, and optimize data flows and transformations for performance and reliability. Required Skills: 38 years of experience in data engineering, BI development, or similar roles. Strong hands-on experience with Microsoft Fabric ecosystem:OneLake, Data Lake, Lakehouse, Warehouse, Real-Time Analytics Proficient in Power BI for interactive reporting and visualization. Experience with Azure Synapse Analytics, ADF (Azure Data Factory), and related Azure services. Good understanding of data modeling, SQL, T-SQL, and Spark/Delta Lake concepts. Working knowledge of DevOps tools and CI/CD processes for data deployment (Azure DevOps preferred). Familiarity with DataOps and version control practices for data solutions. Preferred Qualifications: Microsoft certifications (e.g., DP-203, PL-300, or Microsoft Fabric certifications) are a plus. Experience with Python, Notebooks, or KQL for Real-Time Analytics is advantageous. Knowledge of data governance tools (e.g., Microsoft Purview) is a plus. Location: Remote- Bengaluru,Hyderabad,Delhi / NCR,Chennai,Pune,Kolkata,Ahmedabad,Mumbai

Posted 1 week ago

Apply

8.0 - 13.0 years

8 - 17 Lacs

Chennai

Remote

MS Fabric (Data Lake, OneLake, Lakehouse, Warehouse, Real-Time Analytics) and integration with Power BI, Synapse, and Azure Data Factory. DevOps Knowledge Team Leading experience

Posted 3 weeks ago

Apply

8.0 - 13.0 years

8 - 17 Lacs

Chennai

Remote

MS Fabric (Data Lake, OneLake, Lakehouse, Warehouse, Real-Time Analytics) and integration with Power BI, Synapse, and Azure Data Factory. DevOps Knowledge Team Leading experience

Posted 3 weeks ago

Apply

3.0 - 7.0 years

5 - 10 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Hybrid

Role & Responsibilities Job Description: We are seeking a skilled and experienced Microsoft Fabric Engineer to join data engineering team. The ideal candidate will have a strong background in designing, developing, and maintaining data solutions using Microsoft Fabric, i ncluding experience across key workloads such as Data Engineering, Data Factory, Data Science, Real-Time Analytics, and Power BI. Require deep understanding of Synapse Data Warehouse, OneLake, Notebooks, Lakehouse architecture, and Power BI integration within Microsoft ecosystem. Key Responsibilities: Design, implement scalable and secure data solutions using Microsoft Fabric. Build and maintain Data Pipelines using Dataflows Gen2 and Data Factory. Work with Lakehouse architecture and manage datasets in OneLake. Develop notebooks (PySpark or T-SQL) for data transformation and processing. Collaborate with data analysts to create interactive dashboards, reports using Power BI (within Fabric). Leverage Synapse Data Warehouse and KQL databases for structured real-time analytics. Monitor and optimize performance of data pipelines and queries. Ensure to adhere data quality, security, and governance practices. Stay current with Microsoft Fabric updates and roadmap, recommending enhancements. Required Skills: 3+ years of hands-on experience with Microsoft Fabric or similar tools in the Microsoft data stack. Strong proficiency with: Data Factory (Fabric) Synapse Data Warehouse / SQL Analytics Endpoints Power BI integration and DAX Notebooks (PySpark, T-SQL) Lakehouse and OneLake Understanding of data modeling, ETL/ELT processes, and real-time data streaming. Experience with KQL (Kusto Query Language) is a plus. Familiarity with Microsoft Purview, Azure Data Lake, or Azure Synapse Analytics is advantageous. Qualifications: Microsoft Fabric, Onelake, Data Factory, Data Lake, DataMesh

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies