Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
8.0 - 12.0 years
14 - 24 Lacs
Pune
Work from Office
Role & responsibilities Experience: 8-10 years in the Data and Analytics domain with expertise in the Microsoft Data Tech stack. Leadership: Experience in managing teams of 8-10 members. Technical Skills: Expertise in tools like Microsoft Fabric, Azure Synapse Analytics, Azure Data Factory, Power BI, SQL Server, Azure Databricks, etc. Strong understanding of data architecture, pipelines, and governance. Understanding of one of the other data platforms like Snowflake or Google Big query or Amazon Red shift will be a plus and good to have skill. Tech stack - DBT and Databricks or Snowflake Microsoft BI - PBI, Synapse and Fabric Project Management: Proficiency in project management methodologies (Agile, Scrum, or Waterfall). Key Responsibilities Project Delivery & Management: Involved in the delivery of project. Help and define project plan, and ensure timelines are met in project delivery. Maintain quality control and ensure client satisfaction at all stages. Team Leadership & Mentorship: Lead, mentor, and manage a team of 5 to 8 professionals. Conduct performance evaluations and provide opportunities for skill enhancement. Foster a collaborative and high-performance work environment. Client Engagement: Act as the primary point of contact on technical front. Understand client needs and ensure expectations are met or exceeded. Conduct and do bi-weekly and monthly reviews on projects with customer. Technical Expertise & Innovation: Stay updated with the latest trends in Microsoft Data Technologies (Microsoft Fabric, Azure Synapse, Power BI, SQL Server, Azure Data Factory, etc.). Provide technical guidance and support to the team. Regards, Ruchita Shete Busisol Sourcing Pvt. Ltd. Tel No: 7738389588 Email id: ruchita@busisol.net
Posted 1 day ago
6.0 - 9.0 years
8 - 11 Lacs
Chennai
Work from Office
About the job : Role : Microsoft Fabric Data Engineer Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundational knowledge in data warehousing, ETL/ELT processes, and data modeling (dimensional, normalized). - Design and implement scalable and efficient data pipelines using Data Factory (Data Pipeline, Data Flow Gen 2 etc) in Fabric, Pyspark notebooks, Spark SQL, and Python. This includes data ingestion, data transformation, and data loading processes. - Experience ingesting data from SAP systems like SAP ECC/S4HANA/SAP BW etc will be a plus. - Nice to have ability to develop dashboards or reports using tools like Power BI. Coding Fluency : - Proficiency in SQL, Python, or other languages for data scripting, transformation, and automation.
Posted 2 days ago
6.0 - 9.0 years
9 - 13 Lacs
Kolkata
Work from Office
Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundational knowledge in data warehousing, ETL/ELT processes, and data modeling (dimensional, normalized). - Design and implement scalable and efficient data pipelines using Data Factory (Data Pipeline, Data Flow Gen 2 etc) in Fabric, Pyspark notebooks, Spark SQL, and Python. This includes data ingestion, data transformation, and data loading processes. - Experience ingesting data from SAP systems like SAP ECC/S4HANA/SAP BW etc will be a plus. - Nice to have ability to develop dashboards or reports using tools like Power BI. Coding Fluency : - Proficiency in SQL, Python, or other languages for data scripting, transformation, and automation.
Posted 2 days ago
8.0 - 10.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Role Responsibilities : - Design and implement data pipelines using MS Fabric. - Develop data models to support business intelligence and analytics. - Manage and optimize ETL processes for data extraction, transformation, and loading. - Collaborate with cross-functional teams to gather and define data requirements. - Ensure data quality and integrity in all data processes. - Implement best practices for data management, storage, and processing. - Conduct performance tuning for data storage and retrieval for enhanced efficiency. - Generate and maintain documentation for data architecture and data flow. - Participate in troubleshooting data-related issues and implement solutions. - Monitor and optimize cloud-based solutions for scalability and resource efficiency. - Evaluate emerging technologies and tools for potential incorporation in projects. - Assist in designing data governance frameworks and policies. - Provide technical guidance and support to junior data engineers. - Participate in code reviews and ensure adherence to coding standards. - Stay updated with industry trends and best practices in data engineering. Qualifications : - 8+ years of experience in data engineering roles. - Strong expertise in MS Fabric and related technologies. - Proficiency in SQL and relational database management systems. - Experience with data warehousing solutions and data modeling. - Hands-on experience in ETL tools and processes. - Knowledge of cloud computing platforms (Azure, AWS, GCP). - Familiarity with Python or similar programming languages. - Ability to communicate complex concepts clearly to non-technical stakeholders. - Experience in implementing data quality measures and data governance. - Strong problem-solving skills and attention to detail. - Ability to work independently in a remote environment. - Experience with data visualization tools is a plus. - Excellent analytical and organizational skills. - Bachelor's degree in Computer Science, Engineering, or related field. - Experience in Agile methodologies and project management.
Posted 2 days ago
5.0 - 10.0 years
10 - 20 Lacs
Pune
Work from Office
Dear Candidate, We are excited to share an opportunity at Avigna.AI for the position of Data Engineer . We're looking for professionals with strong data engineering experience who can contribute to building scalable, intelligent data solutions and have a passion for solving complex problems. Position Details: Role: Data Engineer Location: Pune, Baner (Work from Office) Experience: 7+ years Working Days: Monday to Friday (9:00 AM 6:00 PM) Education: Bachelors or Master’s in Computer Science, Engineering, Mathematics, or related field Company Website: www.avigna.ai LinkedIn: Avigna.AI Key Responsibilities: Design and develop robust data pipelines for large-scale data ingestion, transformation, and analytics. Implement scalable Lakehouse architectures using tools like Microsoft Fabric for structured and semi-structured data. Work with Python , PySpark , and Azure services to support data modelling, automation, and predictive insights. Develop custom KQL queries and manage data using Power BI , Azure Cosmos DB , or similar tools. Collaborate with cross-functional teams to integrate data-driven components with application backends and frontends. Ensure secure, efficient, and reliable CI/CD pipelines for automated deployments and data updates. Skills & Experience Required: Strong proficiency in Python , PySpark , and cloud-native data tools Experience with Microsoft Azure services (e.g., App Services, Functions, Cosmos DB, Active Directory) Hands-on experience with Microsoft Fabric (preferred or good to have) Working knowledge of Power BI and building interactive dashboards for business insights Familiarity with CI/CD practices for automated deployments Exposure to machine learning integration into data workflows (nice to have) Strong analytical and problem-solving skills with attention to detail Good to Have: Experience with KQL (Kusto Query Language) Background in simulation models or mathematical modeling Knowledge of Power Platform integration (Power Pages, Power Apps) Benefits : Competitive salary. Health insurance coverage. Professional development opportunities. Dynamic and collaborative work environment. Important Note: Kindly share your resumes to talent@avigna.ai When sharing your profile, please copy paste the below content in the subject line: Subject: Applying for Data Engineer role JOBID:ZR_14_JOB
Posted 2 days ago
6.0 - 9.0 years
9 - 13 Lacs
Mumbai
Work from Office
Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundational knowledge in data warehousing, ETL/ELT processes, and data modeling (dimensional, normalized). - Design and implement scalable and efficient data pipelines using Data Factory (Data Pipeline, Data Flow Gen 2 etc) in Fabric, Pyspark notebooks, Spark SQL, and Python. This includes data ingestion, data transformation, and data loading processes. - Experience ingesting data from SAP systems like SAP ECC/S4HANA/SAP BW etc will be a plus. - Nice to have ability to develop dashboards or reports using tools like Power BI. Coding Fluency : - Proficiency in SQL, Python, or other languages for data scripting, transformation, and automation.
Posted 2 days ago
4.0 - 9.0 years
5 - 10 Lacs
Pune
Hybrid
Design, develop, and deploy Power BI dashboards, data models, and reports. Collaborate with stakeholders to meet business needs through data visualization and BI solutions. Required Candidate profile 4–10 years’ experience in Power BI development. Skilled in data modelling, data visualization, and data analysis. Strong collaboration and communication skills are essential.
Posted 3 days ago
7.0 - 10.0 years
15 - 25 Lacs
Gurgaon, Haryana, India
On-site
The Cloud Data Architect will lead client engagements, guiding stakeholders toward optimized, cloud-native data architectures. This role will be pivotal in defining modernization strategies, designing future-state data platforms, and integrating Microsoft Fabric solutions. Key Responsibilities: Lead client interviews and workshops to understand current and future data needs Conduct technical reviews of Azure infrastructure including Databricks, Synapse Analytics, and Power BI Design scalable and optimized architecture solutions with a focus on Microsoft Fabric integration Define and refine data governance frameworks including cataloguing, lineage, and quality standards Deliver strategic and actionable project outputs in line with client expectations Evaluate and ensure the quality and accuracy of deliverables Collaborate with business and domain stakeholders to capture and implement business logic Manage end-to-end project delivery, including coordination with client and internal teams Communicate effectively with global stakeholders across various channels Troubleshoot and resolve complex issues across dev, test, UAT, and production environments Ensure quality checks and adherence to Service Level Agreements and Turnaround Times Required Skills and Experience: Bachelor's or Master's degree in Computer Science, Finance, Information Systems, or related field Minimum 7 years of experience in Data and Cloud architecture roles Proven experience engaging with client stakeholders and leading solution architecture Deep expertise in Azure Data Platform: Synapse, Databricks, Azure Data Factory, Azure SQL, Power BI Strong knowledge of data governance best practices including data quality, cataloguing, and lineage Familiarity with Microsoft Fabric and its integration into enterprise environments Experience creating modernization roadmaps and designing target architectures Excellent verbal and written communication skills Strong analytical, organizational, and problem-solving abilities Self-starter capable of working independently and in team environments Experience delivering projects in agile development environments Project management and team leadership capabilities
Posted 6 days ago
3.0 - 5.0 years
10 - 12 Lacs
Bengaluru
Hybrid
Notice Period: Immediate Key Responsibilities: Design and implement data models and schemas to support business intelligence and analytics. Perform data engineering tasks as needed to support analytical activities. Develop clear, concise, and insightful data visualizations and dashboards. Interpret complex datasets and communicate findings through compelling visualizations and storytelling. Work closely with stakeholders to understand data requirements and deliver actionable insights. Maintain documentation and ensure data quality and integrity. Reporting Structure: Direct reporting to Senior Data Engineer Dotted-line reporting to CTO Required Skills & Qualifications: Proficiency in Power BI, Microsoft Fabric, and Power Query. Experience designing and implementing data models and schemas. Familiarity with basic data engineering tasks. Advanced SQL skills for querying and analyzing data. Exceptional ability to translate complex data into clear, actionable insights. Strong ability to communicate complex data insights effectively to technical and non-technical audiences. Preferred Skills: Experience with Python for data manipulation and analysis. Experience in the finance, tax, or professional services industries. Familiarity with Salesforce data models and integrations.
Posted 6 days ago
5.0 - 10.0 years
0 - 0 Lacs
Pune, Chennai
Hybrid
Ciklum is looking for a Senior Microsoft Fabric Data Engineer to join our team full-time in India. We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4,000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live. About the role: We are seeking a highly skilled and experienced Senior Microsoft Fabric Data Engineer to design, develop, and optimize advanced data solutions leveraging the Microsoft Fabric platform. You will be responsible for building robust, scalable data pipelines, integrating diverse and large-scale data sources, and enabling sophisticated analytics and business intelligence capabilities. This role requires extensive hands-on expertise with Microsoft Fabric, a deep understanding of Azure data services, and mastery of modern data engineering practices. Responsibilities: Lead the design and implementation of highly scalable and efficient data pipelines and data warehouses using Microsoft Fabric and a comprehensive suite of Azure services (Data Factory, Synapse Analytics, Azure SQL, Data Lake) Develop, optimize, and oversee complex ETL/ELT processes for data ingestion, transformation, and loading from a multitude of disparate sources, ensuring high performance with large-scale datasets Ensure the highest level of data integrity, quality, and governance throughout the entire Fabric environment, establishing best practices for data management Collaborate extensively with stakeholders, translating intricate business requirements into actionable, resilient, and optimized data solutions Proactively troubleshoot, monitor, and fine-tune data pipelines and workflows for peak performance and efficiency, particularly in handling massive datasets Architect and manage workspace architecture, implement robust user access controls, and enforce data security in strict compliance with privacy regulations Automate platform tasks and infrastructure management using advanced scripting languages (Python, PowerShell) and Infrastructure as Code (Terraform, Ansible) principles Document comprehensive technical solutions, enforce code modularity, and champion best practices in version control and documentation across the team Stay at the forefront of Microsoft Fabric updates, new features, and contribute significantly to continuous improvement initiatives and the adoption of cutting-edge technologies Requirements: Minimum of 5+ years of progressive experience in data engineering, with at least 3 years of hands-on, in-depth work on Microsoft Fabric and a wide array of Azure data services Exceptional proficiency in SQL, Python, and advanced data transformation tools (e.g., Spark, PySpark notebooks) Mastery of data warehousing concepts, dimensional modeling, and advanced ETL best practices Extensive experience with complex hybrid cloud and on-premises data integration scenarios Profound understanding of data governance, security protocols, and compliance standards Excellent problem-solving, analytical, and communication skills, with the ability to articulate complex technical concepts clearly to both technical and non-technical audiences Desirable: Experience with Power BI, Azure Active Directory, and managing very large-scale data infrastructure Strong familiarity with Infrastructure as Code and advanced automation tools Bachelors degree in Computer Science, Engineering, or a related field (or equivalent extensive experience) What's in it for you? Care: your mental and physical health is our priority. We ensure comprehensive company-paid medical insurance, as well as financial and legal consultation Tailored education path: boost your skills and knowledge with our regular internal events (meetups, conferences, workshops), Udemy licence, language courses and company-paid certifications Growth environment: share your experience and level up your expertise with a community of skilled professionals, locally and globally Flexibility: hybrid work mode at Chennai or Pune Opportunities: we value our specialists and always find the best options for them. Our Resourcing Team helps change a project if needed to help you grow, excel professionally and fulfil your potential Global impact: work on large-scale projects that redefine industries with international and fast-growing clients Welcoming environment: feel empowered with a friendly team, open-door policy, informal atmosphere within the company and regular team-building events About us: At Ciklum, we are always exploring innovations, empowering each other to achieve more, and engineering solutions that matter. With us, youll work with cutting-edge technologies, contribute to impactful projects, and be part of a One Team culture that values collaboration and progress. India is a strategic innovation hub for Ciklum, with growing teams in Chennai and Pune leading advancements in EdgeTech, AR/VR, IoT, and beyond. Join us to collaborate on game-changing solutions and take your career to the next level. Want to learn more about us? Follow us on Instagram , Facebook , LinkedIn . Explore, empower, engineer with Ciklum! Experiences of tomorrow. Engineered together Interested already? We would love to get to know you! Submit your application. Cant wait to see you at Ciklum.
Posted 6 days ago
5.0 - 7.0 years
15 - 25 Lacs
Pune, Ahmedabad
Hybrid
Key Responsibilities: Design, develop, and optimize data pipelines and ETL/ELT workflows using Microsoft Fabric, Azure Data Factory, and Azure Synapse Analytics. Implement Lakehouse and Warehouse architectures within Microsoft Fabric, supporting medallion (bronze-silver-gold) data layers. Collaborate with business and analytics teams to build scalable and reliable data models (star/snowflake) using Azure SQL, Power BI, and DAX. Utilize Azure Analysis Services, Power BI Semantic Models, and Microsoft Fabric Dataflows for analytics delivery. Very good hands-on experience with Python for data transformation and processing. Apply CI/CD best practices and manage code through Git version control. Ensure data security, lineage, and quality using data governance best practices and Microsoft Purview (if applicable). Troubleshoot and improve performance of existing data pipelines and models. Participate in code reviews, testing, and deployment activities. Communicate effectively with stakeholders across geographies and time zones. Required Skills: Hands-on experience with Microsoft Fabric (Lakehouse, Warehouse, Dataflows, Pipelines). Strong knowledge of Azure Synapse Analytics, Azure Data Factory, Azure SQL, and Azure Analysis Services. Proficiency in Power BI and DAX for data visualization and analytics modeling. Strong Python skills for scripting and data manipulation. Experience in dimensional modeling, star/snowflake schemas, and Kimball methodologies. Familiarity with CI/CD pipelines, DevOps, and Git-based versioning. Understanding of data governance, data cataloging, and quality management practices. Excellent verbal and written communication skills.
Posted 1 week ago
3.0 - 5.0 years
4 - 7 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Key Responsibilities: Design and implement data models and schemas to support business intelligence and analytics. Perform data engineering tasks as needed to support analytical activities. Develop clear, concise, and insightful data visualizations and dashboards. Interpret complex datasets and communicate findings through compelling visualizations and storytelling. Work closely with stakeholders to understand data requirements and deliver actionable insights. Maintain documentation and ensure data quality and integrity. Reporting Structure: Direct reporting to Senior Data Engineer Dotted-line reporting to CTO Required Skills & Qualifications: Proficiency in Power BI, Microsoft Fabric, and Power Query. Experience designing and implementing data models and schemas. Familiarity with basic data engineering tasks. Advanced SQL skills for querying and analyzing data. Exceptional ability to translate complex data into clear, actionable insights. Strong ability to communicate complex data insights effectively to technical and non-technical audiences.
Posted 1 week ago
5.0 - 7.0 years
9 - 12 Lacs
Bengaluru
Work from Office
Join our growing Data & Analytics practice as a Data Analytics & Visualization Candidate must be ready for a F2F interview at Bangalore Consultant and play a key role in designing, building, and governing enterprise- grade dashboards and low-code solutions that enable datadriven decision-making across the firm and for our clients. We are looking for a hands-on, results-driven individual with proven expertise in Power BI, Power Apps, and SQL, along with exposure to modern cloud data ecosystems. Familiarity with Snowflake, Microsoft Fabric best practices, and Finance domain knowledge will be considered valuable assets. This role spans the full delivery lifecycleincluding requirements gathering, data modelling, solution design, development, testing, deployment, and support. Collaborate with business stakeholders to gather and translate business requirements into technical solutions. Design and develop end-to-end Power BI dashboards including data models, DAX calculations, row level security, and performance optimization. Build and deploy Power Apps solutions to automate workflows and integrate with Microsoft 365 and data platforms. Write and optimize complex SQL queries to transform, clean, and extract data from Snowflake or Azure-based data platforms. Connect Power BI to Snowflake using best practices (ODBC, DirectQuery, Import modes). Author views and stored procedures on Azure SQL/Synapse to enable scalable and governed reporting. Understand and apply Microsoft Fabric concepts and infrastructure best practices for scalable BI and data integration. Develop workflows using Alteryx or similar data-prep tools as needed. Build data ingestion and transformation pipelines using Azure Data Factory or Synapse pipelines. Collaborate with data engineers to ensure data quality, integrity, and availability. Monitor and troubleshoot solutions, ensuring performance and reliability. Mentor junior team members and support internal knowledge-sharing initiatives Qualifications Bachelors degree in computer science, Information Systems, Data Analytics, or a related field. Masters degree preferred. 57 years of experience in Business Intelligence or Analytics roles. Expertise in: Power BI (data modelling, DAX, visuals, optimization) Power Apps (canvas apps, connectors, integration) SQL (query performance, views, procedures) Hands-on experience with: Azure Data Factory / Synapse Pipelines Data prep tools like Alteryx or equivalent Strong communication skills, with the ability to present technical concepts to business stakeholders. Practical, solution-oriented mindset with strong problem-solving skills. Experience with Snowflake (architecture, best practices, optimization) Exposure to Finance domain (e.g., FP&A, P&L dashboards, financial metrics) Experience with other BI tools like Tableau or QlikView is a plus Familiarity with Microsoft Fabric and its infrastructure components
Posted 1 week ago
8.0 - 10.0 years
0 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
Job Description : YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we're a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth - bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hireMicrosoft Fabric Professionals in the following areas : Experience 8+ Years Job Description Position: Data Analytics Lead. Experience: 8+ Years. Responsibilities: . Build, manage, and foster a high-functioning team of data engineers and Data analysts. . Collaborate with business and technical teams to capture and prioritize platform ingestion requirements. . Experience of working with manufacturing industry in building a centralized data platform for self service reporting. . Lead the data analytics team members, providing guidance, mentorship, and support to ensure their professional growth and success. . Responsible for managing customer, partner, and internal data on the cloud and on-premises. . Evaluate and understand current data technologies and trends and promote a culture of learning. . Build and end to end data strategy from collecting the requirements from business to modelling the data and building reports and dashboards Required Skills: . Experience in data engineering and architecture, with a focus on developing scalable cloud solutions in Azure Synapse / Microsoft Fabric / Azure Databricks . Accountable for the data group's activities including architecting, developing, and maintaining a centralized data platform including our operational data, data warehouse, data lake, Data factory pipelines, and data-related services. . Experience in designing and building operationally efficient pipelines, utilising core Azure components, such as Azure Data Factory, Azure Databricks and Pyspark etc . Strong understanding of data architecture, data modelling, and ETL processes. . Proficiency in SQL and Pyspark . Strong knowledge of building PowerBI reports and dashboards. . Excellent communication skills . Strong problem-solving and analytical skills. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Basic knowledge of customer's business processes- relevant technology platform or product. Able to prepare process maps, workflows, business cases and simple business models in line with customer requirements with assistance from SME and apply industry standards/ practices in implementation with guidance from experienced team members. Requirement Gathering and Analysis: Working knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Able to analyse the impact of change requested/ enhancement/ defect fix and identify dependencies or interrelationships among requirements & transition requirements for engagement. Product/ Technology Knowledge: Working knowledge of technology product/platform standards and specifications. Able to implement code or configure/customize products and provide inputs in design and architecture adhering to industry standards/ practices in implementation. Analyze various frameworks/tools, review the code and provide feedback on improvement opportunities. Architecture tools and frameworks: Working knowledge of architecture Industry tools & frameworks. Able to identify pros/ cons of available tools & frameworks in market and use those as per Customer requirement and explore new tools/ framework for implementation. Architecture concepts and principles : Working knowledge of architectural elements, SDLC, methodologies. Able to provides architectural design/ documentation at an application or function capability level and implement architectural patterns in solution & engagements and communicates architecture direction to the business. Analytics Solution Design: Knowledge of statistical & machine learning techniques like classification, linear regression modelling, clustering & decision trees. Able to identify the cause of errors and their potential solutions. Tools & Platform Knowledge: Familiar with wide range of mainstream commercial & open-source data science/analytics software tools, their constraints, advantages, disadvantages, and areas of application. Required Behavioral Competencies Accountability: Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration: Shares information within team, participates in team activities, asks questions to understand other points of view. Agility: Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus: Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication: Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results: Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict: Displays sensitivity in interactions and strives to understand others views and concerns. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment.We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 1 week ago
3.0 - 6.0 years
5 - 9 Lacs
Jaipur
Work from Office
We are looking for a skilled Power BI Developer to design, develop, and manage business intelligence solutions that provide actionable insights. You will work closely with stakeholders to understand data needs and transform complex datasets into intuitive, interactive dashboards and reports using Power BI. Key Responsibilities: Understand business requirements and translate them into technical BI solutions Design and develop Power BI dashboards, reports, and datasets Optimize data models for performance and scalability Use Python scripts within Power BI for advanced data manipulation, forecasting, and automation Integrate data from multiple sources (SQL, Excel, APIs, etc.) Work with DAX and Power Query for data transformation and calculations Collaborate with data engineers and business analysts to ensure data accuracy Maintain and support existing BI solutions and troubleshoot issues Stay current on Power BI updates and best practices Requirement: Bachelor's degree; MBA or relevant technical degree preferred, with 3-7 years of experience in Data Analytics. Excellent visualization skills, storytelling abilities, and familiarity with best practices for conveying insights in an intuitive and visually compelling manner. Experience in creating, collecting, analyzing, and communicating business insights from data. Strong background in retention, churn, engagement, and marketing analytics metrics. Knowledge of analytics self-service BI approaches and predictive analytics solutions is a strong advantage. Proficiency in SQL and experience working with Microsoft Azure environment. Experience in statistical techniques (e.g., R, hypothesis testing). 2-3 years of hands-on experience with BI Tools such as Tableau, Power BI, or Fabric. Knowledge of GCS or AWS, AI/ML will be a plus. Preferred Qualifications: Microsoft Power BI certification Experience with large data sets and performance tuning
Posted 1 week ago
5.0 - 6.0 years
13 - 14 Lacs
Bhopal, New Delhi, Pune
Hybrid
Design, build & maintain PowerBI/Fabric semantic models, reports, dashboards & pipelines using advanced DAX, PowerQuery, KQL; optimize performance, refreshes, handle complex models, collaborate .Contact to 9063478484 /v.aparna@tekgenieservices.com
Posted 1 week ago
7.0 - 12.0 years
7 - 12 Lacs
Pune, Maharashtra, India
On-site
As part of a critical healthcare IT transformation, Xpress we'llness is migrating its data infrastructure from Google Cloud Platform (GCP) to Microsoft Azure, and building an end-to-end ETL and reporting system to deliver key KPIs via Power BI. We are seeking a hands-on Technical Lead - Azure Data Engineering to lead the data engineering workstream of the project. The ideal candidate will have deep expertise in Azure cloud data services, strong experience in data migration from GCP to Azure, and a solid understanding of data governance, compliance, and Azure storage architectures. Key Responsibilities: Lead the technical design and implementation of data pipelines and storage on Azure. Drive the GCP-to-Azure data migration strategy and execution. Oversee the development of scalable ETL/ELT processes using Azure Data Factory, Synapse, or Fabric. Ensure alignment with data governance and healthcare compliance standards. Collaborate with architects, data engineers, and Power BI developers to enable accurate KPI delivery. Provide technical mentorship to junior engineers and ensure best practices are followe'd. Act as the primary technical point of contact for data engineering-related discussions. Key Skills & Qualifications: 7-12 years of experience in data engineering, with 3-6 years in Azure Cloud. Strong experience with Azure Data Factory, Azure Data Lake, Microsoft Fabric, ETL, Synapse Analytics, and Azure Storage services. Hands-on experience in data migration projects from GCP to Azure. Knowledge of data governance, Microsoft Purvue, privacy, and compliance (HIPAA preferred). Excellent communication and stakeholder management skills. Relevant Microsoft certifications are a plus. Our Commitment to Diversity & Inclusion: Our Perks and Benefits: Our benefits and rewards program has been thoughtfully designed to recognize your skills and contributions, elevate your learning/upskilling experience and provide care and support for you and your loved ones. As an Apexon Associate, you get continuous skill-based development, opportunities for career advancement, and access to comprehensive health and we'll-being benefits and assistance. We also offer: Group Health Insurance covering family of 4 Term Insurance and Accident Insurance Paid Holidays & Earned Leaves Paid Parental LeaveoLearning & Career Development Employee we'llness
Posted 1 week ago
2.0 - 5.0 years
8 - 13 Lacs
Pune, Maharashtra, India
On-site
We are looking for a proactive and detail-oriented Junior Data Engineer with 2 to 5 years of experience to join our cloud data transformation team. The candidate will work closely with the Data Engineering Lead and Solution Architect to support data migration, pipeline development, testing, and integration efforts on the Microsoft Azure platform. Key Responsibilities: Data Migration Support Assist in migrating structured and semi-structured data from GCP storage systems to Azure Blob Storage, Azure Data Lake, or Synapse. Help validate and reconcile data post-migration to ensure completeness and accuracy. ETL/ELT Development Build and maintain ETL pipelines using Azure Data Factory, Synapse Pipelines, or Microsoft Fabric. Support the development of data transformation logic (SQL/ADF/Dataflows). Ensure data pipelines are efficient, scalable, and meet defined SLAs. Data Modeling & Integration Support the design of data models to enable effective reporting in Power BI. Prepare clean, structured datasets ready for downstream KPI reporting and analytics use cases. Testing & Documentation Conduct unit and integration testing of data pipelines. Maintain documentation of data workflows, metadata, and pipeline configurations. Collaboration & Learning Collaborate with the Data Engineering Lead, BI Developers, and other team members. Stay current with Azure technologies and best practices under the guidance of senior team members. Qualifications: Education: bachelors degree in Computer Science, Information Technology, Engineering, or a related field. Experience: 2 to 5 years of hands-on experience in data engineering or analytics engineering roles. Exposure to at least one cloud platform (preferably Microsoft Azure). Technical Skills Required: Experience with SQL and data transformation logic. Familiarity with Azure data services like Azure Data Factory, Synapse Analytics, Blob Storage, Data Lake, or Microsoft Fabric. Basic knowledge of ETL/ELT concepts and data warehousing principles and familiarity with Unix shell scripting. Familiarity with Power BI datasets or Power Query is a plus. Good understanding of data quality and testing practices. Exposure to version control systems like Git. Soft Skills: Eagerness to learn and grow under the mentorship of experienced team members. Strong analytical and problem-solving skills. Ability to work in a collaborative, fast-paced team environment. Good written and verbal communication skills. Our Perks and Benefits: Our benefits and rewards program has been thoughtfully designed to recognize your skills and contributions, elevate your learning/upskilling experience and provide care and support for you and your loved ones. As an Apexon Associate, you get continuous skill-based development, opportunities for career advancement, and access to comprehensive health and we'll-being benefits and assistance. We also offer: Group Health Insurance covering family of 4 Term Insurance and Accident Insurance Paid Holidays & Earned Leaves Paid Parental Leaveo Learning & Career Development Employee we'llness Job Location : Pune, India
Posted 1 week ago
2.0 - 5.0 years
5 - 15 Lacs
Hyderabad
Work from Office
Company Overview Accordion works at the intersection of sponsors and management teams throughout every stage of the investment lifecycle, providing hands-on, execution-focused support to elevate data and analytics capabilities. So, what does it mean to work at Accordion? It means joining 1,000+ analytics, data science, finance & technology experts in a high-growth, agile, and entrepreneurial environment while transforming how portfolio companies drive value. It also means making your mark on Accordions futureby embracing a culture rooted in collaboration and a firm-wide commitment to building something great, together. Headquartered in New York City with 10 offices worldwide, Accordion invites you to join our journey. Data & Analytics (Accordion | Data & Analytics) Accordion's Data & Analytics (D&A) team delivers cutting-edge, intelligent solutions to a global clientele, leveraging a blend of domain knowledge, sophisticated technology tools, and deep analytics capabilities to tackle complex business challenges. We partner with Private Equity clients and their Portfolio Companies across diverse sectors, including Retail, CPG, Healthcare, Media & Entertainment, Technology, and Logistics. D&A team delivers data and analytical solutions designed to streamline reporting capabilities and enhance business insights across vast and complex data sets ranging from Sales, Operations, Marketing, Pricing, Customer Strategies, and more. Location: Hyderabad, Telangana Role Overview: Accordion is looking for Senior Data Engineer with Database/Data Warehouse/Business Intelligence experience. He/she will be responsible for the design, development, configuration/deployment, and maintenance of the above technology stack. He/she must have in depth understanding of various tools & technologies in the above domain to design and implement robust and scalable solutions which address client current and future requirements at optimal costs. The Senior Data Engineer should be able to understand various architecture and recommend right fit depending on the use case of the project. A successful Senior Data Engineer should possess strong working business knowledge, familiarity with multiple tools and techniques along with industry standards and best practices in Business Intelligence and Data Warehousing environment. He/she should have strong organizational, critical thinking, and communication skills. What You will do: Understand the business requirements thoroughly to design and develop the BI architecture. Determine business intelligence and data warehousing solutions that meet business needs. Perform data warehouse design and modelling according to established standards. Work closely with the business teams to arrive at methodologies to develop KPIs and Metrics. Work with Project Manager in developing and executing project plans within assigned schedule and timeline. Develop standard reports and functional dashboards based on business requirements. Ensure to develop and deliver high quality reports in timely and accurate manner. Conduct training programs and knowledge transfer sessions to junior developers when needed. Recommend improvements to provide optimum reporting solutions. Ideally, you have: Undergraduate degree (B.E/B.Tech.) from tier-1/tier-2 colleges are preferred. 2 - 5 years of experience in related field. Proven expertise in SSIS, SSAS and SSRS (MSBI Suite). In-depth knowledge of databases (SQL Server, MySQL, Oracle etc.) and data warehouse (Azure Synapse, AWS Redshift, Google BigQuery, Snowflake etc.). In-depth knowledge of business intelligence tools (any one of Power BI, Tableau, Qlik, DOMO, Looker etc.). Good understanding of Azure (Data Factory & Pipelines, SQL Database & Managed Instances, DevOps, Logic Apps, Analysis Services), AWS (Glue, Aurora Database, Dynamo Database, Redshift, QuickSight). Proven abilities to take on initiative and be innovative. Analytical mind with problem solving attitude. Why Explore a Career at Accordion: High growth environment: Semi-annual performance management and promotion cycles coupled with a strong meritocratic culture, enables fast track to leadership responsibility. Cross Domain Exposure: Interesting and challenging work streams across industries and domains that always keep you excited, motivated, and on your toes. Entrepreneurial Environment : Intellectual freedom to make decisions and own them. We expect you to spread your wings and assume larger responsibilities. Fun culture and peer group: Non-bureaucratic and fun working environment; Strong peer environment that will challenge you and accelerate your learning curve. Other benefits for full time employees: Health and wellness programs that include employee health insurance covering immediate family members and parents, term life insurance for employees, free health camps for employees, discounted health services (including vision, dental) for employee and family members, free doctors consultations, counsellors, etc. Corporate Meal card options for ease of use and tax benefits. Team lunches, company sponsored team outings and celebrations. Cab reimbursement for women employees beyond a certain time of the day. Robust leave policy to support work-life balance. Specially designed leave structure to support woman employees for maternity and related requests. Reward and recognition platform to celebrate professional and personal milestones. A positive & transparent work environment including various employee engagement and employee benefit initiatives to support personal and professional learning and development.
Posted 1 week ago
12.0 - 18.0 years
25 - 40 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Role & responsibilities Azure Cloud Services (PaaS & IaaS): Proficient in deploying and managing cloud-based solutions using Azure's Platform-as-a-Service and Infrastructure-as-a-Service offerings. Data Engineering & Analytics: Azure Synapse Analytics: Integrated big data and data warehousing capabilities for comprehensive analytics solutions. Azure Data Factory: Developed and orchestrated ETL/ELT pipelines for seamless data movement and transformation. Azure Databricks & PySpark: Engineered scalable data processing workflows and machine learning models. Azure Stream Analytics: Implemented real-time data stream processing for immediate insights. Microsoft Fabric: Utilized AI-powered analytics for unified data access and management.deepaksood619.github.io Business Intelligence & Reporting: Power BI & SSRS: Designed and developed interactive dashboards and reports for data visualization and decision-making. SQL Server Analysis Services (SSAS): Built OLAP cubes and tabular models for multidimensional data analysis. Data Governance & Security: Microsoft Purview: Established comprehensive data governance frameworks to ensure compliance and data integrity. DevOps & Automation: Azure DevOps: Implemented CI/CD pipelines and automated deployment processes for efficient software delivery. Preferred candidate profile Technical Skills: Cloud Computing: Azure-Cloud Services (PaaS & IaaS), Active Directory, Application Insights, Azure Stream Analytics, Azure Search, Data Factory, Key Vault and SQL Azure, Azure Data Factory, Azure Analysis services, Azure Synapse Analytics (DW), Azure Data Lake, PySpark, Microsoft Fabric Database & BI Tools: SQL, T-SQL, SSIS, SSRS, SQL Server Management Studio (SSMS) 2016/2014, SQL Server Job Agent, Import and Export Data, Linked Servers. Reporting Tools: SSRS, Power BI reports, Tableau, Excel
Posted 1 week ago
8.0 - 13.0 years
15 - 30 Lacs
Pune
Work from Office
Role & responsiby: Position Details: Role: Data Engineer Location: Pune, Baner (Work from Office) Experience: 6+ years Working Days: Monday to Friday (9:30 AM 6:30 PM) Education: Bachelors or Masters in Computer Science, Engineering, Mathematics, or related field Company Website: www.avigna.ai LinkedIn: Avigna.AI Key Responsibilities: Design and develop robust data pipelines for large-scale data ingestion, transformation, and analytics. Implement scalable Lakehouse architectures using tools like Microsoft Fabric for structured and semi-structured data. Work with Python , PySpark , and Azure services to support data modeling, automation, and predictive insights. Develop custom KQL queries and manage data using Power BI , Azure Cosmos DB , or similar tools. Collaborate with cross-functional teams to integrate data-driven components with application backends and frontends. Ensure secure, efficient, and reliable CI/CD pipelines for automated deployments and data updates. Skills & Experience Required: Strong proficiency in Python , PySpark , and cloud-native data tools Experience with Microsoft Azure services (e.g., App Services, Functions, Cosmos DB, Active Directory) Hands-on experience with Microsoft Fabric (preferred or good to have) Working knowledge of Power BI and building interactive dashboards for business insights Familiarity with CI/CD practices for automated deployments Exposure to machine learning integration into data workflows (nice to have) Strong analytical and problem-solving skills with attention to detail Good to Have: Experience with KQL (Kusto Query Language) Background in simulation models or mathematical modeling Knowledge of Power Platform integration (Power Pages, Power Apps) ilities Benefits : Competitive salary. Health insurance coverage. Professional development opportunities. Dynamic and collaborative work environment.
Posted 1 week ago
8.0 - 13.0 years
14 - 24 Lacs
Bengaluru
Remote
Key Responsibilities: Design and implement data solutions using MS Fabric, including data pipelines, data warehouses, and data lakes Lead and mentor a team of data engineers, providing technical guidance and oversight Collaborate with stakeholders to understand data requirements and deliver data-driven solutions Develop and maintain large-scale data systems, ensuring data quality, integrity, and security Troubleshoot data pipeline issues and optimize data workflows for performance and scalability Stay up-to-date with MS Fabric features and best practices, applying knowledge to improve data solutions Requirements: 8+ years of experience in data engineering, with expertise in MS Fabric, Azure Data Factory, or similar technologies Strong programming skills in languages like Python, SQL, or C# Experience with data modeling, data warehousing, and data governance Excellent problem-solving skills, with ability to troubleshoot complex data pipeline issues Strong communication and leadership skills, with experience leading teams
Posted 1 week ago
5.0 - 9.0 years
15 - 25 Lacs
Bengaluru
Hybrid
Position: Data Engineer Skills Required: Experience in Python/Pyspark, Strong SQL Server. Good to have: Azure DataBricks (ADF) (or) Azure Synapse or Snowflake.
Posted 1 week ago
1.0 - 4.0 years
5 - 9 Lacs
Noida, Mohali
Work from Office
- Support the development of internal web applications and tools. - Help build and maintain backend services. - Contribute to frontend development using React.js or Vue.js. - Assist in setting up and managing cloud-based infrastructure.
Posted 1 week ago
7.0 - 12.0 years
8 - 18 Lacs
Kolkata
Remote
Position : Sr Azure Data Engineer Location: Remote Time : CET Time Role & responsibilities We are seeking a highly skilled Senior Data Engineer to join our dynamic team. The ideal candidate will have extensive experience in Microsoft Azure, Fabric Azure SQL, Azure Synapse, Python, and Power BI. Knowledge of Oracle DB and data replication tools will be preferred . This role involves designing, developing, and maintaining robust data pipelines and ensuring efficient data processing and integration across various platforms. Candidate understands stated needs & requirements of the stakeholders and produce high quality deliverables Monitors own work to ensure delivery within the desired performance standards. Understands the importance of delivery within expected time, budget and quality standards and displays concern in case of deviation. Good communication skills and a team player Design and Development: Architect, develop, and maintain scalable data pipelines using Microsoft Fabric and Azure services, including Azure SQL and Azure Synapse. Data Integration : Integrate data from multiple sources, ensuring data consistency, quality, and availability using data replication tools. Data Management: Manage and optimize databases, ensuring high performance and reliability. ETL Processes: Develop and maintain ETL processes to transform data into actionable insights. Data Analysis: Use Python and other tools to analyze data, create reports, and provide insights to support business decisions. Visualization : Develop and maintain dashboards and reports in Power BI to visualize complex data sets. Performance Tuning : Optimize database performance and troubleshoot any issues related to data processing and integration Preferred candidate profile Minimum 7 years of experience in data engineering or a related field. Proven experience with Microsoft Azure services, Fabrics including Azure SQL and Azure Synapse. Strong proficiency in Python for data analysis and scripting. Extensive experience with Power BI for data visualization. Knowledge of Oracle DB and experience with data replication tools. Proficient in SQL and database management. Experience with ETL tools and processes. Strong understanding of data warehousing concepts and architectures. Familiarity with cloud-based data platforms and services. Analytical Skills: Ability to analyze complex data sets and provide actionable insights. Problem-Solving: Strong problem-solving skills and the ability to troubleshoot data-related issues.
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6462 Jobs | Ahmedabad
Amazon
6351 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane