Jobs
Interviews

1620 Azure Databricks Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 9.0 years

9 - 13 Lacs

jaipur

Work from Office

About the job : Role : Microsoft Fabric Data Engineer Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundational knowledge in data warehousing, ETL/ELT processes, and data modeling (dimensional, normalized). - Design and implement scalable and efficient data pipelines using Data Factory (Data Pipeline, Data Flow Gen 2 etc) in Fabric, Pyspark notebooks, Spark SQL, and Python. This includes data ingestion, data transformation, and data loading processes. - Experience ingesting data from SAP systems like SAP ECC/S4HANA/SAP BW etc will be a plus. - Nice to have ability to develop dashboards or reports using tools like Power BI. Coding Fluency : - Proficiency in SQL, Python, or other languages for data scripting, transformation, and automation.

Posted 4 hours ago

Apply

10.0 - 20.0 years

35 - 60 Lacs

mumbai, navi mumbai

Work from Office

We Are Hiring Databricks Data Architect | Navi Mumbai (Onsite) Are you passionate about designing scalable, enterprise-grade data platforms? Join Celebal Technologies and work on cutting-edge Azure Databricks solutions in the manufacturing and energy sectors! Role: Databricks Data Architect Experience: 10+ Years Location: Navi Mumbai (Onsite) Notice Period: Immediate to 30 Days Preferred About the Role We are looking for an experienced Databricks Data Architect with strong expertise in Azure Databricks, data modeling, and big data solutions. Youll be responsible for architecting scalable, cloud-native data platforms, enabling real-time and batch processing for advanced analytics and AI-driven insights Key Skills Azure Databricks | Apache Spark | PySpark Delta Lake | Data Lakes | Data Warehouses | Lakehouse Architectures Kafka / Event Hub | Streaming & Batch Data Processing Data Modeling | ETL / ELT Pipelines | Metadata Management Data Governance | Security & Compliance Manufacturing / Energy Domain Experience (Preferred) Why Join Us? Work on innovative big data & cloud-native solutions Exposure to manufacturing and energy sector projects Collaborative, growth-oriented work environment Be part of a fast-growing leader in data engineering & AI solutions Interested? Let’s connect! Send your resume to Latha.kolla@celebaltech.com/8197451451 #Hiring #Databricks #AzureDatabricks #DataArchitect #BigData #DataEngineering #Lakehouse #DeltaLake #Kafka #Azure #CloudComputing #Manufacturing #Energy #CelebalTechnologies #Careers #JobSearch

Posted 5 hours ago

Apply

12.0 - 22.0 years

35 - 55 Lacs

bengaluru

Work from Office

Role: QA Lead ETL EXP : 12 + years Location: Bangalore Notice: within 1 month Skills & Description: Job Description: Quality Engineer -Automation-ETL, Complex SQL, Azure Databricks "Specific requirements of this unique demand/role: The STAM project is seeking a skilled Software QA Test Engineer. STAM (Site Transaction Analytical Mobility) is a strategic Data LakeHouse solution within the Downstream Mobility Data & Analytics portfolio, designed to replace the current RSTS (SAP4HANA) system. RSTS captures transactional sales data from Shell retail stations and transforms it into actionable insights for reporting and analytics. The STAM solution leverages modern cloud technologies including Azure Event Hubs, Azure Databricks, Azure Data Factory, and Azure Data Lake Storage, with reporting via SAP BO, Power BI, and external system integration through APIs. Key Responsibilities: Develop and own QA deliverables including: Test Strategy, Test Plan, Test Cases (risk-based approach) Execute manual and automated test cases to validate data integrity and system functionality. Identify opportunities for test automation and contribute to automation strategy. Design and execute tests for large-scale data migrations, ensuring data availability, accuracy, and quality. Collaborate with developers, data engineers, and business stakeholders to ensure testing aligns with business requirements and technical design. Qualifications Proven experience in software QA and testing, preferably in data platforms or cloud-based solutions. Strong understanding of data pipelines, ETL processes, and data validation techniques. Familiarity with Azure cloud services, especially Databricks and Data Factory. Experience with test automation tools and scripting is a plus. Excellent analytical and communication skills. Roles and Responsibilities: Accountable for End to End testing deliverables in a portfolio and demand management, by bringing in capable candidates to the team and ensuring End to End delivery for each of the programs has top quality deliverables. Maintain Business Continuity. Manage a small team of 10-12 members, their goals, appraisals and the development plans supporting their aspirations. Demonstrate Automation experience, ability to manage and deliver top quality testing for medium to large programs. Take a leadership position in automation, working with Architects and Scrum team members to clarify requirements, ensure testability and the ability to automate. Provide feedback on design, both functional and technical. Innovate on latest tools and processes to improve QA functional, manual and Automation testing. Document best practices and mentor junior team members. Work on frameworks to ensure continuous deployment and continuous integration. www.talendroid.com 3 Talendroid Technologies - Confidential Develop new proofs-of-concepts for QA Automation, ensuring continual improvements. The role has frequent exposure to and interaction with senior levels of business leadership. Works independently under minimal managerial supervision. Delivers the work as assigned, applying the work procedures, frameworks. Guidance provided for those tasks where there are limited precedents available. Define roadmap for the CC in collaboration with Portfolio/IT managers and drive alignment across all stakeholders, Mandatory skills: Bachelor's Degree in computer science / equivalent and 12+ years of experience as QA lead. Agile testing & test automation (accelerators tools). Selenium C# with BDD framework using Specflow, API Automation. Implementation of selenium Grid. Applying appropriate test measurements & metrics. Optional skills: Leadership and ability to drive change using influence and networking. Good team player and effective leader with a long-term vision. Flexible re time zones - coordination with stakeholders in UK/US

Posted 17 hours ago

Apply

7.0 - 12.0 years

0 - 0 Lacs

bengaluru

Hybrid

We are seeking a Senior Azure Data Engineer with hands-on expertise in Azure Databricks, Azure Data Factory (ADF), Python/PySpark, and SQL. The ideal candidate should have strong experience designing and building scalable data pipelines in the Azure cloud. Primary Responsibilities: Design and develop scalable ETL/ELT pipelines using ADF and Databricks (PySpark). Write efficient, production-grade PySpark and SQL code for data transformations. Build and maintain data models, ingestion frameworks, and processing pipelines. Collaborate with cross-functional teams to gather requirements and implement solutions. Ensure performance tuning, error handling, and monitoring of data workflows. Mandatory Skills: Strong hands-on experience with Azure Databricks and Azure Data Factory (ADF). Proficiency in Python and PySpark. Excellent SQL skills for data extraction, transformation, and analysis. Good understanding of data lake architecture and best practices. Good to Have / Secondary Skills: Experience with CI/CD pipelines and GitHub-based version control. Exposure to DevOps practices in data engineering. Knowledge of data quality frameworks and orchestration tools is a plus. Soft Skills: Strong analytical and problem-solving skills. Effective communication and stakeholder collaboration. Agile mindset and ability to adapt to evolving data needs.

Posted 17 hours ago

Apply

3.0 - 8.0 years

9 - 13 Lacs

hyderabad

Work from Office

Project Role : Data Platform Engineer Project Role Description : Assists with the data platform blueprint and design, encompassing the relevant data platform components. Collaborates with the Integration Architects and Data Architects to ensure cohesive integration between systems and data models. Must have skills : Databricks Unified Data Analytics Platform Good to have skills : Microsoft Power Business Intelligence (BI), Microsoft Azure DatabricksMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Platform Engineer, you will assist with the data platform blueprint and design, encompassing the relevant data platform components. Your typical day will involve collaborating with Integration Architects and Data Architects to ensure cohesive integration between systems and data models, while also engaging in discussions to refine and enhance the data architecture. You will be involved in analyzing data requirements and translating them into effective solutions that align with the overall data strategy of the organization. Your role will require you to stay updated with the latest trends in data engineering and contribute to the continuous improvement of data processes and systems. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Engage in the design and implementation of data pipelines to support data integration and analytics.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Databricks Unified Data Analytics Platform.- Good To Have Skills: Experience with Microsoft Azure Databricks, Microsoft Power Business Intelligence (BI).- Strong understanding of data modeling concepts and best practices.- Experience with ETL processes and data warehousing solutions.- Familiarity with cloud-based data solutions and architectures. Additional Information:- The candidate should have minimum 3 years of experience in Databricks Unified Data Analytics Platform.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 20 hours ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

pune

Work from Office

About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Azure Databricks Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure that application development aligns with business objectives, overseeing project timelines, and facilitating communication among stakeholders. You will also be responsible for troubleshooting issues and providing guidance to team members, ensuring that the applications meet the required standards and specifications. Your role will be pivotal in driving innovation and efficiency within the application development process, fostering a collaborative environment that encourages creativity and problem-solving. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate training and development opportunities for team members to enhance their skills.- Monitor project progress and implement necessary adjustments to ensure timely delivery. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Databricks.- Good To Have Skills: Experience with cloud computing platforms.- Strong understanding of application development methodologies.- Familiarity with data integration and ETL processes.- Experience in performance tuning and optimization of applications. Additional Information:- The candidate should have minimum 7.5 years of experience in Microsoft Azure Databricks.- This position is based in Pune.- A 15 years full time education is required. Qualification 15 years full time education

Posted 20 hours ago

Apply

7.0 - 10.0 years

3 - 7 Lacs

noida

Work from Office

We are seeking a Lead Data Engineer with 7-10 years of experience to join our Data Platform team. This role will report to the Manager of data engineering and be involved in the planning, design, and implementation of our centralized data warehouse solution for ETL, reporting and analytics across all applications within the company. Qualifications: Deep knowledge and experience working with Python/Scala and PySpark Experienced in Azure Data bricks, Azure Blob Storage, Azure Data Lake, Delta lake. Experience working with Spark SQL Experience in creating pipelines and databricks dashboards Experience with Azure cloud environments Experience with acquiring and preparing data from primary and secondary disparate data sources Experience working on large scale data product implementation, responsible for technical delivery. Good understanding of OOPs and design patterns Experience working with agile methodology preferred. Healthcare industry experience preferred. Responsibilities: Collaborate with and across Agile teams to design, develop, test, implement, and support technical solutions Work with other team with deep experience in ETL process, distributed microservices, and data science domains to understand how to centralize their data Share your passion for staying experimenting with and learning new technologies. Perform thorough data analysis, uncover opportunities, and address business problems.

Posted 21 hours ago

Apply

5.0 - 7.0 years

10 - 14 Lacs

kolkata

Work from Office

Job Title : Sr. Data Engineer Ontology & Knowledge Graph Specialist Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.

Posted 21 hours ago

Apply

5.0 - 7.0 years

10 - 14 Lacs

ahmedabad

Work from Office

Job Title : Sr. Data Engineer Ontology & Knowledge Graph Specialist Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.

Posted 21 hours ago

Apply

8.0 - 13.0 years

12 - 16 Lacs

bengaluru

Hybrid

This Position reports to: Senior Project Manager In this role, you will have the opportunity to deliver and implement analytics strategy in one or more functional areas in alignment with business stakeholders. Each day, you will build and maintain large-scale statistical models that turn billions of data points into insights and actions. You will also showcase your expertise by driving data-driven strategy and decision making for predictive analytics and optimization by leveraging Artificial Intelligence (AI) and Machine Learning (ML). The work model for the role is: #LI- Hybrid. This role is contributing to Process Automation business for BA Function Operation Centers and Quality division based in Bangalore, India You will be mainly accountable for: Front-End Development Translate design mockups into responsive web pages or applications using HTML and CSS, ensuring pixel-perfect implementation. Application Development & AI Enablement Build end-to-end frameworks, integrating data analytics and AI capabilities for enhanced functionality. Software Design & Support Ensure effective design, development, validation, and support activities, aligning with customer needs and ABB standards. Machine Learning & Data Science Develop and fine-tune ML models for predictive analytics, classification, and recommendation systems. Data Analysis & Visualization Perform exploratory data analysis, preprocess data, and create compelling visualizations for non-technical stakeholders. Project Planning & Collaboration Define solutions, manage projects, track progress, and collaborate with cross-functional teams to develop data science solutions. Continuous Learning & ML Deployment Stay updated on the latest AI/ML trends and deploy ML models as REST APIs for seamless integration. You will join a dynamic team, where you will be able to thrive. Qualifications for the role Educational Background & Experience Engineering or masters degree in data science with 8+ years of AI/ML expertise. Front-End & JavaScript Skills Proficiency in responsive web design and JavaScript frameworks like React, Angular, or Vue.js.. AI/ML & Data Science Expertise Strong Python skills, experience with PyTorch/TensorFlow, and knowledge of Manufacturing/Process Industries. Cloud & Deployment Proficiency Familiarity with Azure ML Studio, Azure Databricks, Docker, Kubernetes, and MLflow for scalable deployment. Analytical & Visualization Skills Ability to analyze data, build predictive models, and create insightful visualizations. Communication & Collaboration Strong ability to engage with clients, cross-functional teams, and stakeholders to drive innovation.

Posted 21 hours ago

Apply

5.0 - 9.0 years

15 - 27 Lacs

hyderabad, gurugram

Work from Office

Must Have: Strong proficiency in Python and TypeScript for fullstack development. Backend development experience with Flask or equivalent Python web frameworks. Hands-on experience integrating AI features using LangChain, OpenAI SDKs, or similar. Understanding of core AI concepts such as Retrieval-Augmented Generation (RAG) and Model Context Protocol (MCP). Frontend development skills with Next.js for building responsive, modern UIs. UI styling using Tailwind CSS and component libraries like ShadCN. State management using React Query, Zustand, or equivalent tools. Experience deploying scalable applications on Microsoft Azure. Ability to write clean, maintainable, and well-structured code. Good to Have: Familiarity with Azure Data bricks and Azure Data Factory. Experience building Microsoft Office Add-ins (Word, Excel, Outlook). Knowledge of prompt engineering and working with LLM APIs. Experience with Docker and CI/CD pipeline integration. Understanding of performance optimization and monitoring tools within Azure

Posted 22 hours ago

Apply

6.0 - 10.0 years

9 - 13 Lacs

ahmedabad

Work from Office

The Azure Data Bricks Engineer plays a critical role in establishing and maintaining an efficient data ecosystem within an organization. This position is integral to the development of data solutions leveraging the capabilities of Microsoft Azure Data Bricks. The engineer will work closely with data scientists and analytics teams to facilitate the transformation of raw data into actionable insights. With increasing reliance on big data technologies and cloud-based solutions, having an expert on board is vital for driving data-driven decision-making processes. The Azure Data Bricks Engineer will also be responsible for optimizing data workflows, ensuring data quality, and deploying scalable data solutions that align with organizational goals. This role requires not only technical expertise in handling large volumes of data but also the ability to collaborate across various functional teams to enhance operational efficiency. - Design and implement scalable data pipelines using Azure Data Bricks. - Develop ETL processes to efficiently extract, transform, and load data. - Collaborate with data scientists and analysts to define and refine data requirements. - Optimize Spark jobs for performance and efficiency. - Monitor and troubleshoot production workflows and jobs. - Implement data quality checks and validation processes. - Create and maintain technical documentation related to data architecture. - Conduct code reviews to ensure best practices are followed. - Work on integrating data from various sources including databases, APIs, and third-party services. - Utilize SQL and Python for data manipulation and analysis. - Collaborate with DevOps teams to deploy and maintain data solutions. - Stay updated with the latest trends and updates in Azure Data Bricks and related technologies. - Facilitate data visualization initiatives for better data-driven insights. - Provide training and support to team members on data tools and practices. - Participate in cross-functional projects to enhance data sharing and access. - Bachelor's degree in Computer Science, Information Technology, or a related field. - Minimum of 6 years of experience in data engineering or a related domain. - Strong expertise in Azure Data Bricks and data lake concepts. - Proficiency with SQL, Python, and Spark. - Solid understanding of data warehousing concepts. - Experience with ETL tools and frameworks. - Familiarity with cloud platforms such as Azure, AWS, or Google Cloud. - Excellent problem-solving and analytical skills. - Ability to work collaboratively in a diverse team environment. - Experience with data visualization tools such as Power BI or Tableau. - Strong communication skills with the ability to convey technical concepts to non-technical stakeholders. - Knowledge of data governance and data quality best practices. - Hands-on experience with big data technologies and frameworks. - A relevant certification in Azure is a plus. - Ability to adapt to changing technologies and evolving business requirements.

Posted 22 hours ago

Apply

4.0 - 6.0 years

9 - 13 Lacs

gurugram

Work from Office

As a Senior Cloud Data Platform (Azure) Engineer at Incedo, you will be responsible for managing and optimizing the Microsoft Azure environment, ensuring its performance, scalability, and security. You will work closely with data analysts and data scientists to develop data pipelines and run data science experiments. You will be skilled in cloud computing platforms such as AWS or GCP and have experience with data warehousing technologies such as Azure Synapse Analytics. You will be responsible for configuring and optimizing the Azure environment, ensuring that data pipelines are efficient and accurate, and troubleshooting any issues that arise. You will also work with the security team to ensure that the Azure environment is secure and complies with relevant regulations. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Microsoft Azure Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving data platform issues Technical Skills Skills Requirements: Expertise in Azure services and tools such as Azure Data Factory, Azure Databricks, and Azure Synapse Analytics Experience in building scalable and reliable data pipelines using Azure services, Apache Spark, and related big data technologies Familiarity with cloud-based infrastructure and deployment, specifically on Azure Strong knowledge of programming languages such as Python, C#, and SQL Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 22 hours ago

Apply

6.0 - 10.0 years

9 - 13 Lacs

kolkata

Work from Office

The Azure Data Bricks Engineer plays a critical role in establishing and maintaining an efficient data ecosystem within an organization. This position is integral to the development of data solutions leveraging the capabilities of Microsoft Azure Data Bricks. The engineer will work closely with data scientists and analytics teams to facilitate the transformation of raw data into actionable insights. With increasing reliance on big data technologies and cloud-based solutions, having an expert on board is vital for driving data-driven decision-making processes. The Azure Data Bricks Engineer will also be responsible for optimizing data workflows, ensuring data quality, and deploying scalable data solutions that align with organizational goals. This role requires not only technical expertise in handling large volumes of data but also the ability to collaborate across various functional teams to enhance operational efficiency. - Design and implement scalable data pipelines using Azure Data Bricks. - Develop ETL processes to efficiently extract, transform, and load data. - Collaborate with data scientists and analysts to define and refine data requirements. - Optimize Spark jobs for performance and efficiency. - Monitor and troubleshoot production workflows and jobs. - Implement data quality checks and validation processes. - Create and maintain technical documentation related to data architecture. - Conduct code reviews to ensure best practices are followed. - Work on integrating data from various sources including databases, APIs, and third-party services. - Utilize SQL and Python for data manipulation and analysis. - Collaborate with DevOps teams to deploy and maintain data solutions. - Stay updated with the latest trends and updates in Azure Data Bricks and related technologies. - Facilitate data visualization initiatives for better data-driven insights. - Provide training and support to team members on data tools and practices. - Participate in cross-functional projects to enhance data sharing and access. - Bachelor's degree in Computer Science, Information Technology, or a related field. - Minimum of 6 years of experience in data engineering or a related domain. - Strong expertise in Azure Data Bricks and data lake concepts. - Proficiency with SQL, Python, and Spark. - Solid understanding of data warehousing concepts. - Experience with ETL tools and frameworks. - Familiarity with cloud platforms such as Azure, AWS, or Google Cloud. - Excellent problem-solving and analytical skills. - Ability to work collaboratively in a diverse team environment. - Experience with data visualization tools such as Power BI or Tableau. - Strong communication skills with the ability to convey technical concepts to non-technical stakeholders. - Knowledge of data governance and data quality best practices. - Hands-on experience with big data technologies and frameworks. - A relevant certification in Azure is a plus. - Ability to adapt to changing technologies and evolving business requirements.

Posted 22 hours ago

Apply

6.0 - 9.0 years

9 - 13 Lacs

pune

Work from Office

About the job : Role : Microsoft Fabric Data Engineer Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundational knowledge in data warehousing, ETL/ELT processes, and data modeling (dimensional, normalized). - Design and implement scalable and efficient data pipelines using Data Factory (Data Pipeline, Data Flow Gen 2 etc) in Fabric, Pyspark notebooks, Spark SQL, and Python. This includes data ingestion, data transformation, and data loading processes. - Experience ingesting data from SAP systems like SAP ECC/S4HANA/SAP BW etc will be a plus. - Nice to have ability to develop dashboards or reports using tools like Power BI. Coding Fluency : - Proficiency in SQL, Python, or other languages for data scripting, transformation, and automation.

Posted 22 hours ago

Apply

6.0 - 9.0 years

9 - 13 Lacs

chennai

Work from Office

Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundational knowledge in data warehousing, ETL/ELT processes, and data modeling (dimensional, normalized). - Design and implement scalable and efficient data pipelines using Data Factory (Data Pipeline, Data Flow Gen 2 etc) in Fabric, Pyspark notebooks, Spark SQL, and Python. This includes data ingestion, data transformation, and data loading processes. - Experience ingesting data from SAP systems like SAP ECC/S4HANA/SAP BW etc will be a plus. - Nice to have ability to develop dashboards or reports using tools like Power BI. Coding Fluency : - Proficiency in SQL, Python, or other languages for data scripting, transformation, and automation.

Posted 22 hours ago

Apply

5.0 - 10.0 years

5 - 8 Lacs

bengaluru

Work from Office

Your role and responsibilities: Responsible for the design, integrity and quality of our D&A solution. Design solution on the Finance Snowflake platform which will support business or function requirements. Responsible that the Finance solution/service provides the required functionality and meets the cost expectations of business. Collaborates with and contributes to project set-up and delivery as well as to service operations optimization. The work model for the role is: #LI-Onsite This role is contributing to the Finance Services business Finance Process Data Systems division in Bangalore, India. You will be mainly accountable for: The Snowflake Solution Architect would provide solution design in the areas of Snowflake building and maintaining: Data ingestion and modelling in Snowflake from various data sources (S4HANA, BW4/HANA, etc.) Transformation within Click Compose, Snowflake architecture of instances /layers, DataMart design to support easy Power BI report/dashboard development, Authorization and access rights concept The Snowflake Solution Architect. Assists in defining technical solutions in alignment with other D&A, AI & Finance function stakeholders through cooperation and collaboration across teams. Supports ensuring the solution provides the required functionality and meets the requirements of functional stakeholders. Contributes to delivery as well as continuous improvement projects Engages in requirement engineering to decide which requirements are covered by available standard functionalities, for which functionalities enhancements are needed and for which functionalities workarounds are accepted. Supports validation and prioritization of incoming business demand in collaboration with the relevant IS Service Manager and/or Global IS Domain Architect. Assists in reviewing and signing off and signs off the functional and technical use case specifications. Qualifications for the role: A B.Sc. or M.Sc. in Computer Science or a related discipline, At least 5 years' working experience in the information technology field. At least 3-5 years of experience Snowflake/Microsoft BI application landscape such as SQL Server, Power BI, Integration services, Analysis services. Good understanding of the structure and logic of data warehousing and business intelligence methodologies. Experience with common software development tools: Power BI, Microsoft Integration Services, Microsoft Analysis Services, Microsoft Azure solutions, Microsoft SQL data warehouse. Experience in UI development, Fluent in English, Ideally, you have other language skills, Experience of DevOps methodology

Posted 23 hours ago

Apply

6.0 - 10.0 years

9 - 13 Lacs

pune

Work from Office

The Azure Data Bricks Engineer plays a critical role in establishing and maintaining an efficient data ecosystem within an organization. This position is integral to the development of data solutions leveraging the capabilities of Microsoft Azure Data Bricks. The engineer will work closely with data scientists and analytics teams to facilitate the transformation of raw data into actionable insights. With increasing reliance on big data technologies and cloud-based solutions, having an expert on board is vital for driving data-driven decision-making processes. The Azure Data Bricks Engineer will also be responsible for optimizing data workflows, ensuring data quality, and deploying scalable data solutions that align with organizational goals. This role requires not only technical expertise in handling large volumes of data but also the ability to collaborate across various functional teams to enhance operational efficiency. - Design and implement scalable data pipelines using Azure Data Bricks. - Develop ETL processes to efficiently extract, transform, and load data. - Collaborate with data scientists and analysts to define and refine data requirements. - Optimize Spark jobs for performance and efficiency. - Monitor and troubleshoot production workflows and jobs. - Implement data quality checks and validation processes. - Create and maintain technical documentation related to data architecture. - Conduct code reviews to ensure best practices are followed. - Work on integrating data from various sources including databases, APIs, and third-party services. - Utilize SQL and Python for data manipulation and analysis. - Collaborate with DevOps teams to deploy and maintain data solutions. - Stay updated with the latest trends and updates in Azure Data Bricks and related technologies. - Facilitate data visualization initiatives for better data-driven insights. - Provide training and support to team members on data tools and practices. - Participate in cross-functional projects to enhance data sharing and access. - Bachelor's degree in Computer Science, Information Technology, or a related field. - Minimum of 6 years of experience in data engineering or a related domain. - Strong expertise in Azure Data Bricks and data lake concepts. - Proficiency with SQL, Python, and Spark. - Solid understanding of data warehousing concepts. - Experience with ETL tools and frameworks. - Familiarity with cloud platforms such as Azure, AWS, or Google Cloud. - Excellent problem-solving and analytical skills. - Ability to work collaboratively in a diverse team environment. - Experience with data visualization tools such as Power BI or Tableau. - Strong communication skills with the ability to convey technical concepts to non-technical stakeholders. - Knowledge of data governance and data quality best practices. - Hands-on experience with big data technologies and frameworks. - A relevant certification in Azure is a plus. - Ability to adapt to changing technologies and evolving business requirements.

Posted 23 hours ago

Apply

6.0 - 9.0 years

9 - 13 Lacs

ahmedabad

Work from Office

About the job : Role : Microsoft Fabric Data Engineer Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundational knowledge in data warehousing, ETL/ELT processes, and data modeling (dimensional, normalized). - Design and implement scalable and efficient data pipelines using Data Factory (Data Pipeline, Data Flow Gen 2 etc) in Fabric, Pyspark notebooks, Spark SQL, and Python. This includes data ingestion, data transformation, and data loading processes. - Experience ingesting data from SAP systems like SAP ECC/S4HANA/SAP BW etc will be a plus. - Nice to have ability to develop dashboards or reports using tools like Power BI. Coding Fluency : - Proficiency in SQL, Python, or other languages for data scripting, transformation, and automation.

Posted 23 hours ago

Apply

6.0 - 9.0 years

9 - 13 Lacs

kolkata

Remote

About the job : Role : Microsoft Fabric Data Engineer Experience : 6+ years as Azure Data Engineer including at least 1 E2E Implementation in Microsoft Fabric. Responsibilities : - Lead the design and implementation of Microsoft Fabric-centric data platforms and data warehouses. - Develop and optimize ETL/ELT processes within the Microsoft Azure ecosystem, effectively utilizing relevant Fabric solutions. - Ensure data integrity, quality, and governance throughout Microsoft Fabric environment. - Collaborate with stakeholders to translate business needs into actionable data solutions. - Troubleshoot and optimize existing Fabric implementations for enhanced performance. Skills : - Solid foundational knowledge in data warehousing, ETL/ELT processes, and data modeling (dimensional, normalized). - Design and implement scalable and efficient data pipelines using Data Factory (Data Pipeline, Data Flow Gen 2 etc) in Fabric, Pyspark notebooks, Spark SQL, and Python. This includes data ingestion, data transformation, and data loading processes. - Experience ingesting data from SAP systems like SAP ECC/S4HANA/SAP BW etc will be a plus. - Nice to have ability to develop dashboards or reports using tools like Power BI. Coding Fluency : - Proficiency in SQL, Python, or other languages for data scripting, transformation, and automation.

Posted 23 hours ago

Apply

12.0 - 14.0 years

25 - 30 Lacs

chennai

Work from Office

The Solution Architect Data Engineer will design, implement, and manage data solutions for the insurance business, leveraging expertise in Cognos, DB2, Azure Databricks, ETL processes, and SQL. The role involves working with cross-functional teams to design scalable data architectures and enable advanced analytics and reporting, supporting the company's finance, underwriting, claims, and customer service operations. Key Responsibilities: Data Architecture & Design: Design and implement robust, scalable data architectures and solutions in the insurance domain using Azure Databricks, DB2, and other data platforms. Data Integration & ETL Processes: Lead the development and optimization of ETL pipelines to extract, transform, and load data from multiple sources, ensuring data integrity and performance. Cognos Reporting: Oversee the design and maintenance of Cognos reporting systems, developing custom reports and dashboards to support business users in finance, claims, underwriting, and operations. Data Engineering: Design, build, and maintain data models, data pipelines, and databases to enable business intelligence and advanced analytics across the organization. Cloud Infrastructure: Develop and manage data solutions on Azure, including Databricks for data processing, ensuring seamless integration with existing systems (e.g., DB2, legacy platforms). SQL Development: Write and optimize complex SQL queries for data extraction, manipulation, and reporting purposes, with a focus on performance and scalability. Data Governance & Quality: Ensure data quality, consistency, and governance across all data solutions, implementing best practices and adhering to industry standards (e.g., GDPR, insurance regulations). Collaboration: Work closely with business stakeholders, data scientists, and analysts to understand business needs and translate them into technical solutions that drive actionable insights. Solution Architecture: Provide architectural leadership in designing data platforms, ensuring that solutions meet business requirements, are cost-effective, and can scale for future growth. Performance Optimization: Continuously monitor and tune the performance of databases, ETL processes, and reporting tools to meet service level agreements (SLAs). Documentation: Create and maintain comprehensive technical documentation including architecture diagrams, ETL process flows, and data dictionaries. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Proven experience as a Solution Architect or Data Engineer in the insurance industry, with a strong focus on data solutions. Hands-on experience with Cognos (for reporting and dashboarding) and DB2 (for database management). Proficiency in Azure Databricks for data processing, machine learning, and real-time analytics. Extensive experience in ETL development, data integration, and data transformation processes. Strong knowledge of Python, SQL (advanced query writing, optimization, and troubleshooting). Experience with cloud platforms (Azure preferred) and hybrid data environments (on-premises and cloud). Familiarity with data governance and regulatory requirements in the insurance industry (e.g., Solvency II, IFRS 17). Strong problem-solving skills, with the ability to troubleshoot and resolve complex technical issues related to data architecture and performance. Excellent verbal and written communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Preferred Qualifications: Experience with other cloud-based data platforms (e.g., Azure Data Lake, Azure Synapse, AWS Redshift). Knowledge of machine learning workflows, leveraging Databricks for model training and deployment. Familiarity with insurance-specific data models and their use in finance, claims, and underwriting operations. Certifications in Azure Databricks, Microsoft Azure, DB2, or related technologies. Knowledge of additional reporting tools (e.g., Power BI, Tableau) is a plus. Key Competencies: Technical Leadership: Ability to guide and mentor development teams in implementing best practices for data architecture and engineering. Analytical Skills: Strong analytical and problem-solving skills, with a focus on optimizing data systems for performance and scalability. Collaborative Mindset: Ability to work effectively in a cross-functional team, communicating complex technical solutions in simple terms to business stakeholders. Attention to Detail: Meticulous attention to detail, ensuring high-quality data output and system performance.

Posted 23 hours ago

Apply

3.0 - 7.0 years

0 - 1 Lacs

pune, chennai

Work from Office

Role & responsibilities Responsibilities: - Design, develop, and maintain the company's data applications. Work with product management team and analysts to build and deploy data[1]driven solutions. Develop and implement data model and data quality. Work with stakeholders to understand the data needs and requirements. Provide technical leadership to the data engineering team. Stay up to date on the latest data engineering trends and technologies. Perform periodic code reviews to ensure that the code is rigorously designed, coded, and effectively tuned for performance Experience: - 3- 7 Years Location: - Pune, Chennai Technical certification in multiple technologies is desirable. Educational Qualifications: - Engineering Degree BE / ME / BTech / M Tech / B.Sc. / M.Sc. Skills: - Mandatory Technical skills Strong knowledge on SQL Any one ETL/ELT platforms development experience and building data pipelines (real-time, batch) Knowledge of Distributed computing, Big Data Concepts and PySpark Experience in integrating and processing semi-structured/un-structured files and data processing, data preparation, and data quality, and more Strong understanding of data warehousing and data lakes Excellent communication and interpersonal skills Ability to work under pressure and meet deadlines Good to have skills: - Hands-on experience in at least 1 Cloud Hyperscaler’s (AWS/Azure) data services Hands-on experience in Azure Cloud (Azure HD Insights, Azure Data Factory, ADF, Synapse) or AWS Cloud (AWS EMR, Athena, Glue, Kinesis, Firehose, AWS Step functions, Amazon QuickSight, Athena, Redshift) Experience in designing and developing ETL large scale data pipelines Knowledge of data engineering tools and technologies Experience in handling healthcare clients and data Preferred candidate profile

Posted 23 hours ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

bengaluru

Work from Office

About The Role Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Microsoft Azure Architecture Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and troubleshooting to ensure that the applications function as intended, contributing to the overall success of the projects you are involved in. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Assist in the documentation of application processes and workflows.- Engage in continuous learning to stay updated with the latest technologies and best practices.- Troubleshooting complex issues, implementing solutions, and providing technical guidance to clients and team members. Play a key role in optimizing cloud infrastructure, ensuring security and compliance, and contributing to the development of best practices and documentation. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Architecture.- Strong understanding of cloud computing principles and services.- Experience with application development frameworks and methodologies.- Familiarity with DevOps practices and tools for continuous integration and deployment.- Ability to troubleshoot and resolve application issues effectively.- Willingness to learn, Positive attitude, Conflict resolution- Work in shift rotation 24/7 and handle on-call support- strong foundation in Azure services and architecture, coupled with experience in troubleshooting, problem-solving, and communication. Should be proficient in Azure's compute, storage, networking, and identity services, as well as infrastructure-as-code (IaC) tools. Experience should also include data analysis and reporting capabilities, potentially with exposure to Azure Data Factory, Azure Databricks, or similar tools Additional Information:- The candidate should have minimum 3 years of experience in Microsoft Azure Architecture.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 23 hours ago

Apply

9.0 - 14.0 years

10 - 14 Lacs

bengaluru

Work from Office

About The Role Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Python (Programming Language), Kubernetes, Microsoft Azure Databricks Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team in implementing effective solutions. You will also engage in strategic planning to align application development with organizational goals, ensuring that all aspects of the project are executed efficiently and effectively. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Facilitate regular team meetings to discuss progress and address any roadblocks. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language), Microsoft Azure Databricks, Kubernetes.- Strong experience in application design and architecture.- Familiarity with cloud computing concepts and services.- Ability to troubleshoot and optimize application performance.- Experience with version control systems such as Git. Additional Information:- The candidate should have minimum 9+ years of experience in Python (Programming Language), Microsoft Azure Databricks, Kubernetes.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 23 hours ago

Apply

2.0 - 4.0 years

3 - 7 Lacs

noida

Work from Office

Position Summary: Perform DataLake(Azure Databricks) operations on healthcare data from multiple sources. To succeed in this role, the candidate should be analytical and excellent communicator. Experience in the healthcare industry is a plus. Experience integrating data from disparate sources in MS SQL and DataLake Environment. You will be responsible towards working with different stakeholders to accomplish business and operation goals. Key Duties & Responsibilities: Data processing (ETL) using MSSQL, DataLake (Azure Databricks), Python, Scala, GitHub with T-SQL stored procedures, views, and other various database objects; import and export processing; data conversions; business process workflows and metrics reporting. Providing client support services and enhancements. Controlling daily ticket resolution/prioritization as client and user volume increases. Prioritizing issues based on client expectations, volume of current tickets, and visibility of issues across the enterprise. Analyzing the overall enterprise environment to find gaps and can think outside-of-the-box in order to design and create functionality which will prove to be of value. Provide DataLake (Databricks), Python, SQL, Scala training to other technicians. Drive ticket resolution momentum and provide feedback to US Leadership where staff improvements can be made in order to better overall productivity of the technicians. Manage DataLake (Databricks), Python, Scala, SQL database objects (stored procedures, views, synonyms, tables and overall schema), reporting, and administration. Skills 2-4 years of experience writing T-SQL, DataLake (Databricks), code to triage issues, analyse data, and optimize database objects. 1-3 years of experience of troubleshooting using TSQL, DataLake (DataBricks), GitHub. 1-2 years of experience in ETL flat file/real-time message data loading. Key Competencies :- Takes full responsibility for meeting the clients level of satisfaction. Prioritizes work and sets realistic deadlines to ensure that important tasks are achieved on or ahead of time, with quality results. Shares own expertise with team members, while remaining open to others' ideas. Feels comfortable working in a changing environment. Identify area of process improvement and automation Finds flexible and rapid solutions to meet the clients needs. Takes controlled risks, seeking support from team members when unsure. Help team members with your expertise to archive common goal.

Posted 23 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies