Home
Jobs

178 Azure Synapse Jobs - Page 2

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

20 - 25 Lacs

Hyderabad

Remote

Naukri logo

Required Skills: Azure Synapse Azure Fabric Azure Data Factory (ADF) Azure Storage PySpark SQL Azure Key Vault Excellent communication skills as this would be client facing and L2 will be client round of interview. Responsibilities: Design and implement scalable data pipelines using Microsoft Fabric, including Dataflows Gen2, Lakehouse, Notebooks and SQL endpoints. Develop ETL/ELT solutions using PySpark, T-SQL and Spark Notebooks within Fabric and Azure Synapse. Manage and optimize data storage and compute in OneLake supporting Lakehouse and Warehouse use cases. Implement and manage Azure Key Vault for secure handling of secrets, credentials and connection strings. Configure and manage CI/CD pipelines for Data engineering projects using Azure Devops including automated deployment of Fabric assets. Integrate data from diverse sources including SQL server, Azure Blob, REST APIs and on-prem systems. Collaborate closely with business teams and PowerBI developers to ensure data models support reporting and self-service needs. Monitor and troubleshoot data pipeline performance, data quality and failure recovery. Contribute to architecture design, governance processes and performance tuning.

Posted 6 days ago

Apply

6.0 - 11.0 years

12 - 17 Lacs

Gurugram

Work from Office

Naukri logo

Job Responsibilities Skillet Needed from the resource Data Architecture and Management : Understanding of Azure SQL technology, including SQL databases, operational data stores, and data transformation processes. Azure Data Factory : Expertise in using Azure Data Factory for ETL processes, including creating and managing pipelines. Python Programming : Proficiency in writing Python scripts, particularly using the pandas library, for data cleaning and transformation tasks. Azure Functions : Experience with Azure Functions for handling and processing Excel files, making them suitable for database import. API Integration : Skills in integrating various data sources, including APIs, into the data warehouse. BPO Exp Mandatory

Posted 1 week ago

Apply

5.0 - 9.0 years

1 - 1 Lacs

Visakhapatnam, Hyderabad, Vizianagaram

Work from Office

Naukri logo

Role & responsibilities 5+ years of experience in data engineering or a related field. Strong hands-on experience with Azure Synapse Analytics and Azure Data Factory (ADF) . Proven experience with Databricks , including development in PySpark or Scala . Proficiency in DBT for data modeling and transformation. Expert in Analytics and reporting Power BI expert who can Develop power BI models and develop interactive BI reports Setting up RLS in Power BI reports Expertise in SQL and performance tuning techniques. Strong understanding of data warehousing concepts and ETL/ELT design patterns. Experience working in Agile environments and familiarity with Git-based version control. Strong communication and collaboration skills. Preferred candidate profile Experience with CI/CD tools and DevOps for data engineering. Familiarity with Delta Lake and Lakehouse architecture. Exposure to other Azure services such as Azure Data Lake Storage (ADLS) , Azure Key Vault , and Azure DevOps. Experience with data quality frameworks or tools.

Posted 1 week ago

Apply

8.0 - 12.0 years

12 - 22 Lacs

Hyderabad, Secunderabad

Work from Office

Naukri logo

Proficiency in SQL, Python, and data pipeline frameworks such as Apache Spark, Databricks, or Airflow. Hands-on experience with cloud data platforms (e.g., Azure Synapse, AWS Redshift, Google BigQuery). Strong understanding of data modeling, ETL/ELT, and data lake/warehouse/ Datamart architectures. Knowledge on Data Factory or AWS Glue Experience in developing reports and dashboards using tools like Power BI, Tableau, or Looker.

Posted 1 week ago

Apply

6.0 - 11.0 years

20 - 27 Lacs

Pune

Hybrid

Naukri logo

6+yrs of experience as a Data Engineer, Expertise in the Azure platform Azure SQL DB, ADF and Azure Synapse, 5+ years of experience in database development using SQL, knowledge of data modeling, ETL processes and data warehouse design principles.

Posted 1 week ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Pune

Work from Office

Naukri logo

As an Azure/SQL Data Analytics Consultant expect to be: •Working on projects that utilize products within the Microsoft Azure and SQL Data Analytics stack •Satisfying the expectations and requirements of customers, both internal and external Required Candidate profile Core:Azure Data Platform, SQL Server (T-SQL)Data Analytics(SSIS,SSAS, SSRS)Power BI,Synapse Supporting: Azure ML,Azure infra,Python,Data Factory Principles: Data Modelling,Data Warehouse Theory,

Posted 1 week ago

Apply

6.0 - 10.0 years

15 - 22 Lacs

Chennai

Work from Office

Naukri logo

Job Tittle: Data Engineering Lead Exp: 6-8 yrs Location: Chennai Work Mode: WFO All 5 Days Shift Timing: General Shift Budget: Max 24 LPA Immediate Joiners Required Mail me at -> triveni2@elabsinfotech.com Mandatory Skills: . Data Engineer with Strong ETL experience . Azure Data Factory , Azure Synapse & Databricks All are Mandatory . Power BI, 1 yr exp . Azure Cloud . Must have managed a team of minimum 5 . Good Communication

Posted 1 week ago

Apply

5.0 - 8.0 years

15 - 22 Lacs

Noida, Bengaluru, Delhi / NCR

Hybrid

Naukri logo

HI Candidates, we have an opportunities with one of the leading IT consulting Group for the data engineer role. Interested candidates can mail their CV's at Abhishek.saxena@mounttalent.com Job Description- What were looking for Data Engineer III with: 5+ years of experience with ETL Process, Data warehouse architecture 5+ Years of experience with Azure Data services i.e. ADF, ADLS Gen 2, Azure SQL dB, Synapse, Azure Databricks, Microsoft Fabric 5+ years of experience designing business intelligence solutions Strong proficiency in SQL and Python/pyspark Implementation experience of Medallion architecture and delta lake (or lakehouse) Experience with cloud-based data platforms, preferably Azure Familiarity with big data technologies and data warehousing concepts Working knowledge of Azure DevOps and CICD (build and release)

Posted 1 week ago

Apply

3.0 - 7.0 years

6 - 10 Lacs

Mumbai

Work from Office

Naukri logo

Senior Azure Data Engineer ? L1 Support

Posted 1 week ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Pune

Work from Office

Naukri logo

What You'll Do The Global Analytics and Insights (GAI) team is seeking an experienced and experienced Data Visualization Manager to lead our data-driven decision-making initiatives. The ideal candidate will have a background in Power BI, expert-level SQL proficiency, to drive actionable insights and demonstrated leadership and mentoring experience, and an ability to drive innovation and manage complex projects. You will become an expert in Avalara's financial, marketing, sales, and operations data. This position will Report to Senior Manager What Your Responsibilities Will Be You will define and execute the organization's BI strategy, ensuring alignment with business goals. You will Lead, mentor, and manage a team of BI developers and analysts, fostering a continuous learning. You will Develop and implement robust data visualization and reporting solutions using Power BI. You will Optimize data models, dashboards, and reports to provide meaningful insights and support decision-making. You will Collaborate with business leaders, analysts, and cross-functional teams to gather and translate requirements into actionable BI solutions. Be a trusted advisor to business teams, identifying opportunities where BI can drive efficiencies and improvements. You will Ensure data accuracy, consistency, and integrity across multiple data sources. You will Stay updated with the latest advancements in BI tools, SQL performance tuning, and data visualization best practices. You will Define and enforce BI development standards, governance, and documentation best practices. You will work closely with Data Engineering teams to define and maintain scalable data pipelines. You will Drive automation and optimization of reporting processes to improve efficiency. What You'll Need to be Successful 8+ years of experience in Business Intelligence, Data Analytics, or related fields. 5+ Expert proficiency in Power BI, including DAX, Power Query, data modeling, and dashboard creation. 5+ years of strong SQL skills, with experience in writing complex queries, performance tuning, and working with large datasets. Familiarity with cloud-based BI solutions (e.g., Azure Synapse, AWS Redshift, Snowflake) is a plus. Should have understanding of ETL processes and data warehousing concepts. Strong problem-solving, analytical thinking, and decision-making skills.

Posted 1 week ago

Apply

7.0 - 12.0 years

8 - 18 Lacs

Kolkata

Remote

Naukri logo

Position : Sr Azure Data Engineer Location: Remote Time : CET Time Role & responsibilities We are seeking a highly skilled Senior Data Engineer to join our dynamic team. The ideal candidate will have extensive experience in Microsoft Azure, Fabric Azure SQL, Azure Synapse, Python, and Power BI. Knowledge of Oracle DB and data replication tools will be preferred . This role involves designing, developing, and maintaining robust data pipelines and ensuring efficient data processing and integration across various platforms. Candidate understands stated needs & requirements of the stakeholders and produce high quality deliverables Monitors own work to ensure delivery within the desired performance standards. Understands the importance of delivery within expected time, budget and quality standards and displays concern in case of deviation. Good communication skills and a team player Design and Development: Architect, develop, and maintain scalable data pipelines using Microsoft Fabric and Azure services, including Azure SQL and Azure Synapse. Data Integration : Integrate data from multiple sources, ensuring data consistency, quality, and availability using data replication tools. Data Management: Manage and optimize databases, ensuring high performance and reliability. ETL Processes: Develop and maintain ETL processes to transform data into actionable insights. Data Analysis: Use Python and other tools to analyze data, create reports, and provide insights to support business decisions. Visualization : Develop and maintain dashboards and reports in Power BI to visualize complex data sets. Performance Tuning : Optimize database performance and troubleshoot any issues related to data processing and integration Preferred candidate profile Minimum 7 years of experience in data engineering or a related field. Proven experience with Microsoft Azure services, Fabrics including Azure SQL and Azure Synapse. Strong proficiency in Python for data analysis and scripting. Extensive experience with Power BI for data visualization. Knowledge of Oracle DB and experience with data replication tools. Proficient in SQL and database management. Experience with ETL tools and processes. Strong understanding of data warehousing concepts and architectures. Familiarity with cloud-based data platforms and services. Analytical Skills: Ability to analyze complex data sets and provide actionable insights. Problem-Solving: Strong problem-solving skills and the ability to troubleshoot data-related issues.

Posted 1 week ago

Apply

4.0 - 9.0 years

5 - 15 Lacs

Chandigarh, Pune, Bengaluru

Work from Office

Naukri logo

Responsibilities A day in the life of an Infoscion • As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. • You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. • You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. • You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. • You would be a key contributor to building efficient programs/ systems and if you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Technical and Professional Requirements: Primary skills: Technology->Cloud Platform->Azure Analytics Services->Azure Data Lake Preferred Skills: Technology->Cloud Platform->Azure Analytics Services->Azure Data Lake

Posted 1 week ago

Apply

3.0 - 5.0 years

7 - 10 Lacs

Pune

Work from Office

Naukri logo

Job Title: Data Engineer Location : Pune, India (On-site) Experience : 3 5 years Employment Type: Full-time Job Summary We are looking for a hands-on Data Engineer who can design and build modern Lakehouse solutions on Microsoft Azure. You will own data ingestion from source-system APIs through Azure Data Factory into OneLake, curate bronze silver gold layers on Delta Lake, and deliver dimensional models that power analytics at scale. Key Responsibilities Build secure, scalable Azure Data Factory pipelines that ingest data from APIs, files, and databases into OneLake. Curate raw data into Delta Lake tables on ADLS Gen 2 using the Medallion (bronze silver gold) architecture, ensuring ACID compliance and optimal performance. Develop and optimize SQL/Spark SQL transformations in Azure Fabric Warehouse / Lakehouse environments. Apply dimensional-modelling best practices (star/snowflake, surrogate keys, SCDs) to create analytics-ready datasets. Implement monitoring, alerting, lineage, and CI/CD (Git/Azure DevOps) for all pipelines and artifacts. Document data flows, data dictionaries, and operational runbooks. Must-Have Technical Skills Azure Fabric & Lakehouse experience Azure Fabric Warehouse experience / Azure Synapse Data Factory building, parameterizing, and orchestrating API-driven ingestion pipelines ADLS Gen 2 + Delta Lake Strong SQL advanced querying, tuning, and procedural extensions (T-SQL / Spark SQL) Data-warehousing & Dimensional Modelling concepts Good-to-Have Skills Python (PySpark, automation, data-quality checks) Unix/Linux shell scripting DevOps (Git, Azure DevOps) Education & Certifications BE / B. Tech in computer science, Information Systems, or related field Preferred: Microsoft DP-203 Azure Data Engineer Associate Soft Skills Analytical, detail-oriented, and proactive problem solver Clear written and verbal communication; ability to simplify complex topics Collaborative and adaptable within agile, cross-functional teams

Posted 1 week ago

Apply

15.0 - 20.0 years

4 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Microsoft Azure Data Services Good to have skills : NAMinimum 2 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to effectively migrate and deploy data across various systems, contributing to the overall efficiency and reliability of data management within the organization. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and deliver data solutions that meet business needs.- Monitor and optimize data pipelines for performance and reliability. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Azure Data Services.- Good To Have Skills: Experience with Azure Data Factory, Azure SQL Database, and Azure Synapse Analytics.- Strong understanding of data modeling and database design principles.- Experience with data integration and ETL tools.- Familiarity with data governance and data quality best practices. Additional Information:- The candidate should have minimum 2 years of experience in Microsoft Azure Data Services.- This position is based at our Hyderabad office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

12.0 - 14.0 years

20 - 30 Lacs

Indore, Hyderabad

Work from Office

Naukri logo

Microsoft Fabric Data engineer CTC Range 12 14 Years Location – Hyderabad/Indore Notice Period - Immediate * Primary Skill Microsoft Fabric Secondary Skill 1 Azure Data Factory (ADF) 12+ years of experience in Microsoft Azure Data Engineering for analytical projects. Proven expertise in designing, developing, and deploying high-volume, end-to-end ETL pipelines for complex models, including batch, and real-time data integration frameworks using Azure, Microsoft Fabric and Databricks. Extensive hands-on experience with Azure Data Factory, Databricks (with Unity Catalog), Azure Functions, Synapse Analytics, Data Lake, Delta Lake, and Azure SQL Database for managing and processing large-scale data integrations. Experience in Databricks cluster optimization and workflow management to ensure cost-effective and high-performance processing. Sound knowledge of data modelling, data governance, data quality management, and data modernization processes. Develop architecture blueprints and technical design documentation for Azure-based data solutions. Provide technical leadership and guidance on cloud architecture best practices, ensuring scalable and secure solutions. Keep abreast of emerging Azure technologies and recommend enhancements to existing systems. Lead proof of concepts (PoCs) and adopt agile delivery methodologies for solution development and delivery. www.yash.com 'Information transmitted by this e-mail is proprietary to YASH Technologies and/ or its Customers and is intended for use only by the individual or entity to which it is addressed, and may contain information that is privileged, confidential or exempt from disclosure under applicable law. If you are not the intended recipient or it appears that this mail has been forwarded to you without proper authority, you are notified that any use or dissemination of this information in any manner is strictly prohibited. In such cases, please notify us immediately at info@yash.com and delete this mail from your records.

Posted 1 week ago

Apply

3.0 - 8.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Software Development Engineer Project Role Description : Analyze, design, code and test multiple components of application code across one or more clients. Perform maintenance, enhancements and/or development work. Must have skills : Microsoft Azure Data Services Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Software Development Engineer, you will engage in a dynamic work environment where you will analyze, design, code, and test various components of application code across multiple clients. Your day will involve collaborating with team members to ensure the successful implementation of software solutions, while also performing maintenance and enhancements to existing applications. You will be responsible for delivering high-quality code and contributing to the overall success of the projects you are involved in, ensuring that client requirements are met effectively and efficiently. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Collaborate with cross-functional teams to gather requirements and translate them into technical specifications.- Conduct code reviews to ensure adherence to best practices and coding standards. Professional & Technical Skills: -Must have experience on Azure Synapse and Pyspark.- Must To Have Skills: Proficiency in Microsoft Azure Data Services.- Strong understanding of cloud computing concepts and architecture.- Experience with data integration and ETL processes.- Familiarity with database management systems and data modeling.- Ability to troubleshoot and resolve technical issues efficiently.-Must have experience on Azure Synapse. Additional Information:- The candidate should have minimum 3 years of experience in Microsoft Azure Data Services.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

2.0 - 7.0 years

5 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Job description Hiring for Azure developer with experience range 2 to 9 years Mandatory Skills: Azure, ADF, ADB, Azure synapse Education: BE/B.Tech/BCA/B.SC/MCA/M.Tech/MSc./MS Location: Pan India Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you!

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

2 Data Engineer Azure Synapse/ADF , Workiva To manage and maintain the associated Connector, Chains, Tables and Queries, making updates, as needed, as new metrics or requirements are identified Develop functional and technical requirements for any changes impacting wData (Workiva Data) Configure and unit test any changes impacting wData (connector, chains, tables, queries Promote wData changes

Posted 1 week ago

Apply

4.0 - 9.0 years

15 - 30 Lacs

Gurugram, Chennai

Work from Office

Naukri logo

Role & responsibilities • Assume ownership of Data Engineering projects from inception to completion. Implement fully operational Unified Data Platform solutions in production environments using technologies like Databricks, Snowflake, Azure Synapse etc. Showcase proficiency in Data Modelling and Data Architecture Utilize modern data transformation tools such as DBT (Data Build Tool) to streamline and automate data pipelines (nice to have). Implement DevOps practices for continuous integration and deployment (CI/CD) to ensure robust and scalable data solutions (nice to have). Maintain code versioning and collaborate effectively within a version-controlled environment. Familiarity with Data Ingestion & Orchestration tools such as Azure Data Factory, Azure Synapse, AWS Glue etc. Set up processes for data management, templatized analytical modules/deliverables. Continuously improve processes with focus on automation and partner with different teams to develop system capability. Proactively seek opportunities to help and mentor team members by sharing knowledge and expanding skills. Ability to communicate effectively with internal and external stakeholders. Coordinating with cross-functional team members to make sure high quality in deliverables with no impact on timelines Preferred candidate profile • Expertise in computer programming languages such as: Python and Advance SQL • Should have working knowledge of Data Warehousing, Data Marts and Business Intelligence with hands-on experience implementing fully operational data warehouse solutions in production environments. • 3+ years of Working Knowledge of Big data tools (Hive, Spark) along with ETL tools and cloud platforms. • 3+ years of relevant experience in either Snowflake or Databricks. Certification in Snowflake or Databricks would be highly recommended. • Proficient in Data Modelling and ELT techniques. • Experienced with any of the ETL/Data Pipeline Orchestration tools such as Azure Data Factory, AWS Glue, Azure Synapse, Airflow etc. • Experience working with ingesting data from different data sources such as RDBMS, ERP Systems, APIs etc. • Knowledge of modern data transformation tools, particularly DBT (Data Build Tool), for streamlined and automated data pipelines (nice to have). • Experience in implementing DevOps practices for CI/CD to ensure robust and scalable data solutions (nice to have). • Proficient in maintaining code versioning and effective collaboration within a versioncontrolled environment. • Ability to work effectively as an individual contributor and in small teams. Should have experience mentoring junior team members. • Excellent problem-solving and troubleshooting ability with experience of supporting and working with cross functional teams in a dynamic environment. • Strong verbal and written communication skills with ability to communicate effectively, articulate results and issues to internal and client team.

Posted 1 week ago

Apply

3.0 - 8.0 years

6 - 18 Lacs

Kochi

Work from Office

Naukri logo

Looking for a Data Engineer with 3+ yrs exp in Azure Data Factory, Synapse, Data Lake, Databricks, SQL, Python, Spark, CI/CD. Preferred: DP-203 cert, real-time data tools (Kafka, Stream Analytics), data governance (Purview), Power BI.

Posted 1 week ago

Apply

2.0 - 4.0 years

10 - 15 Lacs

Pune

Work from Office

Naukri logo

Role & responsibilities Develop and Maintain Data Pipelines: Design, develop, and manage scalable ETL pipelines to process large datasets using PySpark, Databricks, and other big data technologies. Data Integration and Transformation: Work with various structured and unstructured data sources to build efficient data workflows and integrate them into a central data warehouse. Collaborate with Data Scientists & Analysts: Work closely with the data science and business intelligence teams to ensure the right data is available for advanced analytics, machine learning, and reporting. Optimize Performance: Optimize and tune data pipelines and ETL processes to improve data throughput and reduce latency, ensuring timely delivery of high-quality data. Automation and Monitoring: Implement automated workflows and monitoring tools to ensure data pipelines are running smoothly, and issues are proactively addressed. Ensure Data Quality: Build and maintain validation mechanisms to ensure the accuracy and consistency of the data. Data Storage and Access: Work with data storage solutions (e.g., Azure, AWS, Google Cloud) to ensure effective data storage and fast access for downstream users. Documentation and Reporting: Maintain proper documentation for all data processes and architectures to facilitate easier understanding and onboarding of new team members. Skills and Qualifications: Experience: 5+ years of experience as a Data Engineer or similar role, with hands-on experience in designing, building, and maintaining ETL pipelines. Technologies: Proficient in PySpark for large-scale data processing. Strong programming experience in Python , particularly for data engineering tasks. Experience working with Databricks for big data processing and collaboration. Hands-on experience with data storage solutions (e.g., AWS S3, Azure Data Lake, or Google Cloud Storage). Solid understanding of ETL concepts, tools, and best practices. Familiarity with SQL for querying and manipulating data in relational databases. Experience working with data orchestration tools such as Apache Airflow or Luigi is a plus. Data Modeling & Warehousing: Experience with data warehousing concepts and technologies (e.g., Redshift, Snowflake, or Big Query). Knowledge of data modeling, data transformations, and dimensional modeling. Soft Skills: Strong analytical and problem-solving skills. Excellent communication skills, capable of explaining complex data processes to non-technical stakeholders. Ability to work in a fast-paced, collaborative environment and manage multiple priorities. Preferred Qualifications: Bachelor's or masters degree in computer science, Engineering, or a related field. Certification or experience with cloud platforms like AWS , Azure , or Google Cloud . Experience in Apache Kafka or other stream-processing technologies.

Posted 1 week ago

Apply

8.0 - 12.0 years

6 - 14 Lacs

Mumbai, Hyderabad, Pune

Work from Office

Naukri logo

Job Description: 5+ years in data engineering with at least 2 years on Azure Synapse. Strong SQL, Spark, and Data Lake integration experience. Familiarity with Azure Data Factory, Power BI, and DevOps pipelines. Experience in AMS or managed services environments is a plus. Detailed JD Design, develop, and maintain data pipelines using Azure Synapse Analytics. Collaborate with customer to ensure SLA adherence and incident resolution. Optimize Synapse SQL pools for performance and cost. Implement data security, access control, and compliance measures. Participate in calibration and transition phases with client stakeholders

Posted 1 week ago

Apply

5.0 - 9.0 years

12 - 22 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Job description We are looking for Azure Data Engineer's resources having minimum 5 to 9 years of Experience. Role & responsibilities Blend of technical expertise with 5 to 9 year of experience, analytical problem-solving, and collaboration with cross-functional teams. Design and implement Azure data engineering solutions (Ingestion & Curation) Create and maintain Azure data solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure. Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations Use Azure Data Factory and Databricks to assemble large, complex data sets Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Ensure data quality / security and compliance. Optimize Azure SQL databases for efficient query performance. Collaborate with data engineers, and other stakeholders to understand requirements and translate them into scalable and reliable data platform architectures.

Posted 1 week ago

Apply

7.0 - 12.0 years

18 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Urgently Hiring for Senior Azure Data Engineer Job Location- Bangalore Minimum exp - Total 7+yrs with min 4 years relevant exp Keywords Databricks, Pyspark, SCALA, SQL, Live / Streaming data, batch processing data Share CV siddhi.pandey@adecco.com OR Call 6366783349 Roles and Responsibilities: The Data Engineer will work on data engineering projects for various business units, focusing on delivery of complex data management solutions by leveraging industry best practices. They work with the project team to build the most efficient data pipelines and data management solutions that make data easily available for consuming applications and analytical solutions. A Data engineer is expected to possess strong technical skills Key Characteristics Technology champion who constantly pursues skill enhancement and has inherent curiosity to understand work from multiple dimensions Interest and passion in Big Data technologies and appreciates the value that can be brought in with an effective data management solution Has worked on real data challenges and handled high volume, velocity, and variety of data. Excellent analytical & problem-solving skills, willingness to take ownership and resolve technical challenges. Contributes to community building initiatives like CoE, CoP. Mandatory skills: Azure - Master ELT - Skill Data Modeling - Skill Data Integration & Ingestion - Skill Data Manipulation and Processing - Skill GITHUB, Action, Azure DevOps - Skill Data factory, Databricks, SQL DB, Synapse, Stream Analytics, Glue, Airflow, Kinesis, Redshift, SonarQube, PyTest - Skill Optional skills: Experience in project management, running a scrum team. Experience working with BPC, Planning. Exposure to working with external technical ecosystem. MKDocs documentation Share CV siddhi.pandey@adecco.com OR Call 6366783349

Posted 1 week ago

Apply

8.0 - 13.0 years

7 - 12 Lacs

Noida

Work from Office

Naukri logo

Job Overview We are looking for a Data Engineer who will be part of our Analytics Practice and will be expected to actively work in a multi-disciplinary fast paced environment. This role requires a broad range of skills and the ability to step into different roles depending on the size and scope of the project; its primary responsibility is the acquisition, transformation, loading and processing of data from a multitude of disparate data sources, including structured and unstructured data for advanced analytics and machine learning in a big data environment. Responsibilities: Engineer a modern data pipeline to collect, organize, and process data from disparate sources. Performs data management tasks, such as conduct data profiling, assess data quality, and write SQL queries to extract and integrate data Develop efficient data collection systems and sound strategies for getting quality data from different sources Consume and analyze data from the data pool to support inference, prediction and recommendation of actionable insights to support business growth. Design and develop ETL processes using tools and scripting. Troubleshoot and debug ETL processes. Performance tuning and opitimization of the ETL processes. Provide support to new of existing applications while recommending best practices and leading projects to implement new functionality. Collaborate in design reviews and code reviews to ensure standards are met. Recommend new standards for visualizations. Learn and develop new ETL techniques as required to keep up with the contemporary technologies. Reviews the solution requirements and architecture to ensure selection of appropriate technology, efficient use of resources and integration of multiple systems and technology. Support presentations to Customers and Partners Advising on new technology trends and possible adoption to maintain competitive advantage Experience Needed: 8+ years of related experience is required. A BS or Masters degree in Computer Science or related technical discipline is required ETL experience with data integration to support data marts, extracts and reporting Experience connecting to varied data sources Excellent SQL coding experience with performance optimization for data queries. Understands different data models like normalized, de-normalied, stars, and snowflake models. Worked with transactional, temporarl, time series, and structured and unstructured data. Experience on Azure Data Factory and Azure Synapse Analytics Worked in big data environments, cloud data stores, different RDBMS and OLAP solutions. Experience in cloud-based ETL development processes. Experience in deployment and maintenance of ETL Jobs. Is familiar with the principles and practices involved in development and maintenance of software solutions and architectures and in service delivery. Has strong technical background and remains evergreen with technology and industry developments. At least 3 years of demonstrated success in software engineering, release engineering, and/or configuration management. Highly skilled in scripting languages like PowerShell. Substantial experience in the implementation and exectuion fo CI/CD processes. Additional Demonstrated ability to have successfully completed multiple, complex technical projects Prior experience with application delivery using an Onshore/Offshore model Experience with business processes across multiple Master data domains in a services based company Demonstrates a rational and organized approach to the tasks undertaken and an awareness of the need to achieve quality. Demonstrates high standards of professional behavior in dealings with clients, colleagues and staff. Is able to make sound and far reaching decisions alone on major issues and to take full responsibility for them on a technical basis. Strong written communication skills. Is effective and persuasive in both written and oral communication. Experience with gathering end user requirements and writing technical documentation Time management and multitasking skills to effectively meet deadlines under time-to-market pressure

Posted 1 week ago

Apply

Exploring Azure Synapse Jobs in India

The Azure Synapse job market in India is currently experiencing a surge in demand as organizations increasingly adopt cloud solutions for their data analytics and business intelligence needs. With the growing reliance on data-driven decision-making, professionals with expertise in Azure Synapse are highly sought after in the job market.

Top Hiring Locations in India

  1. Bangalore
  2. Mumbai
  3. Hyderabad
  4. Pune
  5. Chennai

Average Salary Range

The average salary range for Azure Synapse professionals in India varies based on experience levels. Entry-level positions can expect to earn around INR 6-8 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.

Career Path

A typical career path in Azure Synapse may include roles such as Junior Developer, Senior Developer, Tech Lead, and Architect. As professionals gain experience and expertise in the platform, they can progress to higher-level roles with more responsibilities and leadership opportunities.

Related Skills

In addition to expertise in Azure Synapse, professionals in this field are often expected to have knowledge of SQL, data warehousing concepts, ETL processes, data modeling, and cloud computing principles. Strong analytical and problem-solving skills are also essential for success in Azure Synapse roles.

Interview Questions

  • What is Azure Synapse Analytics and how does it differ from Azure Data Factory? (medium)
  • Can you explain the differences between a Data Warehouse and a Data Lake? (basic)
  • How do you optimize data loading and querying performance in Azure Synapse? (advanced)
  • What is PolyBase in Azure Synapse and how is it used for data integration? (medium)
  • How do you handle security and compliance considerations in Azure Synapse? (advanced)
  • Explain the concept of serverless SQL pools in Azure Synapse. (medium)
  • What are the different components of an Azure Synapse workspace? (basic)
  • How do you monitor and troubleshoot performance issues in Azure Synapse? (advanced)
  • Describe your experience with building data pipelines in Azure Synapse. (medium)
  • Can you walk us through a recent project where you used Azure Synapse for data analysis? (advanced)
  • How do you ensure data quality and integrity in Azure Synapse? (medium)
  • What are the key features of Azure Synapse Link for Azure Cosmos DB? (advanced)
  • How do you handle data partitioning and distribution in Azure Synapse? (medium)
  • Discuss a scenario where you had to optimize data storage and processing costs in Azure Synapse. (advanced)
  • What are some best practices for data security in Azure Synapse? (medium)
  • How do you automate data integration workflows in Azure Synapse? (advanced)
  • Can you explain the role of Azure Data Lake Storage Gen2 in Azure Synapse? (medium)
  • Describe a situation where you had to collaborate with cross-functional teams on a data project in Azure Synapse. (advanced)
  • How do you ensure data governance and compliance in Azure Synapse? (medium)
  • What are the advantages of using Azure Synapse over traditional data warehouses? (basic)
  • Discuss your experience with real-time analytics and streaming data processing in Azure Synapse. (advanced)
  • How do you handle schema evolution and versioning in Azure Synapse? (medium)
  • What are some common challenges you have faced while working with Azure Synapse and how did you overcome them? (advanced)
  • Explain the concept of data skew and how it can impact query performance in Azure Synapse. (medium)
  • How do you stay updated on the latest developments and best practices in Azure Synapse? (basic)

Closing Remark

As the demand for Azure Synapse professionals continues to rise in India, now is the perfect time to upskill and prepare for exciting career opportunities in this field. By honing your expertise in Azure Synapse and related skills, you can position yourself as a valuable asset in the job market and embark on a rewarding career journey. Prepare diligently, showcase your skills confidently, and seize the numerous job opportunities waiting for you in the Azure Synapse domain. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies