Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 7.0 years
6 - 8 Lacs
Hyderabad
Work from Office
Role Overview We are seeking an experienced .NET Backend Developer with strong Azure Data Engineering skills to join our growing team in Hyderabad. You will work closely with cross-functional teams to build scalable backend systems, modern APIs, and data pipelines using cutting-edge tools like Azure Databricks and MS Fabric. Technical Skills (Must-Have) Strong hands-on experience in C#, SQL Server, and OOP Concepts Proficiency with .NET Core, ASP.NET Core, Web API, Entity Framework (v6 or above) Strong understanding of Microservices Architecture Experience with Azure Cloud technologies including Data Engineering, Azure Databricks, MS Fabric, Azure SQL, Blob Storage, etc. Experience with Snowflake or similar cloud data platforms Experience working with NoSQL databases Skilled in Database Performance Tuning and Design Patterns Working knowledge of Agile methodologies Ability to write reusable libraries and modular, maintainable code Excellent verbal and written communication skills (especially with US counterparts) Strong troubleshooting and debugging skills Nice to Have Skills Experience with Angular, MongoDB, NPM Familiarity with Azure DevOps CI/CD pipelines for build and release configuration Self-starter attitude with strong analytical and problem-solving abilities Willingness to work extra hours when needed to meet tight deadlines Why Join UsWork with a passionate, high-performing team Opportunity to grow your technical and leadership skills in a dynamic environment Be part of global digital transformation initiatives with top-tier clients Exposure to real-world enterprise data systems Opportunity to work on cutting-edge Azure and cloud technologies Performance-based growth & internal mobility opportunities
Posted 1 week ago
7.0 - 12.0 years
8 - 14 Lacs
Bengaluru
Work from Office
Individual Accountabilities Collaboration Collaborates with domain architects in the DSS, OEA, EUS, and HaN towers and if appropriate, the respective business stakeholders in architecting data solutions for their data service needs. Collaborates with the Data Engineering and Data Software Engineering teams to effectively communicate the data architecture to be implemented. Contributes to prototype or proof of concept efforts. Collaborates with InfoSec organization to understand corporate security policies and how they apply to data solutions. Collaborates with the Legal and Data Privacy organization to understand the latest policies so they may be incorporated into every data architecture solution. Suggest architecture design with Ontologies, MDM team. Technical skills & design Significant experience working with structured and unstructured data at scale and comfort with a variety of different stores (key-value, document, columnar, etc.) as well as traditional RDBMS and data warehouses. Deep understanding of modern data services in leading cloud environments, and able to select and assemble data services with maximum cost efficiency while meeting business requirements of speed, continuity, and data integrity. Creates data architecture artifacts such as architecture diagrams, data models, design documents, etc. Guides domain architect on the value of a modern data and analytics platform. Research, design, test, and evaluate new technologies, platforms and third-party products. Working experience with Azure Cloud, Data Mesh, MS Fabric, Ontologies, MDM, IoT, BI solution and AI would be greater assets. Expert troubleshoot skills and experience. Leadership Mentors aspiring data architects typically operating in data engineering and software engineering roles. Key shared accountabilities Leads medium to large data services projects. Provides technical partnership to product owners Shared stewardship, with domains architects, of the Arcadis data ecosystem. Actively participates in Arcadis Tech Architect community. Key profile requirements Minimum of 7 years of experience in designing and implementing modern solutions as part of variety of data ingestion and transformation pipelines Minimum of 5 years of experience with best practice design principles and approaches for a range of application styles and technologies to help guide and steer decisions. Experience working in large scale development and cloud environment.
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Hyderabad
Work from Office
We Are Hiring: Senior .NET Backend Developer with Azure Data Engineering Experience Job Location: Hyderabad, India Work Mode: Onsite Only Experience: Minimum 6+ Years Qualification: B.Tech, B.E, MCA, M.Tech Role Overview We are seeking an experienced .NET Backend Developer with strong Azure Data Engineering skills to join our growing team in Hyderabad. You will work closely with cross-functional teams to build scalable backend systems, modern APIs, and data pipelines using cutting-edge tools like Azure Databricks and MS Fabric. Technical Skills (Must-Have) Strong hands-on experience in C, SQL Server, and OOP Concepts Proficiency with .NET Core, ASP.NET Core, Web API, Entity Framework (v6 or above) Strong understanding of Microservices Architecture Experience with Azure Cloud technologies including Data Engineering, Azure Databricks, MS Fabric, Azure SQL, Blob Storage, etc. Experience with Snowflake or similar cloud data platforms Experience working with NoSQL databases Skilled in Database Performance Tuning and Design Patterns Working knowledge of Agile methodologies Ability to write reusable libraries and modular, maintainable code Excellent verbal and written communication skills (especially with US counterparts) Strong troubleshooting and debugging skills Nice to Have Skills Experience with Angular, MongoDB, NPM Familiarity with Azure DevOps CI/CD pipelines for build and release configuration Self-starter attitude with strong analytical and problem-solving abilities Willingness to work extra hours when needed to meet tight deadlines Why Join Us Work with a passionate, high-performing team Opportunity to grow your technical and leadership skills in a dynamic environment Be part of global digital transformation initiatives with top-tier clients Exposure to real-world enterprise data systems Opportunity to work on cutting-edge Azure and cloud technologies Performance-based growth & internal mobility opportunities Tags DotNetDeveloper BackendDeveloper AzureDataEngineering Databricks MSFabric Snowflake Microservices CSharpJobs HyderabadJobs FullTimeJob HiringNow EntityFramework ASPNetCore CloudEngineering SQLJobs DevOps DotNetCore BackendJobs SuzvaCareers DataPlatformDeveloper SoftwareJobsIndia
Posted 1 week ago
5.0 - 10.0 years
5 - 10 Lacs
Gurgaon / Gurugram, Haryana, India
On-site
Design, develop, and maintain cloud infrastructure using Azure and MS Fabric: Architect and implement cloud solutions leveraging Microsoft Azure services and MS Fabric. Ensure the infrastructure supports scalability, reliability, performance, and cost-efficiency. Integrate containerization and orchestration technologies: Utilize Kubernetes and Docker for containerization and orchestration. Manage and optimize Azure Kubernetes Service (AKS) deployments. Implement DevOps practices and automation: Develop CI/CD pipelines to automate code deployment and infrastructure provisioning. Use automation tools and Terraform to streamline operations and reduce manual intervention. Collaborate with development teams to build and deploy cloud-native applications: Provide guidance and support for designing and implementing cloud-native applications. Ensure applications are optimized for cloud environments. Monitor, troubleshoot, and optimize cloud infrastructure: Implement monitoring and alerting systems to ensure infrastructure health. Optimize resource usage and performance to reduce costs and improve efficiency. Develop cost optimization strategies for efficient use of Azure resources. Troubleshoot and resolve issues quickly to minimize impact on users. Ensure high availability and uptime of applications. Enhance system security and compliance: Implement security best practices and ensure compliance with industry standards. Perform regular security assessments and audits EDUCATION University background: Bachelors/Master's degree in computer science & information systems or related engineering. BEHAVIORAL COMPETENCIES: Outstanding Technical leader with proven hands on in configuration and deployment of DevOps towards successful delivery. Be Innovative and be aligned to new product development technologies and methods. Demonstrate excellent communication skills and able to guide, influence and convince others in a matrix organization. Demonstrated teamwork and collaboration in a professional setting Proven capabilities with worldwide teams Team Player with prior experience in working with European customer is not mandatory but preferable. 5 to 10 years in IT and/or digital companies or startups Knowledge of ansible. Extensive knowledge of cloud technologies, particularly Microsoft Azure and MS Fabric. Proven experience with containerization and orchestration tools such as Kubernetes and Docker. Experience with Azure Kubernetes Service (AKS), Terraform, and DevOps practices. Strong automation skills, including scripting and using automation tools. Proven track record in designing and implementing cloud infrastructure. Experience in optimizing cloud resource usage and performance. Proven experience in Azure cost optimization strategies. Proven experience ensuring uptime of applications and rapid troubleshooting in case of failures. Strong understanding of security best practices and compliance standards. Proven experience providing technical guidance to teams. Proven experience in managing customer expectations. Proven track record of driving decisions collaboratively, resolving conflicts, and ensuring follow-through. Extensive knowledge of software development and system operations. Proven experience in designing stable solutions, testing, and debugging. Demonstrated technical guidance with worldwide teams. Demonstrated teamwork and collaboration in a professional setting. Proven capabilities with worldwide teams. Proficient in English; proficiency in French is a plus Performance Measurements: On-Time Delivery (OTD) Infrastructure Reliability and Availability Cost Optimization and Efficiency Application Uptime and Failure Resolution
Posted 2 weeks ago
4.0 - 6.0 years
4 - 6 Lacs
Bengaluru / Bangalore, Karnataka, India
On-site
Generate CPG business insights and reports using Excel, PowerPoint, and PowerBI. Develop and maintain Power BI reports, dashboards, and visualizations that provide meaningful insights to stakeholders. Create comprehensive content and presentations to support business decisions and strategies. Extract and analyze data from NIQ, Circana, Spins , and other relevant sources. Work with cross-functional teams to develop and implement data-driven solutions, including data visualizations, reports, and dashboards. Manage analytics projects and work streams, and build dashboards and reports. Provide expert-level support to stakeholders on analytics and data visualization. Present findings and recommendations to stakeholders in a clear and concise manner. Required Education Bachelor's Degree Preferred Education Master's Degree Required Technical and Professional Expertise B. Tech, Bachelor's or Master's degree in Computer Science, Science, or relevant education. 4-6 years of experience in data analysis or a related field. Proficiency in MS Fabric, PowerBI, SQL, Excel , and experience with NIQ, Circana, and Spins . Preferred Technical and Professional Experience Strong analytical skills and attention to detail. Excellent communication and presentation abilities. Ability to manage multiple tasks and meet deadlines. Experience in the CPG industry is preferred.
Posted 2 weeks ago
4.0 - 7.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Design, develop, and implement data solutions using Microsoft Fabric , including data pipelines, data warehouses, and data marts. Develop data pipelines, data transformations, and data workflows using Microsoft Fabric.
Posted 2 weeks ago
3.0 - 8.0 years
9 - 19 Lacs
Bengaluru, Delhi / NCR
Work from Office
Key Responsibilities: Lead the implementation and optimization of Microsoft Purview across the clients data estate in MS Fabric/Azure Cloud Platform (ADF or Data Bricks etc). Define and enforce data governance policies, data classification, sensitivity labeling, and data lineage to ensure readiness for GenAI use cases. Collaborate with data engineers, architects, and AI/ML teams to ensure data discoverability, compliance, and ethical AI readiness. Design and implement data cataloging strategies to support GenAI model training and inference. Provide guidance on data access controls, privacy, and regulatory compliance (e.g., GDPR, HIPAA). Conduct workshops and training sessions for client stakeholders on Purview capabilities and best practices. Monitor and report on data governance KPIs and GenAI readiness metrics. Required Skills & Qualifications: Proven experience as a Microsoft Purview SME in enterprise environments. Strong knowledge of Microsoft Fabric, OneLake, and Synapse Data Engineering. Experience with data governance frameworks and metadata management. Hands-on experience with data classification, sensitivity labels, and data lineage tracking. Understanding of compliance standards and data privacy regulations. Excellent communication and stakeholder management skills. Preferred Qualifications: Microsoft certifications in Azure Data, Purview, or Security & Compliance. Experience working with Azure OpenAI, Copilot integrations, or other GenAI platforms. Background in data science, AI ethics, or ML operations is a plus.
Posted 3 weeks ago
2.0 - 5.0 years
3 - 8 Lacs
Bengaluru
Work from Office
Job Title: Power BI Developer Experience: 23 Years Location: Bangalore - Indiranagar (Work from Office Only) Employment Type: Full-Time Job Description: We are looking for a Power BI Developer with 23 years of hands-on experience in designing and developing BI reports and dashboards using Power BI. Candidates with experience in Microsoft Fabric will be given preference. Strong communication skills are essential, as the role involves close collaboration with cross-functional teams. Key Responsibilities: Develop, design, and maintain interactive dashboards and reports in Power BI Work closely with stakeholders to gather requirements and translate them into effective data visualizations Optimize data models for performance and usability Implement row-level security and data governance best practices Stay updated with Power BI and MS Fabric capabilities and best practices Requirements: 23 years of hands-on Power BI development experience Familiarity with Power Query, DAX, and data modeling techniques Experience in Microsoft Fabric is a plus Strong analytical and problem-solving skills Excellent verbal and written communication skills Interested candidates kindly share your CV and below details to usha.sundar@adecco.com 1) Present CTC (Fixed + VP) - 2) Expected CTC - 3) No. of years experience - 4) Notice Period - 5) Offer-in hand - 6) Reason of Change - 7) Present Location -
Posted 4 weeks ago
3.0 - 4.0 years
5 - 6 Lacs
Hyderabad
Work from Office
Overview This role serves as an Associate Analyst for the GTM Data analytics COE project development team. This role is one of the go-to resource for building/ maintaining key reports, data pipelines and advanced analytics necessary to bring insights to light for senior leaders and Sector and field end users. Responsibilities The COEs core competencies are a mastery of data visualization, data engineering, data transformation, predictive and prescriptive analytics Enhance data discovery, processes, testing, and data acquisition from multiple platforms. Apply detailed knowledge of PepsiCos applications for root-cause problem-solving. Ensure compliance with PepsiCo IT governance rules and design best practices. Participate in project planning with stakeholders to analyze business opportunities and define end-to-end processes. Translate operational requirements into actionable data presentations. Support data recovery and integrity issue resolution between business and PepsiCo IT. Provide performance reporting for the GTM function, including ad-hoc requests using internal, shipment data systems Develop on-demand reports and scorecards for improved agility and visualization. Collate and analyze large data sets to extract meaningful insights on performance trends and opportunities. Present insights and recommendations to the GTM Leadership team regularly. Manage expectations through effective communication with headquarters partners. Ensure timely and accurate data delivery per service level agreements (SLA). Collaborate across functions to gather insights for action-oriented analysis. Identify and act on opportunities to improve work delivery. Implement process improvements, reporting standardization, and optimal technology use. Foster an inclusive and collaborative environment. Provide baseline support for monitoring SPA mailboxes, work intake, and other ad-hoc requests.Additionally, the role will provide baseline support for monitoring work intake & other adhoc requests, queries Qualifications Undergrad degree in Business or related technology 3-4 Yrs working experience in Power BI 1-2 Yrs working experience in SQL and Python Preferred qualifications : Information technology or analytics experience is a plus Familiarity with Power BI/ Tableau, Python, SQL, Teradata, Azure, MS Fabric Requires a level of analytical, critical thinking, and problem-solving skills as well as great attention to detail Strong time management skills, ability to multitask, set priorities, and plan
Posted 1 month ago
4.0 - 9.0 years
0 - 25 Lacs
Hyderabad, Pune, Greater Noida
Work from Office
Roles and Responsibilities : Design, develop, test, deploy, and maintain large-scale data pipelines using Azure Data Factory (ADF) to integrate various data sources into a centralized platform. Collaborate with cross-functional teams to gather requirements for data integrations and ensure seamless delivery of high-quality solutions. Develop complex SQL queries to extract insights from large datasets stored in relational databases such as PostgreSQL or MySQL. Troubleshoot issues related to data pipeline failures, identify root causes, and implement fixes to prevent future occurrences. Job Requirements : 4-9 years of experience in designing and developing data integration solutions using ADF or similar tools like Informatica PowerCenter or Talend Open Studio. Strong understanding of Microsoft Azure services including storage options (e.g., Blob Storage), compute resources (e.g., Virtual Machines), networking concepts (e.g., VPN). Proficiency in writing complex SQL queries for querying large datasets stored in relational databases such as PostgreSQL or MySQL.
Posted 1 month ago
8 - 12 years
19 - 30 Lacs
Pune, Bengaluru
Work from Office
About Position: We at Persistent are looking for a Data Engineering lead with experience in MS Fabric, SQL, Python along with knowledge in Data Extraction and ETL Process. Role: Data Engineering Lead Location: Pune, Bangalore Experience: 8+ years Job Type: Full Time Employment What You'll Do: Work with business to understand business requirements and translate into low level design Design and implement robust, fault tolerant, scalable, and secure data pipelines using pyspark, notebooks in MS Fabric Review code of peers and mentor junior team members Participate in sprint planning and other agile ceremonies Drive automation and efficiency in Data ingestion, data movement and data access workflow Contribute ideas to help ensure that required standards and processes are in place and actively look for opportunities to enhance standards and improve process efficiency. Expertise You'll Bring: Around 8 to 12 years of experience, at least 1 year in MS fabric and Azure cloud Leadership: Ability to lead and mentor junior data engineers, help with planning and estimations Data migration: Experience on migrating and re-modeling large enterprise data from legacy warehouse to Lakehouse (Delta lake) on MS Fabric or Databricks. Strong Data Engineering Skills: Proficiency in data extraction, transformation, and loading (ETL) processes, data modeling, and database management. Also experience around setting up pipelines using Notebooks and ADF, setting up monitoring and alert notifications. Experience with Data Lake Technologies: MS Fabric, Azure, Databricks, Python, Orchestration tool like Apache Airflow or Azure Data Factory, Azure Synapse along with stored procedures, Azure data lake storage. Data Integration Knowledge: Familiarity with data integration techniques, including batch processing, streaming, and real-time data ingestion, auto-loader, change data capture, creation of fact and dimension tables. Programming Skills: Proficiency in SQL, Python, Pyspark for data manipulation and transformation. DP-700 certification will be preferred Benefits: Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage: group term life, personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment: Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. We offer hybrid work options and flexible working hours to accommodate various needs and preferences. Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. If you are a person with disabilities and have specific requirements, please inform us during the application process or at any time during your employment. We are committed to creating an inclusive environment where all employees can thrive. Our company fosters a values-driven and people-centric work environment that enables our employees to: Accelerate growth, both professionally and personally Impact the world in powerful, positive ways, using the latest technologies Enjoy collaborative innovation, with diversity and work-life wellbeing at the core Unlock global opportunities to work and learn with the industry's best Let's unleash your full potential at Persistent "Persistent is an Equal Opportunity Employer and prohibits discrimination and harassment of any kind."
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6462 Jobs | Ahmedabad
Amazon
6351 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane