Home
Jobs

984 Adf Jobs - Page 7

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: Data Architect Location: Hyderabad, Chennai & Bangalore Exprience: 10+ Years Job Summary We are seeking a highly experienced and strategic Data Architect to lead the design and implementation of robust data architecture solutions. The ideal candidate will have a deep understanding of data modelling, governance, integration, and analytics platforms. As a Data Architect, you will play a crucial role in shaping the data landscape across the organization, ensuring data availability, consistency, security, and quality. Mandatory Skills Enterprise data architecture and modeling Cloud data platforms (Azure, AWS, GCP) Data warehousing and lakehouse architecture Data governance and compliance frameworks ETL/ELT design and orchestration Master Data Management (MDM) Databricks architecture and implementation Key Responsibilities Lead, define, and implement end-to-end modern data platforms on public cloud using Databricks Design and manage scalable data models and storage solutions Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to define required data structures, formats, pipelines, metadata, and workload orchestration capabilities Establish data standards, governance policies, and best practices Oversee the integration of new data technologies and tools Lead the development of data pipelines, marts, and lakes Ensure data solutions are compliant with security and regulatory standards Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations Mentor data engineers and developers on best practices Qualifications Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field Relevant certifications in cloud platforms, data architecture, or governance Technical Skills Data Modelling: Conceptual, Logical, Physical modelling (ERwin, Power Designer, etc.) Cloud: Azure (ADF, Synapse, Databricks), AWS (Redshift, Glue), GCP (Big Query) Databases: SQL Server, Oracle, PostgreSQL, NoSQL (MongoDB, Cassandra) Data Integration: Informatica, Talend, Apache NiFi Big Data: Hadoop, Spark, Kafka Governance Tools: Collibra, Alation, Azure Purview Scripting: Python, SQL, Shell DevOps/DataOps practices and CI/CD tools Soft Skills Strong leadership and stakeholder management Excellent communication and documentation skills Strategic thinking with problem-solving ability Collaborative and adaptive in cross-functional teams Good to Have Experience in AI/ML data lifecycle support Exposure to industry data standards and frameworks (TOGAF, DAMA-DMBOK) Experience with real-time analytics and streaming data solutions Work Experience Minimum 10 years in data engineering, architecture, or related roles At least 5 years of hands-on experience in designing data platforms on Azure Demonstrated knowledge of 2 full project cycles using Databricks as an architect Experience supporting and working with cross-functional teams in a dynamic environment Advanced working SQL knowledge and experience working with relational databases and unstructured datasets Experience with stream-processing systems such as Storm, Spark-Streaming Compensation & Benefits Competitive salary and annual performance-based bonuses Comprehensive health and optional Parental insurance. Retirement savings plans and tax savings plans. Work-Life Balance: Flexible work hours Key Result Areas (KRAs) Effective implementation of scalable and secure data architecture Governance and compliance adherence Standardization and optimization of data assets Enablement of self-service analytics and data democratization Key Performance Indicators (KPIs) Architecture scalability and reusability metrics Time-to-delivery for data initiatives Data quality and integrity benchmarks Compliance audit outcomes Satisfaction ratings from business stakeholders Contact: hr@bigtappanalytics.com

Posted 1 week ago

Apply

40.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Analyze, design develop, and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Responsibilities Preferred Qualifications: Oracle Applications Lab (OAL) has a central role within Oracle. It's role is to work with Product Development and Oracle internal business to deliver Oracle products for Oracle to use internally. OAL has a role of implementing Oracle applications, databases and middleware, supporting Oracle applications for Oracle internally and configuring Oracle applications to meet the specific needs of Oracle. OAL also provides a showcase for Oracle's products The role will involve: Working as part of a global team to implement and support new business applications for HR and Payroll Debugging and solving sophisticated problems and working closely with Oracle Product Development and other groups to implement solutions Developing and implementing product extensions and customizations Testing new releases Providing critical production support Your skills should include: Experience in designing and supporting Oracle E-Business Suite and Fusion applications. Preferably Oracle HRMS/Fusion HCM Strong Oracle technical skills: SQL, PL/SQL, Java, XML, ADF, SOA etc Communicating confidently with peers and management within technical and business teams Detailed Description and Job Requirements: Work with Oracle's world class technology to develop, implement, and support Oracle's global infrastructure. As a member of the IT organization, help analyze existing complex programs and formulate logic for new complex internal systems. Prepare flowcharting, perform coding, and test/debug programs. Develop conversion and system implementation plans. Recommend changes to development, maintenance, and system standards. Job duties are varied and complex using independent judgment. May have project lead role. BS or equivalent experience in programming on enterprise or department servers or systems. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Posted 1 week ago

Apply

7.0 years

0 Lacs

Chennai, Tamil Nadu, India

Remote

Linkedin logo

Job Description Deevita is seeking a highly skilled and experienced Senior Data Engineer / Data Platform Specialist to join our team. This role involves building and maintaining robust data marts, developing scalable data models, and designing enterprise-grade data platform solutions. The ideal candidate will collaborate closely with cross-functional teams and play a key role in ensuring data accuracy, performance, and alignment with business goals. We are a team talented engineers, fun to work with, who is looking to make difference in Healthcare. Join us as we work together to revolutionize the future of pharma with the latest technologies and a collaborative, fast-paced work environment. Roles & Responsibilities: Design, build, and maintain data marts to support various business functions and carrier partners. Work closely with cross-functional stakeholders to define data modeling roadmaps . Collaborate with application/product teams to understand business logic and translate into efficient data models . Optimize SQL queries and data structures for performance and scalability. Develop and manage ETL pipelines using Azure Logic Apps, Power Platform, and Azure Data Factory . Ensure data accuracy, integrity, and consistency through rigorous validation and cleansing processes. Establish and enforce data modeling standards and best practices across teams. Maintain thorough and up-to-date data model documentation . Design and develop advanced Power BI visualizations , dashboards, and paginated reports. Create and maintain Power Platform applications (Power Apps, Power Automate) for internal use cases. Provide technical support and troubleshooting for Power BI dashboards and SQL-based solutions. Ensure data privacy, security, and compliance in collaboration with the IT team. Analyze business requirements and propose best-fit data architecture solutions . Work with the engineering team to ensure delivery and performance of solutions. Required Qualifications: 7-10 years of experience in designing and developing enterprise-grade data platform solutions involving SQL Server, Azure SQL Database, Power BI. Master’s or Bachelor’s degree in Computer Science or Engineering fields. (BE/B.Tech/ M.E / M.Tech, MCA) Must have at least 7+ years of software development experience in SQL Server/ Azure SQL Database Must have at least 3+ years of software development experience in building data visualizations for business /enterprise customers using Power BI (Power BI Desktop, Power BI Report Builder, Power BI Service). Must have 1+ year of experience in building applications using Power Platform (Power Apps, Power Automate). Must have at least one Microsoft Certification in Power BI Must be hands-on, deeply technical and exposure to latest features within SQL Server, able to do both coding and guide group of junior/mid-level database engineers Must have excellent communication skills (English) to interface with US clients directly and verify in proficiency in all types of communication modes – Listening, Reading, Writing and Speaking Must have strong experience in Performance Tuning / Optimization Initiatives using Healthcare, Pharma/Life science industry experience will be an added advantage Preferred Qualifications: Healthcare/Pharma/Life science industry experience will be an added advantage Azure Development using ADF, Data pipelines Benefits Industry Competitive Compensation package Exposure to work on advanced technologies and excellent career growth opportunities both technical and organization level Paid Time off (EL, SL, CL) , Health Insurance coverage Hybrid /Remote Work From Home About us DeeVita is a dynamic and growing organization, providing advanced technology services and solutions to enterprise customers in USA over the last decade. Deevita specializes in Data, analytics, AI, and Cloud Solutions & product development services from startups to enterprise customers.

Posted 1 week ago

Apply

7.0 years

5 - 9 Lacs

Hyderābād

On-site

GlassDoor logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. As part of our strategic initiative to build a centralized capability around data and cloud engineering, we are establishing a dedicated Azure Cloud Data Engineering practice. This team will be at the forefront of designing, developing, and deploying scalable data solutions on cloud primarily using Microsoft Azure platform. The practice will serve as a centralized team, driving innovation, standardization, and best practices across cloud-based data initiatives. New hires will play a pivotal role in shaping the future of our data landscape, collaborating with cross-functional teams, clients, and stakeholders to deliver impactful, end-to-end solutions. Primary Responsibilities: Ingest data from multiple on-prem and cloud data sources using various tools & capabilities in Azure Design and develop Azure Databricks processes using PySpark/Spark-SQL Design and develop orchestration jobs using ADF, Databricks Workflow Analyzing data engineering processes being developed and act as an SME to troubleshoot performance issues and suggest solutions to improve Develop and maintain CI/CD processes using Jenkins, GitHub, Github Actions etc Building test framework for the Databricks notebook jobs for automated testing before code deployment Design and build POCs to validate new ideas, tools, and architectures in Azure Continuously explore new Azure services and capabilities; assess their applicability to business needs Create detailed documentation for cloud processes, architecture, and implementation patterns Work with data & analytics team to build and deploy efficient data engineering processes and jobs on Azure cloud Prepare case studies and technical write-ups to showcase successful implementations and lessons learned Work closely with clients, business stakeholders, and internal teams to gather requirements and translate them into technical solutions using best practices and appropriate architecture Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Ensure solutions adhere to security, compliance, and governance standards Monitor and optimize data pipelines and cloud resources for cost and performance efficiency Identifies solutions to non-standard requests and problems Support and maintain the self-service BI warehouse Mentor and support existing on-prem developers for cloud environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Undergraduate degree or equivalent experience 7+ years of overall experience in Data & Analytics engineering 5+ years of experience working with Azure, Databricks, and ADF, Data Lake 5+ years of experience working with data platform or product using PySpark and Spark-SQL Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc. In-depth understanding of Azure architecture & ability to come up with efficient design & solutions Highly proficient in Python and SQL Proven excellent communication skills Preferred Qualifications: Snowflake, Airflow experience Power BI development experience Experience or knowledge of health care concepts – E&I, M&R, C&S LOBs, Claims, Members, Provider, Payers, Underwriting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. #NIC

Posted 1 week ago

Apply

2.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Profile: Sr. DW BI Developer Location: Sector 64, Noida (Work from Office) Position Overview: Working with the Finance Systems Manager, the role will ensure that ERP system is available and fit for purpose. The ERP Systems Developer will be developing the ERP system, providing comprehensive day-to-day support, training and develop the current ERP System for the future. Key Responsibilities: As a Sr. DW BI Developer, the candidate will participate in the design / development / customization and maintenance of software applications. As a DW BI Developer, the person should analyse the different applications/Products, design and implement DW using best practices. Rich data governance experience, data security, data quality, provenance / lineage. The candidate will also be maintaining a close working relationship with the other application stakeholders. Experience of developing secured and high-performance web application(s) Knowledge of software development life-cycle methodologies e.g. Iterative, Waterfall, Agile, etc. Designing and architecting future releases of the platform. Participating in troubleshooting application issues. Jointly working with other teams and partners handling different aspects of the platform creation. Tracking advancements in software development technologies and applying them judiciously in the solution roadmap. Ensuring all quality controls and processes are adhered to. Planning the major and minor releases of the solution. Ensuring robust configuration management. Working closely with the Engineering Manager on different aspects of product lifecycle management. Demonstrate the ability to independently work in a fast-paced environment requiring multitasking and efficient time management. Required Skills and Qualifications: End to end Lifecyle of Data warehousing, DataLakes and reporting Experience with Maintaining/Managing Data warehouses. Responsible for the design and development of a large, scaled-out, real-time, high performing Data Lake / Data Warehouse systems (including Big data and Cloud). Strong SQL and analytical skills. Experience in Power BI, Tableau, Qlikview, Qliksense etc. Experience in Microsoft Azure Services. Experience in developing and supporting ADF pipelines. Experience in Azure SQL Server/ Databricks / Azure Analysis Services Experience in developing tabular model. Experience in working with APIs. Minimum 2 years of experience in a similar role Experience with data warehousing, data modelling. Strong experience in SQL 2-6 years of total experience in building DW/BI systems Experience with ETL and working with large-scale datasets. Proficiency in writing and debugging complex SQLs. Prior experience working with global clients. Hands on experience with Kafka, Flink, Spark, SnowFlake, Airflow, nifi, Oozie, Pig, Hive,Impala Sqoop. Storage like HDFS , Object Storage (S3 etc), RDBMS, MPP and Nosql DB. Experience with distributed data management, data sfailover,luding databases (Relational, NoSQL, Big data, data analysis, data processing, data transformation, high availability, and scalability) Experience in end-to-end project implementation in Cloud (Azure / AWS / GCP) as a DW BI Developer Rich data governance experience, data security, data quality, provenance / lineagHive, Impalaerstanding of industry trends and products in dataops , continuous intelligence , Augmented analytics , and AI/ML. Prior experience of working in cloud like Azure, AWS and GCP Prior experience of working with Global Clients To know our Privacy Policy, please click on the link below or copy paste the URL on your browser: https://gedu.global/wp-content/uploads/2023/09/GEDU-Privacy-Policy-22092023-V2.0-1.pdf

Posted 1 week ago

Apply

8.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Job Role: Senior Dot Net Developer Experience :8+ years Notice period :Immediate Location :Trivandrum / Kochi Introduction Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Working hours - 8 hours , with a 4 hours of overlap during EST Time zone. ( 12 PM - 9 PM) This overlap hours is mandatory as meetings happen during this overlap hours Responsibilities include: Design, develop, enhance, document, and maintain robust applications using .NET Core 6/8+, C#, REST APIs, T-SQL, and modern JavaScript/jQuery Integrate and support third-party APIs and external services Collaborate across cross-functional teams to deliver scalable solutions across the full technology stack Identify, prioritize, and execute tasks throughout the Software Development Life Cycle (SDLC) Participate in Agile/Scrum ceremonies and manage tasks using Jira Understand technical priorities, architectural dependencies, risks, and implementation challenges Troubleshoot, debug, and optimize existing solutions with a strong focus on performance and reliability • Certifications : Microsoft Certified: Azure Fundamentals Microsoft Certified: Azure Developer Associate Other relevant certifications in Azure, .NET, or Cloud technologies Primary Skills : 8+ years of hands-on development experience with: C#, .NET Core 6/8+, Entity Framework / EF Core JavaScript, jQuery, REST APIs Expertise in MS SQL Server, including: Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types Skilled in unit testing with XUnit, MSTest Strong in software design patterns, system architecture, and scalable solution design Ability to lead and inspire teams through clear communication, technical mentorship, and ownership Strong problem-solving and debugging capabilities Ability to write reusable, testable, and efficient code Develop and maintain frameworks and shared libraries to support large-scale applications Excellent technical documentation, communication, and leadership skills Microservices and Service-Oriented Architecture (SOA) Experience in API Integrations 2+ years of hands with Azure Cloud Services, including: Azure Functions Azure Durable Functions Azure Service Bus, Event Grid, Storage Queues Blob Storage, Azure Key Vault, SQL Azure Application Insights, Azure Monitoring Secondary Skills: Familiarity with AngularJS, ReactJS, and other front-end frameworks Experience with Azure API Management (APIM) Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) Experience with Azure Data Factory (ADF) and Logic Apps Exposure to Application Support and operational monitoring Azure DevOps - CI/CD pipelines (Classic / YAML)

Posted 1 week ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Position: Data Engineer Experience: 6 +yrs. Job Location: Pune / Mumbai Job Profile Summary- Azure Databricks and Hands on Pyspark with tuning Azure Data Factory pipelines for various data loading into ADB, perf tuning Azure Synapse Azure Monitoring and Log Analytics ( error handling in ADF pipelines and ADB) Logic Apps and Functions Performance Tuning Databricks, Datafactory and Synapse Databricks data loading (layers ) and Export (which connection options, which best approach for report and access for fast)

Posted 1 week ago

Apply

10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Acuity Knowledge Partners Acuity Knowledge Partners (Acuity) is a leading provider of bespoke research, analytics and technology solutions to the financial services sector, including asset managers, corporate and investment banks, private equity and venture capital firms, hedge funds and consulting firms. Its global network of over 6,000 analysts and industry experts, combined with proprietary technology, supports more than 600 financial institutions and consulting companies to operate more efficiently and unlock their human capital, driving revenue higher and transforming operations. Acuity is headquartered in London and operates from 10 locations worldwide. The company fosters a diverse, equitable and inclusive work environment, nurturing talent, regardless of race, gender, ethnicity or sexual orientation. Acuity was established as a separate business from Moody’s Corporation in 2019, following its acquisition by Equistone Partners Europe (Equistone). In January 2023, funds advised by global private equity firm Permira acquired a majority stake in the business from Equistone, which remains invested as a minority shareholder. For more information, visit www.acuitykp.com Position Title- Associate Director Experience Level- 10+yrs Department-IT Location-Gurgaon Job Summary Role Overview We are seeking a highly skilled Oracle Fusion Techno-Functional Consultant with deep expertise in both technical development and functional configuration across key Oracle Fusion modules (Finance, SCM, HCM, or Projects). The ideal candidate will be responsible for the end-to-end delivery of Oracle Fusion solutions, including system analysis, technical development, functional configuration, integration, testing, and user support. This role requires a strong understanding of Oracle Cloud architecture and hands-on experience in implementing and supporting Oracle Fusion applications in a global enterprise environment. Key Responsibilities Collaborate with business stakeholders to gather and analyse requirements across modules such as Financials (AP, AR, GL, FA), SCM, HCM, or Projects. Configure Oracle Fusion modules to align with business needs, leveraging best practices. Lead and support Oracle Cloud module implementations, rollouts, and upgrades. Prepare functional design documents (FDDs) and provide input into solution design. Conduct functional testing, UAT support, and issue resolution. Facilitate knowledge transfer and user training sessions for key users and superusers. Technical Responsibilities: Develop technical solutions including custom reports (BI Publisher, OTBI), integrations (OIC, REST/SOAP APIs), and extensions (using PaaS and VBCS). Write and review technical design documents (MD50, MD70) and conduct peer code reviews. Build and manage integrations between Oracle Fusion and third-party systems using Oracle Integration Cloud (OIC), BIP, FBDI, and HDL. Monitor and troubleshoot technical issues including performance tuning and bug fixing. Ensure compliance with data governance, security, and system performance standards. Project and Support Responsibilities: Participate in ERP enhancement projects, change requests, and day-to-day support activities. Serve as a subject matter expert and act as a liaison between IT and business units. Manage and document change control processes, and contribute to the creation of SOPs and support materials. Engage in continuous improvement initiatives to optimise system usage and performance. Required Qualifications and Experience: Bachelor’s degree in Computer Science, Information Systems, Finance, or related discipline. 10+ years of techno-functional experience with Oracle Fusion Applications (Cloud ERP). Strong domain knowledge in at least one of the following: Finance, SCM, HCM, or Projects. Proven experience with configuration of Oracle Cloud modules and business process setup. Technical expertise in BI Publisher, OTBI, HDL, FBDI, Oracle Integration Cloud (OIC), REST/SOAP APIs, and SQL/PLSQL. Experience with Oracle Security, Role-Based Access Control (RBAC), and workflow configuration. Strong understanding of data migration strategies and tools. Excellent communication and stakeholder management skills. Oracle certifications in Cloud ERP modules (preferred). Preferred Skills: Experience with Agile/Scrum methodologies. Exposure to Oracle Cloud quarterly patch impact assessments. Familiarity with tools like JIRA, ServiceNow, or equivalent for ticket management. Knowledge of VBCS, ADF, or other Oracle PaaS development frameworks is a plus.

Posted 1 week ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Country India Working Schedule Full-Time Work Arrangement Hybrid Relocation Assistance Available No Posted Date 17-Jun-2025 Job ID 9856 Description And Requirements We are seeking a skilled and experienced Azure Data Factory/ Synapse Engineer with expertise in SQL and Pipelines to join our dynamic team. As an Azure Data Engineer, you will be responsible for developing and implementing dynamic pipelines for data integration platform. Interact with the stakeholders/ data engineering manager to understand the ad-hoc and strategic data / project requirements and provide logical and long-term technical solutions. Work independently on basic to intermediate level data extraction, validation, and manipulation assignments using SQL, Python and ADF/Synapse. Work on maintaining and supporting the day-to-day operations revolving around DW management, cleanups on Azure Cloud platform. Write SQL scripts to update, verify the data loads and perform data validations. Using Git, GitHub to log the development work and manage deployments Effectively manage the evolving priorities and maintain clear communication with the stakeholders and/or Data Engineering Manager involved. About MetLife Recognized on Fortune magazine's list of the 2025 "World's Most Admired Companies" and Fortune World’s 25 Best Workplaces™ for 2024, MetLife , through its subsidiaries and affiliates, is one of the world’s leading financial services companies; providing insurance, annuities, employee benefits and asset management to individual and institutional customers. With operations in more than 40 markets, we hold leading positions in the United States, Latin America, Asia, Europe, and the Middle East. Our purpose is simple - to help our colleagues, customers, communities, and the world at large create a more confident future. United by purpose and guided by empathy, we’re inspired to transform the next century in financial services. At MetLife, it’s #AllTogetherPossible. Join us!

Posted 1 week ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About the Company JMAN Group is a growing technology-enabled management consultancy that empowers organizations to create value through data. Founded in 2010, we are a team of 450+ consultants based in London, UK, and a team of 300+ engineers in Chennai, India. Having delivered multiple projects in the US, we are now opening a new office in New York to help us support and grow our US client base. We approach business problems with the mindset of a management consultancy and the capabilities of a tech company. We work across all sectors, and have in depth experience in private equity, pharmaceuticals, government departments and high-street chains. Our team is as cutting edge as our work. We take pride for ourselves on being great to work with – no jargon or corporate-speak, flexible to change and receptive of feedback. We have a huge focus on investing in the training and professional development of our team, to ensure they can deliver high quality work and shape our journey to becoming a globally recognised brand. The business has grown quickly in the last 3 years with no signs of slowing down. About the Role 7+ years of experience in managing Data & Analytics service delivery, preferably within a Managed Services or consulting environment. Responsibilities Serve as the primary owner for all managed service engagements across all clients, ensuring SLAs and KPIs are met consistently. Continuously improve the operating model, including ticket workflows, escalation paths, and monitoring practices. Coordinate triaging and resolution of incidents and service requests raised by client stakeholders. Collaborate with client and internal cluster teams to manage operational roadmaps, recurring issues, and enhancement backlogs. Lead a >40 member team of Data Engineers and Consultants across offices, ensuring high-quality delivery and adherence to standards. Support transition from project mode to Managed Services – including knowledge transfer, documentation, and platform walkthroughs. Ensure documentation is up to date for architecture, SOPs, and common issues. Contribute to service reviews, retrospectives, and continuous improvement planning. Report on service metrics, root cause analyses, and team utilization to internal and client stakeholders. Participate in resourcing and onboarding planning in collaboration with engagement managers, resourcing managers and internal cluster leads. Act as a coach and mentor to junior team members, promoting skill development and strong delivery culture. Qualifications ETL or ELT: Azure Data Factory, Databricks, Synapse, dbt (any two – Mandatory). Data Warehousing: Azure SQL Server/Redshift/Big Query/Databricks/Snowflake (Anyone - Mandatory). Data Visualization: Looker, Power BI, Tableau (Basic understanding to support stakeholder queries). Cloud: Azure (Mandatory), AWS or GCP (Good to have). SQL and Scripting: Ability to read/debug SQL and Python scripts. Monitoring: Azure Monitor, Log Analytics, Datadog, or equivalent tools. Ticketing & Workflow Tools: Freshdesk, Jira, ServiceNow, or similar. DevOps: Containerization technologies (e.g., Docker, Kubernetes), Git, CI/CD pipelines (Exposure preferred). Required Skills Strong understanding of data engineering and analytics concepts, including ELT/ETL pipelines, data warehousing, and reporting layers. Experience in ticketing, issue triaging, SLAs, and capacity planning for BAU operations. Hands-on understanding of SQL and scripting languages (Python preferred) for debugging/troubleshooting. Proficient with cloud platforms like Azure and AWS; familiarity with DevOps practices is a plus. Familiarity with orchestration and data pipeline tools such as ADF, Synapse, dbt, Matillion, or Fabric. Understanding of monitoring tools, incident management practices, and alerting systems (e.g., Datadog, Azure Monitor, PagerDuty). Strong stakeholder communication, documentation, and presentation skills. Experience working with global teams and collaborating across time zones.

Posted 1 week ago

Apply

8.0 years

0 Lacs

Trivandrum, Kerala, India

Remote

Linkedin logo

Role-Azure Data Engineer Experience-8+ years Location-Remote(Need candidates from south India) Notice-Immediate Interested candidates share your resume to sunilkumar@xpetize.com Keyskills: § Design and develop warehouse solutions using Azure Synapse Analytics, ADLS, ADF, Databricks, Power BI, Azure Analysis Services § Should be proficient in SSIS, SQL and Query optimization. § Should have worked in onshore offshore model managing challenging scenarios. § Expertise in working with large amounts of data (structured and unstructured), building data pipelines for ETL workloads and generate insights utilizing Data Science, Analytics. § Expertise in Azure, AWS cloud services, and DevOps/CI/CD frameworks. § Ability to work with ambiguity and vague requirements and transform them into deliverables. § Good combination of technical and interpersonal skills with strong written and verbal communication; detail-oriented with the ability to work independently. § Drive automation efforts across the data analytics team utilizing Infrastructure as Code (IaC) using Terraform, Configuration Management, and Continuous Integration (CI) / Continuous Delivery (CD) tools such as Jenkins. § Help build define architecture frameworks, best practices & processes. Collaborate on Data warehouse architecture and technical design discussions. § Expertise in Azure Data factory and should be familiar with building pipelines for ETL projects. § Expertise in SQL knowledge and experience working with relational databases. § Expertise in Python and ETL projects § Experience in data bricks will be of added advantage.

Posted 1 week ago

Apply

12.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. As part of our strategic initiative to build a centralized capability around data and cloud engineering, we are establishing a dedicated Azure Cloud Data Engineering practice under RMI - Optum Advisory umbrella. This team will be at the forefront of designing, developing, and deploying scalable data solutions on cloud primarily using Microsoft Azure platform. The practice will serve as a centralized team, driving innovation, standardization, and best practices across cloud-based data initiatives. New hires will play a pivotal role in shaping the future of our data landscape, collaborating with cross-functional teams, clients, and stakeholders to deliver impactful, end-to-end solutions. Primary Responsibilities Design and implement secure, scalable, and cost-effective cloud data architectures using cloud services such as Azure Data Factory (ADF), Azure Databricks, Azure Storage, Key Vault, Snowflake, Synapse Analytics, MS Fabric/Power BI etc. Define and lead data & cloud strategy, including migration plans, modernization of legacy systems, and adoption of new cloud capabilities Collaborate with clients to understand business requirements and translate them into optimal cloud architecture solutions, balancing performance, security, and cost Evaluate and compare cloud services (e.g., Databricks, Snowflake, Synapse Analytics) and recommend the best-fit solutions based on project needs and organizational goals Lead the full lifecycle of data platform and product implementations, from planning and design to deployment and support Drive cloud migration initiatives, ensuring smooth transition from on-premise systems while engaging and upskilling existing teams Lead and mentor a team of cloud and data engineers, fostering a culture of continuous learning and technical excellence Plan and guide the team in building Proof of Concepts (POCs), exploring new cloud capabilities, and validating emerging technologies Establish and maintain comprehensive documentation for cloud setup processes, architecture decisions, and operational procedures Work closely with internal and external stakeholders to gather requirements, present solutions, and ensure alignment with business objectives Ensure all cloud solutions adhere to security best practices, compliance standards, and governance policies Prepare case studies and share learnings from implementations to build organizational knowledge and improve future projects Building and analyzing data engineering processes and act as an SME to troubleshoot performance issues and suggesting solutions to improve Develop and maintain CI/CD processes using Jenkins, GitHub, Github Actions, Maven etc Building test framework for the Databricks notebook jobs for automated testing before code deployment Continuously explore new Azure services and capabilities; assess their applicability to business needs Create detailed documentation for cloud processes, architecture, and implementation patterns Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Identifies solutions to non-standard requests and problems Mentor and support existing on-prem developers for cloud environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience 12+ years of overall experience in Data & Analytics engineering 10+ years of solid experience working as an Architect designing data platforms using Azure, Databricks, Snowflake, ADF, Data Lake, Synapse Analytics, Power BI etc. 10+ years of experience working with data platform or product using PySpark and Spark-SQL In-depth experience designing complex Azure architecture for various business needs & ability to come up with efficient design & solutions Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc. Experience in leading team and people management Highly proficient and hands-on experience with Azure services, Databricks/Snowflake development etc. Excellent communication and stakeholder management skills Preferred Qualifications Snowflake, Airflow experience Power BI development experience Eexperience or knowledge of health care concepts - E&I, M&R, C&S LOBs, Claims, Members, Provider, Payers, Underwriting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. #NIC

Posted 1 week ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Job Purpose This role includes designing and building AI/ML products at scale to improve customer Understanding & Sentiment analysis, recommend customer requirements, recommend optimal inputs, Improve efficiency of Process. This role will collaborate with product owners and business owners. Key Responsibilities Leading a team of junior and experienced data scientists Lead and participate in end-to-end ML projects deployments that require feasibility analysis, design, development, validation, and application of state-of-the art data science solutions. Push the state of the art in terms of the application of data mining, visualization, predictive modelling, statistics, trend analysis, and other data analysis techniques to solve complex business problems including lead classification, recommender systems, product life-cycle modelling, Design Optimization problems, Product cost & weigh optimization problems. Leverage and enhance applications utilizing NLP, LLM, OCR, image based models and Deep Learning Neural networks for use cases including text mining, speech and object recognition Identify future development needs, advance new emerging ML and AI technology, and set the strategy for the data science team Cultivate a product-centric, results-driven data science organization Write production ready code and deploy real time ML models; expose ML outputs through APIs Partner with data/ML engineers and vendor partners for input data pipes development and ML models automation Provide leadership to establish world-class ML lifecycle management processes Qualifications Job Requirements MTech / BE / BTech / MSc in CS or Stats or Maths Experience Over 10 years of Applied Machine learning experience in the fields of Machine Learning, Statistical Modelling, Predictive Modelling, Text Mining, Natural Language Processing (NLP), LLM, OCR, Image based models, Deep learning and Optimization Expert Python Programmer: SQL, C#, extremely proficient with the SciPy stack (e.g. numpy, pandas, sci-kit learn, matplotlib) Proficiency in work with open source deep learning platforms like TensorFlow, Keras, Pytorch Knowledge of the Big Data Ecosystem: (Apache Spark, Hadoop, Hive, EMR, MapReduce) Proficient in Cloud Technologies and Service (Azure Databricks, ADF, Databricks MLflow) Functional Competencies A demonstrated ability to mentor junior data scientists and proven experience in collaborative work environments with external customers Proficient in communicating technical findings to non-technical stakeholders Holding routine peer code review of ML work done by the team Experience in leading and / or collaborating with small to midsized teams Experienced in building scalable / highly available distribute systems in production Experienced in ML lifecycle mgmt. and ML Ops tools & frameworks

Posted 1 week ago

Apply

3.0 years

0 Lacs

India

Remote

Linkedin logo

Title: Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description.

Posted 1 week ago

Apply

0.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu

On-site

Indeed logo

Our software engineers at Fiserv bring an open and creative mindset to a global team developing mobile applications, user interfaces and much more to deliver industry-leading financial services technologies to our clients. Our talented technology team members solve challenging problems quickly and with quality. We're seeking individuals who can create frameworks, leverage developer tools, and mentor and guide other members of the team. Collaboration is key and whether you are an expert in a legacy software system or are fluent in a variety of coding languages you're sure to find an opportunity as a software engineer that will challenge you to perform exceptionally and deliver excellence for our clients. Full-time Entry, Mid, Senior Yes (occasional), Minimal (if any) Responsibilities Requisition ID R-10363786 Date posted 06/20/2025 End Date 06/26/2025 City Chennai State/Region Tamil Nadu Country India Location Type Onsite Calling all innovators – find your future at Fiserv. We’re Fiserv, a global leader in Fintech and payments, and we move money and information in a way that moves the world. We connect financial institutions, corporations, merchants, and consumers to one another millions of times a day – quickly, reliably, and securely. Any time you swipe your credit card, pay through a mobile app, or withdraw money from the bank, we’re involved. If you want to make an impact on a global scale, come make a difference at Fiserv. Job Title Professional, Software Development Engineering What does a great Software Development Engineer do? As Software Development Engineer your focus will be on applying the principles of engineering to software development. The role focuses on the complex and large software systems that make up the core systems for the organization. You will be responsible for developing, unit testing, and integration tasks working within this highly visible-client focused web services application. Development efforts will also include feature enhancements, client implementations, and bug fixes as well as support of the production environment. What you will do: Collaborate within a team environment in the development, testing, and support of software development project lifecycles. Develop web interfaces and underlying business logic. Prepare any necessary technical documentation. Track and report daily and weekly activities. Participate in code reviews and code remediation. Perform and develop proper unit tests and automation. Participate in a 24 hour on-call rotation to support previous releases of the product. Research problems discovered by QA or product support and develop solutions to the problems. Perform additional duties as determined by business needs and as directed by management. What you will need to have: Bachelor’s degree in Computer Science, Engineering or Information Technology, or equivalent experience. 3-5 years of experience in developing scalable and secured J2EE applications. Excellent knowledge in Java based technologies (Core Java, JSP, AJAX, JSF, EJB, and Spring Framework), Oracle SQL/PLSQL and App servers like WebLogic, JBOSS. Excellent knowledge in SOAP & REST web service implementations. Knowledge in UNIX environment is preferred. Experience in JSF UI components (Oracle ADF & Rich Faces) technology is preferred. Good analytical, organizational, and problem-solving abilities. Good at prioritizing the tasks and commitment to complete them. Strong team player / customer service orientation. Demonstrated ability to work with both end users and technical staff. Ability to track progress against assigned tasks, report status, and proactively identifies issues. Demonstrate the ability to present information effectively in communications with peers and project management team. Highly Organized and Works well in a fast paced, fluid and dynamic environment. What would be great to have: Experience working in a Scrum Development Team Banking and Financial Services experience Java Certifications. Thank you for considering employment with Fiserv. Please: Apply using your legal name Complete the step-by-step profile and attach your resume (either is acceptable, both are preferable). Our commitment to Diversity and Inclusion: Fiserv is proud to be an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, national origin, gender, gender identity, sexual orientation, age, disability, protected veteran status, or any other category protected by law. Note to agencies: Fiserv does not accept resume submissions from agencies outside of existing agreements. Please do not send resumes to Fiserv associates. Fiserv is not responsible for any fees associated with unsolicited resume submissions. Warning about fake job posts: Please be aware of fraudulent job postings that are not affiliated with Fiserv. Fraudulent job postings may be used by cyber criminals to target your personally identifiable information and/or to steal money or financial information. Any communications from a Fiserv representative will come from a legitimate Fiserv email address.

Posted 1 week ago

Apply

5.0 - 7.0 years

0 Lacs

Rajkot, Gujarat, India

Remote

Linkedin logo

📢 Immediate Requirement – Senior Data Engineer (5 Positions) 📅 Joining Date: From 10th July 2025 💼 Experience: 5 to 7 years 💰 Compensation: ₹15 to ₹27 LPA 📍 Work Mode: Remote We are hiring Senior Data Engineers with strong hands-on experience in SQL, Python, Spark, and cloud-based ETL tools such as AWS Glue, Lambda, Step Functions, Azure Data Factory, or Databricks. ✅ Key Responsibilities: Build and maintain automated data pipelines Ensure data integrity, validation, and transformation Work with large datasets and ensure data quality Collaborate with cross-functional teams 🎯 Mandatory Skills: SQL (Postgres/Other RDBMS) Python for Data Processing Apache Spark Cloud ETL Tools (AWS Glue, Lambda, Step Functions, Azure Data Factory, Databricks) 🌟 Nice to Have Skills: AWS / Azure Cloud GitHub, Jenkins Terraform 📩 To Apply, Please Share Your CV with the Following Details: 1) Personal Information: Full Name: Contact Number: Email ID: 2) Job Preferences: Open to Contract Role (Yes/No): Open to Work from Home (Yes/No): Current Location: WFH Location (if different): 3) Experience Overview: Total Experience (Years): Cloud Experience (AWS/Azure – Glue, Lambda, Step Functions, ADF, Databricks): Spark (Years): Python for Data Processing (Years): SQL (Postgres/Other RDBMS): ETL/Data Pipelines (Automation & QA): Large Data Handling (Yes/No): Data Validation/Transformation (Yes/No): 4) Technical Skills Proficiency: Skill Level (Beginner / Intermediate / Expert) Python Spark SQL (Postgres) AWS (Glue, Lambda, Step Fn.) Azure (ADF / Databricks) Terraform GitHub (Nice to have) Jenkins (Nice to have) 5) Additional Skills / Certifications (if any): 6) Availability & Other Information: Current Notice Period: Last Working Day (if applicable): Reason for Job Change: Any Active Offers (Yes/No): Preferred Joining Timeline: Current CTC : Expected CTC: 🕒 Interview Timing: Weekdays, between 11:00 AM to 1:00 PM 📨 Interested candidates, please send your resume along with the above details to: 📧 jobspinakin@gmail.com 📞 Mr. Pinak Upadhyay 📍 Pinakin Recruitment Consultancy (Since 2005) 📱 9033442745

Posted 1 week ago

Apply

10.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Manager - MSM (Microsoft Sustainability Manager) Architect As an Architect on the GDS Consulting team within the Digital Engineering team, your primary responsibility will be to design and implement cutting-edge sustainability solutions for clients on a global scale. Your role involves leveraging your expertise to ensure these solutions align with industry best practices and deliver tangible value to clients. Your Key Responsibilities Oversees the design and deployment of the technical architecture, ensuring the appropriate expectations, principles, structures, tools, and responsibilities are in place to deliver excellence and risks are identified, managed, and mitigated. Analyse the chosen technologies against the implied target state and leverages good operational knowledge to identify technical and business gaps. Provides innovative and practical designs for the design and integration of new and existing solutions, which could include solutions for one or more functions of the enterprise, applying advanced technical capabilities. Collaborate with Service Lines, Sectors, Managed Services, Client Technology, Alliances and others to drive an integrated solution development and activation plan. Create sales and delivery collateral, online knowledge communities and support resources (e.g., client meeting decks, methods, delivery toolkits) with subject matter experts. Acts as an intermediary between the business / client community and the technical community, working with the business to understand and solve complex problems, presenting solutions and options in a simplified manner for clients / business. Microsoft Sustainability Manager configuration and customization: Analyse client needs and translate them into comprehensive MSM and Azure cloud solutions for managing emissions, waste, water, and other sustainability metrics. Configure and customize Microsoft Sustainability Manager to meet our specific data needs and reporting requirements. Develop automation routines and workflows for data ingestion, processing, and transformation. Integrate Sustainability Manager with other relevant data platforms and tools. Stay up to date on evolving ESG regulations, frameworks, and reporting standards. Power BI skills: Develop insightful dashboards and reports using Power BI to visualize and analyse key ESG metrics. Collaborate with stakeholders to identify data and reporting needs. Develop interactive reports and storytelling narratives to effectively communicate ESG performance. Designing and implementing data models: Lead the design and development of a robust data model to capture and integrate ESG data from various sources (internal systems, external datasets, etc.). Ensure the data model aligns with relevant ESG frameworks and reporting standards. Create clear documentation and maintain data lineage for transparency and traceability. Analyse and interpret large datasets relating to environmental, social, and governance performance. KPI (Key Performance Indicators) modelling and analysis: Define and develop relevant KPIs for tracking progress towards our ESG goals. Perform data analysis to identify trends, patterns, and insights related to ESG performance. Provide data-driven recommendations for improving our ESG footprint and decision-making. To qualify for the role, you must have: A bachelor's or master's degree. A minimum of 10-14 years of experience, preferably background in a professional services firm. 3+ years of experience in data architecture or analytics, preferably in the sustainability or ESG domain. Subject matter expertise in sustainability and relevant experience preferred (across any industry or competency) Experience managing large complex change management programs with multiple global stakeholders (required). Strong knowledge of Power Platform (Core), Power Apps (Canvas & MD), Power Automate. At least 6+ years of relevant experience on Power Platform Core (Dataverse/CDS, Canvas Apps, Model driven apps, Power Portals/ Power Pages), Dynamics CRM / 365. Strong and proven experience on Power Automate with efficiency/performance driven solution approach. Experience in designing cloud-based solutions using Microsoft Azure technologies including Azure Synapse, ADF, Azure functions etc. Able to effectively communicate with and manage diverse stakeholders across the business and enabling functions. Prior experience in go-to-market efforts Strong understanding of data modelling concepts and methodologies. Proven experience with Microsoft Azure and Power BI, including advanced functions and DAX scripting. Excellent communication skills with consulting experience preferred. Ideally, you will also have Analytical ability to manage multiple projects and prioritize tasks into manageable work products. Can operate independently or with minimum supervision. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

6.0 - 9.5 years

0 Lacs

Andhra Pradesh, India

On-site

Linkedin logo

At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Position Title : Full Stack Lead Developer Experience : 6-9.5 Years Job Overview We are seeking a highly skilled and versatile polyglot Full Stack Developer with expertise in modern front-end and back-end technologies, cloud-based solutions, AI/ML and Gen AI. The ideal candidate will have a strong foundation in full-stack development, cloud platforms (preferably Azure), and hands-on experience in Gen AI, AI and machine learning technologies. Key Responsibilities Develop and maintain web applications using Angular/React.js, .NET, and Python. Design, deploy, and optimize Azure native PaaS and SaaS services, including but not limited to Function Apps, Service Bus, Storage Accounts, SQL Databases, Key vaults, ADF, Data Bricks and REST APIs with Open API specifications. Implement security best practices for data in transit and rest. Authentication best practices – SSO, OAuth 2.0 and Auth0. Utilize Python for developing data processing and advanced AI/ML models using libraries like pandas, NumPy, scikit-learn and Langchain, Llamaindex, Azure OpenAI SDK Leverage Agentic frameworks like Crew AI, Autogen etc. Well versed with RAG and Agentic Architecture. Strong in Design patterns – Architectural, Data, Object oriented Leverage azure serverless components to build highly scalable and efficient solutions. Create, integrate, and manage workflows using Power Platform, including Power Automate, Power Pages, and SharePoint. Apply expertise in machine learning, deep learning, and Generative AI to solve complex problems. Primary Skills Proficiency in React.js, .NET, and Python. Strong knowledge of Azure Cloud Services, including serverless architectures and data security. Experience with Python Data Analytics libraries: pandas NumPy scikit-learn Matplotlib Seaborn Experience with Python Generative AI Frameworks: Langchain LlamaIndex Crew AI AutoGen Familiarity with REST API design, Swagger documentation, and authentication best practices. Secondary Skills Experience with Power Platform tools such as Power Automate, Power Pages, and SharePoint integration. Knowledge of Power BI for data visualization (preferred). Preferred Knowledge Areas – Nice To Have In-depth understanding of Machine Learning, deep learning, supervised, un-supervised algorithms. Qualifications Bachelor's or master's degree in computer science, Engineering, or a related field. 6~12 years of hands-on experience in full-stack development and cloud-based solutions. Strong problem-solving skills and ability to design scalable, maintainable solutions. Excellent communication and collaboration skills.

Posted 1 week ago

Apply

6.0 - 11.0 years

15 - 27 Lacs

Gurugram

Hybrid

Naukri logo

Hi, Wishes from GSN!!! Pleasure connecting with you!!! We been into Corporate Search Services for Identifying & Bringing in Stellar Talented Professionals for our reputed IT / Non-IT clients in India. We have been successfully providing results to various potential needs of our clients for the last 20 years. At present, GSN is hiring SSIS Developer for one of our leading MNC client. PFB the details for your better understanding : ****** Looking for SHORT JOINERS ****** 1. WORK LOCATION : Gurgaon 2. Job Role: SSIS Developer 3. EXPERIENCE : 6+ Yrs 4. CTC Range: Rs. 15 to Rs. 27 LPA 5. Work Type : WFO HYBRID ****** Looking for SHORT JOINERS ****** Job Description : • Experience in MS SQL Server • Designing, creating and maintaining database s • Creating stores procedures and function s • Hands-on writing complex queries • Ability to debug SQL procedures • Ability to tune the performance of SQL server • Understanding of indexes, partitions • Understanding of distributed database system like (snowflake, Hyperscale) • Understanding of azure data factory concepts (ADF) • SSIS ****** Looking for SHORT JOINERS ****** Note : kindly go through GOOGLE reviews on www.gsnhr.net Kindly feel free to contact us for any queries. Apply ONLINE for IMEMDIATE response. Thanks & Rgds Kaviya K GSN CONSULTING Mob : 9150016092 Email : kaviya@gsnhr.net Web : www.gsnhr.net Google Review : https://g.co/kgs/UAsF9W

Posted 1 week ago

Apply

4.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Summary Position Summary Senior Analyst – DMS Innovation Automation – Digital Data Analytics Innovation – Deloitte Support Services India Private Limited Are you willing to work with new technologies and a quick learner? You have the chance to play an important role in the CoRe DDAI Group through the Product Development team. This group's key areas of interest include unique solutions that enable the delivery of cutting-edge technology to the firm's business centers and their research, development, maintenance, and documentation. Work you will do. Design, develop, and maintain end-to-end data pipelines using Azure Data Factory, Azure Data Lake, and related Azure data services. Build and optimize ETL processes to ingest, transform, and load data from various sources (on-premises and cloud). Collaborate with data architects, business analysts, and stakeholders to understand data requirements and deliver solutions. Monitor, troubleshoot, and optimize data workflows for performance, reliability, and scalability. Implement data quality and validation checks within data pipelines. Document technical solutions, data flows, and process designs. Ensure compliance with data security and privacy standards. Mentor junior analysts and provide technical guidance as needed. Experienced developer - Should possess excellent skills in coding, testing, and debugging in SQL, SSIS and ADF Should be a quick learner in upskilling in any of the new tools/technologies recommended and should be able to deliver projects quickly with new upskilled technologies. Applies Quality Standards and best Practices throughout project life cycle. Perform Code Review for Analyst/ Junior resources and provide necessary feedback. Proficient in handling multiple and critical projects simultaneously. Should possess knowledge of working in Agile software engineering practices & best practices for the full software development life cycle, including coding standards, code reviews, source control, build processes, testing, and operations. Ability to develop solutions following established technical designs, application development standards and quality processes. Ability to work independently, manage small engagements or be part of larger engagements. Should have excellent communication skills to work with end users. Should have good knowledge in gathering requirements, working on design sessions, and communicating effectively with clients during UAT phases. Continue improving MDM data model, Global Systems & Client attributes knowledge. Decent understanding of all technology tools and their integration Understand the technology trends in the industry and share with the team members. Identify potential process gaps and closely collaborate with Assistant Managers to provision automated solutions or work on relevant POCs. Required Education, Qualifications, and Experience: Required Experience: B.Tech/M.Tech, MBA, Lateral Hire (4-6 years of experience). Educational Qualification: B.E/B.Tech or MTech Should be proficient in understanding of one or more of the following Technologies: Knowledge in DBMS concepts, exposure to querying on any relational database preferably MS SQL Server, MS Azure SQL, SSIS, and Azure Data Factory. Knowledge on any of the coding language like C#. NET or VB .Net would be added advantage. Understands development methodology and lifecycle. Excellent analytical skills and communication skills (written, verbal, and presentation) Ability to work both independently and as part of a team with professionals at all levels. Ability to prioritize tasks, work on multiple assignments, and raise concerns/questions where appropriate. Seek information / ideas / establish relationship with customer to assess any future opportunities. Location: Hyderabad Work hours: 2 p.m. – 11 p.m. How You Will Grow At Deloitte, we have invested a great deal to create a rich environment in which our professionals can grow. We want allourpeopletodevelopintheirownway,playingtotheirownstrengthsastheyhonetheirleadershipskills.And,asa part of our efforts, we provide our professionals with a variety of learning and networking opportunities—including exposure to leaders, sponsors, coaches, and challenging assignments—to help accelerate their careers along the way. Notwopeoplelearninexactlythesameway.So,weprovidearangeofresources,includingliveclassrooms,team-based learning, and eLearning. Deloitte University (DU): The Leadership Center in India, our state-of-the-art, world-class learningcenterinthe Hyderabadoffice,isanextensionoftheDUinWestlake,Texas,andrepresentsatangiblesymbol ofourcommitmenttoourpeople’sgrowthanddevelopment. ExploreDU:TheLeadershipCenterinIndia. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Deloitte’s culture Ourpositiveandsupportivecultureencouragesourpeopletodotheirbest workeveryday.Wecelebrateindividualsby recognizing their uniqueness and offering them the flexibility to make daily choices that can help them to be healthy, centered,confident,andaware.Weofferwell-beingprogramsandarecontinuouslylookingfornewwaystomaintaina culture that is inclusive, invites authenticity, leverages our diversity, and where our people excel and lead healthy, happy lives. Learn more about Life atDeloitte. Corporate citizenship Deloitte is led by a purpose: to make an impact that matters. This purpose defines who we are and extends to relationships with our clients, our people, and our communities. We believe that business has the power to inspire and transform. We focus on education, giving, skill-based volunteerism, and leadership to help drive positive social impact in our communities. Learn more about Deloitte’s impact on the world. Disclaimer: Please note that this description is subject to change basis business/project requirements and at the discretion of the management. #EAG-Core Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 305195 Show more Show less

Posted 1 week ago

Apply

7.0 years

0 Lacs

India

On-site

Linkedin logo

Key Responsibilities: Design end-to-end cloud architectures aligned with business objectives and technology strategy. Lead the migration of applications and infrastructure to Azure or hybrid cloud environments. Create detailed High-Level Designs (HLD) and Low-Level Designs (LLD) . Develop and implement landing zones , networking , security , and identity architectures . Ensure compliance , governance , and cost optimization across all cloud deployments. Collaborate with DevOps teams on CI/CD pipelines, IaC, monitoring, and automation. Conduct technical assessments , architecture reviews , and POCs for new solutions. Stay updated with the latest Azure and cloud services; act as a technical evangelist within the organization. 🛠️ Required Skills & Experience: 7+ years of experience in solution architecture or cloud architecture roles. Deep hands-on experience in Microsoft Azure : Azure VMs, App Services, Azure SQL, ADF, Azure AD, AKS, Bicep/Terraform. Strong grasp of security best practices : Identity management, RBAC, NSGs, private endpoints. Solid understanding of networking : VPN, ExpressRoute, Azure Firewall, VNet Peering. Familiar with DevOps practices: Git, Azure DevOps, CI/CD, Docker, Kubernetes. Experience in application modernization and cloud migration strategies . Proficiency in writing Infrastructure as Code (IaC) with ARM, Bicep, or Terraform. Show more Show less

Posted 1 week ago

Apply

5.0 years

0 Lacs

India

On-site

Linkedin logo

Requirements A bachelor's degree in computer science or equivalent 5+ years of experience as a Data Engineer, Experience with Azure big data tools: Azure Databricks, Azure Synapse, HDInsight, ADLS Experience with relational SQL and NoSQL databases. Excellent problem solving and analytic skills associated with working on structured and unstructured datasets using Azure bigdata tools. Experience with data pipeline and workflow management tools: ADF and Logic Apps Experience with Azure cloud services: VM, Azure Databricks, Azure SQL DB, Azure Synapse Experience with stream-processing systems: Azure Streams Experience with scripting languages Python. and have strong knowledge in developing Python notebooks. Mandatory Skills : Azure Data engineering, Pyspark, Python, Azure Data Bricks, Azure Data Factory, SQL. Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Linkedin logo

Job Description1. Job Title: Lead Technical Architect (Strategy & Optimization Data Lake & Analytics) (Total Position – 1) Experience: 10+ years Location: Onsite/Client-facing (Noida) Reports To: Client Stakeholders / Delivery Head Budget: Max. 25 LPA Responsibilities: · Manage Project Delivery, scope, timelines, budget, resource allocation, and risk mitigation. · Develop and maintain robust data ingestion pipelines (batch, streaming, API). Provide architectural inputs during incident escalations and act as final authority for RCA documentation and closure. of ADF, Power BI, and Databricks · Define and enforce data governance, metadata, and quality standards across zones. · Monitor performance, optimize data formats (e.g., Parquet), and tune for cost-efficiency. Tune query performance for Databricks and Power BI datasets using optimization techniques (e.g. caching, BI Engine, materialized views). · Lead and mentor a team of data engineers, fostering skills in Azure services and DevOps. Guide schema designs for new datasets and integrations aligned with Diageo’s analytics strategy. · Coordinate cross-functional stakeholders (security, DevOps, business) for aligned execution. · Oversee incident and change management with SLA adherence and continuous improvement. Serve as the governance owner for SLA compliance, IAM policies, encryption standards, and data retention strategies. · Ensure compliance with policies (RBAC, ACLs, encryption) and regulatory audits. Initial data collection for RCA · Report project status, KPIs, and business value to senior leadership. Lead monthly and quarterly reviews, presenting insights, improvements, and roadmap alignment to Diageo stakeholders. Required Skills · Strong architecture-level expertise in Azure Data Platform (ADLS, ADF, Databricks, Synapse, Power BI). · Deep understanding of data lake zone structuring, data lineage, metadata governance, and compliance (e.g., GDPR, ISO). · Expert in Spark, PySpark, SQL, JSON, and automation tooling (ARM, Bicep, Terraform optional). · Capable of aligning technical designs with business KPIs and change control frameworks. · Excellent stakeholder communication, team mentoring, and leadership capabilities. Show more Show less

Posted 1 week ago

Apply

14.0 years

0 Lacs

New Delhi, Delhi, India

On-site

Linkedin logo

Company Description AWC Software is a 14+ year old IT services provider specializing in Oracle E-Business Suite, OBIEE, Microsoft Sharepoint, SAP, SFDC, Fusion Middleware, and ADF & BPM tools. With a team of around 600 employees, our Offshore Development Center in Noida supports enterprise clients and System Integrators both in India and abroad. We are an ISO 9001:2008 certified company, Oracle Gold Partner, and Microsoft Silver Partner, recognized as one of the World’s Top 20 Most Promising Oracle Service Providers. Role Description This is a full-time on-site role for a SAPFICO Consultant located in New Delhi. The consultant will be responsible for implementing and supporting SAP FICO solutions, integrating processes and data, configuring systems, and providing training and support to end users. They will also collaborate with cross-functional teams to design and optimize financial processes within SAP FICO. Qualifications Expertise in SAP FICO implementation and support Knowledge of financial processes and accounting principles Experience in configuring systems and integrating data Ability to design and optimize financial processes within SAP FICO Strong problem-solving and analytical skills Excellent communication and interpersonal skills Bachelor's degree in Finance, Accounting, Computer Science, or related field SAP certifications in FICO are a plus Show more Show less

Posted 1 week ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Indeed logo

Bangalore,Karnataka,India Job ID 768618 Join our Team About this opportunity: Data and Analytics is a function with the aim to craft and execute our Analytics and AI strategy, vision and to ensure multi-functional execution for Ericsson to transform into an Intelligent enterprise. AI and Data is a key theme for Ericsson to become data driven, and we continue to build up elite competence in this area to increase performance, deliver intelligence and to ensure, secure and scalable development, deployment and continued operations of Analytics and AI solutions. What you will do: Facilitate requirement analysis with Stakeholders for Data Collection, Analytics, & Machine Learning Requirements Work with IT Digital Product Owner to ensure understanding of business requirements in Data Management for the respective functional area Identify and propose new opportunities for data services Define characteristics for wanted scale, redundancy, distribution, protection, etc. criteria for different types of data services Work with Developers and other FA Architects for cross functional data requirements Support with technical lead scrum teams during development. Define and design appropriate interfaces for data update, retrieval queries and recommended workflows based on specific data use case The skills you bring: Strong prior experience in Data Warehousing and BI, irrespective of technology. Prior experience as an Architect with leading delivery team Strong hands-on prior experience as BI/ETL developer. Education / Experience Required (In Years) We are looking for a candidate with 10+ years of experience as Data Architect, Solution Architect or Data Engineer who has attained a Graduate degree (B.E/ BTech/Mtech ) in Computer Science and Information Systems or another quantitative field. Software/tools: Following experiences and skills are required. Strong understanding and experience on Snowflake Architecture Strong Data Modelling Experience Experienced on Batch and streaming processing (for instance Apache Spark, Storm) and relevant back-end languages such as Python, Pyspark and SQL Software Development including Agile/Scrum, CI/CD, Testing and Configuration Management, GIT/Gerrit BI Tools Such as Tableau, PowerBI, and also web-based dashboards and solutions Any of the experiences and skills listed below is a merit, the more the better Cloud Agnostic Data Platforms such as Snowflake, Databricks and SAP Data Warehouse Cloud Able to create the design and data modelling independently using data vault 2.0, Big data technologies, such as Hadoop, Hive, and Pig, MAPR or Enterprise data warehousing initiatives Cloud Native Data Platforms (Azure Synapse Analytics, Amazon Redshift) Azure services – Azure Data factory (ADF), Azure Databricks (ADB) Designing data products and publishing data using data marketplace platform Authorization methods including Role Based Access Control and Policy Based Access Control. Data mesh implementation and federated model for data product development Relational Data Modeling with 3NF Modern Data Architectures including cloud native, microservices architecture, virtualization, Kubernetes and containerization. Messaging, (for instance Apache Kafka) NoSQL DBs, like Cassandra and MongoDB RDBMS such Oracle, MS SQL, MariaDB or PostGRE, both row based and columnar based Experience with ETL tools (Talend, Informatica, BODS etc.) Why join Ericsson? At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply?

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies