Jobs
Interviews

1301 Adf Jobs - Page 41

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

About The Role Gartner is looking for a well-rounded and motivated developer to join its Conferences Technology & Insight Analytics team, which is responsible for developing the reporting and analytics to support its Conference reporting operations. What You Will Do Collaborate with business stakeholders, design, build advanced analytic solutions for Gartner Conference Technology Business. Execution of our data strategy through design and development of Data platforms to deliver Reporting, BI and Advanced Analytics solutions. Design and development of key analytics capabilities using MS SQL Server, Azure SQL Managed Instance, T-SQL & ADF on Azure Platform. Consistently improving and optimizing T-SQL performance across the entire analytics platform. Create, build, and implement comprehensive data integration solutions utilizing Azure Data Factory. Analysing and solving complex business problems, breaking down the work into actionable tasks. Develop, maintain, and document data dictionary and data flow diagrams Responsible for building and enhancing the regression test suite to monitor nightly ETL jobs and identify data issues. Work alongside project managers, cross teams to support fast paced Agile/Scrum environment. Build Optimized solutions and designs to handle Big Data. Follow coding standards, build appropriate unit tests, integration tests, deployment scripts and review project artifacts created by peers. Contribute to overall growth by suggesting improvements to the existing software architecture or introducing new technologies. What You Will Need Looking for 5 - 6yrs of exp as Data Engineer. The candidate should have strong qualitative and quantitative problem-solving abilities and is expected to yield ownership and accountability. Must have: Strong experience with SQL, including diagnosing and resolving load failures, constructing hierarchical queries, and efficiently analysing existing SQL code to identify and resolve issues, using Microsoft Azure SQL Database, SQL Server, and Azure SQL Managed Instance. Strong technical experience with Database performance and tuning, troubleshooting and query optimization. Technical experience with Azure Data Factory on Azure Platform. Create and manage complex ETL pipelines to extract, transform, and load data from various sources using Azure Data Factory. Enhance data workflows to improve performance, scalability, and cost-effectiveness. Experience in Cloud Platforms, Azure technologies like Azure Analysis Services, Azure Blob Storage, Azure Data Lake, Azure Delta Lake etc Experience with data modelling, database design, and data warehousing concepts and Data Lake. Ensure thorough documentation of data processes, configurations, and operational procedures. Who You Are Graduate/Post-graduate in BE/Btech, ME/MTech or MCA is preferred Excellent communication and prioritization skills Able to work independently or within a team proactively in a fast-paced AGILE-SCRUM environment Strong desire to improve upon their skills in software development, frameworks, and technologies Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 20,000 associates globally who support ~15,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work. What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com. Job Requisition ID:98336 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser. Show more Show less

Posted 1 month ago

Apply

10.0 years

0 Lacs

Gurgaon, Haryana, India

Remote

We are looking for a skilled and cloud-native-oriented Data Platform Engineer to join our Core Infrastructure organization. You will be instrumental in evolving our database services from traditional IaaS to fully managed, cloud-native, and automated platforms. You will design, build, and maintain robust data infrastructure using modern cloud services, with a strong focus on automation, performance, reliability, and security. Exhibit deep knowledge of migration paths from IaaS (e.g., SQL Server on EC2) to PaaS platforms, with proven ability to modernize database deployments while maintaining operational continuity. Influence platform modernization decisions by recommending the use of PaaS offerings (e.g., RDS, Managed Instance) over traditional IaaS setups, evaluating serverless options, NoSQL databases, or other cloud-native technologies based on workload and scalability needs. SQL Query Optimization: Review and optimize SQL queries across platform-centric environments, including AWS RDS, Azure SQL Server Managed Instance, and SQL Server on EC2, to enhance database performance and efficiency, identifying and resolving bottlenecks to improve query execution times. Database Migrations and Upgrades : Plan and execute migrations with a platform-centric focus, transitioning from on-premises or IaaS-based SQL Server on EC2 to PaaS solutions like AWS RDS and Azure SQL Server Managed Instance, ensuring seamless transitions with minimal downtime and data integrity in coordination with development and operations teams. Performance Monitoring and Tuning: Monitor and tune database performance using platform-native tools (e.g., Azure Monitoring for Managed Instance, AWS CloudWatch for RDS), proactively optimizing resource utilization and scalability across cloud platforms, while maintaining efficiency for SQL Server on EC2. High Availability and Security: Manage and optimize Azure SQL Server Managed Instance and AWS RDS for high availability, including multi-subnet Always On configurations, while ensuring secure operations with role-based access controls (RBAC), encryption, and AWS KMS integration across all platforms, including SQL Server on EC2. Retention and Compliance: Implement database and application server retention policies aligned with cloud-native best practices, ensuring compliance across RDS, Managed Instance, and EC2-hosted SQL Server. On-Call Support: Participate in on-call support rotations, including weekends, holidays, and after-business hours, to ensure 24/7 availability and rapid incident response for platform-centric database environments. Job Requirements 10+ years of experience as a Senior SQL Database Administrator (DBA) or Data Platform Engineer, with at least 3 years of focused expertise in cloud-native database environments, including Microsoft SQL Server (2016 to 2019), Azure SQL Database, and AWS RDS/Aurora. Proven hands-on experience with managed database services such as AWS RDS, AWS Aurora, and Azure SQL Database, including configuration, optimization, and scaling in production environments. Demonstrated expertise in migration paths from on-premises/IaaS to PaaS, with a track record of successfully transitioning SQL Server databases to cloud-native platforms like Azure SQL Database or AWS Aurora. Strong working knowledge of SQL query optimization in cloud environments, including tuning queries, transactions, and stored procedures to enhance performance across managed database services. Expertise in cloud database architecture, including indexing, partitioning, storage, and caching mechanisms, optimized for PaaS platforms. Comprehensive understanding of cloud-native database performance, with advanced skills in query tuning, troubleshooting, and resolving bottlenecks in Azure SQL and AWS RDS environments. Knowledge of ETL/ELT processes, data lakes, and data warehousing concepts, leveraging cloud tools to support data engineering and analytics. Proficiency in Python or SQL scripting for database automation, optimization, and integration with cloud-native workflows. Familiarity with additional cloud database technologies (e.g., MySQL, PostgreSQL, or NoSQL databases on AWS/Azure) and a willingness to adapt to emerging cloud database solutions. Strong understanding of cloud security standards, including RBAC, encryption, and integration with AWS KMS or Azure Key Vault for secure database management. Experience with database migration tools, version control systems, and automated deployment pipelines to streamline cloud database operations. Excellent problem-solving skills and attention to detail, with a proactive approach to managing cloud-native database challenges. Strong communication and collaboration skills, enabling effective teamwork with developers, cloud architects, and operations teams in a cloud-first environment. Bachelor’s or Master’s degree in Computer Science, Information Technology, or a related field. Preferred Qualifications Certification in database administration (e.g., Microsoft Certified Database Administrator). Hands-on experience with Terraform, Infrastructure as Code (IaC), and automation tools (Ansible, Python or PowerShell). Familiarity with Data Engineering Stack (ADLS, ADF, Azure Databricks), Python libraries. Company description What we offer A truly global working environment with a strong focus on diversity and inclusion that enables you to work with people of different cultures from all around the world. An excellent work-life balance, including flexible working times and remote working options wherever possible. A unique SoftwareOne working culture where we celebrate our success by “Work Hard & Party Harder” which includes company parties, social events and team gatherings. Focus on people development with an extensive internal Learning Platform, including webinars, trainings, live learning, audiobooks, certification preparation, and more. Corporate benefits such as shopping discounts, company pension plan, employee share purchase program, etc. as per local country policies and guidelines. To know more about us click here : https://youtu.be/820Si9eFHT0?si=MBwrsRPRn0w9uI20 Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Associate Job Description & Summary At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities We are seeking a developer to design, develop, and maintain data ingestion processes to a data platform built using Microsoft Technologies, ensuring data quality and integrity. The role involves collaborating with data architects and business analysts to implement solutions using tools like ADF, Azure Databricks, and requires strong SQL skills. Key responsibilities include developing, testing, and optimizing ETL workflows and maintaining documentation. B.Tech degree and 5+ years of ETL development experience in Microsoft data track are required. Demonstrated expertise in Agile methodologies, including Scrum, Kanban, or SAFe. Mandatory Skill Sets ETL Development Preferred Skill Sets Microsoft Stack Years Of Experience Required 4+ Education Qualification B.Tech/B.E./MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills ETL Development Optional Skills Microsoft Technology Stack Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

India

Remote

Client- UK Based client Availability: 8 hours per day Shift: 11 AM IST to 8 PM IST ( 8 hrs per day) Exp -7+ Yrs Mode: WFH ( Freelancing) If you're interested, kindly share your CV-thara.dhanaraj@excelenciaconsulting.com/ Call7358452333 Min 4+ years exp is required.. • Hands on technical experience in Fusion HCM Application Core HR, Absences, Payroll, OTL, Compensation, Benefits, Talent modules, Fast Formulas • Hands on Fusion HCM Techno experience on data migration, integrations using FBL, HCM Data Loader, Spreadsheet Loader, ADF-DI tools. • Working on design and development of reports, extracts and interfaces using HCM Extract, OTBI and BIP tools. • Strong in Oracle SQL queries preparing, finding the tables If you are interested, please share your updated CV. Looking forward to hearing from you. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Name: Senior Data Engineer Azure Years of Experience: 5 Job Description: We are looking for a skilled and experienced Senior Azure Developer to join our team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, thinking strategically, innovating, and helping others, this job is for you! Primary Skills: ADF,Databricks Secondary Skills: DBT, Python, Databricks, Airflow, Fivetran, Glue, Snowflake Role Description: Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge /involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases Role Responsibility: Translate functional specifications and change requests into technical specifications Translate business requirement document, functional specification, and technical specification to related coding Develop efficient code with unit testing and code documentation Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving Setting up the development environment and configuration of the development tools Communicate with all the project stakeholders on the project status Manage, monitor, and ensure the security and privacy of data to satisfy business needs Contribute to the automation of modules, wherever required To be proficient in written, verbal and presentation communication (English) Co-ordinating with the UAT team Role Requirement: Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) Knowledgeable in Shell / PowerShell scripting Knowledgeable in relational databases, nonrelational databases, data streams, and file stores Knowledgeable in performance tuning and optimization Experience in Data Profiling and Data validation Experience in requirements gathering and documentation processes and performing unit testing Understanding and Implementing QA and various testing process in the project Knowledge in any BI tools will be an added advantage Sound aptitude, outstanding logical reasoning, and analytical skills Willingness to learn and take initiatives Ability to adapt to fast-paced Agile environment Additional Requirement: Demonstrated expertise as a Data Engineer, specializing in Azure cloud services. Highly skilled in Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure Synapse Analytics. Create and execute efficient, scalable, and dependable data pipelines utilizing Azure Data Factory. Utilize Azure Databricks for data transformation and processing. Effectively oversee and enhance data storage solutions, emphasizing Azure Data Lake and other Azure storage services. Construct and uphold workflows for data orchestration and scheduling using Azure Data Factory or equivalent tools. Proficient in programming languages like Python, SQL, and conversant with pertinent scripting languages Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Tamil Nadu, India

On-site

Senior Data Engineer - Azure Years of Experience : 5 Job location: Chennai Job Description : We are looking for a skilled and experienced Senior Azure Developer to join the team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, thinking strategically, innovating,and helping others, this job is for you! Primary Skills : ADF,Databricks Secondary Skills : DBT,Python,Databricks,Airflow,Fivetran,Glue,Snowflake Role Description : Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge / involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases. Role Responsibility : l Translate functional specifications and change requests into technical specifications l Translate business requirement document, functional specification, and technical specification to related coding l Develop efficient code with unit testing and code documentation l Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving l Setting up the development environment and configuration of the development tools l Communicate with all the project stakeholders on the project status l Manage, monitor, and ensure the security and privacy of data to satisfy business needs l Contribute to the automation of modules, wherever required l To be proficient in written, verbal and presentation communication (English) l Co-ordinating with the UAT team Role Requirement : l Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) l Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) l Knowledgeable in Shell / PowerShell scripting l Knowledgeable in relational databases, nonrelational databases, data streams, and file stores l Knowledgeable in performance tuning and optimization l Experience in Data Profiling and Data validation l Experience in requirements gathering and documentation processes and performing unit testing l Understanding and Implementing QA and various testing process in the project l Knowledge in any BI tools will be an added advantage l Sound aptitude, outstanding logical reasoning, and analytical skills l Willingness to learn and take initiatives l Ability to adapt to fast-paced Agile environment Additional Requirement : l Demonstrated expertise as a Data Engineer, specializing in Azure cloud services. l Highly skilled in Azure Data Factory, Azure Data Lake, Azure Databricks, and Azure Synapse Analytics. l Create and execute efficient, scalable, and dependable data pipelines utilizing Azure Data Factory. l Utilize Azure Databricks for data transformation and processing. l Effectively oversee and enhance data storage solutions, emphasizing Azure Data Lake and other Azure storage services. l Construct and uphold workflows for data orchestration and scheduling using Azure Data Factory or equivalent tools. l Proficient in programming languages like Python, SQL, and conversant with pertinent l scripting languages. #SeniorDataEngineer #AzureDataEngineer #Azure #AzureDataFactory #AzureDatabricks #DataEngineering #HiringNow #TechJobs #CloudEngineer #DataPipeline #SQL #Python #Databricks #ETL #WeAreHiring #ImmediateJoiners Show more Show less

Posted 1 month ago

Apply

3.0 - 7.0 years

10 - 20 Lacs

Noida, Gurugram, Delhi / NCR

Hybrid

Salary: 8 to 24 LPA Exp: 3 to 7 years Location: Gurgaon (Hybrid) Notice: Immediate to 30 days..!! Job Title: Senior Data Engineer Job Summary: We are looking for an experienced Senior Data Engineer with 5+ years of hands-on experience in cloud data engineering platforms, specifically AWS, Databricks, and Azure. The ideal candidate will play a critical role in designing, building, and maintaining scalable data pipelines and infrastructure to support our analytics and business intelligence initiatives. Key Responsibilities: Design, develop, and optimize scalable data pipelines using AWS services (e.g., S3, Glue, Redshift, Lambda). Build and maintain ETL/ELT workflows leveraging Databricks and Apache Spark for processing large datasets. Work extensively with Azure data services such as Azure Data Lake, Azure Synapse, Azure Data Factory, and Azure Databricks. Collaborate with data scientists, analysts, and stakeholders to understand data requirements and deliver high-quality data solutions. Ensure data quality, reliability, and security across multiple cloud platforms. Monitor and troubleshoot data pipelines, implement performance tuning, and optimize resource usage. Implement best practices for data governance, metadata management, and documentation. Stay current with emerging cloud data technologies and industry trends to recommend improvements. Required Qualifications: 5+ years of experience in data engineering with strong expertise in AWS , Databricks , and Azure cloud platforms. Hands-on experience with big data processing frameworks, particularly Apache Spark. Proficient in building complex ETL/ELT pipelines and managing data workflows. Strong programming skills in Python, Scala, or Java. Experience working with structured and unstructured data in cloud storage solutions. Knowledge of SQL and experience with relational and NoSQL databases. Familiarity with CI/CD pipelines and DevOps practices in cloud environments. Strong analytical and problem-solving skills with an ability to work independently and in teams. Preferred Skills: Experience with containerization and orchestration tools (Docker, Kubernetes). Familiarity with machine learning pipelines and tools. Knowledge of data modeling, data warehousing, and analytics architecture.

Posted 1 month ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications.As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products.Duties and tasks are varied and complex needing independent judgment. Fully competent in own area of expertise. May have project lead role and or supervise lower level personnel. BS or MS degree or equivalent experience relevant to functional area. 4 years of software engineering or related experience. Career Level - IC3 Responsibilities Fusion development team works on design, development and maintenance of Fusion Global HR, Talent, Configuration Workbench and Compensation product areas. As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products. Duties and tasks are varied and complex needing independent judgment. Fully competent in own area of expertise. May have project lead role and or supervise lower level personnel. Bachelors or Masters Degree (B.E./B.Tech./MCA/M.Tech./M.S.) from reputed universities. 1-8 years of experience in Applications or product development. Mandatory Skills Strong Knowledge of object oriented programming concepts Product design & development experience in [Java / J2EE technologies (JSP/Servlet)] OR [Database fundamentals, SQL, PL/SQL] Optional Skills Development experience on the Fusion Middleware platform Familiarity with ADF and Exposure to development in the cloud Development experience in Oracle Applications / HCM functionalityAnalyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 month ago

Apply

5.0 - 10.0 years

11 - 21 Lacs

Pune, Bengaluru, Delhi / NCR

Work from Office

Skills- Azure Data factory, Data Bricks, Scala. Location- PAN India. Experience- 5 years- 12 years

Posted 1 month ago

Apply

4.0 years

0 Lacs

Rajkot, Gujarat, India

On-site

Job Description Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications.As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products.Duties and tasks are varied and complex needing independent judgment. Fully competent in own area of expertise. May have project lead role and or supervise lower level personnel. BS or MS degree or equivalent experience relevant to functional area. 4 years of software engineering or related experience. Career Level - IC3 Responsibilities Fusion development team works on design, development and maintenance of Fusion Global HR, Talent, Configuration Workbench and Compensation product areas. As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products. Duties and tasks are varied and complex needing independent judgment. Fully competent in own area of expertise. May have project lead role and or supervise lower level personnel. Bachelors or Masters Degree (B.E./B.Tech./MCA/M.Tech./M.S.) from reputed universities. 1-8 years of experience in Applications or product development. Mandatory Skills Strong Knowledge of object oriented programming concepts Product design & development experience in [Java / J2EE technologies (JSP/Servlet)] OR [Database fundamentals, SQL, PL/SQL] Optional Skills Development experience on the Fusion Middleware platform Familiarity with ADF and Exposure to development in the cloud Development experience in Oracle Applications / HCM functionalityAnalyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Senior Associate Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Year of experience required Minimum 4Years of Oracle fusion experience Educational Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Chartered Accountant Diploma, Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Oracle Fusion Applications Optional Skills Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Self-Awareness, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 month ago

Apply

4.0 - 6.0 years

0 Lacs

Gurgaon

On-site

Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date What your main responsibilities are: Accountabilities: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications: Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-6 years of relevant work experience is required. Experience with stakeholder management is an added advantage. What we are looking for Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills and Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.

Posted 1 month ago

Apply

7.0 - 9.0 years

3 - 9 Lacs

Gurgaon

On-site

Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 16,700 stores in 31 countries, serving more than 9 million customers each day. At Circle K, we are building a best-in-class global data engineering practice to support intelligent business decision-making and drive value across our retail ecosystem. As we scale our engineering capabilities, we’re seeking a Lead Data Engineer to serve as both a technical leader and people coach for our India-based Data Enablement pod. This role will oversee the design, delivery, and maintenance of critical cross-functional datasets and reusable data assets while also managing a group of talented engineers in India. This position plays a dual role: contributing hands-on to engineering execution while mentoring and developing engineers in their technical careers. About the role The ideal candidate combines deep technical acumen, stakeholder awareness, and a people-first leadership mindset. You’ll collaborate with global tech leads, managers, platform teams, and business analysts to build trusted, performant data pipelines that serve use cases beyond traditional data domains. Responsibilities Design, develop, and maintain scalable pipelines across ADF, Databricks, Snowflake, and related platforms Lead the technical execution of non-domain specific initiatives (e.g. reusable dimensions, TLOG standardization, enablement pipelines) Architect data models and re-usable layers consumed by multiple downstream pods Guide platform-wide patterns like parameterization, CI/CD pipelines, pipeline recovery, and auditability frameworks Mentoring and coaching team Partner with product and platform leaders to ensure engineering consistency and delivery excellence Act as an L3 escalation point for operational data issues impacting foundational pipelines Own engineering best practices, sprint planning, and quality across the Enablement pod Contribute to platform discussions and architectural decisions across regions Job Requirements Education Bachelor’s or master’s degree in computer science, Engineering, or related field Relevant Experience 7-9 years of data engineering experience with strong hands-on delivery using ADF, SQL, Python, Databricks, and Spark Experience designing data pipelines, warehouse models, and processing frameworks using Snowflake or Azure Synapse Knowledge and Preferred Skills Proficient with CI/CD tools (Azure DevOps, GitHub) and observability practices. Solid grasp of data governance, metadata tagging, and role-based access control. Proven ability to mentor and grow engineers in a matrixed or global environment. Strong verbal and written communication skills, with the ability to operate cross-functionally. Certifications in Azure, Databricks, or Snowflake are a plus. Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management). Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools. Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Hands on experience in Databases like (Azure SQL DB, Snowflake, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting. ADF, Databricks and Azure certification is a plus. Technologies we use : Databricks, Azure SQL DW/Synapse, Snowflake, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI #LI-DS1

Posted 1 month ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description: Job Title: Senior Business Analyst Experience Range: 8-12 Years Location: Chennai, Hybrid Employment Type: Full-Time About UPS UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Supply Chain Symphony™ The UPS Supply Chain Symphony™ platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making. About The Role We are seeking an experienced Senior Business Analyst to join our project team responsible for delivering a Microsoft Azure-hosted web application with Angular as the frontend and .NET 8 as the backend framework. The solution follows a micro-frontend and microservices architecture integrated with Azure SQL database. Additionally, the data engineering component involves Azure Data Factory (ADF), Databricks, and Cosmos DB. The Senior Business Analyst will play a pivotal role in bridging the gap between business stakeholders, development teams, and data engineering teams. This role involves eliciting and analyzing requirements, defining business processes, and ensuring alignment of project objectives with strategic goals. The candidate will also work closely with architects, developers, and testers to ensure comprehensive requirements coverage and successful project delivery. Key Responsibilities Requirements Elicitation and Analysis: Gather and document business and technical requirements through stakeholder interviews, workshops, and document analysis. Analyze complex data flows and business processes to define clear and concise requirements. Create detailed requirement specifications, user stories, and acceptance criteria for both web application and data engineering components. Business Process Design and Improvement: Define and document business processes, workflows, and data models. Identify areas for process optimization and automation within web and data solutions. Collaborate with stakeholders to design solutions that align with business objectives. Stakeholder Communication and Collaboration: Serve as a liaison between business stakeholders, development teams, and data engineering teams. Facilitate communication and collaboration to ensure stakeholder alignment and understanding. Conduct requirement walkthroughs, design reviews, and user acceptance testing sessions. Solution Validation and Quality Assurance: Ensure requirements traceability throughout the project lifecycle. Validate and test solutions to ensure they meet business needs and objectives. Collaborate with QA teams to define testing strategies and acceptance criteria. Primary Skills Business Analysis: Requirement gathering, process modeling, and gap analysis. Documentation: User stories, functional specifications, and acceptance criteria. Agile Methodologies: Experience in Agile/Scrum environments. Stakeholder Management: Effective communication and collaboration with cross-functional teams. Data Analysis: Ability to analyze and interpret complex data flows and business processes. Secondary Skills Cloud Platform: Familiarity with Microsoft Azure services. Data Engineering: Understanding of data pipelines, ETL processes, and data modeling. UX/UI Collaboration: Experience collaborating with UX/UI teams for optimal user experience. Communication Skills: Excellent verbal and written communication for stakeholder engagement. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational And Preferred Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications such as: Certified Business Analysis Professional (CBAP) PMI Professional in Business Analysis (PMI-PBA) Microsoft Certified: Azure Fundamentals Experience in cloud-native solutions and microservices architecture. Familiarity with Angular and .NET frameworks for web applications. About The Team As a Senior Business Analyst , you will be working with a dynamic, cross-functional team that includes developers, product managers, and other quality engineers. You will be a key player in the quality assurance process, helping shape testing strategies and ensuring the delivery of high-quality web applications. Employee Type: Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less

Posted 1 month ago

Apply

7.0 years

1 - 10 Lacs

Noida

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Design, develop, and implement scalable data pipelines using Azure Databricks Develop PySpark-based data transformations and integrate structured and unstructured data from various sources Optimize Databricks clusters for performance, scalability, and cost-efficiency within the Azure ecosystem Monitor, troubleshoot, and resolve performance bottlenecks in Databricks workloads Manage orchestration and scheduling of end to end data pipeline using tool like Apache airflow, ADF scheduling, logic apps Effective collaboration with Architecture team in designing solutions and with product owners with validating the implementations Implementing best practices to enable data quality, monitoring, logging and alerting the failure scenarios and exception handling Documenting step by step process to trouble shoot the potential issues and deliver cost optimized cloud solutions Provide technical leadership, mentorship, and best practices for junior data engineers Stay up to date with Azure and Databricks advancements to continuously improve data engineering capabilities Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: B.Tech or equivalent 7+ years of overall experience in IT industry and 6+ years of experience in data engineering with 3+ years of hands-on experience in Azure Databricks Hands-on experience with Delta Lake, Lakehouse architecture, and data versioning Experience with CI/CD pipelines for data engineering solutions (Azure DevOps, Git) Solid knowledge of performance tuning, partitioning, caching, and cost optimization in Databricks Deep understanding of data warehousing, data modeling (Kimball/Inmon), and big data processing Solid expertise in the Azure ecosystem, including Azure Synapse, Azure SQL, ADLS, and Azure Functions Proficiency in PySpark, Python and SQL for data processing in Databricks Proven excellent written and verbal communication skills Proven excellent problem-solving skills and ability to work independently Proven ability to balance multiple and competing priorities and execute accordingly Proven highly self-motivated with excellent interpersonal and collaborative skills Proven ability to anticipate risks and obstacles and develop plans for mitigation Proven excellent documentation experience and skills Preferred Qualifications: Azure certifications DP-203, AZ-304 etc. Experience in infrastructure as code, scheduling as code, and automating operational activities using Terraform scripts At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 1 month ago

Apply

0 years

0 Lacs

Noida

On-site

As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Job Description: Have hands-on experience in supporting/integrating/Development (example : XML publisher, BI publisher enterprise ,Oracle designer, SQL & PLSQL, JDev , Java , ADF , HTML and CSS with JavaScript ) and extending Oracle Cloud (Financials, Distribution, Manufacturing, HCM) Have experience (Understanding of Data Model and Business process functionality and its data flow) in Oracle Fusion and Oracle On-Premise Applications (Finance or Supply chain) Developing integrations using OIC, VBCS, Rest APIs/Web Services Experts in PaaS such as Oracle Analytics Cloud Services, Visual Builder Cloud Services, Oracle Integration Services Career Level - IC4 As an Advisory Systems Engineer, you are expected to be an expert member of the problem-solving/avoidance team and be highly skilled in solving extremely complex (often previously unknown), critical customer issues. Performing the assigned duties with a high level of autonomy and reporting to management on customer status and technical matters on a regular basis, you will be expected to work with very limited guidance from management. Further, the Advisory Systems Engineer is sought by customers and Oracle employees to provide expert technical advice. Job Description: Have hands-on experience in supporting/integrating/Development (example : XML publisher, BI publisher enterprise ,Oracle designer, SQL & PLSQL, JDev , Java , ADF , HTML and CSS with JavaScript ) and extending Oracle Cloud (Financials, Distribution, Manufacturing, HCM) Have experience (Understanding of Data Model and Business process functionality and its data flow) in Oracle Fusion and Oracle On-Premise Applications (Finance or Supply chain) Developing integrations using OIC, VBCS, Rest APIs/Web Services Experts in PaaS such as Oracle Analytics Cloud Services, Visual Builder Cloud Services, Oracle Integration Services

Posted 1 month ago

Apply

5.0 years

0 Lacs

Calcutta

On-site

Skill required: Tech for Operations - Microsoft Azure Cloud Services Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation/12th/PUC/HSC Years of Experience: 5 to 8 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us at www.accenture.com. In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle: Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience. The Senior Azure Data factory (ADF) Support Engineer Il will be a critical member of our Enterprise Applications Team, responsible for designing, supporting & maintaining robust data solutions. The ideal candidate is proficient in ADF, SQL and has extensive experience in troubleshooting Azure Data factory environments, conducting code reviews, and bug fixing. This role requires a strategic thinker who can collaborate with cross-functional teams to drive our data strategy and ensure the optimal performance of our data systems. What are we looking for? • Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. • Proven experience (5+ years) as a Azure Data Factory Support Engineer Il Expertise in ADF with a deep understanding of its data-related libraries. • Strong experience in Azure cloud services, including troubleshooting and optimizing cloud-based environments. • Proficient in SQL and experience with SQL database design. • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Experience with ADF pipelines. • Excellent problem-solving and troubleshooting skills. • Experience in code review and debugging in a collaborative project setting. • Excellent verbal and written communication skills. • Ability to work in a fast-paced, team-oriented environment. • Strong understanding of the business and a passion for the mission of Service Supply Chain • Hands on with Jira, Devops ticketing, ServiceNow is good to have Roles and Responsibilities: • Innovate. Collaborate. Build. Create. Solve ADF & associated systems Ensure systems meet business requirements and industry practices. • Integrate new data management technologies and software engineering tools into existing structures. • Recommend ways to improve data reliability, efficiency, and quality. • Use large data sets to address business issues. • Use data to discover tasks that can be automated. • Fix bugs to ensure robust and sustainable codebase. • Collaborate closely with the relevant teams to diagnose and resolve issues in data processing systems, ensuring minimal downtime and optimal performance. • Analyze and comprehend existing ADF data pipelines, systems, and processes to identify and troubleshoot issues effectively. • Develop, test, and implement code changes to fix bugs and improve the efficiency and reliability of data pipelines. • Review and validate change requests from stakeholders, ensuring they align with system capabilities and business objectives. • Implement robust monitoring solutions to proactively detect and address issues in ADF data pipelines and related infrastructure. • Coordinate with data architects and other team members to ensure that changes are in line with the overall architecture and data strategy. • Document all changes, bug fixes, and updates meticulously, maintaining clear and comprehensive records for future reference and compliance. • Provide technical guidance and support to other team members, promoting a culture of continuous learning and improvement. • Stay updated with the latest technologies and practices in ADF to continuously improve the support and maintenance of data systems. Flexible Work Hours to include US Time Zones • Flexible working hours however this position may require you to work a rotational On-Call schedule, evenings, weekends, and holiday shifts when need arises Participate in the Demand Management and Change Management processes. • Work in partnership with internal business, external 3rd party technical teams and functional teams as a technology partner in communicating and coordinating delivery of technology services from Technology For Operations (TfO) Any Graduation,12th/PUC/HSC

Posted 1 month ago

Apply

5.0 - 7.0 years

15 - 30 Lacs

Hyderabad, Bengaluru

Work from Office

Working in shifts on rotation basis, especially to cover North America customers (EST time zone) is a must. Should have good communication and customer facing skills. Location: Bangalore or Hyderabad and 3 days / week to office. Candidate must have hands on experience in EBS SCM Modules Inventory, Order Management, Pricing and Shipping. 5 to 10 years of relevant working experience EBS SCM Technical : Technical Support Professional, preferably with implementation background in Oracle eBiz SCM Applications OM/Pricing/Shipping/OPM/Costing/AR. Proficiency in SQL and PL/SQL for database management, as well as knowledge of Oracle EBS architecture, forms, reports, workflow, and interface development Oracle Application Framework (OAF): Developing web-based applications and customizations. Custom Application Development: Building custom modules and extensions to meet specific business needs. Good knowledge expected in at least one of the following Fusion technologies : ADF, BPEL, ODI, SOA, FBDI, Reporting Tools Responsibilities include but not limited to providing excellence in customer service support, diagnosis, replication, resolving Technical issues of complex and critical service requests. The focus of this position is to provide Customer Service on a technical level and to ultimately drive complete and total resolution of each issue reported by customer. Troubleshooting & Problem Solving: Debugging EBS Applications: Identifying and resolving issues within EBS applications. Performance Tuning: Optimizing EBS performance for efficiency and responsiveness. System Integration: Ensuring seamless integration between EBS and other systems. Development framework: Experience supporting/developing/testing web applications implemented using frameworks that expose business services via a Model/View/Controller paradigm, such as Oracle ADF will be added advantage.

Posted 1 month ago

Apply

7.0 - 12.0 years

0 - 2 Lacs

Pune, Ahmedabad, Gurugram

Work from Office

Urgent Hiring: Azure Data Engineer (Strong PySpark + SCD II/III Expert) Work Mode: Remote Client-Focused Interview on PySpark + SCD II/III Key Must-Haves: Very Strong hands-on PySpark coding Practical experience implementing Slowly Changing Dimensions (SCD) Type II and Type III Strong expertise in Azure Data Engineering (ADF, Databricks, Data Lake, Synapse) Proficiency in SQL and Python for scripting and transformation Strong understanding of data warehousing concepts and ETL pipelines Good to Have: Experience with Microsoft Fabric Familiarity with Power BI Domain knowledge in Finance, Procurement, and Human Capital Note: This role is highly technical. The client will focus interviews on PySpark coding and SCD Type II/III implementation . Only share profiles that are hands-on and experienced in these areas. Share strong, relevant profiles to: b.simrana@ekloudservices.com

Posted 1 month ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Role Description Job Title: Java Backend Developer Location: Chennai / Trivandrum Experience: 4+ Years Job Summary We are looking for a skilled Java Backend Developer to join our growing Services Engineering team, contributing to the development of robust, scalable, and innovative solutions in a high-performance environment. This role involves working on Oracle Fusion BPM solutions, developing APIs, and supporting key business processes for a global credit card platform. You’ll play a key role in designing backend components that deliver real impact to end users. Key Responsibilities Design, develop, and maintain backend services and APIs using Java and related technologies. Collaborate within a self-organized engineering team to build features aligned with the product roadmap and business goals. Work within a Service-Oriented Architecture (SOA) methodology, focusing on Oracle Fusion BPM and associated workflows. Administer BPM architecture and configurations across enterprise software applications. Develop custom BPM workflows and ADF user interfaces based on business and user requirements. Write clean, maintainable, and well-documented code while ensuring high-quality deliverables through practices such as TDD, BDD, and pair programming. Support continuous integration and deployment, collaborating with DevOps and QA teams. Maintain and document customizations and extensions in Oracle Fusion and database components. Contribute to team-level innovation and efficiency improvements. Mandatory Skills 4+ years of professional experience in backend development using Java and Object-Oriented Programming (OOP) principles. Proficient in API development, RESTful services, and HTTP protocols. Experience in Oracle Fusion BPM development, including custom BPM workflows and ADF UI. Solid understanding of SOA methodologies and enterprise application integration. Proficiency in Oracle 12c DB, SQL, and relational database concepts. Strong collaboration and communication skills with the ability to work in a fast-paced team environment. Experience with TDD, BDD, and pair programming practices. Good-to-Have Skills Experience working in cloud environments such as AWS. Exposure to regulated domains or working in financial services is a plus. Experience with solving real-world problems in complex systems. Familiarity with modern CI/CD practices and DevOps tools. Keywords Java, Oracle Fusion, BPM, ADF, API Development, SQL, Cloud (AWS), Software Engineering, Backend Development Skills Cloud Computing,Java,Aws Cloud Show more Show less

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Title: Snowflake Developer Location: Gurugram Experience: 3 to 7 years Skillset: Snowflake, Azure, ADF (Azure Data Factory) Job Type: Full-Time Overview: We are looking for a talented Snowflake Developer with expertise in Snowflake, Azure, and Azure Data Factory (ADF) to join our dynamic team. In this role, you will be responsible for developing, implementing, and optimizing data pipelines and ETL processes. You will work on cloud-based data platforms to ensure the effective and seamless integration of data across systems. The ideal candidate will have a solid background in working with Snowflake and cloud data services, and be ready to travel to client locations as required. Key Responsibilities: • Design, develop, and implement data solutions using Snowflake and Azure technologies. • Develop and manage ETL pipelines using Azure Data Factory (ADF) for seamless data movement and transformation. • Collaborate with cross-functional teams to ensure that the data platform meets business needs and aligns with data architecture best practices. • Monitor, optimize, and troubleshoot data pipelines and workflows in Snowflake and Azure environments. • Implement data governance and security practices in line with industry standards. • Perform data validation and ensure data integrity across systems and platforms. • Ensure data integration and management processes are optimized for performance, scalability, and reliability. • Provide technical support and guidance to junior developers and team members. • Collaborate with the client to understand project requirements and ensure deliverables are met on time. • Be open to travelling to client locations as needed for project delivery and stakeholder engagements. Skills and Qualifications: • 3 to 7 years of hands-on experience in Snowflake development and data management. • Strong working knowledge of Azure (Azure Data Services, Azure Data Lake, etc.) and Azure Data Factory (ADF). • Expertise in designing and developing ETL pipelines and data transformation processes using Snowflake and ADF. • Proficiency in SQL and data modeling, with experience working with structured and semi-structured data. • Knowledge of data warehousing concepts and best practices in Snowflake. • Understanding of data security, privacy, and compliance requirements in cloud environments. • Experience with cloud-based data solutions and integration services. • Strong problem-solving and debugging skills. • Ability to work effectively with both technical and non-technical teams. • Good communication skills to collaborate with clients and team members. • Bachelor’s degree in Computer Science, Information Technology, or a related field. Preferred Skills: • Experience with other Azure services like Azure SQL Database, Azure Synapse Analytics, and Power BI. • Familiarity with data governance tools and data pipeline orchestration best practices. • Ability to optimize Snowflake queries and database performance. Why Join Us: • Work with cutting-edge cloud technologies like Snowflake and Azure. • Exposure to complex, large-scale data projects across industries. • Collaborative work environment that promotes innovation and learning. • Competitive salary and benefits package. • Opportunities for career growth and development. Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Kochi, Kerala, India

On-site

Role Description Develop and manage ETL workflows using Azure Data Factory (ADF). Design and implement data pipelines using PySpark on Azure Databricks. Work with Azure Synapse Analytics, Azure Data Lake, and Azure Blob Storage for data ingestion and transformation. Optimize Spark jobs for performance and scalability in Databricks. Automate data workflows and implement error handling & monitoring in ADF. Collaborate with data engineers, analysts, and business teams to understand data requirements. Implement data governance, security, and compliance best practices in Azure. Debug and troubleshoot PySpark scripts and ADF pipeline failures. 4+ years of experience in ETL development with Azure Data Factory (ADF). Hands-on experience with Azure Databricks and PySpark for big data processing. Strong knowledge of Azure services Proficiency in Python and PySpark for data transformation and processing. Experience with CI/CD pipelines for deploying ADF pipelines and Databricks notebooks. Strong expertise in SQL for data extraction and transformations. Knowledge of performance tuning in Spark and cost optimization on Azure. Skills Azure Data Factory,Pyspark,Azure Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

On-site

We’re hiring a Senior ML Engineer (MLOps) — 3-5 yrs Location: Kochi or Chennai What you’ll do Tame data → pull, clean, and shape structured & unstructured data. Orchestrate pipelines → Airflow / Step Functions / ADF… your call. Ship models → build, tune, and push to prod on SageMaker, Azure ML, or Vertex AI. Scale → Spark / Databricks for the heavy lifting. Automate everything → Docker, Kubernetes, CI/CD, MLFlow, Seldon, Kubeflow. Pair up → work with engineers, architects, and business folks to solve real problems, fast. What you bring 3+ yrs hands-on MLOps (4-5 yrs total software experience). Proven chops on one hyperscaler (AWS, Azure, or GCP). Confidence with Databricks / Spark , Python, SQL, TensorFlow / PyTorch / Scikit-learn. You debug Kubernetes in your sleep and treat Dockerfiles like breathing. You prototype with open-source first, choose the right tool, then make it scale. Sharp mind, low ego, bias for action. Nice-to-haves Sagemaker, Azure ML, or Vertex AI in production. Love for clean code, clear docs, and crisp PRs. Why Datadivr? Domain focus: we live and breathe F&B — your work ships to plants, not just slides. Small team, big autonomy: no endless layers; you own what you build. 📬 How to apply Shoot your CV + a short note on a project you shipped to careers@datadivr.com or DM me here. We reply to every serious applicant. Know someone perfect? Please share — good people know good people. Show more Show less

Posted 1 month ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day-to-day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team, and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after-sales support and best practice advice. Interactions with internal stakeholders and clients to explain technology solutions and a clear understanding of client’s business requirements through which to guide optimal design to meet their needs. Job Description: Must-Have Skills: Database (one or more of MS SQL Server, Oracle, Cloud SQL, Cloud Spanner, etc.) Data Warehouse (one or more of Big Query, SnowFlake, etc.) ETL tool (two or more of Cloud Data Fusion, Dataflow, Dataproc, Pub/Sub, Composer, Cloud Functions, Cloud Run, etc.) Experience in Cloud platforms - GCP Python, PySpark, Project & resource management SVN, JIRA, Automation workflow (Composer, Cloud Scheduler, Apache Airflow, Tidal, Tivoli or similar) Good to have Skills: UNIX shell scripting, SnowFlake, Redshift, Familiar with NoSQL such as MongoDB, etc ETL tool (Databricks / AWS Glue / AWS Lambda / Amazon Kinesis / Amazon Firehose / Azure Data Factory / ADF / DBT / Talend, Informatica, IICS (Informatica cloud) ) Experience in Cloud platforms - AWS / Azure Client-facing skills Key Responsibilities: Ability to design simple to medium data solutions for clients by using cloud architecture using GCP Strong understanding of DW, data mart, data modeling, data structures, databases, and data ingestion and transformation. Working knowledge of ETL as well as database skills Working knowledge of data modeling, data structures, databases, and ETL processes Strong understanding of relational and non-relational databases and when to use them Leadership and communication skills to collaborate with local leadership as well as our global teams Translating technical requirements into ETL/ SQL application code Document project architecture, explain the detailed design to the team, and create low-level to high-level design Create technical documents for ETL and SQL developments using Visio, PowerPoint, and other MS Office package Will need to engage with Project Managers, Business Analysts, and Application DBA to implement ETL Solutions Perform mid to complex-level tasks independently Support Clients, Data Scientists, and Analytical Consultants working on marketing solution Work with cross-functional internal teams and external clients Strong project management and organization skills . Ability to lead 1 – 2 projects of team size 2 – 3 team members. Code management systems which include Code review, deployment, cod Work closely with the QA / Testing team to help identify/implement defect reduction initiatives Work closely with the Architecture team to make sure Architecture standards and principles are followed during development Performing Proof of Concepts on new platforms/validating proposed solutions Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed Must understand software development methodologies including waterfall and agile Distribute and manage SQL development Work across the team The candidate must be willing to work during overlapping hours with US-based teams to ensure effective collaboration and communication, typically between [e.g., 6:00 PM to 11:00 PM IST], depending on project needs. Qualifications: Bachelor’s or Master's Degree in Computer Science with >= 7 years of IT experience Location: Bangalore Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 1 month ago

Apply

7.0 years

0 Lacs

New Delhi, Delhi, India

On-site

The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day-to-day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team, and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after-sales support and best practice advice. Interactions with internal stakeholders and clients to explain technology solutions and a clear understanding of client’s business requirements through which to guide optimal design to meet their needs. Job Description: Must-Have Skills: Database (one or more of MS SQL Server, Oracle, Cloud SQL, Cloud Spanner, etc.) Data Warehouse (one or more of Big Query, SnowFlake, etc.) ETL tool (two or more of Cloud Data Fusion, Dataflow, Dataproc, Pub/Sub, Composer, Cloud Functions, Cloud Run, etc.) Experience in Cloud platforms - GCP Python, PySpark, Project & resource management SVN, JIRA, Automation workflow (Composer, Cloud Scheduler, Apache Airflow, Tidal, Tivoli or similar) Good to have Skills: UNIX shell scripting, SnowFlake, Redshift, Familiar with NoSQL such as MongoDB, etc ETL tool (Databricks / AWS Glue / AWS Lambda / Amazon Kinesis / Amazon Firehose / Azure Data Factory / ADF / DBT / Talend, Informatica, IICS (Informatica cloud) ) Experience in Cloud platforms - AWS / Azure Client-facing skills Key Responsibilities: Ability to design simple to medium data solutions for clients by using cloud architecture using GCP Strong understanding of DW, data mart, data modeling, data structures, databases, and data ingestion and transformation. Working knowledge of ETL as well as database skills Working knowledge of data modeling, data structures, databases, and ETL processes Strong understanding of relational and non-relational databases and when to use them Leadership and communication skills to collaborate with local leadership as well as our global teams Translating technical requirements into ETL/ SQL application code Document project architecture, explain the detailed design to the team, and create low-level to high-level design Create technical documents for ETL and SQL developments using Visio, PowerPoint, and other MS Office package Will need to engage with Project Managers, Business Analysts, and Application DBA to implement ETL Solutions Perform mid to complex-level tasks independently Support Clients, Data Scientists, and Analytical Consultants working on marketing solution Work with cross-functional internal teams and external clients Strong project management and organization skills . Ability to lead 1 – 2 projects of team size 2 – 3 team members. Code management systems which include Code review, deployment, cod Work closely with the QA / Testing team to help identify/implement defect reduction initiatives Work closely with the Architecture team to make sure Architecture standards and principles are followed during development Performing Proof of Concepts on new platforms/validating proposed solutions Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed Must understand software development methodologies including waterfall and agile Distribute and manage SQL development Work across the team The candidate must be willing to work during overlapping hours with US-based teams to ensure effective collaboration and communication, typically between [e.g., 6:00 PM to 11:00 PM IST], depending on project needs. Qualifications: Bachelor’s or Master's Degree in Computer Science with >= 7 years of IT experience Location: Bangalore Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies