Jobs
Interviews

1444 Adf Jobs - Page 47

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications.As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products.Duties and tasks are varied and complex needing independent judgment. Fully competent in own area of expertise. May have project lead role and or supervise lower level personnel. BS or MS degree or equivalent experience relevant to functional area. 4 years of software engineering or related experience. Career Level - IC3 Responsibilities Fusion development team works on design, development and maintenance of Fusion Global HR, Talent, Configuration Workbench and Compensation product areas. As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products. Duties and tasks are varied and complex needing independent judgment. Fully competent in own area of expertise. May have project lead role and or supervise lower level personnel. Bachelors or Masters Degree (B.E./B.Tech./MCA/M.Tech./M.S.) from reputed universities. 1-8 years of experience in Applications or product development. Mandatory Skills Strong Knowledge of object oriented programming concepts Product design & development experience in [Java / J2EE technologies (JSP/Servlet)] OR [Database fundamentals, SQL, PL/SQL] Optional Skills Development experience on the Fusion Middleware platform Familiarity with ADF and Exposure to development in the cloud Development experience in Oracle Applications / HCM functionalityAnalyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 month ago

Apply

5.0 - 10.0 years

11 - 21 Lacs

Pune, Bengaluru, Delhi / NCR

Work from Office

Skills- Azure Data factory, Data Bricks, Scala. Location- PAN India. Experience- 5 years- 12 years

Posted 1 month ago

Apply

4.0 years

0 Lacs

Rajkot, Gujarat, India

On-site

Job Description Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications.As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products.Duties and tasks are varied and complex needing independent judgment. Fully competent in own area of expertise. May have project lead role and or supervise lower level personnel. BS or MS degree or equivalent experience relevant to functional area. 4 years of software engineering or related experience. Career Level - IC3 Responsibilities Fusion development team works on design, development and maintenance of Fusion Global HR, Talent, Configuration Workbench and Compensation product areas. As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products. Duties and tasks are varied and complex needing independent judgment. Fully competent in own area of expertise. May have project lead role and or supervise lower level personnel. Bachelors or Masters Degree (B.E./B.Tech./MCA/M.Tech./M.S.) from reputed universities. 1-8 years of experience in Applications or product development. Mandatory Skills Strong Knowledge of object oriented programming concepts Product design & development experience in [Java / J2EE technologies (JSP/Servlet)] OR [Database fundamentals, SQL, PL/SQL] Optional Skills Development experience on the Fusion Middleware platform Familiarity with ADF and Exposure to development in the cloud Development experience in Oracle Applications / HCM functionalityAnalyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Senior Associate Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Year of experience required Minimum 4Years of Oracle fusion experience Educational Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Chartered Accountant Diploma, Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Oracle Fusion Applications Optional Skills Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Self-Awareness, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 month ago

Apply

4.0 - 6.0 years

0 Lacs

Gurgaon

On-site

Responsible for developing, optimize, and maintaining business intelligence and data warehouse systems, ensuring secure, efficient data storage and retrieval, enabling self-service data exploration, and supporting stakeholders with insightful reporting and analysis. Grade - T5 Please note that the Job will close at 12am on Posting Close date, so please submit your application prior to the Close Date What your main responsibilities are: Accountabilities: Data Pipeline - Develop and maintain scalable data pipelines and builds out new API integrations to support continuing increases in data volume and complexity Data Integration - Connect offline and online data to continuously improve overall understanding of customer behavior and journeys for personalization. Data pre-processing including collecting, parsing, managing, analyzing and visualizing large sets of data Data Quality Management - Cleanse the data and improve data quality and readiness for analysis. Drive standards, define and implement/improve data governance strategies and enforce best practices to scale data analysis across platforms Data Transformation - Processes data by cleansing data and transforming them to proper storage structure for the purpose of querying and analysis using ETL and ELT process Data Enablement - Ensure data is accessible and useable to wider enterprise to enable a deeper and more timely understanding of operation. Qualifications & Specifications: Masters /Bachelor’s degree in Engineering /Computer Science/ Math/ Statistics or equivalent. Strong programming skills in Python/Pyspark/SAS. Proven experience with large data sets and related technologies – Hadoop, Hive, Distributed computing systems, Spark optimization. Experience on cloud platforms (preferably Azure) and it's services Azure Data Factory (ADF), ADLS Storage, Azure DevOps. Hands-on experience on Databricks, Delta Lake, Workflows. Should have knowledge of DevOps process and tools like Docker, CI/CD, Kubernetes, Terraform, Octopus. Hands-on experience with SQL and data modeling to support the organization's data storage and analysis needs. Experience on any BI tool like Power BI (Good to have). Cloud migration experience (Good to have) Cloud and Data Engineering certification (Good to have) Working in an Agile environment 4-6 years of relevant work experience is required. Experience with stakeholder management is an added advantage. What we are looking for Education: Bachelor's degree or equivalent in Computer Science, MIS, Mathematics, Statistics, or similar discipline. Master's degree or PhD preferred. Knowledge, Skills and Abilities Fluency in English Analytical Skills Accuracy & Attention to Detail Numerical Skills Planning & Organizing Skills Presentation Skills Data Modeling and Database Design ETL (Extract, Transform, Load) Skills Programming Skills FedEx was built on a philosophy that puts people first, one we take seriously. We are an equal opportunity/affirmative action employer and we are committed to a diverse, equitable, and inclusive workforce in which we enforce fair treatment, and provide growth opportunities for everyone. All qualified applicants will receive consideration for employment regardless of age, race, color, national origin, genetics, religion, gender, marital status, pregnancy (including childbirth or a related medical condition), physical or mental disability, or any other characteristic protected by applicable laws, regulations, and ordinances. Our Company FedEx is one of the world's largest express transportation companies and has consistently been selected as one of the top 10 World’s Most Admired Companies by "Fortune" magazine. Every day FedEx delivers for its customers with transportation and business solutions, serving more than 220 countries and territories around the globe. We can serve this global network due to our outstanding team of FedEx team members, who are tasked with making every FedEx experience outstanding. Our Philosophy The People-Service-Profit philosophy (P-S-P) describes the principles that govern every FedEx decision, policy, or activity. FedEx takes care of our people; they, in turn, deliver the impeccable service demanded by our customers, who reward us with the profitability necessary to secure our future. The essential element in making the People-Service-Profit philosophy such a positive force for the company is where we close the circle, and return these profits back into the business, and invest back in our people. Our success in the industry is attributed to our people. Through our P-S-P philosophy, we have a work environment that encourages team members to be innovative in delivering the highest possible quality of service to our customers. We care for their well-being, and value their contributions to the company. Our Culture Our culture is important for many reasons, and we intentionally bring it to life through our behaviors, actions, and activities in every part of the world. The FedEx culture and values have been a cornerstone of our success and growth since we began in the early 1970’s. While other companies can copy our systems, infrastructure, and processes, our culture makes us unique and is often a differentiating factor as we compete and grow in today’s global marketplace.

Posted 1 month ago

Apply

7.0 - 9.0 years

3 - 9 Lacs

Gurgaon

On-site

Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company. A leader in the convenience store and fuel space with over 16,700 stores in 31 countries, serving more than 9 million customers each day. At Circle K, we are building a best-in-class global data engineering practice to support intelligent business decision-making and drive value across our retail ecosystem. As we scale our engineering capabilities, we’re seeking a Lead Data Engineer to serve as both a technical leader and people coach for our India-based Data Enablement pod. This role will oversee the design, delivery, and maintenance of critical cross-functional datasets and reusable data assets while also managing a group of talented engineers in India. This position plays a dual role: contributing hands-on to engineering execution while mentoring and developing engineers in their technical careers. About the role The ideal candidate combines deep technical acumen, stakeholder awareness, and a people-first leadership mindset. You’ll collaborate with global tech leads, managers, platform teams, and business analysts to build trusted, performant data pipelines that serve use cases beyond traditional data domains. Responsibilities Design, develop, and maintain scalable pipelines across ADF, Databricks, Snowflake, and related platforms Lead the technical execution of non-domain specific initiatives (e.g. reusable dimensions, TLOG standardization, enablement pipelines) Architect data models and re-usable layers consumed by multiple downstream pods Guide platform-wide patterns like parameterization, CI/CD pipelines, pipeline recovery, and auditability frameworks Mentoring and coaching team Partner with product and platform leaders to ensure engineering consistency and delivery excellence Act as an L3 escalation point for operational data issues impacting foundational pipelines Own engineering best practices, sprint planning, and quality across the Enablement pod Contribute to platform discussions and architectural decisions across regions Job Requirements Education Bachelor’s or master’s degree in computer science, Engineering, or related field Relevant Experience 7-9 years of data engineering experience with strong hands-on delivery using ADF, SQL, Python, Databricks, and Spark Experience designing data pipelines, warehouse models, and processing frameworks using Snowflake or Azure Synapse Knowledge and Preferred Skills Proficient with CI/CD tools (Azure DevOps, GitHub) and observability practices. Solid grasp of data governance, metadata tagging, and role-based access control. Proven ability to mentor and grow engineers in a matrixed or global environment. Strong verbal and written communication skills, with the ability to operate cross-functionally. Certifications in Azure, Databricks, or Snowflake are a plus. Strong Knowledge of Data Engineering concepts (Data pipelines creation, Data Warehousing, Data Marts/Cubes, Data Reconciliation and Audit, Data Management). Working Knowledge of Dev-Ops processes (CI/CD), Git/Jenkins version control tool, Master Data Management (MDM) and Data Quality tools. Strong Experience in ETL/ELT development, QA and operation/support process (RCA of production issues, Code/Data Fix Strategy, Monitoring and maintenance). Hands on experience in Databases like (Azure SQL DB, Snowflake, MySQL/, Cosmos DB etc.), File system (Blob Storage), Python/Unix shell Scripting. ADF, Databricks and Azure certification is a plus. Technologies we use : Databricks, Azure SQL DW/Synapse, Snowflake, Azure Tabular, Azure Data Factory, Azure Functions, Azure Containers, Docker, DevOps, Python, PySpark, Scripting (Powershell, Bash), Git, Terraform, Power BI #LI-DS1

Posted 1 month ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description: Job Title: Senior Business Analyst Experience Range: 8-12 Years Location: Chennai, Hybrid Employment Type: Full-Time About UPS UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Supply Chain Symphony™ The UPS Supply Chain Symphony™ platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making. About The Role We are seeking an experienced Senior Business Analyst to join our project team responsible for delivering a Microsoft Azure-hosted web application with Angular as the frontend and .NET 8 as the backend framework. The solution follows a micro-frontend and microservices architecture integrated with Azure SQL database. Additionally, the data engineering component involves Azure Data Factory (ADF), Databricks, and Cosmos DB. The Senior Business Analyst will play a pivotal role in bridging the gap between business stakeholders, development teams, and data engineering teams. This role involves eliciting and analyzing requirements, defining business processes, and ensuring alignment of project objectives with strategic goals. The candidate will also work closely with architects, developers, and testers to ensure comprehensive requirements coverage and successful project delivery. Key Responsibilities Requirements Elicitation and Analysis: Gather and document business and technical requirements through stakeholder interviews, workshops, and document analysis. Analyze complex data flows and business processes to define clear and concise requirements. Create detailed requirement specifications, user stories, and acceptance criteria for both web application and data engineering components. Business Process Design and Improvement: Define and document business processes, workflows, and data models. Identify areas for process optimization and automation within web and data solutions. Collaborate with stakeholders to design solutions that align with business objectives. Stakeholder Communication and Collaboration: Serve as a liaison between business stakeholders, development teams, and data engineering teams. Facilitate communication and collaboration to ensure stakeholder alignment and understanding. Conduct requirement walkthroughs, design reviews, and user acceptance testing sessions. Solution Validation and Quality Assurance: Ensure requirements traceability throughout the project lifecycle. Validate and test solutions to ensure they meet business needs and objectives. Collaborate with QA teams to define testing strategies and acceptance criteria. Primary Skills Business Analysis: Requirement gathering, process modeling, and gap analysis. Documentation: User stories, functional specifications, and acceptance criteria. Agile Methodologies: Experience in Agile/Scrum environments. Stakeholder Management: Effective communication and collaboration with cross-functional teams. Data Analysis: Ability to analyze and interpret complex data flows and business processes. Secondary Skills Cloud Platform: Familiarity with Microsoft Azure services. Data Engineering: Understanding of data pipelines, ETL processes, and data modeling. UX/UI Collaboration: Experience collaborating with UX/UI teams for optimal user experience. Communication Skills: Excellent verbal and written communication for stakeholder engagement. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational And Preferred Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications such as: Certified Business Analysis Professional (CBAP) PMI Professional in Business Analysis (PMI-PBA) Microsoft Certified: Azure Fundamentals Experience in cloud-native solutions and microservices architecture. Familiarity with Angular and .NET frameworks for web applications. About The Team As a Senior Business Analyst , you will be working with a dynamic, cross-functional team that includes developers, product managers, and other quality engineers. You will be a key player in the quality assurance process, helping shape testing strategies and ensuring the delivery of high-quality web applications. Employee Type: Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation. Show more Show less

Posted 1 month ago

Apply

7.0 years

1 - 10 Lacs

Noida

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by diversity and inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health equity on a global scale. Join us to start Caring. Connecting. Growing together. Primary Responsibilities: Design, develop, and implement scalable data pipelines using Azure Databricks Develop PySpark-based data transformations and integrate structured and unstructured data from various sources Optimize Databricks clusters for performance, scalability, and cost-efficiency within the Azure ecosystem Monitor, troubleshoot, and resolve performance bottlenecks in Databricks workloads Manage orchestration and scheduling of end to end data pipeline using tool like Apache airflow, ADF scheduling, logic apps Effective collaboration with Architecture team in designing solutions and with product owners with validating the implementations Implementing best practices to enable data quality, monitoring, logging and alerting the failure scenarios and exception handling Documenting step by step process to trouble shoot the potential issues and deliver cost optimized cloud solutions Provide technical leadership, mentorship, and best practices for junior data engineers Stay up to date with Azure and Databricks advancements to continuously improve data engineering capabilities Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: B.Tech or equivalent 7+ years of overall experience in IT industry and 6+ years of experience in data engineering with 3+ years of hands-on experience in Azure Databricks Hands-on experience with Delta Lake, Lakehouse architecture, and data versioning Experience with CI/CD pipelines for data engineering solutions (Azure DevOps, Git) Solid knowledge of performance tuning, partitioning, caching, and cost optimization in Databricks Deep understanding of data warehousing, data modeling (Kimball/Inmon), and big data processing Solid expertise in the Azure ecosystem, including Azure Synapse, Azure SQL, ADLS, and Azure Functions Proficiency in PySpark, Python and SQL for data processing in Databricks Proven excellent written and verbal communication skills Proven excellent problem-solving skills and ability to work independently Proven ability to balance multiple and competing priorities and execute accordingly Proven highly self-motivated with excellent interpersonal and collaborative skills Proven ability to anticipate risks and obstacles and develop plans for mitigation Proven excellent documentation experience and skills Preferred Qualifications: Azure certifications DP-203, AZ-304 etc. Experience in infrastructure as code, scheduling as code, and automating operational activities using Terraform scripts At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone–of every race, gender, sexuality, age, location and income–deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 1 month ago

Apply

0 years

0 Lacs

Noida

On-site

As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. Job Description: Have hands-on experience in supporting/integrating/Development (example : XML publisher, BI publisher enterprise ,Oracle designer, SQL & PLSQL, JDev , Java , ADF , HTML and CSS with JavaScript ) and extending Oracle Cloud (Financials, Distribution, Manufacturing, HCM) Have experience (Understanding of Data Model and Business process functionality and its data flow) in Oracle Fusion and Oracle On-Premise Applications (Finance or Supply chain) Developing integrations using OIC, VBCS, Rest APIs/Web Services Experts in PaaS such as Oracle Analytics Cloud Services, Visual Builder Cloud Services, Oracle Integration Services Career Level - IC4 As an Advisory Systems Engineer, you are expected to be an expert member of the problem-solving/avoidance team and be highly skilled in solving extremely complex (often previously unknown), critical customer issues. Performing the assigned duties with a high level of autonomy and reporting to management on customer status and technical matters on a regular basis, you will be expected to work with very limited guidance from management. Further, the Advisory Systems Engineer is sought by customers and Oracle employees to provide expert technical advice. Job Description: Have hands-on experience in supporting/integrating/Development (example : XML publisher, BI publisher enterprise ,Oracle designer, SQL & PLSQL, JDev , Java , ADF , HTML and CSS with JavaScript ) and extending Oracle Cloud (Financials, Distribution, Manufacturing, HCM) Have experience (Understanding of Data Model and Business process functionality and its data flow) in Oracle Fusion and Oracle On-Premise Applications (Finance or Supply chain) Developing integrations using OIC, VBCS, Rest APIs/Web Services Experts in PaaS such as Oracle Analytics Cloud Services, Visual Builder Cloud Services, Oracle Integration Services

Posted 1 month ago

Apply

5.0 years

0 Lacs

Calcutta

On-site

Skill required: Tech for Operations - Microsoft Azure Cloud Services Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation/12th/PUC/HSC Years of Experience: 5 to 8 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us at www.accenture.com. In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle: Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience. The Senior Azure Data factory (ADF) Support Engineer Il will be a critical member of our Enterprise Applications Team, responsible for designing, supporting & maintaining robust data solutions. The ideal candidate is proficient in ADF, SQL and has extensive experience in troubleshooting Azure Data factory environments, conducting code reviews, and bug fixing. This role requires a strategic thinker who can collaborate with cross-functional teams to drive our data strategy and ensure the optimal performance of our data systems. What are we looking for? • Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. • Proven experience (5+ years) as a Azure Data Factory Support Engineer Il Expertise in ADF with a deep understanding of its data-related libraries. • Strong experience in Azure cloud services, including troubleshooting and optimizing cloud-based environments. • Proficient in SQL and experience with SQL database design. • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Experience with ADF pipelines. • Excellent problem-solving and troubleshooting skills. • Experience in code review and debugging in a collaborative project setting. • Excellent verbal and written communication skills. • Ability to work in a fast-paced, team-oriented environment. • Strong understanding of the business and a passion for the mission of Service Supply Chain • Hands on with Jira, Devops ticketing, ServiceNow is good to have Roles and Responsibilities: • Innovate. Collaborate. Build. Create. Solve ADF & associated systems Ensure systems meet business requirements and industry practices. • Integrate new data management technologies and software engineering tools into existing structures. • Recommend ways to improve data reliability, efficiency, and quality. • Use large data sets to address business issues. • Use data to discover tasks that can be automated. • Fix bugs to ensure robust and sustainable codebase. • Collaborate closely with the relevant teams to diagnose and resolve issues in data processing systems, ensuring minimal downtime and optimal performance. • Analyze and comprehend existing ADF data pipelines, systems, and processes to identify and troubleshoot issues effectively. • Develop, test, and implement code changes to fix bugs and improve the efficiency and reliability of data pipelines. • Review and validate change requests from stakeholders, ensuring they align with system capabilities and business objectives. • Implement robust monitoring solutions to proactively detect and address issues in ADF data pipelines and related infrastructure. • Coordinate with data architects and other team members to ensure that changes are in line with the overall architecture and data strategy. • Document all changes, bug fixes, and updates meticulously, maintaining clear and comprehensive records for future reference and compliance. • Provide technical guidance and support to other team members, promoting a culture of continuous learning and improvement. • Stay updated with the latest technologies and practices in ADF to continuously improve the support and maintenance of data systems. Flexible Work Hours to include US Time Zones • Flexible working hours however this position may require you to work a rotational On-Call schedule, evenings, weekends, and holiday shifts when need arises Participate in the Demand Management and Change Management processes. • Work in partnership with internal business, external 3rd party technical teams and functional teams as a technology partner in communicating and coordinating delivery of technology services from Technology For Operations (TfO) Any Graduation,12th/PUC/HSC

Posted 1 month ago

Apply

5.0 - 7.0 years

15 - 30 Lacs

Hyderabad, Bengaluru

Work from Office

Working in shifts on rotation basis, especially to cover North America customers (EST time zone) is a must. Should have good communication and customer facing skills. Location: Bangalore or Hyderabad and 3 days / week to office. Candidate must have hands on experience in EBS SCM Modules Inventory, Order Management, Pricing and Shipping. 5 to 10 years of relevant working experience EBS SCM Technical : Technical Support Professional, preferably with implementation background in Oracle eBiz SCM Applications OM/Pricing/Shipping/OPM/Costing/AR. Proficiency in SQL and PL/SQL for database management, as well as knowledge of Oracle EBS architecture, forms, reports, workflow, and interface development Oracle Application Framework (OAF): Developing web-based applications and customizations. Custom Application Development: Building custom modules and extensions to meet specific business needs. Good knowledge expected in at least one of the following Fusion technologies : ADF, BPEL, ODI, SOA, FBDI, Reporting Tools Responsibilities include but not limited to providing excellence in customer service support, diagnosis, replication, resolving Technical issues of complex and critical service requests. The focus of this position is to provide Customer Service on a technical level and to ultimately drive complete and total resolution of each issue reported by customer. Troubleshooting & Problem Solving: Debugging EBS Applications: Identifying and resolving issues within EBS applications. Performance Tuning: Optimizing EBS performance for efficiency and responsiveness. System Integration: Ensuring seamless integration between EBS and other systems. Development framework: Experience supporting/developing/testing web applications implemented using frameworks that expose business services via a Model/View/Controller paradigm, such as Oracle ADF will be added advantage.

Posted 1 month ago

Apply

7.0 - 12.0 years

0 - 2 Lacs

Pune, Ahmedabad, Gurugram

Work from Office

Urgent Hiring: Azure Data Engineer (Strong PySpark + SCD II/III Expert) Work Mode: Remote Client-Focused Interview on PySpark + SCD II/III Key Must-Haves: Very Strong hands-on PySpark coding Practical experience implementing Slowly Changing Dimensions (SCD) Type II and Type III Strong expertise in Azure Data Engineering (ADF, Databricks, Data Lake, Synapse) Proficiency in SQL and Python for scripting and transformation Strong understanding of data warehousing concepts and ETL pipelines Good to Have: Experience with Microsoft Fabric Familiarity with Power BI Domain knowledge in Finance, Procurement, and Human Capital Note: This role is highly technical. The client will focus interviews on PySpark coding and SCD Type II/III implementation . Only share profiles that are hands-on and experienced in these areas. Share strong, relevant profiles to: b.simrana@ekloudservices.com

Posted 1 month ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Role Description Job Title: Java Backend Developer Location: Chennai / Trivandrum Experience: 4+ Years Job Summary We are looking for a skilled Java Backend Developer to join our growing Services Engineering team, contributing to the development of robust, scalable, and innovative solutions in a high-performance environment. This role involves working on Oracle Fusion BPM solutions, developing APIs, and supporting key business processes for a global credit card platform. You’ll play a key role in designing backend components that deliver real impact to end users. Key Responsibilities Design, develop, and maintain backend services and APIs using Java and related technologies. Collaborate within a self-organized engineering team to build features aligned with the product roadmap and business goals. Work within a Service-Oriented Architecture (SOA) methodology, focusing on Oracle Fusion BPM and associated workflows. Administer BPM architecture and configurations across enterprise software applications. Develop custom BPM workflows and ADF user interfaces based on business and user requirements. Write clean, maintainable, and well-documented code while ensuring high-quality deliverables through practices such as TDD, BDD, and pair programming. Support continuous integration and deployment, collaborating with DevOps and QA teams. Maintain and document customizations and extensions in Oracle Fusion and database components. Contribute to team-level innovation and efficiency improvements. Mandatory Skills 4+ years of professional experience in backend development using Java and Object-Oriented Programming (OOP) principles. Proficient in API development, RESTful services, and HTTP protocols. Experience in Oracle Fusion BPM development, including custom BPM workflows and ADF UI. Solid understanding of SOA methodologies and enterprise application integration. Proficiency in Oracle 12c DB, SQL, and relational database concepts. Strong collaboration and communication skills with the ability to work in a fast-paced team environment. Experience with TDD, BDD, and pair programming practices. Good-to-Have Skills Experience working in cloud environments such as AWS. Exposure to regulated domains or working in financial services is a plus. Experience with solving real-world problems in complex systems. Familiarity with modern CI/CD practices and DevOps tools. Keywords Java, Oracle Fusion, BPM, ADF, API Development, SQL, Cloud (AWS), Software Engineering, Backend Development Skills Cloud Computing,Java,Aws Cloud Show more Show less

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Title: Snowflake Developer Location: Gurugram Experience: 3 to 7 years Skillset: Snowflake, Azure, ADF (Azure Data Factory) Job Type: Full-Time Overview: We are looking for a talented Snowflake Developer with expertise in Snowflake, Azure, and Azure Data Factory (ADF) to join our dynamic team. In this role, you will be responsible for developing, implementing, and optimizing data pipelines and ETL processes. You will work on cloud-based data platforms to ensure the effective and seamless integration of data across systems. The ideal candidate will have a solid background in working with Snowflake and cloud data services, and be ready to travel to client locations as required. Key Responsibilities: • Design, develop, and implement data solutions using Snowflake and Azure technologies. • Develop and manage ETL pipelines using Azure Data Factory (ADF) for seamless data movement and transformation. • Collaborate with cross-functional teams to ensure that the data platform meets business needs and aligns with data architecture best practices. • Monitor, optimize, and troubleshoot data pipelines and workflows in Snowflake and Azure environments. • Implement data governance and security practices in line with industry standards. • Perform data validation and ensure data integrity across systems and platforms. • Ensure data integration and management processes are optimized for performance, scalability, and reliability. • Provide technical support and guidance to junior developers and team members. • Collaborate with the client to understand project requirements and ensure deliverables are met on time. • Be open to travelling to client locations as needed for project delivery and stakeholder engagements. Skills and Qualifications: • 3 to 7 years of hands-on experience in Snowflake development and data management. • Strong working knowledge of Azure (Azure Data Services, Azure Data Lake, etc.) and Azure Data Factory (ADF). • Expertise in designing and developing ETL pipelines and data transformation processes using Snowflake and ADF. • Proficiency in SQL and data modeling, with experience working with structured and semi-structured data. • Knowledge of data warehousing concepts and best practices in Snowflake. • Understanding of data security, privacy, and compliance requirements in cloud environments. • Experience with cloud-based data solutions and integration services. • Strong problem-solving and debugging skills. • Ability to work effectively with both technical and non-technical teams. • Good communication skills to collaborate with clients and team members. • Bachelor’s degree in Computer Science, Information Technology, or a related field. Preferred Skills: • Experience with other Azure services like Azure SQL Database, Azure Synapse Analytics, and Power BI. • Familiarity with data governance tools and data pipeline orchestration best practices. • Ability to optimize Snowflake queries and database performance. Why Join Us: • Work with cutting-edge cloud technologies like Snowflake and Azure. • Exposure to complex, large-scale data projects across industries. • Collaborative work environment that promotes innovation and learning. • Competitive salary and benefits package. • Opportunities for career growth and development. Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Kochi, Kerala, India

On-site

Role Description Develop and manage ETL workflows using Azure Data Factory (ADF). Design and implement data pipelines using PySpark on Azure Databricks. Work with Azure Synapse Analytics, Azure Data Lake, and Azure Blob Storage for data ingestion and transformation. Optimize Spark jobs for performance and scalability in Databricks. Automate data workflows and implement error handling & monitoring in ADF. Collaborate with data engineers, analysts, and business teams to understand data requirements. Implement data governance, security, and compliance best practices in Azure. Debug and troubleshoot PySpark scripts and ADF pipeline failures. 4+ years of experience in ETL development with Azure Data Factory (ADF). Hands-on experience with Azure Databricks and PySpark for big data processing. Strong knowledge of Azure services Proficiency in Python and PySpark for data transformation and processing. Experience with CI/CD pipelines for deploying ADF pipelines and Databricks notebooks. Strong expertise in SQL for data extraction and transformations. Knowledge of performance tuning in Spark and cost optimization on Azure. Skills Azure Data Factory,Pyspark,Azure Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

On-site

We’re hiring a Senior ML Engineer (MLOps) — 3-5 yrs Location: Kochi or Chennai What you’ll do Tame data → pull, clean, and shape structured & unstructured data. Orchestrate pipelines → Airflow / Step Functions / ADF… your call. Ship models → build, tune, and push to prod on SageMaker, Azure ML, or Vertex AI. Scale → Spark / Databricks for the heavy lifting. Automate everything → Docker, Kubernetes, CI/CD, MLFlow, Seldon, Kubeflow. Pair up → work with engineers, architects, and business folks to solve real problems, fast. What you bring 3+ yrs hands-on MLOps (4-5 yrs total software experience). Proven chops on one hyperscaler (AWS, Azure, or GCP). Confidence with Databricks / Spark , Python, SQL, TensorFlow / PyTorch / Scikit-learn. You debug Kubernetes in your sleep and treat Dockerfiles like breathing. You prototype with open-source first, choose the right tool, then make it scale. Sharp mind, low ego, bias for action. Nice-to-haves Sagemaker, Azure ML, or Vertex AI in production. Love for clean code, clear docs, and crisp PRs. Why Datadivr? Domain focus: we live and breathe F&B — your work ships to plants, not just slides. Small team, big autonomy: no endless layers; you own what you build. 📬 How to apply Shoot your CV + a short note on a project you shipped to careers@datadivr.com or DM me here. We reply to every serious applicant. Know someone perfect? Please share — good people know good people. Show more Show less

Posted 1 month ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day-to-day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team, and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after-sales support and best practice advice. Interactions with internal stakeholders and clients to explain technology solutions and a clear understanding of client’s business requirements through which to guide optimal design to meet their needs. Job Description: Must-Have Skills: Database (one or more of MS SQL Server, Oracle, Cloud SQL, Cloud Spanner, etc.) Data Warehouse (one or more of Big Query, SnowFlake, etc.) ETL tool (two or more of Cloud Data Fusion, Dataflow, Dataproc, Pub/Sub, Composer, Cloud Functions, Cloud Run, etc.) Experience in Cloud platforms - GCP Python, PySpark, Project & resource management SVN, JIRA, Automation workflow (Composer, Cloud Scheduler, Apache Airflow, Tidal, Tivoli or similar) Good to have Skills: UNIX shell scripting, SnowFlake, Redshift, Familiar with NoSQL such as MongoDB, etc ETL tool (Databricks / AWS Glue / AWS Lambda / Amazon Kinesis / Amazon Firehose / Azure Data Factory / ADF / DBT / Talend, Informatica, IICS (Informatica cloud) ) Experience in Cloud platforms - AWS / Azure Client-facing skills Key Responsibilities: Ability to design simple to medium data solutions for clients by using cloud architecture using GCP Strong understanding of DW, data mart, data modeling, data structures, databases, and data ingestion and transformation. Working knowledge of ETL as well as database skills Working knowledge of data modeling, data structures, databases, and ETL processes Strong understanding of relational and non-relational databases and when to use them Leadership and communication skills to collaborate with local leadership as well as our global teams Translating technical requirements into ETL/ SQL application code Document project architecture, explain the detailed design to the team, and create low-level to high-level design Create technical documents for ETL and SQL developments using Visio, PowerPoint, and other MS Office package Will need to engage with Project Managers, Business Analysts, and Application DBA to implement ETL Solutions Perform mid to complex-level tasks independently Support Clients, Data Scientists, and Analytical Consultants working on marketing solution Work with cross-functional internal teams and external clients Strong project management and organization skills . Ability to lead 1 – 2 projects of team size 2 – 3 team members. Code management systems which include Code review, deployment, cod Work closely with the QA / Testing team to help identify/implement defect reduction initiatives Work closely with the Architecture team to make sure Architecture standards and principles are followed during development Performing Proof of Concepts on new platforms/validating proposed solutions Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed Must understand software development methodologies including waterfall and agile Distribute and manage SQL development Work across the team The candidate must be willing to work during overlapping hours with US-based teams to ensure effective collaboration and communication, typically between [e.g., 6:00 PM to 11:00 PM IST], depending on project needs. Qualifications: Bachelor’s or Master's Degree in Computer Science with >= 7 years of IT experience Location: Bangalore Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 1 month ago

Apply

7.0 years

0 Lacs

New Delhi, Delhi, India

On-site

The Technical Lead / Technical Consultant is a core role and focal point of the project team responsible for the whole technical solution and managing the day-to-day delivery. The role will focus on the technical solution architecture, detailed technical design, coaching of the development/implementation team, and governance of the technical delivery. Technical ownership of the solution from bid inception through implementation to client delivery, followed by after-sales support and best practice advice. Interactions with internal stakeholders and clients to explain technology solutions and a clear understanding of client’s business requirements through which to guide optimal design to meet their needs. Job Description: Must-Have Skills: Database (one or more of MS SQL Server, Oracle, Cloud SQL, Cloud Spanner, etc.) Data Warehouse (one or more of Big Query, SnowFlake, etc.) ETL tool (two or more of Cloud Data Fusion, Dataflow, Dataproc, Pub/Sub, Composer, Cloud Functions, Cloud Run, etc.) Experience in Cloud platforms - GCP Python, PySpark, Project & resource management SVN, JIRA, Automation workflow (Composer, Cloud Scheduler, Apache Airflow, Tidal, Tivoli or similar) Good to have Skills: UNIX shell scripting, SnowFlake, Redshift, Familiar with NoSQL such as MongoDB, etc ETL tool (Databricks / AWS Glue / AWS Lambda / Amazon Kinesis / Amazon Firehose / Azure Data Factory / ADF / DBT / Talend, Informatica, IICS (Informatica cloud) ) Experience in Cloud platforms - AWS / Azure Client-facing skills Key Responsibilities: Ability to design simple to medium data solutions for clients by using cloud architecture using GCP Strong understanding of DW, data mart, data modeling, data structures, databases, and data ingestion and transformation. Working knowledge of ETL as well as database skills Working knowledge of data modeling, data structures, databases, and ETL processes Strong understanding of relational and non-relational databases and when to use them Leadership and communication skills to collaborate with local leadership as well as our global teams Translating technical requirements into ETL/ SQL application code Document project architecture, explain the detailed design to the team, and create low-level to high-level design Create technical documents for ETL and SQL developments using Visio, PowerPoint, and other MS Office package Will need to engage with Project Managers, Business Analysts, and Application DBA to implement ETL Solutions Perform mid to complex-level tasks independently Support Clients, Data Scientists, and Analytical Consultants working on marketing solution Work with cross-functional internal teams and external clients Strong project management and organization skills . Ability to lead 1 – 2 projects of team size 2 – 3 team members. Code management systems which include Code review, deployment, cod Work closely with the QA / Testing team to help identify/implement defect reduction initiatives Work closely with the Architecture team to make sure Architecture standards and principles are followed during development Performing Proof of Concepts on new platforms/validating proposed solutions Work with the team to establish and reinforce disciplined software development, processes, standards, and error recovery procedures are deployed Must understand software development methodologies including waterfall and agile Distribute and manage SQL development Work across the team The candidate must be willing to work during overlapping hours with US-based teams to ensure effective collaboration and communication, typically between [e.g., 6:00 PM to 11:00 PM IST], depending on project needs. Qualifications: Bachelor’s or Master's Degree in Computer Science with >= 7 years of IT experience Location: Bangalore Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Implementation of data pipelines using big data tools, such as Azure Data Factory, PySpark, SparkSQL,SQL, and Azure DevOps, data workflows.2.Hands-on PySpark programming.3. Hands-on with SQL (Stored procedures, Complex SQL queries), data modelling.

Posted 1 month ago

Apply

12.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to accomplish their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Job Description The Lead Engineer (.NET/Azure) is a hands-on, contributor position, responsible for creating solutions and architectures for high-volume, high-transaction applications across the Experian Employer Services (EES) organization. The Lead Software Engineer (.NET/Azure) will write code, participate in code reviews, evaluate SAST findings, and collaborate closely with other members of the larger Experian Employer Services organization, to provide high-quality software solutions to our clients and partners. The Lead Engineer (.NET/Azure) will also mentor other engineers, delegating work, and evaluating acquired technologies and guiding the best way to incorporate these acquired technologies into the Experian Employer Services product ecosystem. You will be reporting to Director - Engineering. Responsibilities Analyze new feature requirements including: Architectural design considerations Software development best practices Test strategies Database design Security considerations Cloud architecture considerations Create new and modernize existing applications that look great across multiple devices Create new and modernize existing API's and partner integrations Implement high-quality code and unit tests Lead design sessions to define Azure-based technical solutions Participate in code reviews and provide meaningful feedback Adhere to Experian's Secure SDLC practices to ensure secure and compliant development. Resolve bugs identified by QA in a timely manner Demonstrate functionality to Product team for approval Promote DevOps culture and work closely with IT as required Assist other team members as needed Delegate tasks to other team members and oversee work quality Be on-call rotation for any platform emergencies Technical Requirements Extensive experience with C#, .NET Framework, .NET Core, Azure Extensive experience with MS SQL Server, Azure SQL, T-SQL, Relational Database Design Extensive experience with Frontend technologies (HTML, CSS, Javascript, Angular, ReactJS) Extensive experience with Azure Cloud Solutions (IaaS, SaaS) Extensive experience with API's microservices, container development and integrations Extensive experience with ETL Technologies like SSIS, ADF Expert knowledge of the latest Architectural Patterns and Cloud Native development Experience with Azure DevOps CI/CD pipelines Experience with Agile software methodologies Experience with Entity Framework or other ORM General Requirements 12+ years of experience in Microsoft stack (.NET, DotNet Core, C# & SQL Server) and architectural experience 3+ years of team lead experience 5+ years of Azure cloud experience Qualifications Bachelor's in computer science, Masters preferred Preferred Qualifications Azure certifications Understanding ML/AI concepts. Strong understanding of DevOps practices and tools. Additional Information Our uniqueness is that we truly celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what truly matters; DEI, work/life balance, development, authenticity, engagement, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's strong people first approach is award winning; Great Place To Work™ in 24 countries, FORTUNE Best Companies to work and Glassdoor Best Places to Work (globally 4.4 Stars) to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

On-site

We are seeking an experienced Azure Data Architect to lead data architecture design and implementation on Azure cloud. The ideal candidate will have deep expertise in Databricks , Azure Functions , and hold a valid Azure Certification . Required Skills: experience in data architecture and engineering Strong hands-on experience with Azure Databricks and Spark Proficiency in Azure Functions , Data Lake , Azure Synapse , ADF Certified in Microsoft Azure (e.g., AZ-305, DP-203) Solid understanding of data modeling, governance, and performance tuning Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Experience: As mentioned in table below Notice Period: Immediate joiners Work Timings: 1PM – 10 PM Location: Gurgaon, Work from office -Hybrid mode, client location Technical Role Primary & Mandatory Skill SQL + ADF •Strong exp in SQL development along with exp in cloud AWS & good exp in ADF 5 + years Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

On-site

Location - Bangalore/ Gurgaon Key Responsibilities: Lead the implementation and optimization of Microsoft Purview across the client’s data estate in MS Fabric/Azure Cloud Platform (ADF or Data Bricks etc). Define and enforce data governance policies, data classification, sensitivity labeling, and data lineage to ensure readiness for GenAI use cases. Collaborate with data engineers, architects, and AI/ML teams to ensure data discoverability, compliance, and ethical AI readiness. Design and implement data cataloging strategies to support GenAI model training and inference. Provide guidance on data access controls, privacy, and regulatory compliance (e.g., GDPR, HIPAA). Conduct workshops and training sessions for client stakeholders on Purview capabilities and best practices. Monitor and report on data governance KPIs and GenAI readiness metrics. Required Skills & Qualifications: Proven experience as a Microsoft Purview SME in enterprise environments. Strong knowledge of Microsoft Fabric, OneLake, and Synapse Data Engineering. Experience with data governance frameworks and metadata management. Hands-on experience with data classification, sensitivity labels, and data lineage tracking. Understanding of compliance standards and data privacy regulations. Excellent communication and stakeholder management skills. Preferred Qualifications: Microsoft certifications in Azure Data, Purview, or Security & Compliance. Experience working with Azure OpenAI, Copilot integrations, or other GenAI platforms. Background in data science, AI ethics, or ML operations is a plus. Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Greater Kolkata Area

Remote

Position Title : Manager-Data Science Location : Remote (Hybrid option available, if in Chennai) Company : ADF Data Science Pvt. Ltd. : Analytics, Risk and R&D Position Type : Full-Time Job Summary We are seeking skilled and motivated Data Scientists with 4+ years of experience in data science with good domain understanding. The ideal candidate will have a strong foundation in data science concepts, proficiency in analytical tools, and the ability to translate data insights into actionable business recommendations. This role requires a blend of technical expertise and business acumen, preferable in financial (credit, risk) fields, to drive data-driven decision making. This will be an individual contributor role or lead for a small team (if relevant experience is present) Qualifications Education : Bachelor of Engineering or master's in quantitative areas. It is mandatory that the ideal candidate should be from tier 1 institutes. Experience 4+ years of experience in data science and business analytics projects. The ideal candidate should exposure in Credit risk analytics. Proven experience in data handling, analytics with good exposure to statistical analysis and machine learning. Technical Skills Expertise in programming languages such as Python and SQL. Expertise in machine learning algorithms. Soft Skills Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Ability to lead a team. (ref:hirist.tech) Show more Show less

Posted 1 month ago

Apply

5.0 - 8.0 years

10 - 18 Lacs

Pune, Gurugram, Bengaluru

Hybrid

5-8 years of strong experience of working on Microsoft Azure based technologies and frameworks. Good experience in design and implementation of Azure-based solutions using .Net Core, C#, Function Apps, Azure Service Bus and Azure SQL Server, SSIS Packages, among other Azure-related technologies. Experience designing and implementing in REST+JSON Web Services. Expertise in the application of software design patterns, object-oriented practices, and software development life cycle, testing, version control, deployment, production support and maintenance Expertise in relational and non-relational database concepts, design and database management systems Ability capture, document, and implement functional and non-functional requirements into technical solutions. Experience of working in Agile driven development model Ensure quality of deliverables within project timelines Independently manage daily client communication, especially over calls Drives the work towards completion with accuracy and timely deliverables Able to work independently without the needs for close supervision and collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Experience in project management and team management Good to have Financial Services knowledge

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies