Jobs
Interviews

1437 Adf Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 years

6 - 8 Lacs

Hyderābād

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Senior Manager Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients *Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer . Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) *Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) *Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) *Year of experience required Minimum 12Years of Oracle fusion experience *Educational Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Integration Cloud (OIC) Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Coaching and Feedback, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Influence, Innovation, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Professional Courage {+ 9 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 4 days ago

Apply

7.0 years

6 - 10 Lacs

Hyderābād

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Manager – Azure Data Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (Big Data Architects) with strong technology and data understanding having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your key responsibilities Develop standardized practices for delivering new products and capabilities using Big Data & cloud technologies, including data acquisition, transformation, analysis, Modelling, Governance & Data management skills Interact with senior client technology leaders, understand their business goals, create, propose solution, estimate effort, build architectures, develop and deliver technology solutions Define and develop client specific best practices around data management within a cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using ADB, ADF, PySpark, Python, Snypase Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Have managed team and have experience in end to end delivery Have experience of building technical capability and teams to deliver Skills and attributes for success Strong understanding & familiarity with all Cloud Ecosystem components Strong understanding of underlying Cloud Architectural concepts and distributed computing paradigms Experience in the development of large scale data processing. Hands-on programming experience in ADB, ADF, Synapse, Python, PySpark, SQL Hands-on expertise in cloud services like AWS, and/or Microsoft Azure eco system Solid understanding of ETL methodologies in a multi-tiered stack with Data Modelling & Data Governance Experience with BI, and data analytics databases Experience in converting business problems/challenges to technical solutions considering security, performance, scalability etc. Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Strong stakeholder, client, team, process & delivery management skills To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Minimum 7 years hand-on experience in one or more of the above areas. Minimum 10 years industry experience Ideally, you’ll also have Project management skills Client management skills Solutioning skills What we look for People with technical experience and enthusiasm to learn new things in this fast-moving environment What working at EY offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 4 days ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Gurgaon

On-site

Manager EXL/M/1390788 ServicesGurgaon Posted On 18 Jul 2025 End Date 01 Sep 2025 Required Experience 5 - 10 Years Basic Section Number Of Positions 1 Band C1 Band Name Manager Cost Code D005894 Campus/Non Campus NON CAMPUS Employment Type Permanent Requisition Type New Max CTC 10.0000 - 25.0000 Complexity Level Not Applicable Work Type Hybrid – Working Partly From Home And Partly From Office Organisational Group Analytics Sub Group Insurance Organization Services LOB Consulting SBU Analytics Country India City Gurgaon Center EXL - Gurgaon Center 38 Skills Skill AZURE Minimum Qualification B.TECH/B.E Certification No data available Job Description Analytics – JD (Azure DE) EXL (NASDAQ:EXLS) is a leading operations management and analytics company that helps businesses enhance growth and profitability in the face of relentless competition and continuous disruption. Using our proprietary, award-winning Business EXLerator Framework™, which integrates analytics, automation, benchmarking, BPO, consulting, industry best practices and technology platforms, we look deeper to help companies improve global operations, enhance data-driven insights, increase customer satisfaction, and manage risk and compliance. EXL serves the insurance, healthcare, banking and financial services, utilities, travel, transportation and logistics industries. Headquartered in New York, New York, EXL has more than 61,000 professionals in locations throughout the United States, Europe, Asia (primarily India and Philippines), Latin America, Australia and South Africa. EXL Analytics provides data-driven, action-oriented solutions to business problems through statistical data mining, cutting edge analytics techniques and a consultative approach. Leveraging proprietary methodology and best-of-breed technology, EXL Analytics takes an industry-specific approach to transform our clients’ decision making and embed analytics more deeply into their business processes. Our global footprint of nearly 12,000 data scientists and analysts assist client organizations with complex risk minimization methods, advanced marketing, pricing and CRM strategies, internal cost analysis, and cost and resource optimization within the organization. EXL Analytics serves the insurance, healthcare, banking, capital markets, utilities, retail and e-commerce, travel, transportation and logistics industries. Please visit www.exlservice.com for more information about EXL Analytics. Job Title: Consultant / Senior Consultant – Azure Data Engineering Location: India – Gurgaon preferred Industry: Insurance Analytics & AI Vertical Role Overview: We are seeking a hands-on Consultant / Senior Consultant with strong expertise in Azure-based data engineering to support end-to-end development and delivery of data pipelines for our insurance clients. The ideal candidate will have a deep understanding of Azure Data Factory, ADLS, Databricks (preferably with DLT and Unity Catalog), SQL, and Python and be comfortable working in a dynamic, client-facing environment. This is a key offshore role requiring both technical execution and solution-oriented thinking to support modern data platform initiatives. Collaborate with data scientists, analysts, and stakeholders to gather requirements and define data models that effectively support business requirements Demonstrate decision-making, analytical and problem-solving abilities Strong verbal and written communication skills to manage client discussions Familiar with working on Agile methodologies - daily scrum, sprint planning, backlog refinement Key Responsibilities & Skillsets: o Design and develop scalable and efficient data pipelines using Azure Data Factory (ADF) and Azure Data Lake Storage (ADLS). o Build and maintain Databricks notebooks for data ingestion, transformation, and quality checks, using Python and SQL. o Work with Delta Live Tables (DLT) and Unity Catalog (preferred) to improve pipeline automation, governance, and performance. o Collaborate with data architects, analysts, and onshore teams to translate business requirements into technical specifications. o Troubleshoot data issues, ensure data accuracy, and apply best practices in data engineering and DevOps. o Support the migration of legacy SQL pipelines to modern Python-based frameworks. o Ensure adherence to data security, compliance, and performance standards, especially within insurance domain constraints. o Provide documentation, status updates, and technical insights to stakeholders as required. o Excellent communication skills and stakeholder management Required Skills & Experience: 3–7 years of strong hands-on experience in data engineering with a focus on Azure cloud technologies. Proficient in Azure Data Factory, Databricks, ADLS Gen2, and working knowledge of Unity Catalog. Strong programming skills in both SQL, Python especially within Databricks Notebooks. Pyspark expertise is good to have. Experience in Delta Lake / Delta Live Tables (DLT) is a plus. Good understanding of ETL/ELT concepts, data modeling, and performance tuning. Exposure to Insurance or Financial Services data projects is highly preferred. Strong communication and collaboration skills in an offshore delivery model. Required Skills & Experience: Experience working in Agile/Scrum teams Familiarity with Azure DevOps, Git, and CI/CD practices Certifications in Azure Data Engineering (e.g., DP-203) or Databricks What we offer: EXL Analytics offers an exciting, fast paced and innovative environment, which brings together a group of sharp and entrepreneurial professionals who are eager to influence business decisions. From your very first day, you get an opportunity to work closely with highly experienced, world class analytics consultants. You can expect to learn many aspects of businesses that our clients engage in. You will also learn effective teamwork and time-management skills - key aspects for personal and professional growth Analytics requires different skill sets at different levels within the organization. At EXL Analytics, we invest heavily in training you in all aspects of analytics as well as in leading analytical tools and techniques. We provide guidance/ coaching to every employee through our mentoring program wherein every junior level employee is assigned a senior level professional as advisors. Sky is the limit for our team members. The unique experiences gathered at EXL Analytics sets the stage for further growth and development in our company and beyond. "EOE/Minorities/Females/Vets/Disabilities" Workflow Workflow Type Back Office

Posted 4 days ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

Diageo's ambition is to be one of the best performing, most trusted, and respected consumer products companies in the world. The strategy is to support premiumisation in developed and emerging countries by offering a broad portfolio across different consumer occasions and price points. This approach also plays a crucial role in shaping responsible drinking trends in markets where international premium spirits are an emerging category. As a member of Diageo's Analytics & Insights team, you will be instrumental in designing, developing, and implementing analytics products to drive the company's competitive advantage and facilitate data-driven decisions. Your role will involve advancing the sophistication of analytics throughout Diageo, serving as a data evangelist to empower stakeholders, identifying meaningful insights from vast data sources, and communicating findings to drive growth, enhance consumer experiences, and optimize business processes. While the role does not entail budget ownership, understanding architecture resource costs is necessary. You will be supporting global initiatives and functions across various markets, working closely with key stakeholders to create possibilities, foster conditions for success, promote personal and professional growth, and maintain authenticity in all interactions. The purpose of the role includes owning and developing a domain-specific data visualization product portfolio, ensuring compliance with technological and business priorities, and contributing to the end-to-end build of analytics products meeting enterprise standards. You will lead agile teams in developing robust BI solutions, provide technical guidance, oversee data flow, and collaborate with internal and external partners to deliver innovative solutions. Your top accountabilities will involve technical leadership in analytics product builds, optimization of data visualization architecture, BAU support, and feedback to enhance data model standards. Business acumen is essential, particularly in working with marketing data and building relationships with stakeholders to drive data-led innovation. Required qualifications include multiple years of experience in BI solution development, a bachelor's degree in a relevant field, hands-on experience as a lead developer, proficiency in DAX & M language, knowledge of Azure architecture, and expertise in data acquisition and processing. Additionally, experience with Azure platform, technical documentation, DevOps solutions, Agile methodologies, and a willingness to deepen solution architecture skills are vital. Experience with structured and unstructured datasets, design collaboration, user experience best practices, and visualization trends are advantageous. A dynamic personality, proficiency in English, and excellent communication skills are key for success in this role.,

Posted 4 days ago

Apply

1.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. You are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt, take ownership and consistently deliver quality work that drives value for our clients and success as a team. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Role: Specialist Tower: Data Analytics & Insights Managed Service Experience: 1 - 3 years Key Skills: Data Engineering Educational Qualification: Bachelor's degree in computer science/IT or relevant field Work Location: Bangalore, India Job Description As a Specialist, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution by using Data, Analytics & Insights Skills. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Review your work and that of others for quality, accuracy, and relevance. Know how and when to use tools available for a given situation and can explain the reasons for this choice. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good Team player. Take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: Primary Skill: ETL/ELT, SQL, Informatica, Python Secondary Skill: Azure/AWS/GCP, Talend, DataStage, etc. Data Engineer Should have minimum 1 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like Informatica, Talend, SSIS, SSRS, AWS, Azure, ADF, GCP, Snowflake, Spark, SQL, Python etc. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have Certifications in Cloud Technology is an added advantage. Experience in Visualization tools like Power BI, Tableau, Qlik, etc. Managed Services- Data, Analytics & Insights At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights Managed Service where we focus more so on the evolution of our clients’ Data, Analytics, Insights and cloud portfolio. Our focus is to empower our clients to navigate and capture the value of their application portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Application Evolution Service offerings and engagement including help desk support, enhancement and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 4 days ago

Apply

40.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description At Oracle Health, we put humans at the heart of every conversation. Our mission is to create a human-centric healthcare experience powered by unified global data. As a global leader we’re looking for a Data Engineer with BI to join an exciting project for replacing existing Data warehouse systems with the Oracle's own data warehouse to manage storage of all the internal corporate data to provide insights that will help our teams to make critical business decisions Join us and create the future! Responsibilities Career Level - IC3 Roles and Responsibilities Proficient in writing and optimising SQL queries for data extraction Translate client requirements to technical design that junior team members can implement Developing the code that aligns with the technical design and coding standards Review design and code implemented by other team members. Recommend better design and efficient code Conduct Peer design and Code Reviews for early detection of defects and code quality Documenting ETL processes and data flow diagrams Optimizing data extraction and transformation processes for better performance Performing data quality checks and debugging issues Conducting root cause analysis for data issues and implementing fixes Collaborating with more experienced developers on larger projects, collaborate with stakeholders on the requirements Participate in the requirements, design and implementation discussions Participating in learning and development opportunities to enhance technical skills Test storage system after transferring the data Exposures to Business Intelligence platforms like OAC, Power BI or Tableau Technical Skills Set : You must be strong in PLSQL concepts such as tables, keys, DDL, DML commands, etc. You need be proficient in writing, debugging complex SQL queries, Views and Stored Procedures. Strong hands on in Python / PySpark programming As a Data Engineer, you must be strong in data modelling, ETL / ELT concepts, programming / scripting Python language, You must be proficient in the following ETL process automation tools Oracle Data Integrator (ODI) Oracle Data Flow Oracle Database / Autonomous Data warehouse Should possess working knowledge in any of the cloud platform like Oracle Cloud (Preferred), Microsoft Azure, AWS You must be able to create technical design, build prototypes, build and maintain high performing data pipelines, optimise ETL pipelines Good knowledge on Business Intelligence development tools like OAC, PowerBI Good to have Microsoft ADF and Data Lakes, Databricks Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Posted 4 days ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Snowflake Developer Were looking for a highly skilled Snowflake Developer with strong expertise in Snowflake Data Cloud to join our data engineering team. SnowPro Certification is mandatory, and professionals with 4+ years of experience (including 3+ years on Snowflake) will be prioritized. If you're passionate about building scalable data solutions on modern cloud platforms, we'd love to connect. Roles & Responsibilities Build and manage scalable data pipelines on the Snowflake Data Cloud. Develop ETL/ELT workflows using tools like ADF, AWS Glue, Informatica, or Talend. Orchestrate data workflows using Airflow, Control-M, or similar tools. Write advanced SQL and Python scripts (Pandas, PySpark, Snowpark) for data transformation. Optimize pipeline performance and ensure data quality and reliability. Collaborate with cross-functional teams to deliver clean, structured data for analytics. Work with data modeling tools (dbt, Erwin) and integration tools (Fivetran, Stitch) as needed Must-Have Skills Snowflake Cloud Platform - strong hands-on experience ETL/ELT Tools - experience with one or more tools such as : Azure Data Factory AWS Glue Informatica Talend Qlik Orchestration : Proficiency With Tools Like Apache Airflow Control-M Tidal : Advanced SQL Python (including working with data frames using Pandas, PySpark, or Snowpark) Data Engineering Concepts : Strong knowledge of data pipelines, data wrangling, and optimization Good-to-Have Skills SQL scripting and procedural logic Data modeling tools (e.g., Erwin, dbt) Integration tools like Fivetran, Stitch Note : Immediate & Serving Notice period Candidates are prefered (ref:hirist.tech)

Posted 4 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. As part of our strategic initiative to build a centralized capability around data and cloud engineering, we are establishing a dedicated Azure Cloud Data Engineering practice. This team will be at the forefront of designing, developing, and deploying scalable data solutions on cloud primarily using Microsoft Azure platform. The practice will serve as a centralized team, driving innovation, standardization, and best practices across cloud-based data initiatives. New hires will play a pivotal role in shaping the future of our data landscape, collaborating with cross-functional teams, clients, and stakeholders to deliver impactful, end-to-end solutions. Primary Responsibilities Ingest data from multiple on-prem and cloud data sources using various tools & capabilities in Azure Design and develop Azure Databricks processes using PySpark/Spark-SQL Design and develop orchestration jobs using ADF, Databricks Workflow Analyzing data engineering processes being developed and act as an SME to troubleshoot performance issues and suggest solutions to improve Develop and maintain CI/CD processes using Jenkins, GitHub, Github Actions etc Building test framework for the Databricks notebook jobs for automated testing before code deployment Design and build POCs to validate new ideas, tools, and architectures in Azure Continuously explore new Azure services and capabilities; assess their applicability to business needs Create detailed documentation for cloud processes, architecture, and implementation patterns Work with data & analytics team to build and deploy efficient data engineering processes and jobs on Azure cloud Prepare case studies and technical write-ups to showcase successful implementations and lessons learned Work closely with clients, business stakeholders, and internal teams to gather requirements and translate them into technical solutions using best practices and appropriate architecture Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Ensure solutions adhere to security, compliance, and governance standards Monitor and optimize data pipelines and cloud resources for cost and performance efficiency Identifies solutions to non-standard requests and problems Support and maintain the self-service BI warehouse Mentor and support existing on-prem developers for cloud environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience 4+ experience in Data & Analytics engineering Experience working with Azure, Databricks, and ADF, Data Lake experience Eorking with data platform or product using PySpark and Spark-SQL Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc. In-depth understanding of Azure architecture & ability to come up with efficient design & solutions Highly proficient in Python and SQL Proven excellent communication skills Preferred Qualifications Snowflake, Airflow experience Power BI development experience Experience or knowledge of health care concepts - E&I, M&R, C&S LOBs, Claims, Members, Provider, Payers, Underwriting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. #NIC

Posted 4 days ago

Apply

12.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. As part of our strategic initiative to build a centralized capability around data and cloud engineering, we are establishing a dedicated Azure Cloud Data Engineering practice under RMI - Optum Advisory umbrella. This team will be at the forefront of designing, developing, and deploying scalable data solutions on cloud primarily using Microsoft Azure platform. The practice will serve as a centralized team, driving innovation, standardization, and best practices across cloud-based data initiatives. New hires will play a pivotal role in shaping the future of our data landscape, collaborating with cross-functional teams, clients, and stakeholders to deliver impactful, end-to-end solutions. Primary Responsibilities Design and implement secure, scalable, and cost-effective cloud data architectures using cloud services such as Azure Data Factory (ADF), Azure Databricks, Azure Storage, Key Vault, Snowflake, Synapse Analytics, MS Fabric/Power BI etc. Define and lead data & cloud strategy, including migration plans, modernization of legacy systems, and adoption of new cloud capabilities Collaborate with clients to understand business requirements and translate them into optimal cloud architecture solutions, balancing performance, security, and cost Evaluate and compare cloud services (e.g., Databricks, Snowflake, Synapse Analytics) and recommend the best-fit solutions based on project needs and organizational goals Lead the full lifecycle of data platform and product implementations, from planning and design to deployment and support Drive cloud migration initiatives, ensuring smooth transition from on-premise systems while engaging and upskilling existing teams Lead and mentor a team of cloud and data engineers, fostering a culture of continuous learning and technical excellence Plan and guide the team in building Proof of Concepts (POCs), exploring new cloud capabilities, and validating emerging technologies Establish and maintain comprehensive documentation for cloud setup processes, architecture decisions, and operational procedures Work closely with internal and external stakeholders to gather requirements, present solutions, and ensure alignment with business objectives Ensure all cloud solutions adhere to security best practices, compliance standards, and governance policies Prepare case studies and share learnings from implementations to build organizational knowledge and improve future projects Building and analyzing data engineering processes and act as an SME to troubleshoot performance issues and suggesting solutions to improve Develop and maintain CI/CD processes using Jenkins, GitHub, Github Actions, Maven etc Building test framework for the Databricks notebook jobs for automated testing before code deployment Continuously explore new Azure services and capabilities; assess their applicability to business needs Create detailed documentation for cloud processes, architecture, and implementation patterns Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Identifies solutions to non-standard requests and problems Mentor and support existing on-prem developers for cloud environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience 12+ years of overall experience in Data & Analytics engineering 10+ years of solid experience working as an Architect designing data platforms using Azure, Databricks, Snowflake, ADF, Data Lake, Synapse Analytics, Power BI etc. 10+ years of experience working with data platform or product using PySpark and Spark-SQL In-depth experience designing complex Azure architecture for various business needs & ability to come up with efficient design & solutions Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc. Experience in leading team and people management Highly proficient and hands-on experience with Azure services, Databricks/Snowflake development etc. Excellent communication and stakeholder management skills Preferred Qualifications Snowflake, Airflow experience Power BI development experience Eexperience or knowledge of health care concepts - E&I, M&R, C&S LOBs, Claims, Members, Provider, Payers, Underwriting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission. #NIC

Posted 4 days ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. As part of our strategic initiative to build a centralized capability around data and cloud engineering, we are establishing a dedicated Azure Cloud Data Engineering practice. This team will be at the forefront of designing, developing, and deploying scalable data solutions on cloud primarily using Microsoft Azure platform. The practice will serve as a centralized team, driving innovation, standardization, and best practices across cloud-based data initiatives. New hires will play a pivotal role in shaping the future of our data landscape, collaborating with cross-functional teams, clients, and stakeholders to deliver impactful, end-to-end solutions. Primary Responsibilities Ingest data from multiple on-prem and cloud data sources using various tools & capabilities in Azure Design and develop Azure Databricks processes using PySpark/Spark-SQL Design and develop orchestration jobs using ADF, Databricks Workflow Analyzing data engineering processes being developed and act as an SME to troubleshoot performance issues and suggest solutions to improve Develop and maintain CI/CD processes using Jenkins, GitHub, Github Actions etc. Building test framework for the Databricks notebook jobs for automated testing before code deployment Design and build POCs to validate new ideas, tools, and architectures in Azure Continuously explore new Azure services and capabilities; assess their applicability to business needs. Create detailed documentation for cloud processes, architecture, and implementation patterns Work with data & analytics team to build and deploy efficient data engineering processes and jobs on Azure cloud Prepare case studies and technical write-ups to showcase successful implementations and lessons learned Work closely with clients, business stakeholders, and internal teams to gather requirements and translate them into technical solutions using best practices and appropriate architecture Contribute to full lifecycle project implementations, from design and development to deployment and monitoring Ensure solutions adhere to security, compliance, and governance standards Monitor and optimize data pipelines and cloud resources for cost and performance efficiency Identifies solutions to non-standard requests and problems Support and maintain the self-service BI warehouse Mentor and support existing on-prem developers for cloud environment Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Undergraduate degree or equivalent experience 4+ years of overall experience in Data & Analytics engineering 4+ years of experience working with Azure, Databricks, and ADF, Data Lake Solid experience working with data platforms and products using PySpark and Spark-SQL Solid experience with CICD tools such as Jenkins, GitHub, Github Actions, Maven etc. In-depth understanding of Azure architecture & ability to come up with efficient design & solutions Highly proficient in Python and SQL Proven excellent communication skills Preferred Qualifications Snowflake, Airflow experience Power BI development experience Experience or knowledge of health care concepts - E&I, M&R, C&S LOBs, Claims, Members, Provider, Payers, Underwriting At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes — an enterprise priority reflected in our mission. #NIC

Posted 4 days ago

Apply

7.0 - 11.0 years

0 Lacs

karnataka

On-site

You are an exceptional, innovative, and passionate individual looking to grow with NTT DATA, a trusted global innovator of business and technology services. If you aspire to be part of an inclusive, adaptable, and forward-thinking organization, this opportunity is for you. Currently, we are seeking a Sr. ETL Developer to join our team in Bangalore, Karnataka, India. The ideal candidate should have a strong hands-on experience in SQLs, PL/SQLs (Procs, Functions), expert level knowledge in ETL flows & Jobs (ADF pipeline experience preferred), and experience in MS-SQL (preferred), Oracle DB, PostgreSQL, and MySQL. Additionally, a good understanding of Data Warehouse/Data Mart, Data Structures/Models, Integrities constraints, and Performance tuning is required. Knowledge in the Insurance Domain is preferred for this role. The desired total experience for this position is 7-10 years. NTT DATA, with a revenue of $30 billion, is committed to helping clients innovate, optimize, and transform for long-term success. Serving 75% of the Fortune Global 100, NTT DATA is a Global Top Employer with diverse experts in more than 50 countries. The company offers business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation, and management of applications, infrastructure, and connectivity. NTT DATA is at the forefront of providing digital and AI infrastructure globally, being part of the NTT Group that invests over $3.6 billion annually in R&D to support organizations and societies in confidently moving into the digital future. Visit us at us.nttdata.com.,

Posted 4 days ago

Apply

0 years

0 Lacs

India

Remote

Job Summary (List Format) – Information Technology - SQL Data Tester (Offshore, India) Position: SQL Data Tester (Remote, India) Contract: 6 months (possible extension) Work Hours: Flexible, with 3-5 hours overlap with CST Key Responsibilities Design and develop automated SQL test cases to validate data transformations and data quality. Perform end-to-end ETL testing, primarily using SSIS and other Microsoft BI tools. Validate data integrity, accuracy, and performance across ETL pipelines and data platforms. Collaborate with developers, data engineers, QA analysts, and DevOps teams to ensure comprehensive test coverage. Integrate SQL and ETL testing into CI/CD pipelines for continuous delivery/deployment. Identify, document, and track defects and inconsistencies in data. Contribute to development of automated test strategies and reusable SQL test frameworks. Required Skills & Qualifications Strong experience in automated SQL testing, focusing on data validation and transformation. Proficiency in ETL testing, especially with SSIS. Experience working in CI/CD environments and automating test pipelines. Good understanding of relational databases and data warehousing. Excellent analytical and problem-solving skills. Preferred/Nice To Have Experience with tSQLt for SQL Server unit testing. Familiarity with Azure Data Factory (ADF) and cloud-based data integration. Knowledge of SQL linting tools and best practices for code quality. Exposure to Agile/Scrum methodologies. Experience in a manufacturing environment. Other Notes Candidate should be prepared for communication via WhatsApp, Viber, or Google Meet for interview and coordination. Priority given to candidates with tSQLt, ADF, and SQL Linting experience.

Posted 4 days ago

Apply

12.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Description: Excellent written and verbal communication skill Must have experience of 12-16 years. Looking forward to candidates only from Application Development. Excellent Project Management skills with experience in leading large projects/ programs with distributed teams of 20+ Experience delivering projects using Waterfall, Agile (Scrum & Kanban) & Hybrid Experience is leading/ managing Development projects related to Application Development, Infrastructure, Migrations, etc. Strong skills in project planning, monitoring and executing - project plan, communication plan, status reporting, Risk & Issue management, scope management, quality and productivity metrics, etc. Good Stakeholder Engagement skills Excellent People Development skills Experience in leading/ managing projects using Microsoft Technology Stack or Data AI (ETL, ADF, Power BI) Experience in working with US and UK customers, both at offshore and onsite/ onshore Excellent documentation skills - requirements specifications, backlog, etc. Experience in Presales - SOW, Proposal, RFP, RFQ, etc. Experience in executing Big Data projects. Certifications in scrum, project management, etc. Hands-on experience with developing web technologies and/ or data projects Roles & responsibilities - Build and develop high performing and motivated project teams by providing purpose and direction; lead by example Lead projects from requirements definition through deployment, identifying schedules, scopes, budget estimations, and project implementation plans, including risk mitigation Efficiently collaborate with various stakeholders and define communication plan to ensure proper communication and escalation mechanism Monitor and execute project to ensure that it remains within scope, schedule, and defined budgets Establish and maintain relationships with appropriate client stakeholders, providing day-to-day contact on project status and changes Establish and maintain processes to manage scope over the project lifecycle, meeting project quality and performance standards Adhere to process and policies defined for projects execution by customer

Posted 4 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About The Role Are you looking for an exciting opportunity in Solution Architecture ? Are you passionate about everything Azure Cloud ? Then join us as a Senior Cloud Architect Your Main Responsibilities Design and deliver Azure solution architecture for an application workload. Design disaster recovery and backup plans based on RTO, RPO and other non-functional requirements to deliver resilient solution on Public Cloud Assisting engineering teams delivering infrastructure architecture and designing cloud solutions. Build and maintain relationship with Application teams, understanding the context and assisting in achieving respective cloud transformation roadmap Engage with subject matter experts in Security, Enterprise Architecture and Governance teams to contribute and develop cloud technology roadmap and adherence to best practices. About You The following proven technical skills will be required: Expertise in designing app workload using Cloud Platform Services (SaaS, and PaaS). Expertise in technology selection based on Architecture decision records and other standards within the organization. Expertise in Azure AI Services (Foundry, ML OpenAI, Anomaly detection, Bot Services, LUIS) , AKS (Azure Kubernetes Service), App Services, DataBricks, ADF (Azure Data Factory), ASB (Azure Service Bus), EventHub, KV (Key Vault), SA (Storage Account), Container Registry, Azure Functions, Redis, LogicApps, Azure Firewall, VNET (Virtual Network), Private Endpoint, Service Endpoint, SQL Server, CosmosDB, MongoDB Experience in designing IaC using Terraform on Azure DevOps. Expertise in designing Azure disaster recovery and backup scenarios meeting NFRs. Azure Well-Architected framework or Cloud design patterns. Experience in one or more programming languages: .Net / C#, Java, Python or Ruby Experience in Azure landing zone design, platform automation design, DevSecOps tooling, network topology & connectivity, access management & privileged identity design, platform monitoring, security architecture, high availability architecture design. Experience in DevOps methodology (preferably DevSecOps), both technically and organisationally, including continuous deployment, delivery pipelines and test environments. Good stakeholder communication. Ability to work with Product Owner, Product Manager, Architects and engineering teams. At ease working in a transformational and complex environment at a fast-pace and getting things done. Proficiency in English is required. About Swiss Re Swiss Re is one of the world’s leading providers of reinsurance, insurance and other forms of insurance-based risk transfer, working to make the world more resilient. We anticipate and manage a wide variety of risks, from natural catastrophes and climate change to cybercrime. We cover both Property & Casualty and Life & Health. Combining experience with creative thinking and cutting-edge expertise, we create new opportunities and solutions for our clients. This is possible thanks to the collaboration of more than 14,000 employees across the world. Our success depends on our ability to build an inclusive culture encouraging fresh perspectives and innovative thinking. We embrace a workplace where everyone has equal opportunities to thrive and develop professionally regardless of their age, gender, race, ethnicity, gender identity and/or expression, sexual orientation, physical or mental ability, skillset, thought or other characteristics. In our inclusive and flexible environment everyone can bring their authentic selves to work and their passion for sustainability. If you are an experienced professional returning to the workforce after a career break, we encourage you to apply for open positions that match your skills and experience. Keywords Reference Code: 133521

Posted 4 days ago

Apply

0 years

0 Lacs

India

On-site

Working hours - 02.00 PM to 11.00 PM IST Skill Priority: SQL testing, ETL, SSIS, and CICD experience "Nice to Haves": tSQLt, ADF, and SQL Linting experience Position Overview We are seeking a detail-oriented and highly skilled SQL Tester to join our Data & Analytics team. In this role, you will be responsible for Design and develop automated SQL test cases to validate data transformations and data quality. Validating data integrity, accuracy, and performance across our ETL pipelines and data platforms. You will work closely with data engineers, QA analysts, and DevOps teams to ensure high-quality data delivery in a CI/CD environment. Key Responsibilities • Design and develop automated SQL test cases to validate data transformations and data quality. • Perform end-to-end testing of ETL processes, primarily using SSIS and other Microsoft BI tools. • Collaborate with developers and analysts to understand data requirements and ensure test coverage. • Integrate testing into CI/CD pipelines to support continuous delivery and deployment. • Identify, document, and track defects and inconsistencies in data. • Contribute to test automation strategies and reusable SQL test frameworks. Required Skills & Qualifications • Strong experience in Automated SQL testing with a focus on data validation and transformation logic. • Proficiency in ETL testing, particularly with SSIS. • Experience working in CI/CD environments and integrating tests into automated pipelines. • Solid understanding of relational databases and data warehousing concepts. • Excellent analytical and problem-solving skills. Nice to Have • Experience with tSQLt for unit testing SQL Server code. • Familiarity with Azure Data Factory (ADF) and cloud-based data integration. • Knowledge of SQL linting tools and best practices for SQL code quality. • Exposure to Agile/Scrum methodologies.

Posted 4 days ago

Apply

0 years

0 Lacs

India

Remote

Job Role: Information Technology - SQL Data Tester Location: Remote Job Type : 6+ months Contract ** Only Immediate Joiners ** Job Summary: We are seeking a detail-oriented and highly skilled SQL Tester to join our Data & Analytics team. In this role, you will be responsible for Design and develop automated SQL test cases to validate data transformations and data quality. Validating data integrity, accuracy, and performance across our ETL pipelines and data platforms. You will work closely with data engineers, QA analysts, and DevOps teams to ensure high-quality data delivery in a CI/CD environment. Key Responsibilities • Design and develop automated SQL test cases to validate data transformations and data quality. • Perform end-to-end testing of ETL processes, primarily using SSIS and other Microsoft BI tools. • Collaborate with developers and analysts to understand data requirements and ensure test coverage. • Integrate testing into CI/CD pipelines to support continuous delivery and deployment. • Identify, document, and track defects and inconsistencies in data. • Contribute to test automation strategies and reusable SQL test frameworks. Required Skills & Qualifications • Strong experience in Automated SQL testing with a focus on data validation and transformation logic. • Proficiency in ETL testing, particularly with SSIS. • Experience working in CI/CD environments and integrating tests into automated pipelines. • Solid understanding of relational databases and data warehousing concepts. • Excellent analytical and problem-solving skills. Nice to Have • Experience with tSQLt for unit testing SQL Server code. • Familiarity with Azure Data Factory (ADF) and cloud-based data integration. • Knowledge of SQL linting tools and best practices for SQL code quality. • Exposure to Agile/Scrum methodologies.

Posted 4 days ago

Apply

9.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

My client is a IT Service Consulting company looking to hire a Oracle OIC lead with good Team management and customer facing role. Mode: Hybrid Location : Chennai / Bengaluru Experience : 9+ years with 6 years in Oracle Shift Timings : 2:30 PM to 12:00 AM Cab Facility provided Key Responsibilities: Lead end-to-end OIC implementation projects including integration design, development, and deployment. Work closely with business and functional teams to gather integration requirements. Design scalable and reusable integration patterns using OIC (App Driven, Scheduled, File-based, REST/SOAP services). Develop integrations using OIC components: Integration, Process, Visual Builder, and Insight. Manage API creation and lifecycle, error handling, logging, and monitoring. Collaborate with Oracle SaaS and PaaS teams to ensure best practices in security and performance. Troubleshoot and resolve technical issues related to integrations. Required Skills: 6+ years of hands-on experience with Oracle Integration Cloud (OIC) Experience integrating with third-party applications and Oracle ERP Cloud modules (Financials, HCM, SCM), as well as WMS and OTM Strong command of Oracle PaaS components including OIC, VBCS, and DBCS/ATP Proficiency in working with ERP Cloud integration tools and methods such as FBDI, HDL, ADFDI, BIP, OTBI Solid experience with Web Services – both SOAP and REST Development experience in Java, JavaScript, VBCS, and SQL (on DBCS) Added advantage if experienced in Java/J2EE, Oracle ADF, PL/SQL, BIP, OTBI, or SOA Strong client-facing skills and experience managing key stakeholders Interested candidates Please mail your resume to tvarithaa.sudesh@antal.com

Posted 4 days ago

Apply

15.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Excellent written and verbal communication skills Looking forward to candidates in the experience range of 15 - 20 years Must Excellent Project Management skills with experience in leading large projects/ programs with distributed teams of 20+ Experience delivering projects using Waterfall, Agile (Scrum & Kanban) & Hybrid Experience is leading/ managing Development projects related to Application Development, Infrastructure, Migrations, etc. Strong skills in project planning, monitoring and executing - project plan, communication plan, status reporting, Risk & Issue management, scope management, quality and productivity metrics, etc. Good Stakeholder Engagement skills Excellent People Development skills Experience in leading/ managing projects using Microsoft Technology Stack or Data AI (ETL, ADF, Power BI) Experience in working with US and UK customers, both at offshore and onsite/ onshore Excellent documentation skills - requirements specifications, backlog, etc. Experience in Presales - SOW, Proposal, RFP, RFQ, etc. Experience in executing Big Data projects. Certifications in scrum, project management, etc. Hands-on experience with developing web technologies and/ or data projects Location of assignment can either be in Bengaluru or Hyderabad. Roles & Responsibilities: Build and develop high performing and motivated project teams by providing purpose and direction; lead by example Lead projects from requirements definition through deployment, identifying schedules, scopes, budget estimations, and project implementation plans, including risk mitigation Efficiently collaborate with various stakeholders and define communication plan to ensure proper communication and escalation mechanism Monitor and execute project to ensure that it remains within scope, schedule, and defined budgets Establish and maintain relationships with appropriate client stakeholders, providing day-to-day contact on project status and changes Establish and maintain processes to manage scope over the project lifecycle, meeting project quality and performance standards Adhere to process and policies defined for projects execution by customer.

Posted 4 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

Remote

Entity: Technology Job Family Group: IT&S Group Job Description: A results-oriented Senior Architect with a proven track record of delivering end-to-end cloud solutions across infrastructure, data, DevOps, and AI domains. Skilled in architecting, implementing, and governing secure, scalable, and high-performing Azure architectures that align with both technical requirements and business objectives. Brings deep expertise in Azure IaaS and PaaS services, DevOps automation using Azure DevOps, and AI integration through Azure OpenAI and Copilot Studio, enabling intelligent, modern, and future-ready enterprise solutions. Expertise spans Azure infrastructure management, CI/CD automation, Infrastructure as Code (ARM), Azure Data Factory (ADF) pipelines, and enterprise AI adoption. Demonstrated ability to build and support scalable, secure, and cost-optimized Azure environments aligned with governance and compliance standards. Strong background in SQL Server administration—handling deployment, upgrades (in-place and side-by-side), performance tuning, backup/restore strategies, high availability, and security hardening both on Azure VMs and PaaS SQL offerings. Experienced in migrating databases across environments using native tools, scripting, and automation workflows. Combines deep cloud expertise with solid development and scripting skills (PowerShell) to enable automation, integration, and operational excellence. Adept at collaborating with cross-functional teams, mentoring junior engineers, and aligning technical solutions with evolving business goals. Key Accountabilities Design and manage scalable, secure, and highly available Azure infrastructure environments. Implement and maintain Azure IaaS resources such as Virtual Machines, NSGs, Load Balancers, VNETs, VPN Gateways, and ExpressRoute. Perform cost optimization, monitoring, backup/recovery, patching, and capacity planning. Implement governance using Azure Policies, RBAC, and Management Groups. Design and configure Azure PaaS services like Azure App Services, Azure SQL, Azure Web Apps, Azure Functions, Storage Accounts, Key Vault, Logic Apps, and Ensure high availability and DR strategies for PaaS components. Design multi-tier, cloud-native application architectures on Azure.Troubleshoot PaaS performance and availability issues. Integrate Azure OpenAI capabilities into applications and business workflows Develop use cases such as chatbot assistants, intelligent search, summarization, document Q&A, etc. Leverage Copilot Studio to build and deploy enterprise AI copilots integrated with data sources. Ensure responsible AI and compliance alignment. Design and manage data pipelines and orchestrations in ADF for ETL/ELT processes. Integrate with Azure Data Lake, Azure SQL, Blob Storage, and on-prem data sources. Build and manage CI/CD pipelines using Azure DevOps Automate infrastructure deployment using ARM templates Configure and manage release gates, approvals, secrets, and environments. Implement Infrastructure as Code (IaC) and GitOps best practices. Implement identity management using Azure AD, MFA, Conditional Access. Manage secrets using Azure Key Vault, secure access via Managed Identities. Develop reusable and parameterized ARM templates modules for consistent deployments. Maintain template versioning using Git repositories. Use templates in Azure DevOps pipelines and automate deployment validations. Align templates with security and compliance baselines (e.g., Azure Landing Zones). Collaborate with architects, developers, data engineers, and security teams to design solutions. Lead technical discussions and present solutions to stakeholders. Mentor junior engineers and conduct code reviews. Stay updated with Azure roadmap, and guide on service adoption. Essential Education Bachelor's (or higher) degree from a recognized institute of higher learning, ideally focused in Computer Science, MIS/IT, or other STEM related subjects. Essential Experience And Job Requirements Technical capability: Primary Skills: Azure IaaS, PaaS & Core Services Azure OpenAI / Copilot Studio SQL Server & Azure Data Factory (ADF) Secondary Skills: Security & Governance Monitoring & Observability DevOps & CI/CD Business capability: Service Delivery & Management Domain expertise – Legal and Ethics & Compliance Leadership and EQ: For those in team leadership positions (whether activity or line management) Always getting the basics right, from quality development conversations to recognition and ongoing performance feedback. Has the ability to develop, coach, mentor and inspire others. Ensures team compliance with BP's Code of Conduct and demonstrates strong leadership of BP's Leadership Expectations and Values & Behaviours. Creates an environment where people are listening and speak openly about the good, the bad, and the ugly, so that everyone can understand and learn, so that everyone can understand and learn. All role holders Embraces a culture of change and agility, evolving continuously, adapting to our changing world. Effective team player looks beyond own area/organisational boundaries to consider the bigger picture and/or perspective of others. Is self-aware and actively seeks input from others on impact and effectiveness. Applies judgment and common sense – able to use insight and good judgement to enable commercially sound, efficient and pragmatic decisions and solutions and to respond to situations as they arise. Ensures personal compliance with BP's Code of Conduct and demonstrates strong leadership of BP's Leadership Expectations and Values & Behaviours. Cultural fluency – actively seeks to understand cultural differences and sensitivities. Travel Requirement No travel is expected with this role Relocation Assistance: This role is not eligible for relocation Remote Type: This position is a hybrid of office/remote working Skills: Agility core practices, Agility core practices, Analytics, API and platform design, Business Analysis, Cloud Platforms, Coaching, Communication, Configuration management and release, Continuous deployment and release, Data Structures and Algorithms (Inactive), Digital Project Management, Documentation and knowledge sharing, Facilitation, Information Security, iOS and Android development, Mentoring, Metrics definition and instrumentation, NoSql data modelling, Relational Data Modelling, Risk Management, Scripting, Service operations and resiliency, Software Design and Development, Source control and code management {+ 4 more} Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks.

Posted 4 days ago

Apply

7.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Analytics – JD (Azure DE) EXL (NASDAQ:EXLS) is a leading operations management and analytics company that helps businesses enhance growth and profitability in the face of relentless competition and continuous disruption. Using our proprietary, award-winning Business EXLerator Framework™, which integrates analytics, automation, benchmarking, BPO, consulting, industry best practices and technology platforms, we look deeper to help companies improve global operations, enhance data-driven insights, increase customer satisfaction, and manage risk and compliance. EXL serves the insurance, healthcare, banking and financial services, utilities, travel, transportation and logistics industries. Headquartered in New York, New York, EXL has more than 61,000 professionals in locations throughout the United States, Europe, Asia (primarily India and Philippines), Latin America, Australia and South Africa. EXL Analytics provides data-driven, action-oriented solutions to business problems through statistical data mining, cutting edge analytics techniques and a consultative approach. Leveraging proprietary methodology and best-of-breed technology, EXL Analytics takes an industry-specific approach to transform our clients’ decision making and embed analytics more deeply into their business processes. Our global footprint of nearly 12,000 data scientists and analysts assist client organizations with complex risk minimization methods, advanced marketing, pricing and CRM strategies, internal cost analysis, and cost and resource optimization within the organization. EXL Analytics serves the insurance, healthcare, banking, capital markets, utilities, retail and e-commerce, travel, transportation and logistics industries. Please visit www.exlservice.com for more information about EXL Analytics. Job Title: Consultant / Senior Consultant – Azure Data Engineering Location: India – Gurgaon preferred Industry: Insurance Analytics & AI Vertical Role Overview: We are seeking a hands-on Consultant / Senior Consultant with strong expertise in Azure-based data engineering to support end-to-end development and delivery of data pipelines for our insurance clients. The ideal candidate will have a deep understanding of Azure Data Factory, ADLS, Databricks (preferably with DLT and Unity Catalog), SQL, and Python and be comfortable working in a dynamic, client-facing environment. This is a key offshore role requiring both technical execution and solution-oriented thinking to support modern data platform initiatives. Collaborate with data scientists, analysts, and stakeholders to gather requirements and define data models that effectively support business requirements Demonstrate decision-making, analytical and problem-solving abilities Strong verbal and written communication skills to manage client discussions Familiar with working on Agile methodologies - daily scrum, sprint planning, backlog refinement Key Responsibilities & Skillsets: o Design and develop scalable and efficient data pipelines using Azure Data Factory (ADF) and Azure Data Lake Storage (ADLS). o Build and maintain Databricks notebooks for data ingestion, transformation, and quality checks, using Python and SQL. o Work with Delta Live Tables (DLT) and Unity Catalog (preferred) to improve pipeline automation, governance, and performance. o Collaborate with data architects, analysts, and onshore teams to translate business requirements into technical specifications. o Troubleshoot data issues, ensure data accuracy, and apply best practices in data engineering and DevOps. o Support the migration of legacy SQL pipelines to modern Python-based frameworks. o Ensure adherence to data security, compliance, and performance standards, especially within insurance domain constraints. o Provide documentation, status updates, and technical insights to stakeholders as required. o Excellent communication skills and stakeholder management Required Skills & Experience: 3–7 years of strong hands-on experience in data engineering with a focus on Azure cloud technologies. Proficient in Azure Data Factory, Databricks, ADLS Gen2, and working knowledge of Unity Catalog. Strong programming skills in both SQL, Python especially within Databricks Notebooks. Pyspark expertise is good to have. Experience in Delta Lake / Delta Live Tables (DLT) is a plus. Good understanding of ETL/ELT concepts, data modeling, and performance tuning. Exposure to Insurance or Financial Services data projects is highly preferred. Strong communication and collaboration skills in an offshore delivery model. Required Skills & Experience: Experience working in Agile/Scrum teams Familiarity with Azure DevOps, Git, and CI/CD practices Certifications in Azure Data Engineering (e.g., DP-203) or Databricks What we offer: EXL Analytics offers an exciting, fast paced and innovative environment, which brings together a group of sharp and entrepreneurial professionals who are eager to influence business decisions. From your very first day, you get an opportunity to work closely with highly experienced, world class analytics consultants. You can expect to learn many aspects of businesses that our clients engage in. You will also learn effective teamwork and time-management skills - key aspects for personal and professional growth Analytics requires different skill sets at different levels within the organization. At EXL Analytics, we invest heavily in training you in all aspects of analytics as well as in leading analytical tools and techniques. We provide guidance/ coaching to every employee through our mentoring program wherein every junior level employee is assigned a senior level professional as advisors. Sky is the limit for our team members. The unique experiences gathered at EXL Analytics sets the stage for further growth and development in our company and beyond. "EOE/Minorities/Females/Vets/Disabilities"

Posted 5 days ago

Apply

8.0 years

0 - 0 Lacs

Thiruvananthapuram, Kerala

Remote

Candidates with 8+ years of experience in IT industry and with strong .Net/.Net Core/Azure Cloud Service/ Azure DevOps. This is a client facing role and hence should have strong communication skills. This is for a US client and the resource should be hands-on - experience in coding and Azure Cloud. Primary Skills : 8+ years of hands-on development experience with: C#, .NET Core 6/8+, Entity Framework / EF Core JavaScript, jQuery, REST APIs Expertise in MS SQL Server, including: Complex SQL queries, Stored Procedures, Views, Functions, Packages, Cursors, Tables, and Object Types Skilled in unit testing with XUnit, MSTest Strong in software design patterns, system architecture, and scalable solution design Ability to lead and inspire teams through clear communication, technical mentorship, and ownership 2+ years of hands with Azure Cloud Services, including: Azure Functions Azure Durable Functions Azure Service Bus, Event Grid, Storage Queues Blob Storage, Azure Key Vault, SQL Azure Application Insights, Azure Monitoring Familiarity with AngularJS, ReactJS, and other front-end frameworks Experience with Azure API Management (APIM) Knowledge of Azure Containerization and Orchestration (e.g., AKS/Kubernetes) Experience with Azure Data Factory (ADF) and Logic Apps Exposure to Application Support and operational monitoring Azure DevOps - CI/CD pipelines (Classic / YAML) Immediate Joiners (Max 15 days NP) Job Types: Full-time, Permanent Pay: ₹12,801.85 - ₹58,383.10 per month Benefits: Paid sick time Paid time off Provident Fund Work from home Schedule: Day shift Fixed shift Monday to Friday UK shift Application Question(s): How many years of experience do you have? What is your current CTC? What is your expected CTC? What is your notice period? Do you have experience with Azure functions? Location: Trivandrum, Kerala (Required) Work Location: In person Speak with the employer +91 9932724170

Posted 5 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

www.infobeans.com 1- Should have good experience with Azure technologies and SQL is a must. 2- Strong implementation knowledge of ADF, Azure Data bricks, Azure Synapse, and SQL. 3- Strong implementation knowledge of visualization tools like Power BI or Tableau. 4- Good understanding of data modeling, scaling, and transformations. 5- Well-versed in Data Warehousing implementation and concepts. 6- Strong knowledge of database and ETL methodologies. 7- Strong communication skills.

Posted 5 days ago

Apply

5.0 - 7.0 years

0 Lacs

Thiruvananthapuram

On-site

5 - 7 Years 1 Opening Trivandrum Role description Senior Data Streaming Engineer Build, and maintain a real-time, file-based streaming data platform leveraging open-source technologies. The ideal candidate will have experience with Kubernetes (K8s), Apache Kafka, and Java multithreading, and will be responsible for: • Developing a highly performant, scalable streaming architecture optimized for high throughput and low memory overhead • Implementing auto-scaling solutions to support variable data loads efficiently • Integrating reference data enrichment workflows using Snowflake • Ensuring system reliability and real-time processing across distributed environments • Collaborating with cross-functional teams to deliver robust, cloud-native data solutions • Build scalable and optimized ETL/ELT workflows leveraging Azure Data Factory (ADF) and Apache Spark within Databricks. Skills Azure,KAFKA,JAVA,KUBERENETES About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 5 days ago

Apply

0 years

1 - 2 Lacs

Raurkela

On-site

1- 2 Yrs Faulty Dignosis with Parts replacement of All types of Printers Grad./Diploma How Many Types of Printer are there. DMP and Laser Printer Laser and MFP printer Knowledge on b/w 32 and 80 column. Experience on Laser scanner and ADF scanner Should be having Diagnosis if Paper is jam in Printer Printing blurr issue resoultion Issues related if printer is giving Blank printing Issues related if Printer is not picking the Paper if printer is not giving print even after sending print command from System, what are the problems

Posted 5 days ago

Apply

0 years

0 Lacs

India

On-site

Position Overview We are seeking a detail-oriented and highly skilled SQL Tester to join our Data & Analytics team. In this role, you will be responsible for Design and develop automated SQL test cases to validate data transformations and data quality. Validating data integrity, accuracy, and performance across our ETL pipelines and data platforms. You will work closely with data engineers, QA analysts, and DevOps teams to ensure high-quality data delivery in a CI/CD environment. ________________________________________ Key Responsibilities • Design and develop automated SQL test cases to validate data transformations and data quality. • Perform end-to-end testing of ETL processes, primarily using SSIS and other Microsoft BI tools. • Collaborate with developers and analysts to understand data requirements and ensure test coverage. • Integrate testing into CI/CD pipelines to support continuous delivery and deployment. • Identify, document, and track defects and inconsistencies in data. • Contribute to test automation strategies and reusable SQL test frameworks. ________________________________________ Required Skills & Qualifications • Strong experience in Automated SQL testing with a focus on data validation and transformation logic. • Proficiency in ETL testing, particularly with SSIS. • Experience working in CI/CD environments and integrating tests into automated pipelines. • Solid understanding of relational databases and data warehousing concepts. • Excellent analytical and problem-solving skills. ________________________________________ Nice to Have • Experience with tSQLt for unit testing SQL Server code. • Familiarity with Azure Data Factory (ADF) and cloud-based data integration. • Knowledge of SQL linting tools and best practices for SQL code quality. • Exposure to Agile/Scrum methodologies.

Posted 5 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies