Jobs
Interviews

1325 Adf Jobs - Page 14

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 12.0 years

12 - 22 Lacs

Hyderabad, Bengaluru

Work from Office

Role & responsibilities 8-12 years of professional work experience in a relevant field Proficient in Azure Databricks, ADF, Delta Lake, SQL Data Warehouse, Unity Catalog, Mongo DB, Python Experience/ prior knowledge on semi structure data and Structured Streaming, Azure synapse analytics, data lake, data warehouse. Proficient in creating Azure Data Factory pipelines for ETL/ELT processing ; copy activity, custom Azure development etc. Lead the technical team of 4-6 resource. Prior Knowledge in Azure DevOps and CI/CD processes including Github . Good knowledge of SQL and Python for data manipulation, transformation, and analysis knowledge on Power bi would be beneficial. Understand business requirements to set functional specifications for reporting applications

Posted 2 weeks ago

Apply

6.0 - 8.0 years

3 - 8 Lacs

Bengaluru

On-site

Country/Region: IN Requisition ID: 26961 Work Model: Position Type: Salary Range: Location: INDIA - BENGALURU - HP Title: Technical Lead-Data Engg Description: Area(s) of responsibility Azure Data Lead - 5A ( HP Role – Senior Data Engineer) Experience: 6 to 8 Years Azure Lead with experience in Azure ADF, ADLS Gen2, Databricks, PySpark and Advanced SQL Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on Azure Cloud 4 Years of experince in Azure Databricks and PySpark Experience in Performance Tuning Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes Understanding of data modelling and data architecture concepts To be able to clearly articulate pros and cons of various technologies and platforms Experience in supporting tools GitHub, Jira, Teams, Confluence need to be used Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage AWS and Azure cloud platforms. Mandatory Skillset: Azure Databricks, PySpark and Advanced SQL

Posted 2 weeks ago

Apply

12.0 - 14.0 years

9 - 10 Lacs

Bengaluru

On-site

Country/Region: IN Requisition ID: 26981 Work Model: Position Type: Salary Range: Location: INDIA - BENGALURU - AUTOMOTIVE Title: Project Manager Description: Area(s) of responsibility Job description - Azure Tech Project Manager Experience Required- 12- 14 years Project Manager will be responsible for driving project management activities in the Azure Cloud Services (ADF, Azure Databricks, PySpark, ADLS Gen2) Strong understanding of Azure Services process execution from acquiring data from source system to visualization Experience in Azure DevOPS Experience in Data Warehouse and Data Lake and Visualizations Project Management skills including time and risk management, resource prioritization and project structuring Responsible for END TO END project execution and delivery across multiple clients Understand ITIL processes related to incident management, problem management, application life cycle management, operational health management. Strong in Agile and Jira Tool Strong customer service, problem solving, organizational and conflict management skills Should be able to prepare weekly/monthly reports both internal and client management Strong in Agile and Jira Tool Should be able to help team members on technical issue Should be good learner and open to learn new functionalities

Posted 2 weeks ago

Apply

0 years

1 - 9 Lacs

Noida

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. This role is crucial for us on the Cloud Data Engineering team (Data Exchange)for all the Cloud development, migrations, and support works related to WellMed Data Services. Team and Role require to maintain and supports EDW, IS cloud modernization in wellmed, that involves cloud development on Data Streaming using Apache Kafka, Kubernetes, Databricks, Snowflake, SQL Server, Airflow for monitoring and support. GIT and DevOps for pipeline automations. Primary Responsibility: Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Hands-on experience in Cloud Data Engineering Contributed for all the Cloud development, migrations, and support works Proven to be comfortable independently maintain and supports EDW, cloud modernization, SQL development, Azure cloud development, ETL using Azure Data Factory Proven to successfully implemented data Streaming using Apache Kafka, Kubernetes, Databricks, Snowflake, SQL Server, Airflow for monitoring and support. GIT and DevOps for pipeline automations At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 2 weeks ago

Apply

12.0 - 14.0 years

0 Lacs

Greater Bengaluru Area

On-site

Area(s) of responsibility Job Description - Azure Tech Project Manager Experience Required- 12- 14 years Project Manager will be responsible for driving project management activities in the Azure Cloud Services (ADF, Azure Databricks, PySpark, ADLS Gen2) Strong understanding of Azure Services process execution from acquiring data from source system to visualization Experience in Azure DevOPS Experience in Data Warehouse and Data Lake and Visualizations Project Management skills including time and risk management, resource prioritization and project structuring Responsible for END TO END project execution and delivery across multiple clients Understand ITIL processes related to incident management, problem management, application life cycle management, operational health management. Strong in Agile and Jira Tool Strong customer service, problem solving, organizational and conflict management skills Should be able to prepare weekly/monthly reports both internal and client management Strong in Agile and Jira Tool Should be able to help team members on technical issue Should be good learner and open to learn new functionalities

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Key Responsibilities • Build and optimize ETL/ELT pipelines using Databricks and ADF , ingesting data from diverse sources including APIs, flat files, and operational databases. • Develop and maintain scalable PySpark jobs for batch and incremental data processing across Bronze, Silver, and Gold layers. • Write clean, production-ready Python code for data processing, orchestration, and integration tasks. • Contribute to the medallion architecture design and help implement data governance patterns across data layers. • Collaborate with analytics, data science, and business teams to design pipelines that meet performance and data quality expectations. • Monitor, troubleshoot, and continuously improve pipeline performance and reliability. • Support CI/CD for data workflows using Git , Databricks Repos , and optionally Terraform for infrastructure-as-code. • Document pipeline logic, data sources, schema transformations, and operational playbooks. ⸻ Required Qualifications • 3–5 years of experience in data engineering roles with increasing scope and complexity. • Strong hands-on experience with Databricks , including Spark, Delta Lake, and SQL-based transformations. • Proficiency in PySpark and Python for large-scale data manipulation and pipeline development. • Hands-on experience with Azure Data Factory for orchestrating data workflows and integrating with Azure services. • Solid understanding of data modeling concepts and modern warehousing principles (e.g., star schema, slowly changing dimensions). • Comfortable with Git-based development workflows and collaborative coding practices. ⸻ Preferred / Bonus Qualifications • Experience with Terraform to manage infrastructure such as Databricks workspaces, ADF pipelines, or storage resources. • Familiarity with Unity Catalog , Databricks Asset Bundles (DAB) , or Delta Live Tables (DLT) . • Experience with Azure DevOps or GitHub Actions for CI/CD in a data environment. • Knowledge of data governance , role-based access control , or data quality frameworks . • Exposure to real-time ingestion using tools like Event Hubs , Azure Functions , or Autoloader .

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

1. Strategy, Framework, and Governance Operating Model - Develop and maintain enterprise-wide data governance strategies, standards, and policies. - Align governance practices with business goals like regulatory compliance and analytics readiness. - Define roles and responsibilities within the governance operating model. - Drive governance maturity assessments and lead change management initiatives. 2. Stakeholder Alignment & Organizational Enablement - Collaborate across IT, legal, business, and compliance teams to align governance priorities. - Define stewardship models and create enablement, training, and communication programs. - Conduct onboarding sessions and workshops to promote governance awareness. 3. Architecture Design for Data Governance Platforms - Design scalable and modular data governance architecture. - Evaluate tools like Microsoft Purview, Collibra, Alation, BigID, Informatica. - Ensure integration with metadata, privacy, quality, and policy systems. 4. Microsoft Purview Solution Architecture - Lead end-to-end implementation and management of Microsoft Purview. - Configure RBAC, collections, metadata scanning, business glossary, and classification rules. - Implement sensitivity labels, insider risk controls, retention, data map, and audit dashboards. 5. Metadata, Lineage & Glossary Management - Architect metadata repositories and ingestion workflows. - Ensure end-to-end lineage (ADF → Synapse → Power BI). - Define governance over business glossary and approval workflows.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

India

Remote

This is full time remote contract position. (So no freelancing or moonlighting possible) You may need to provide few hours of overlapping time with US timezone . You may need to go through the background verification process in which your claimed experience, education certificates and references given will be verified. So pls don't apply if you are not comfortable to go through this verification process. This is client facing role hence excellent communication in English language is MUST . Min. Experience : 5+ years About role : Our client is about to start an ERP replacement. They plan to move away from the AWS platform and move to an Azure data lake feeding Snowflake. We need a resource who can be Snowflake thought leader and having Microsoft azure data engineering expertise. Key Responsibilities: Data Ingestion & Orchestration (Transformation & Cleansing) : o Design and maintain Azure Data Factory (ADF) pipelines : Extract data from sources like ERPs (SAP, Oracle), UKG, SharePoint, and REST APIs. o Configure scheduled/event-driven loads : Set up ADF for automated data ingestion. o Transform and cleanse data : Develop logic in ADF for Bronze-to-Silver layer transformations. o Implement data quality checks : Ensure accuracy and consistency. · Snowflake Data Warehousing: o Design/optimize data models: Create tables, views, and stored procedures for Silver and Gold layers. o ETL/ELT in Snowflake: Transform curated Silver data into analytical Gold structures. o Performance tuning: Optimize queries and data loads. Design, develop, and optimize data models within Snowflake, including creating tables, views, and stored procedures for both Silver and Gold layers. o Implement ETL/ELT processes within Snowflake to transform curated data (Silver) into highly optimized analytical structures (Gold) Data Lake Management: o Implement Azure Data Lake Gen2 solutions : Follow medallion architecture (Bronze, Silver). o Manage partitioning, security, governance: Ensure efficient and secure data storage. · Collaboration & Documentation: Partner with stakeholders to convert data needs into technical solutions, document pipelines and models, and uphold best practices through code reviews. Monitoring & Support: Track pipeline performance, resolve issues, and deploy alerting/logging for proactive data integrity and issue detection. · Data visualization tools : Proficient in like Power BI, DAX, and Power Query for creating insightful reports. Skilled in Python for data processing and analysis to support data engineering tasks. Required Skills & Qualifications: Over 5+ years of experience in data engineering, data warehousing, or ETL development. Microsoft Azure proficiency: Azure Data Factory (ADF): Experience in designing, developing, and deploying complex data pipelines. Azure Data Lake Storage Gen2: Hands-on experience with data ingestion, storage, and organization. Expertise in Snowflake Data Warehouse and ETL/ELT: Understanding Snowflake architecture. SQL proficiency for manipulation and querying. Experience with Snowpipe, tasks, streams, and stored procedures. Strong understanding of data warehousing concepts and ETL/ELT principles. Data Formats & Integration : Experience with various data formats (e.g., Parquet, CSV, JSON) and data integration patterns. Data Visualization: Experience with Power BI, DAX, Power Query. Scripting: Python for data processing and analysis. Soft Skills: Problem-solving, attention to detail, communication, and collaboration Nice-to-Have Skills : Version control (e.g., Git), Agile/Scrum methodologies and Data governance and security best practices.

Posted 2 weeks ago

Apply

5.0 years

0 Lacs

India

On-site

Years of experience: 5+ Years JD: High-Level Responsibilities: • Design and develop scalable data pipelines using Azure Data Factory, incorporating SSIS packages where applicable. • Write and optimize T-SQL queries for data transformation, validation, and loading. • Collaborate with the customer’s data architects to understand and modernize legacy data integration patterns. • Perform relational database design and schema optimization for Azure SQL or Synapse targets. • Support migration of on-premise or legacy ETL jobs into cloud-native Azure Integration Services. • Conduct unit testing and troubleshoot data pipeline issues during sprint cycles. • Provide support during UK overlap hours (up to 8 PM IST) to align with customer team’s collaboration windows. Mapped Skills: • Azure Data Factory development (SSIS helpful) • T-SQL development • Relational database design • SQL Server Management Studio • Azure Data Studio • Azure Portal • Visual Studio 2022 • Experience migrating existing integrations to AIS Recommended Skills: • Azure Synapse Analytics (often paired with ADF in modern pipelines) • Data flow transformations in ADF • Data lake concepts and Azure Data Lake Gen2 • Monitoring & debugging ADF pipelines • Integration Runtime setup and optimization • Azure Key Vault integration in ADF • Performance tuning in SQL Server and Azure SQL DB • Knowledge of Delta Lake format if modern analytics is a goal

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title : Snowflake Developer/Data Engineer Location : Chennai(Hybrid) Experience: 6+ Years About the Role We are looking for a Snowflake Developer with 6+ years of hands-on experience in Snowflake, SnowSQL, Cortex, DBT, and data warehousing . The ideal candidate should have strong expertise in data modeling, transformation, and optimization, along with excellent communication skills to collaborate with business and technical teams. Key Responsibilities Develop and optimize Snowflake data models, schemas, and performance-tuned queries. Write and execute SnowSQL scripts for data transformation and automation. Utilize Snowflake Cortex to integrate AI-driven analytics and insights. Implement DBT (Data Build Tool) for data transformation, testing, and orchestration. Design and maintain ​ ADF data pipelines and ETL/ELT workflows . Collaborate with cross-functional teams to understand data needs and provide solutions. Ensure data security, governance, and best practices in Snowflake. Troubleshoot performance issues and implement tuning strategies. Required Skills & Qualifications 6+ years of hands-on experience with Snowflake and cloud data warehousing. Strong expertise in SnowSQL and DBT . Expertise in Cortex is a Plus Experience in data modeling, performance tuning, and query optimization . Hands-on experience with ETL/ELT processes and data pipelines . Strong understanding of SQL, data warehousing concepts, and cloud architecture . Experience integrating Snowflake with other BI/Analytics tools . Excellent problem-solving skills and attention to detail . Strong communication skills to interact with business and technical stakeholders. Knowledge / hands on experience in PowerBI, Fabric is a plus.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job description Job Name: Senior Data Engineer DBT & Snowflake Years of Experience: 5 Job Description: We are looking for a skilled and experienced DBT-Snowflake Developer to join our team! As part of the team, you will be involved in the implementation of the ongoing and new initiatives for our company. If you love learning, thinking strategically, innovating, and helping others, this job is for you! Primary Skills: DBT,Snowflake Secondary Skills: ADF,Databricks,Python,Airflow,Fivetran,Glue Role Description: Data engineering role requires creating and managing technological infrastructure of a data platform, be in-charge / involved in architecting, building, and managing data flows / pipelines and construct data storages (noSQL, SQL), tools to work with big data (Hadoop, Kafka), and integration tools to connect sources or other databases. Role Responsibility: Translate functional specifications and change requests into technical specifications Translate business requirement document, functional specification, and technical specification to related coding Develop efficient code with unit testing and code documentation Ensuring accuracy and integrity of data and applications through analysis, coding, documenting, testing, and problem solving Setting up the development environment and configuration of the development tools Communicate with all the project stakeholders on the project status Manage, monitor, and ensure the security and privacy of data to satisfy business needs Contribute to the automation of modules, wherever required To be proficient in written, verbal and presentation communication (English) Co-ordinating with the UAT team Role Requirement: Proficient in basic and advanced SQL programming concepts (Procedures, Analytical functions etc.) Good Knowledge and Understanding of Data warehouse concepts (Dimensional Modeling, change data capture, slowly changing dimensions etc.) Knowledgeable in Shell / PowerShell scripting Knowledgeable in relational databases, nonrelational databases, data streams, and file stores Knowledgeable in performance tuning and optimization Experience in Data Profiling and Data validation Experience in requirements gathering and documentation processes and performing unit testing Understanding and Implementing QA and various testing process in the project Knowledge in any BI tools will be an added advantage Sound aptitude, outstanding logical reasoning, and analytical skills Willingness to learn and take initiatives Ability to adapt to fast-paced Agile environment Additional Requirement: • Design, develop, and maintain scalable data models and transformations using DBT in conjunction with Snowflake, ensure the effective transformation and load data from diverse sources into data warehouse or data lake. • Implement and manage data models in DBT, guarantee accurate data transformation and alignment with business needs. • Utilize DBT to convert raw, unstructured data into structured datasets, enabling efficient analysis and reporting. • Write and optimize SQL queries within DBT to enhance data transformation processes and improve overall performance. • Establish best DBT processes to improve performance, scalability, and reliability. • Expertise in SQL and a strong understanding of Data Warehouse concepts and Modern Data Architectures. • Familiarity with cloud-based platforms (e.g., AWS, Azure, GCP). • Migrate legacy transformation code into modular DBT data models

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

On-site

We are seeking highly motivated and skilled DevOps Support Engineers to join our team. The ideal candidates will have a strong background in modern DevOps tools and practices, with expertise in Kubernetes, Snowflake, Python, Azure, Azure Data Factory (ADF), and other relevant technologies. This role requires a blend of technical expertise, problem-solving skills, and a customer-focused mindset to ensure the smooth operation and scalability of our infrastructure. Location: Off-Shore (India) Positions: 4 Key Responsibilities: 1. Platform Support and Maintenance:  Provide day-to-day operational support for our systems, ensuring high availability, performance, and reliability.  Monitor, troubleshoot, and resolve issues related to Kubernetes clusters, Snowflake data pipelines, and Azure infrastructure.  Collaborate with cross-functional teams to address incidents and implement robust solutions. 2. Infrastructure Automation and Optimization:  Develop and maintain automation scripts and tools using Python to streamline deployment, monitoring, and scaling processes.  Optimize Kubernetes cluster configurations, including resource allocation and scaling strategies.  Implement best practices for cloud resource utilization on Azure to reduce costs and improve efficiency. 3. Data Pipeline Management:  Support and enhance data pipelines built on Snowflake and Azure Data Factory (ADF).  Monitor data flow, troubleshoot pipeline failures, and ensure data integrity and availability.  Collaborate with data engineering teams to implement new data workflows and improve existing pipelines. 4. Security and Compliance:  Ensure the platform adheres to security standards and compliance requirements.  Perform regular audits of infrastructure and implement security patches as needed.  Manage role-based access control (RBAC) and permissions in Kubernetes, Snowflake, and Azure environments. 5. Collaboration and Communication:  Work closely with development, QA, and product teams to ensure seamless integration and deployment of new features.  Participate in on-call rotations to provide 24/7 support for critical issues.  Document processes, configurations, and troubleshooting guides to improve knowledge sharing across the team. Required Skills and Qualifications: 1. Technical Expertise:  Proficient in managing Kubernetes clusters, including deployment, scaling, and monitoring.  Hands-on experience with Snowflake, including data modeling, query optimization, and pipeline management.  Strong programming skills in Python for automation and scripting.  Solid understanding of Azure cloud services, including compute, storage, networking, and identity management.  Familiarity with Azure Data Factory (ADF) for building and managing ETL/ELT pipelines. 2. DevOps Practices :  Experience with CI/CD pipelines and tools (e.g., Jenkins, GitHub Actions, Azure DevOps).  Knowledge of infrastructure-as-code (IaC) tools such as Terraform or ARM templates.  Proficiency in monitoring tools like Prometheus, Grafana, or Azure Monitor. 3. Soft Skills:  Excellent problem-solving and analytical skills, with a proactive mindset.  Strong communication skills to work effectively with cross-functional teams.  Ability to prioritize tasks and manage multiple responsibilities in a fast-paced environment.

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

India

Remote

Job Title: Lead Data Engineer Experience: 8–10 Years Location: Remote Mandatory: Prior hands-on experience with Fivetran integrations About the Role: We are seeking a highly skilled Lead Data Engineer with 8–10 years of deep expertise in cloud-native data platforms, including Snowflake, Azure, DBT , and Fivetran . This role will drive the design, development, and optimization of scalable data pipelines, leading a cross-functional team and ensuring data engineering best practices are implemented and maintained. Key Responsibilities: Lead the design and development of data pipelines (batch and real-time) using Azure, Snowflake, DBT, Python , and Fivetran . Translate complex business and data requirements into scalable, efficient data engineering solutions. Architect multi-cluster Snowflake setups with an eye on performance and cost. Design and implement robust CI/CD pipelines for data workflows (Git-based). Collaborate closely with analysts, architects, and business teams to ensure data architecture aligns with organizational goals. Mentor and review work of onshore/offshore data engineers. Define and enforce coding standards, testing frameworks, monitoring strategies , and data quality best practices. Handle real-time data processing scenarios where applicable. Own end-to-end delivery and documentation for data engineering projects. Must-Have Skills: Fivetran : Proven experience integrating and managing Fivetran connectors and sync strategies. Snowflake Expertise : Warehouse management, cost optimization, query tuning Internal vs. external stages, loading/unloading strategies Schema design, security model, and user access Python (advanced): Modular, production-ready code for ETL/ELT, APIs, and orchestration DBT : Strong command of DBT for transformation workflows and modular pipelines Azure : Azure Data Factory (ADF), Databricks Integration with Snowflake and other services SQL : Expert-level SQL for transformations, validations, and optimizations Version Control : Git, branching, pull requests, and peer code reviews CI/CD : DevOps/DataOps workflows for data pipelines Data Modeling : Star schema, Data Vault, normalization/denormalization techniques Strong documentation using Confluence, Word, Excel, etc. Excellent communication skills – verbal and written Good to Have: Experience with real-time data streaming tools (Event Hub, Kafka) Exposure to monitoring/data observability tools Experience with cost management strategies for cloud data platforms Exposure to Agile/Scrum-based environments

Posted 2 weeks ago

Apply

0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Optum is a global organization that delivers care, aided by technology to help millions of people live healthier lives. The work you do with our team will directly improve health outcomes by connecting people with the care, pharmacy benefits, data and resources they need to feel their best. Here, you will find a culture guided by inclusion, talented peers, comprehensive benefits and career development opportunities. Come make an impact on the communities we serve as you help us advance health optimization on a global scale. Join us to start Caring. Connecting. Growing together. This role is crucial for us on the Cloud Data Engineering team (Data Exchange)for all the Cloud development, migrations, and support works related to WellMed Data Services. Team and Role require to maintain and supports EDW, IS cloud modernization in wellmed, that involves cloud development on Data Streaming using Apache Kafka, Kubernetes, Databricks, Snowflake, SQL Server, Airflow for monitoring and support. GIT and DevOps for pipeline automations. Primary Responsibility Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications Hands-on experience in Cloud Data Engineering Contributed for all the Cloud development, migrations, and support works Proven to be comfortable independently maintain and supports EDW, cloud modernization, SQL development, Azure cloud development, ETL using Azure Data Factory Proven to successfully implemented data Streaming using Apache Kafka, Kubernetes, Databricks, Snowflake, SQL Server, Airflow for monitoring and support. GIT and DevOps for pipeline automations At UnitedHealth Group, our mission is to help people live healthier lives and make the health system work better for everyone. We believe everyone-of every race, gender, sexuality, age, location and income-deserves the opportunity to live their healthiest life. Today, however, there are still far too many barriers to good health which are disproportionately experienced by people of color, historically marginalized groups and those with lower incomes. We are committed to mitigating our impact on the environment and enabling and delivering equitable care that addresses health disparities and improves health outcomes - an enterprise priority reflected in our mission.

Posted 2 weeks ago

Apply

8.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Data Engineer – Azure Data Platform Location: Padi, Chennai Job Type: Full-Time Role Overview: We are looking for an experienced Data Engineer to join our Azure Data Platform team. The ideal candidate will have a deep understanding of Azure’s data engineering and cloud technology stack. This role is pivotal in driving data-driven decision-making, operational analytics, and advanced manufacturing intelligence initiatives. Key Responsibilities: Lead the design and implementation of data architectures that support operational analytics and advanced manufacturing intelligence, ensuring scalability and flexibility to handle increasing data volumes. Design, implement, and maintain scalable data and analytics platforms using Microsoft Azure services, such as Azure Data Factory (ADF), Azure Data Lake Storage Gen2, and Azure Synapse Analytics. Develop and manage ETL processes, data pipelines, and batch jobs to ensure efficient data flow and transformation, optimizing pipeline runs and monitoring compute and storage usage. Implement metadata management solutions to ensure data quality and governance, leading to consistent data quality and integrity. Integrate data from key sources such as SAP, SQL Server, and cloud databases, IoT and other live streaming data into centralized data structures to support analytics and decision-making. Provide expertise on data ingestion (SAP, SQL), data transformation, and the automation of data pipelines in a manufacturing context. Ensure the data platform supports dashboarding and advanced analytics, enabling business users to independently create and evolve dashboards. Implement manufacturing-specific analytics solutions, including leadership and operational dashboards, and other analytics solutions across our value chain leveraging Azure’s comprehensive toolset. Define and monitor KPIs, ensuring data quality and the accuracy of insights delivered to business stakeholders. Identify and manage project risks related to data security, system integration, and scalability. Independently maintain the data platform, ensuring its reliability and performance, and implementing best practices for data security and compliance. Advise the Data Platform project manager and leadership team on best practices for data management and scaling needs, providing guidance on integrating data from IoT and other SaaS platforms, as well as newer systems as they come into the digital landscape. Work closely with data scientists to ensure data is available in the required format for their analyses and collaborate with Power BI developers to support dashboarding and reporting needs. Create data marts for business users to facilitate self-service analytics. Mentor and train junior engineers, fostering their professional growth and development, and providing guidance and support on best practices and technical challenges. Qualifications & Experience: Education: Bachelor’s degree in Engineering, Computer Science, or a related field. Experience: 8-10 years of experience, with a minimum of 5 years working on core data engineering responsibilities on a cloud platform. Project Management experience is a big plus. Proven track record of implementing data-driven solutions in areas such as plant automation, operational analytics, quality control, supply chain optimization. Technical Proficiency: Expertise in cloud-based data platforms, particularly within the Azure ecosystem (Azure Data Factory, Synapse Analytics, Databricks). Familiarity with SAP as a data source. Proficiency in programming languages such as SQL, Python, and R for analytics and reporting. Soft Skills: Strong analytical mindset with the ability to translate manufacturing challenges into data-driven insights and solutions. Excellent communication and organizational skills. What We Offer: The opportunity to work on transformative data analytics projects that drive innovation and operational excellence in manufacturing. A collaborative and dynamic work environment focused on professional growth and career development.

Posted 2 weeks ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description Your Responsibilities We are seeking an experienced and highly motivated Sr Data Engineer - Data Ingestion to join our dynamic team. The ideal candidate will have strong hands-on experience with Azure Data Factory (ADF), a deep understanding of relational and non-relational data ingestion techniques, and proficiency in Python programming. You will be responsible for designing and implementing scalable data ingestion solutions that interface with Azure Data Lake Storage Gen 2 (ADLS Gen 2), Databricks, and various other Azure ecosystem services. The Data Ingestion Engineer will work closely with stakeholders to gather data ingestion requirements, create modularized ingestion solutions, and define best practices to ensure efficient, robust, and scalable data pipelines. This role requires effective communication skills, ownership, and accountability for the delivery of high-quality data solutions. Data Ingestion Strategy & Development: Design, develop, and deploy scalable and efficient data pipelines in Azure Data Factory (ADF) to move data from multiple sources (relational, non-relational, files, APIs, etc.) into Azure Data Lake Storage Gen 2 (ADLS Gen 2), Azure SQL Database, and other target systems. Implement ADF activities (copy, lookup, execute pipeline, etc.) to integrate data from on-premises and cloud-based systems. Build parameterized and reusable pipeline templates in ADF to standardize the data ingestion process, ensuring maintainability and scalability of ingestion workflows. Integrate custom data transformation activities within ADF pipelines, utilizing Python, Databricks, or Azure Functions when required. ADF Data Flows Design & Development: Leverage Azure Data Factory Data Flows for visually designing and orchestrating data transformation tasks, enabling complex ETL (Extract, Transform, Load) logic to process large datasets at scale. Design data flow transformations such as filtering, aggregation, joins, lookups, and sorting to process and transform data before loading it into target systems like ADLS Gen 2 or Azure SQL Database. Implement incremental loading strategies in Data Flows to ensure efficient and optimized data ingestion for large volumes of data while minimizing resource consumption. Develop reusable data flow components to streamline transformation processes, ensuring consistency and reducing development time for new data ingestion pipelines. Utilize debugging tools in Data Flows to troubleshoot, test, and optimize data transformations, ensuring accurate results and performance. ADF Orchestration & Automation: Use ADF triggers and scheduling to automate pipeline execution based on time or events, ensuring timely and efficient data ingestion. Configure ADF monitoring and alerting capabilities to proactively track pipeline performance, handle failures, and address issues in a timely manner. Implement ADF version control practices using Git to manage code changes, collaborate effectively with other team members, and ensure code integrity. Data Integration with Various Sources: Ingest data from diverse sources such as on-premise SQL Servers, REST APIs, cloud databases (e.g., Azure SQL Database, Cosmos DB), file-based systems (CSV, Parquet, JSON), and third-party services using ADF. Design and implement ADF linked services to securely connect to external data sources (databases, file systems, APIs, etc.). Develop and configure ADF datasets and dataflows to efficiently transform, clean, and load data into Azure Data Lake or other destinations. Pipeline Monitoring and Optimization: Continuously monitor and optimize ADF pipelines to ensure they run with high performance and minimal cost. Apply techniques like data partitioning, parallel processing, and incremental loading where appropriate. Implement data quality checks within the pipelines to ensure data integrity and handle data anomalies or errors in a systematic manner. Review pipeline execution logs and performance metrics regularly, and apply tuning recommendations to improve execution times and reduce operational costs. Collaboration and Communication: Work closely with business and technical stakeholders to capture and translate data ingestion requirements into ADF pipeline designs. Provide ADF-specific technical expertise to both internal and external teams, guiding them in the use of ADF for efficient and cost-effective data pipelines. Document ADF pipeline designs, error handling strategies, and best practices to ensure the team can maintain and scale the solutions. Conduct training sessions or knowledge transfer with junior engineers or other team members on ADF best practices and architecture. Security and Compliance: Ensure all data ingestion solutions built in ADF follow security and compliance guidelines, including encryption at rest and in transit, data masking, and identity and access management. Implement role-based access control (RBAC) and managed identities within ADF to manage access securely and reduce the risk of unauthorized access to sensitive data. Integration with Azure Ecosystem: Leverage other Azure services, such as Azure Logic Apps, Azure Function Apps, and Azure Databricks, to augment the capabilities of ADF pipelines, enabling more advanced data processing, event-driven workflows, and custom transformations. Incorporate Azure Key Vault to securely store and manage sensitive data (e.g., connection strings, credentials) used in ADF pipelines. Integrate ADF with Azure Data Lake Analytics, Synapse Analytics, or other data warehousing solutions for advanced querying and analytics after ingestion. Best Practices & Continuous Improvement: Develop and enforce best practices for building and maintaining ADF pipelines and data flows, ensuring the solutions are modular, reusable, and follow coding standards. Identify opportunities for pipeline automation to reduce manual intervention and improve operational efficiency. Regularly review and suggest new tools or services within the Azure ecosystem to enhance ADF pipeline performance and increase the overall efficiency of data ingestion workflows. Incident and Issue Management: Actively monitor the health of the data pipelines, swiftly addressing any failures, data quality issues, or performance bottlenecks. Troubleshoot ADF pipeline errors, including issues within Data Flows, and work with other teams to root-cause issues related to data availability, quality, or connectivity. Participate in post-mortem analysis for any major incidents, documenting lessons learned and implementing preventative measures for the future. Your Profile Experience with Azure Data Services: Strong experience with Azure Data Factory (ADF) for orchestrating data pipelines. Hands-on experience with ADLS Gen 2, Databricks, and various data formats (e.g., Parquet, JSON, CSV). Solid understanding of Azure SQL Database, Azure Logic Apps, Azure Function Apps, and Azure Container Apps. Programming and Scripting: Proficient in Python for data ingestion, automation, and transformation tasks. Ability to write clean, reusable, and maintainable code. Data Ingestion Techniques: Solid understanding of relational and non-relational data models and their ingestion techniques. Experience working with file-based data ingestion, API-based data ingestion, and integrating data from various third-party systems. Problem Solving & Analytical Skills Communication Skills #IncludingYou Diversity, equity, inclusion and belonging are cornerstones of ADM’s efforts to continue innovating, driving growth, and delivering outstanding performance. We are committed to attracting and retaining a diverse workforce and create welcoming, truly inclusive work environments — environments that enable every ADM colleague to feel comfortable on the job, make meaningful contributions to our success, and grow their career. We respect and value the unique backgrounds and experiences that each person can bring to ADM because we know that diversity of perspectives makes us better, together. For more information regarding our efforts to advance Diversity, Equity, Inclusion & Belonging, please visit our website here: Diversity, Equity and Inclusion | ADM. About ADM At ADM, we unlock the power of nature to provide access to nutrition worldwide. With industry-advancing innovations, a complete portfolio of ingredients and solutions to meet any taste, and a commitment to sustainability, we give customers an edge in solving the nutritional challenges of today and tomorrow. We’re a global leader in human and animal nutrition and the world’s premier agricultural origination and processing company. Our breadth, depth, insights, facilities and logistical expertise give us unparalleled capabilities to meet needs for food, beverages, health and wellness, and more. From the seed of the idea to the outcome of the solution, we enrich the quality of life the world over. Learn more at www.adm.com. Req/Job ID 97477BR Ref ID

Posted 2 weeks ago

Apply

40.0 years

0 Lacs

Hyderābād

On-site

Engineering Graduate / Post Graduate preferably in Computer Science or MCA having 2+ yrs of development experience in : Oracle and ADF based applications Knowledge of RDBMS and data modeling concepts Oracle database, knowledge of SQL, and PL/SQL Cient side web development languages (JavaScript, HTML, DHTML, and CSS) Desirable : Rest API Implementation SOA (REST-based micro-services) Collaborative development , (Gitflow, peer reviewing) Maven - SQL - Continuous Integration/delivery (Jenkins,Docker) Diversity and Inclusion: An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry.In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Java developers need to be successful building cloud-native applications. Leverage deep integrations with familiar tools like Spring, Maven, Kubernetes, and IntelliJ to get started quickly. As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products. Duties and tasks are varied and complex needing independent judgment. Fully competent in own area of expertise. May have project lead role and or supervise lower level personnel. BS or MS degree or equivalent experience relevant to functional area. 4 years of software engineering or related experience.

Posted 2 weeks ago

Apply

4.0 years

10 - 10 Lacs

Bengaluru

On-site

Location: Bengaluru, KA, IN Company: ExxonMobil About us At ExxonMobil, our vision is to lead in energy innovations that advance modern living and a net-zero future. As one of the world’s largest publicly traded energy and chemical companies, we are powered by a unique and diverse workforce fueled by the pride in what we do and what we stand for. The success of our Upstream, Product Solutions and Low Carbon Solutions businesses is the result of the talent, curiosity and drive of our people. They bring solutions every day to optimize our strategy in energy, chemicals, lubricants and lower-emissions technologies. We invite you to bring your ideas to ExxonMobil to help create sustainable solutions that improve quality of life and meet society’s evolving needs. Learn more about our What and our Why and how we can work together . ExxonMobil’s affiliates in India ExxonMobil’s affiliates have offices in India in Bengaluru, Mumbai and the National Capital Region. ExxonMobil’s affiliates in India supporting the Product Solutions business engage in the marketing, sales and distribution of performance as well as specialty products across chemicals and lubricants businesses. The India planning teams are also embedded with global business units for business planning and analytics. ExxonMobil’s LNG affiliate in India supporting the upstream business provides consultant services for other ExxonMobil upstream affiliates and conducts LNG market-development activities. The Global Business Center - Technology Center provides a range of technical and business support services for ExxonMobil’s operations around the globe. ExxonMobil strives to make a positive contribution to the communities where we operate and its affiliates support a range of education, health and community-building programs in India. Read more about our Corporate Responsibility Framework. To know more about ExxonMobil in India, visit ExxonMobil India and the Energy Factor India. What role you will play in our team Design, build, and maintain data systems, architectures, and pipelines to extract insights and drive business decisions. Collaborate with stakeholders to ensure data quality, integrity, and availability. What you will do Design, develop, and maintain robust ETL pipelines using tools like Airflow, Azure Data Factory, Qlik Replicate, and Fivetran. Automate data extraction, transformation, and loading processes across cloud platforms (Azure, Snowflake). Build and optimize Snowflake data models in collaboration with system architects to support business needs. Develop and maintain CI/CD pipelines using GitHub and Azure DevOps (ADO). Create and manage data input and review screens in Sigma, including performance dashboards. Integrate third-party ETL tools for Cloud-to-Cloud (C2C) and On-Premises to Cloud (OP2C) data flows. Implement monitoring and alerting systems for pipeline health and data quality. Support data cleansing, enrichment, and curation to enable business use cases. Troubleshoot and resolve data issues, including missing or incorrect data, long-running queries, and Sigma screen problems. Collaborate with cross-functional teams to deliver data solutions for platforms like CEDAR. Manage Snowflake security, including roles, shares, and access controls. Optimize and tune SQL queries across Snowflake, MSSQL, Postgres, Oracle, and Azure SQL. Develop large-scale aggregate queries across multiple schemas and datasets. About You Skills and Qualifications Core Technical Skills Languages: Proficient in Python, with experience in C#, C++, F#, or Java. Databases: Strong experience with SQL and NoSQL, including Snowflake, Azure SQL, PostgreSQL, MSSQL, Oracle. ETL Tools: Expertise in Airflow, Qlik Replicate, Fivetran, Azure Data Factory. Cloud Platforms: Deep knowledge of Azure services including Azure Data Explorer (ADX), ADF, Databricks. Data Modeling: Hands-on experience with Snowflake modeling, including stored procedures, UDFs, Snowpipe, streams, shares. Monitoring & Optimization: Skilled in query tuning, performance measurement, and pipeline monitoring. CI/CD: Experience managing pipelines using GitHub and Azure DevOps. Additional Tools & Technologies Sigma: Experience designing and managing Sigma dashboards and screens (or strong background in Power BI/Tableau with willingness to learn Sigma). Streamlit: Experience developing Streamlit apps using Python. DBT: Experience managing Snowflake with DBT scripting. Preferred Qualifications 4+ years of hands-on experience as a Data Engineer. Proficiency in Snowflake with Data Modelling Experience in Change Management and working in Agile environments. Prior experience in the Energy industry is a plus. Bachelor’s or Master’s degree in Computer Science, IT, or related engineering disciplines with a minimum GPA of 7.0. Your benefits An ExxonMobil career is one designed to last. Our commitment to you runs deep: our employees grow personally and professionally, with benefits built on our core categories of health, security, finance and life. We offer you: Competitive compensation Medical plans, maternity leave and benefits, life, accidental death and dismemberment benefits Retirement benefits Global networking & cross-functional opportunities Annual vacations & holidays Day care assistance program Training and development program Tuition assistance program Workplace flexibility policy Relocation program Transportation facility Please note benefits may change from time to time without notice, subject to applicable laws. The benefits programs are based on the Company’s eligibility guidelines. Stay connected with us Learn more about ExxonMobil in India, visit ExxonMobil India and Energy Factor India . Follow us on LinkedIn and Instagram Like us on Facebook Subscribe our channel at YouTube EEO Statement ExxonMobil is an Equal Opportunity Employer: All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, age, national origin or disability status. Business solicitation and recruiting scams ExxonMobil does not use recruiting or placement agencies that charge candidates an advance fee of any kind (e.g., placement fees, immigration processing fees, etc.). Follow the LINK to understand more about recruitment scams in the name of ExxonMobil. Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships. Competencies (B) Adapts (B) Applies Learning (B) Analytical (B) Collaborates (B) Communicates Effectively (B) Innovates Nothing herein is intended to override the corporate separateness of local entities. Working relationships discussed herein do not necessarily represent a reporting connection, but may reflect a functional guidance, stewardship, or service relationship. Exxon Mobil Corporation has numerous affiliates, many with names that include ExxonMobil, Exxon, Esso and Mobil. For convenience and simplicity, those terms and terms like corporation, company, our, we and its are sometimes used as abbreviated references to specific affiliates or affiliate groups. Abbreviated references describing global or regional operational organizations and global or regional business lines are also sometimes used for convenience and simplicity. Similarly, ExxonMobil has business relationships with thousands of customers, suppliers, governments, and others. For convenience and simplicity, words like venture, joint venture, partnership, co-venturer, and partner are used to indicate business relationships involving common activities and interests, and those words may not indicate precise legal relationships. Job Segment: Sustainability, SQL, Database, Oracle, Computer Science, Energy, Technology

Posted 2 weeks ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Year of experience required Minimum 2+ Years of Oracle fusion experience Educational Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Master of Business Administration, Bachelor of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Business Intelligence (BI) Publisher, Oracle Fusion Applications Optional Skills Accepting Feedback, Active Listening, Business Transformation, Communication, Design Automation, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 2 weeks ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Years of experience required Minimum 4Years of Oracle fusion experience Education Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Bachelor of Technology, Master of Business Administration Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Integration Cloud (OIC) Optional Skills Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Self-Awareness, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 2 weeks ago

Apply

58.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title : Data Engineer (Azure) Experience : 58 Years Work Mode : Onsite Location : Pune or Overview : Our company is a strategic technology partner specializing in delivering reliable, turnkey AI solutions. Our experienced team of statisticians, data scientists, computer engineers, and product managers brings clarity and innovation to organizations by empowering decision-making through analytics and AI. Responsibilities We are committed to reshaping how AI/ML is leveraged in the service sector by delivering outcomes that truly matter. Our goal is to help businesses grow, transform, and achieve their objectives through best-in-class, value-driven data Responsibilities : Design and develop scalable data pipelines and solutions on Azure cloud services. Implement and manage Data Warehousing and Data Lake solutions. Perform ETL operations using SSIS, Azure Data Factory (ADF), and Synapse. Carry out data transformation, ingestion, integration, and modeling tasks. Optimize SQL queries and improve database performance. Collaborate with stakeholders to ensure accurate data delivery and reporting. Follow CI/CD best practices using Azure DevOps. Ensure data governance, lineage, and security compliance. Integrate data from various sources, including RESTful APIs and event-driven Skills : Expertise in Azure Data Factory, Synapse Analytics, SQL Server, Azure SQL, and Azure Storage Proficiency in ETL processes using SSIS, ADF Strong understanding of Data Warehousing and Data Lake architecture Solid SQL skills with data modeling and optimization Microsoft Certified : Azure Data Engineer Associate Experience in working with version control (Git) and CI/CD pipelines Knowledge of data governance and privacy best practices Excellent problem-solving and communication Qualifications : Bachelors degree in Computer Science, Information Technology, or a related field Experience in event-driven architectures and Data APIs Prior experience in Agile environments (ref:hirist.tech)

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Oracle Global Services Center (GSC) is a fast-growing cloud consulting team passionate about our customer’s rapid and successful adoption of Oracle Cloud Solutions. Our flexible and innovative “Optimum Shore” approach helps our clients implement, maintain, and integrate their Oracle Cloud Applications and Technology environments while reducing overall total cost of ownership. We assemble an efficient team for each client by blending resources from onshore, near shore, and offshore global delivery centers to match the right expertise, to the right solution, for the right cost. To support our rapid growth, we are seeking versatile consultants that bring a passion for providing excellent client experience, enabling client success by developing innovative solutions. Our cloud solutions are redefining the world of business, empowering governments, and helping society evolve with the pace of change. Join the team of top-class consultants and help our customers achieve more than ever before.. Senior consulting position operating independently with some assistance and mentorship to a project team or customer align with Oracle methodologies and practices. Performs standard duties and tasks with some variation to implement Oracle products and technology to meet customer specifications. Life at Oracle: We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veteran’s status or any other characteristic protected by law. At Oracle, we don’t just value differences—we celebrate them! Committed to crafting a workplace where all kinds of people work together. We believe innovation starts with diversity. https://www.oracle.com/corporate/careers/culture/diversity.html Career Level - IC2 Responsibilities Oracle Global Services Center (GSC) is a fast-growing cloud consulting team passionate about our customer’s rapid and successful adoption of Oracle Cloud Solutions. Our flexible and innovative “Optimum Shore” approach helps our clients implement, maintain, and integrate their Oracle Cloud Applications and Technology environments while reducing overall total cost of ownership. We assemble an efficient team for each client by blending resources from onshore, near shore, and offshore global delivery centers to match the right expertise, to the right solution, for the right cost. To support our rapid growth, we are seeking versatile consultants that bring a passion for providing excellent client experience, enabling client success by developing innovative solutions. Our cloud solutions are redefining the world of business, empowering governments, and helping society evolve with the pace of change. Join the team of top-class consultants and help our customers achieve more than ever before.. Life at Oracle: We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veteran’s status or any other characteristic protected by law. At Oracle, we don’t just value differences—we celebrate them! Committed to crafting a workplace where all kinds of people work together. We believe innovation starts with diversity. https://www.oracle.com/corporate/careers/culture/diversity.html Detailed Description Operates independently to provide quality work products to an engagement. Performs multifaceted and complex tasks that need independent judgment. Applies Oracle methodology, company procedures, and leading practices. Demonstrates expertise to deliver solutions on complex engagements. May act as the functional team lead on projects. Efficiently collaborates with management of customer organizations. Participates in business development activities. Develops and configures detailed solutions for complex projects. Detail Requirements: The candidate is expected to have a sound domain knowledge in HCM covering the hire to retire cycle with 7 to 12 years experience. They must have been a part of at least 3 end to end HCM Cloud implementations along with experience in at least 1 projects as a lead. FUNCTIONAL - The candidate must have knowledge in any of the modules along with Core HR module -Time and Labor Absence Management Payroll Benefits Compensation Recruiting The candidate should have been in client facing roles and interacted with customers in requirement gathering workshops, design, configuration, testing and go-live. Engineering graduates with MBA (HR) will be preferred. TECHNICAL - In-depth understanding of Data Model and Business process functionality and its data flow) in HCM Cloud application and Oracle EBS / PeopleSoft AU (HRMS). Experienced knowledge on Cloud HCM Conversions, integrations (HCM Extracts & BIP), Reporting (OTBI & BIP), Fast Formula & Personalization. Engineering Graduation in any field or MCA Degree or equivalent experience. Proven experience with Fusion technologies including HDL, HCM Extracts, Fast Formulas, BI Publisher Reports & Design Studio. Apart from the above experience, advanced knowledge in OIC, ADF, Java, PaaS, DBCS etc would be an added advantage. Good functional or technical leadership capability with strong planning and follow up skills, mentorship, Work Allocation, Monitoring and status updates to Project Coordinator Should have strong written and verbal communication skills, personal drive, flexibility, teammate, problem solving, influencing and negotiating skills and organizational awareness and sensitivity, engagement delivery, continuous improvement and sharing the knowledge and client management. Assist in the identification, assessment and resolution of complex Technical issues/problems. Interact with client frequently around specific work efforts/deliverable. Candidate should be open for domestic or international travel for short as well as long duration. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Hyderābād

On-site

As one of the world’s leading asset managers, Invesco is dedicated to helping investors worldwide achieve their financial objectives. By delivering the combined power of our distinctive investment management capabilities, we provide a wide range of investment strategies and vehicles to our clients around the world. If you're looking for challenging work, smart colleagues, and a global employer with a social conscience, come explore your potential at Invesco. Make a difference every day! Job Description Job Purpose (Job Summary): The primary role of this position will be to join the Invesco technology team that will implement a global Enterprise Content Management solution on the Alfresco platform. The vision is to utilize this platform for all of Invesco’s content, including but not limited to: Legal documents, marketing collateral, text snippets, digital assets, etc. This role will be working as a technology team member, responsible for the building and maintenance of the platform. Alfresco is one component of a technical transformation taking place, so it will be integrated with other emerging technologies within Invesco. Invesco technology is creating strategic advantage by being the best. We are highly motivated, not your usual financial services stereotype, and thrive within an industry resistant to change. As a team we respect and trust each other and celebrate our successes AND failures while continually looking for ways to improve. Key Responsibilities/Duties: Work closely with clients to understand business requirements and to gain a thorough understanding of the business processes. Answers team member questions, assists team members with issues, and reviews team member work for quality. Learn and adhere to department standards for design, development and quality control. Complete major deliverables including technical design documentation and implementation of business requirements. Prepare for and support user acceptance testing. Work as part of a project team, reporting progress and escalating issues to project management in a timely manner to ensure successful completion of projects / reviews. Complete all tasks related to technical analysis, building and unit testing, quality assurance, system test and implementation in accordance with the IT development life cycle. Provide post implementation support. Create and maintain documentation for systems and processes. Assist with developing improvements to team processes and procedures. Stay abreast of new developments and functionality of Alfresco and content management industry trends. Work Experience: Proficient in using Alfresco for content management Must have a understanding of the Alfresco Content Services architecture Must have experience extending Alfresco via ADF, content modeling, web scripts, behaviors, actions and scheduled jobs Experience / understanding development process in the DevOps environment (including exposure to CI/CD tools such a Jenkins, source code management tools - e.g. GIT) Excellent problem-solving skills Good understanding of software engineering concepts and practices Experience in developing and consuming REST or GraphQL APIs Exposure and interest in full stack development in one of the following: Node.JS, Python or Java Debugging skills with the ability to ask for help Consume libraries, tools, and APIs from other teams to achieve the desired functionality Experience Agile framework is preferred Knowledge/ Abilities: Excellent verbal and written skills Enjoy challenging and thought provoking work and have a strong desire to learn and progress (motivated enough to self-learn). Must demonstrate a strong customer focused attitude and understand the fundamentals of customer service Structured, disciplined approach to work, with attention to detail Be able to work under pressure and multi-task while remaining professional and courteous. Open-minded, flexible and willing to listen for other people’s opinions. Ability to work as part of a distributed team in a self-directed way with strong communication skills. Flexibility, teamwork, empathy, a sense of humor and courage to challenge the status quo are must haves in this environment Adaptable and resourceful, capable of working under pressure to meet aggressive deadlines with limited resources. Able to work independently and with team members (brainstorming, team building activities, etc). Formal Education and Experience Required: (minimum requirement to perform job duties): A Bachelor’s Degree in IT related program is preferred and a minimum of 3 years of related experience. Working Conditions: Normal office environment with little exposure to noise, dust and temperatures. The ability to lift, carry or otherwise move objects of up to 10 pounds is also necessary. Normally works a regular schedule of hours, however hours may vary depending upon the project or assignment. Willingness to travel both domestically and internationally. Frequency and duration to be determined by manager. About Invesco Invesco Ltd. is a leading independent global investment management firm, dedicated to helping investors worldwide achieve their financial objectives. By delivering the combined power of our distinctive investment management capabilities, Invesco provides a wide range of investment strategies and vehicles to our clients around the world. Operating in more than 20 countries, the firm is listed on the New York Stock Exchange under the symbol IVZ. About Invesco Technology Do you like working with top Technology professionals where everyone has an opportunity to collaborate, share ideas and work on leading-edge technologies? Do you thrive in an environment where you are part of a team implementing innovative technology solutions for clients and employees? Invesco’s Technology team is a global organization with 1300+ employees working together to serve the business. We value everyone’s ideas and input and provide opportunities to develop skills. If this sounds like a team you want to be a part of, read on to learn more about the opportunity to join us. Full Time / Part Time Full time Worker Type Employee Job Exempt (Yes / No) Yes Workplace Model At Invesco, our workplace model supports our culture and meets the needs of our clients while providing flexibility our employees value. As a full-time employee, compliance with the workplace policy means working with your direct manager to create a schedule where you will work in your designated office at least three days a week, with two days working outside an Invesco office. Why Invesco In Invesco, we act with integrity and do meaningful work to create impact for our stakeholders. We believe our culture is stronger when we all feel we belong, and we respect each other’s identities, lives, health, and well-being. We come together to create better solutions for our clients, our business and each other by building on different voices and perspectives. We nurture and encourage each other to ensure our meaningful growth, both personally and professionally. We believe in diverse, inclusive, and supportive workplace where everyone feels equally valued, and this starts at the top with our senior leaders having diversity and inclusion goals. Our global focus on diversity and inclusion has grown exponentially and we encourage connection and community through our many employee-led Business Resource Groups (BRGs). What’s in it for you? As an organization we support personal needs, diverse backgrounds and provide internal networks, as well as opportunities to get involved in the community and in the world. Our benefit policy includes but not limited to: Competitive Compensation Flexible, Hybrid Work 30 days’ Annual Leave + Public Holidays Life Insurance Retirement Planning Group Personal Accident Insurance Medical Insurance for Employee and Family Annual Health Check-up 26 weeks Maternity Leave Paternal Leave Adoption Leave Near site Childcare Facility Employee Assistance Program Study Support Employee Stock Purchase Plan ESG Commitments and Goals Business Resource Groups Career Development Programs Mentoring Programs Invesco Cares Dress for your Day In Invesco, we offer development opportunities that help you thrive as a lifelong learner in a constantly evolving business environment and ensure your constant growth. Our AI enabled learning platform delivers curated content based on your role and interest. We ensure our manager and leaders also have many opportunities to advance their skills and competencies that becomes pivotal in their continuous pursuit of performance excellence. To know more about us About Invesco: https://www.invesco.com/corporate/en/home.html About our Culture: https://www.invesco.com/corporate/en/about-us/our-culture.html About our D&I policy: https://www.invesco.com/corporate/en/our-commitments/diversity-and-inclusion.html About our CR program: https://www.invesco.com/corporate/en/our-commitments/corporate-responsibility.html Apply for the role @ Invesco Careers: https://careers.invesco.com/india/

Posted 2 weeks ago

Apply

7.0 years

1 - 10 Lacs

Hyderābād

On-site

SUMMARY The Database Developer III will play a critical role in engaging with stakeholders and technical team members for requirement gathering, creating data pipelines in ADF and SSIS, mapping, extraction, transformation, visualizations, and analytical data analysis. You will work closely with cross-functional teams, including IT and business stakeholders to ensure seamless and efficient data flow, report generation, visualizations, and data analysis. You will collaborate with various departments to ensure data accuracy, integrity, and compliance with established data standards. This role will report to the BEST Data Services lead in our Business Enterprise Systems Technology department. A successful Database Developer must take a hands-on approach, ensuring the highest quality solutions are provided to our business stakeholders, with accurate development, documentation, and adherence to deadlines. This role will also work with key stakeholders across the organization to drive enhancements to a successful implementation and ensure all master data meets requirements and are deployed and implemented properly. PRIMARY RESPONSIBILITIES Expertise in writing complex SQL queries, Data modeling and Performance tuning. Proficient in ETL operations using SSIS and other tools. Proficient in Python API calls, Cloud ETL operations Proficient in Cloud technologies like Azure and GCP. Collaborating with other stakeholders to ensure the architecture is aligned with business requirements. Collaborating with senior Leaders to determine business-specific application needs. Providing technical leadership to the application development team. Performing design and code reviews and ensuring application design standards are maintained. Compiling and implementing application development plans for new or existing applications. Must be following SDLC practices. Demonstrating application prototypes and integrating user feedback. Writing scripts and code for applications, as well as installing and updating applications. Mentoring junior application developers and providing end-users with technical support. Performing application integration, maintenance, upgrades, and migration. Documenting application development processes, procedures, and standards. Integrating trends in application architecture in application development projects. streamlining of day-to-day activities; providing a more efficient production environment. lowering costs and gaining cost-effectiveness; providing a secure, stable, and supportable environment. REQUIRED KNOWLEDGE/SKILLS/ABILITIES Bachelor's degree in Computer Science, Engineering, or related field. 7+ years of experience in SQL SERVER (Expert) and SSIS (Expert) development. Proficient in DataBricks. Proficient in Python API calls. Basic knowledge of Cloud Technologies (Azure/GCP). Strong knowledge of Data Modeling and Data Warehousing concepts. Good to have knowledge in POWERBI. Strong analytical and problem-solving skills.

Posted 2 weeks ago

Apply

0 years

0 Lacs

India

On-site

Simeio is a global identity and access management service provider focused on protecting organizations key data and access requirements to business-critical systems and applications. Simeio provides services such as Access Management, IGA, PAM & CIAM plus our wider service offerings include support, upgrades, governance and application onboarding. The Opportunity The Senior Consultant is responsible for supporting delivery and support of high-quality Identity and Access Management services while adhering to Simeio's standards and best practices. The ideal Candidate would have experience primarily in Access Management and preferably in CIAM to serve as a member of multiple client engagement teams that assist clients in employing proper information systems, resources, and controls to maximize efficiencies and minimize risk. Ability to take up challenges, adapt to the business needs, and staying focused on delivering results are essential to the success of this role. The Role: Senior Identity Specialist Key Skills: Access Management, IBM Security Access Manager, Core Directory Services Key Accountabilities Responsible for all development, customization and configuration activities for the Access Management Program. These configurations will be performed via the Ping suite which provides customers with a single comprehensive platform for access request. In-depth knowledge of Identity Security Access Manager (ISAM) is required. Candidate will be responsible for working on a variety of supplementary products such as Java, IBM LDAP, WebSphere, and WebSEAL. to perform configuration and customization for the Customer Access Management Program. Working experience on setting up Single Sign-On (SSO), WebSEAL/Reverse proxy etc. Work experience on different types of junctions. Should have worked on different policy configurations like ACLs, POPs and authorization rules. Good working knowledge on SAML2.0, OAuth and OIDC setup for SSOand API protection and on IBM security Directory Serve (LDAP) Candidate will be responsible for the following specific life cycle duties: Technical Analysis of Business Requirements and Design interpretation – Ensuring understanding of design in response to business requirements and is able to interpret and successfully implement solutions. Technical Process Management – Ensuring all development and implementation activities follow technical process for compliance purposes. Release Management – Coordinating and building deployment processes to enable management of code/configuration releases Configuration and Customization – Completing all assigned configuration and customization activities within and in support of the Oracle Identity Governance suite. Quality Assurance Management – Triaging and correcting testing defects. Technical Process Documentation – Creating and maintaining development and deployment documentation. Knowledge on -- Web services, disconnected systems and ADF would be added advantage. Must Have Requisites: Familiarity With The Following Technologies And Concepts Is Desired Directories (Active Directory, LDAP & X500) Extensive working experience on IBM ISAM 9 or above Single Sign-on and Federation (Kerberos, SAML 2.0, OAuth 2.0, etc.) Good to Have Requisites: Solid written and verbal communication Capable of working on multiple projects simultaneously Capable of solving complex problems Capable of defining strategic and tactical solutions, and knowing when each applies Educational Qualification: Bachelor's degree in computer science, systems analysis or a related study, or equivalent experience (Masters preferred). About Simeio Simeio has over 650 talented employees across the globe. We have offices in USA (Atlanta HQ and Texas), India, Canada, Costa Rica and UK. Founded in 2007, and now backed by private equity company ZMC, Simeio is recognized as a top IAM provider by industry analysts. Alongside Simeio’s identity orchestration tool ‘Simeio IO’ - Simeio also partners with industry leading IAM software vendors to provide access management, identity governance and administration, privileged access management and risk intelligence services across on-premise, cloud, and hybrid technology environments. Simeio provides services to numerous Fortune 1000 companies across all industries including financial services, technology, healthcare, media, retail, public sector, utilities and education. Simeio is an equal opportunity employer. If you require assistance with completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employee selection process, please direct your inquiries to our recruitment team - [email protected]. Thank you About Your Application We review every application received and will get in touch if your skills and experience match what we’re looking for. If you don’t hear back from us within 10 days, please don’t be too disappointed – we may keep your CV on our database for any future vacancies and we would encourage you to keep an eye on our career opportunities as there may be other suitable roles. Simeio is an equal opportunity employer. If you require assistance with completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employee selection process, please direct your inquiries to any of the recruitment team at recruitment@simeio.com or +1 404-882-3700.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies