Home
Jobs

77 Oracle Adf Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 11.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Cloud Visual Builder Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to understand their needs, developing innovative solutions, and ensuring that applications are aligned with business objectives. You will engage in problem-solving activities, participate in team meetings, and contribute to the overall success of projects by leveraging your expertise in application development. Roles & Responsibilities:- Expected to be an SME.- Experience with building different types of application in VBCS using Business Object, ORDS-Knowledge and experience in integration with other Oracle PaaS services.-Experience with integrating VBCS applications with Oracle SaaS Applications-Good hands on knowledge in JavaScript, CSS3, XML/JSON/WSDL, Consuming Web Services(SOAP/REST), Testing Tools(Postman/SoapUI/JMeter)-Work experience on development of SaaS extensions using VBCS-Hands on knowledge on various web service related technologies such as WSDL/XML/SOAP/REST/JSON standards-Knowledge of Oracle database and PL/SQL-Good Hands on writing SQL Queries-Conduct Design review to provide guidance and Quality assurance around best practices and frameworks-Excellent communication interpersonal skills. Good analytical and debugging skills Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Cloud Visual Builder.- Overall 6+ years of experience in Web App development (Oracle ADF)- 2 to 4 years of experience in Oracle VBCS-Experience in GIT-HUB, Oracle Developer Cloud and UCD tools for build and deployment-Analyze requirements, determine technical level of effort and prepare technical design and specifications.-Conversant in deploying and troubleshooting, analyzing, and resolving technical problems- Good To Have Skills: Experience with application integration techniques.- Strong understanding of user interface design principles.- Familiarity with agile development methodologies.- Experience in troubleshooting and debugging applications. Additional Information:- The candidate should have minimum 5 years of experience in Oracle Cloud Visual Builder.- A 15 years full time education is required. Qualification 15 years full time education

Posted 7 hours ago

Apply

4.0 - 9.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Cloud Visual Builder Good to have skills : NAMinimum 3 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will engage in the design, construction, and configuration of applications tailored to fulfill specific business processes and application requirements. Your typical day will involve collaborating with team members to understand project needs, developing innovative solutions, and ensuring that applications are optimized for performance and usability. You will also participate in testing and troubleshooting to ensure that the applications function as intended, contributing to the overall success of the projects you are involved in. Roles & Responsibilities:- Expected to perform independently and become an SME.- Required active participation/contribution in team discussions.- Contribute in providing solutions to work related problems.- Analyze requirements, determine technical level of effort and prepare technical design and specifications.- Conversant in deploying and troubleshooting, analyzing, and resolving technical problems- Hands on writing SQL Queries- Conduct Design review to provide guidance and Quality assurance around best practices and frameworks Professional & Technical Skills: - Overall 4+ years of experience in Web App development (Oracle ADF)- 2 to 3 years of experience in Oracle VBCS (Visual Builder Cloud Service)- Good hands on knowledge in JavaScript, CSS3, XML/JSON/WSDL,Consuming Web Services(SOAP/REST),Testing Tools(Postman/SoapUI/JMeter)- Experience with building different types of application in VBCS using Business Object, ORDS- Knowledge and experience in integration with other Oracle PaaS services.- Experience with integrating VBCS applications with Oracle SaaS Applications- Work experience on development of SaaS extensions using VBCS- Experience of various web service related technologies such as WSDL/XML/SOAP/REST/JSON standards- Knowledge of Oracle database and PL/SQL - Experience in GIT-HUB, Oracle Developer Cloud and UCD tools for build and deployment- Good communication interpersonal skills. Good analytical and debugging skills Additional Information:- The candidate should have minimum 3 years of experience in Oracle Cloud Visual Builder..- A 15 years full time education is required. Qualification 15 years full time education

Posted 7 hours ago

Apply

4.0 - 9.0 years

5 - 9 Lacs

Coimbatore

Work from Office

Naukri logo

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle Cloud Visual Builder Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with various teams to gather requirements, developing application features, and ensuring that the applications align with business objectives. You will also engage in problem-solving activities, providing innovative solutions to enhance application performance and user experience, while maintaining a focus on quality and efficiency throughout the development process. Roles & Responsibilities:-Expected to be an SME.- Analyze requirements, determine technical level of effort and prepare technical design and specifications.- Conversant in deploying and troubleshooting, analyzing, and resolving technical problems-Conduct Design review to provide guidance and Quality assurance around best practices and frameworks Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle Cloud Visual Builder.- Overall 4+ years of experience in Web App development (Oracle ADF)- 2 to 3 years of experience in Oracle VBCS (Visual Builder Cloud Service)- Knowledge of Oracle database and PL/SQL and experience in integration with other Oracle PaaS services.- Good hands on knowledge in JavaScript, CSS3, XML/JSON/WSDL, Consuming Web Services(SOAP/REST), Testing Tools(Postman/SoapUI/JMeter)- Experience in GIT-HUB, Oracle Developer Cloud and UCD tools for build and deployment- Work experience on development of SaaS extensions using VBCS- Experience of various web service related technologies such as WSDL/XML/SOAP/REST/JSON standards-Hands on writing SQL Queries- Experience with integrating VBCS applications with Oracle SaaS Applications- Experience with building different types of application in VBCS using Business Object, ORDS-Good communication interpersonal skills. Good analytical and debugging skills Additional Information:- The candidate should have minimum 5 years of experience in Oracle Cloud Visual Builder.- A 15 years full time education is required. Qualification 15 years full time education

Posted 7 hours ago

Apply

7.0 - 12.0 years

30 - 45 Lacs

Noida

Work from Office

Naukri logo

Job Description Bachelor's degree in Computer Science, Software Engineering, or a related field. 5+ years of experience as a Full Stack Developer. Strong proficiency in ReactJS, Next.js, C#, .NET, and Azure cloud services. Understanding of AngularJS and a willingness to learn new technologies. Experience with Azure Functions, Azure CosmosDB, AKS, ADF, Logic Apps, Azure Event Hubs, APIM, and Front Door. Excellent problem-solving, analytical, and debugging skills. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Roles & Responsibilities Responsibilities: Development: Contribute to the development of robust and scalable applications using our primary tech stack, including ReactJS, Next.js, C#, .NET, Azure Functions, and Azure CosmosDB. Cross-Platform Compatibility: Demonstrate a solid understanding of AngularJS and be open to learning newer technologies as needed. Cloud Expertise: Possess a deep understanding of Azure cloud services, including Azure Functions, Azure CosmosDB, AKS, ADF, Logic Apps, Azure Event Hubs, APIM, and Front Door. Problem-Solving: Identify and resolve technical challenges effectively, leveraging your problem-solving skills and expertise. Collaboration: Work collaboratively with cross-functional teams to deliver high-quality solutions that meet business objectives. Job Description Bachelor's degree in Computer Science, Software Engineering, or a related field. 5+ years of experience as a Full Stack Developer. Strong proficiency in ReactJS, Next.js, C#, .NET, and Azure cloud services. Understanding of AngularJS and a willingness to learn new technologies. Experience with Azure Functions, Azure CosmosDB, AKS, ADF, Logic Apps, Azure Event Hubs, APIM, and Front Door. Excellent problem-solving, analytical, and debugging skills. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Roles & Responsibilities Responsibilities: Development: Contribute to the development of robust and scalable applications using our primary tech stack, including ReactJS, Next.js, C#, .NET, Azure Functions, and Azure CosmosDB. Cross-Platform Compatibility: Demonstrate a solid understanding of AngularJS and be open to learning newer technologies as needed. Cloud Expertise: Possess a deep understanding of Azure cloud services, including Azure Functions, Azure CosmosDB, AKS, ADF, Logic Apps, Azure Event Hubs, APIM, and Front Door. Problem-Solving: Identify and resolve technical challenges effectively, leveraging your problem-solving skills and expertise. Collaboration: Work collaboratively with cross-functional teams to deliver high-quality solutions that meet business objectives. Roles and Responsibilities Job Description Bachelor's degree in Computer Science, Software Engineering, or a related field. 5+ years of experience as a Full Stack Developer. Strong proficiency in ReactJS, Next.js, C#, .NET, and Azure cloud services. Understanding of AngularJS and a willingness to learn new technologies. Experience with Azure Functions, Azure CosmosDB, AKS, ADF, Logic Apps, Azure Event Hubs, APIM, and Front Door. Excellent problem-solving, analytical, and debugging skills. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Roles & Responsibilities Responsibilities: Development: Contribute to the development of robust and scalable applications using our primary tech stack, including ReactJS, Next.js, C#, .NET, Azure Functions, and Azure CosmosDB. Cross-Platform Compatibility: Demonstrate a solid understanding of AngularJS and be open to learning newer technologies as needed. Cloud Expertise: Possess a deep understanding of Azure cloud services, including Azure Functions, Azure CosmosDB, AKS, ADF, Logic Apps, Azure Event Hubs, APIM, and Front Door. Problem-Solving: Identify and resolve technical challenges effectively, leveraging your problem-solving skills and expertise. Collaboration: Work collaboratively with cross-functional teams to deliver high-quality solutions that meet business objectives.

Posted 1 day ago

Apply

5.0 - 10.0 years

9 - 19 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Key Responsibilities: ¢ Work on client projects to deliver AWS, PySpark, Databricks based Data engineering & Analytics solutions. €¢ Build and operate very large data warehouses or data lakes. €¢ ETL optimization, designing, coding, & tuning big data processes using Apache Spark. €¢ Build data pipelines & applications to stream and process datasets at low latencies. €¢ Show efficiency in handling data - tracking data lineage, ensuring data quality, and improving discoverability of data. Technical Experience: €¢ Minimum of 5 years of experience in Databricks engineering solutions on AWS Cloud platforms using PySpark, Databricks SQL, Data pipelines using Delta Lake. €¢ Minimum of 5 years of experience years of experience in ETL, Big Data/Hadoop and data warehouse architecture & delivery. Email at- maya@mounttalent.com

Posted 4 days ago

Apply

4.0 - 9.0 years

20 - 25 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Interpret requirements and design solution that satisfy the customers needs and also satisfied the need for standardization, usability and maintainability of the different Agent facing applications. Breakdown solution into components that could be assigned and tracked in pursue of a fast and cost effective solution. Implement ADF Business Components / Web Services / Oracle Objects Calls that would provide the data access layer for the Agent Applications processes. Implement the ADF View Controller components, including Tasks Flows, Beans and JSF pages that would allow the successful interaction between the agents and the applications. Supervise tasks performed by other members of the solution team in order to ensure a cohesive approach that satisfies the external (customer) requirements and internal (technology) requirements.

Posted 4 days ago

Apply

5.0 - 10.0 years

5 - 15 Lacs

Hyderabad

Hybrid

Naukri logo

Role & responsibilities Interpret requirements and design solution that satisfy the customers needs and also satisfied the need for standardization, usability and maintainability of the different Agent facing applications. Breakdown solution into components that could be assigned and tracked in pursue of a fast and cost effective solution. Implement ADF Business Components / Web Services / Oracle Objects Calls that would provide the data access layer for the Agent Applications processes. Implement the ADF View Controller components, including Tasks Flows, Beans and JSF pages that would allow the successful interaction between the agents and the applications. Supervise tasks performed by other members of the solution team in order to ensure a cohesive approach that satisfies the external (customer) requirements and internal (technology) requirements. Preferred candidate profile Any Graduation or Post-Graduation

Posted 4 days ago

Apply

3.0 - 5.0 years

8 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Understanding the requirements and developing ADF pipelines Good knowledge of data bricks Strong understanding of the existing ADF pipelines and enhancements Deployment and Monitoring ADF Jobs Good understanding of SQL concepts and Strong in SQL query writing Understanding and writing the stored procedures Performance Tuning Roles and Responsibilities Understand business and data integration requirements. Design, develop, and implement scalable and reusable ADF pipelines for ETL/ELT processes. Leverage Databricks for advanced data transformations within ADF pipelines. Collaborate with data engineers to integrate ADF with Azure Databricks notebooks for big data processing. Analyze and understand existing ADF workflows. Implement improvements, optimize data flows, and incorporate new features based on evolving requirements. Manage deployment of ADF solutions across development, staging, and production environments. Set up monitoring, logging, and alerts to ensure smooth pipeline executions and troubleshoot failures. Write efficient and complex SQL queries to support data analysis and ETL tasks. Tune SQL queries for performance, especially in large-volume data scenarios. Design, develop, and maintain stored procedures for data transformation and business logic. Ensure procedures are optimized and modular for reusability and performance. Identify performance bottlenecks in queries and data processing routines. Apply indexing strategies, query refactoring, and execution plan analysis to enhance performance

Posted 4 days ago

Apply

2.0 - 3.0 years

7 - 11 Lacs

Noida

Work from Office

Naukri logo

Work hours : 5 days a week. Saturday , Sunday fixed off Please see JD below : Knowledge of Oracle Webcenter Portal Good knowledge of exposing ADF taskflows in Webcenter Experience in Oracle ADF / JSF Strong working knowledge of ADF UI, task flows, ADF integration with Web Services, ADF BC4J Components Strong understanding of ADF/JSF Page lifecycle Good knowledge of CSS, Stylesheets and designing UI Templates for ADF/Webcenter applications Proficient in Java, REST based web services and invocation of REST webservices. Good understanding of RDBMS concepts. Proficient in SQL and should be able to work with PLSQL. Good understanding of Web Services, Web Services Security.

Posted 5 days ago

Apply

10.0 - 11.0 years

8 - 12 Lacs

Noida

Work from Office

Naukri logo

What We’re Looking For: Solid understanding of data pipeline architecture , cloud infrastructure , and best practices in data engineering Strong grip on SQL Server , Oracle , Azure SQL , and working with APIs Skilled in data analysis – identify discrepancies, recommend fixes Proficient in at least one programming language: Python , Java , or C# Hands-on experience with Azure Data Factory (ADF) , Logic Apps , Runbooks Knowledge of PowerShell scripting and Azure environment Excellent problem-solving , analytical , and communication skills Able to collaborate effectively and manage evolving project priorities Roles and Responsibilities Senior Data Engineer - Azure & Databricks Development and maintenance of Data Pipelines, Modernisation of cloud data platform At least 8 years experience in Data Engineering space At least 4 experiences in Apache Spark/ Databricks At least 4 years of experience in Python & at least 7 years in SQL and ETL stack .

Posted 5 days ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Pune

Work from Office

Naukri logo

Job Title: Data Engineer Data Solutions Delivery + Data Catalog & Quality Engineer About Advanced Energy Advanced Energy Industries, Inc (NASDAQ: AEIS), enables design breakthroughs and drives growth for leading semiconductor and industrial customers Our precision power and control technologies, along with our applications know-how, inspire close partnerships and innovation in thin-film and industrial manufacturing We are proud of our rich heritage, award-winning technologies, and we value the talents and contributions of all Advanced Energy's employees worldwide, Department: Data and Analytics Team: Data Solutions Delivery Team Job Summary: We are seeking a highly skilled Data Engineer to join our Data and Analytics team As a member of the Data Solutions Delivery team, you will be responsible for designing, building, and maintaining scalable data solutions The ideal candidate should have extensive knowledge of Databricks, Azure Data Factory, and Google Cloud, along with strong data warehousing skills from data ingestion to reporting Familiarity with the manufacturing and supply chain domains is highly desirable Additionally, the candidate should be well-versed in data engineering, data product, data platform concepts, data mesh, medallion architecture, and establishing enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview The candidate should also have proven experience in implementing data quality practices using tools like Great Expectations, Deequ, etc Key Responsibilities: Design, build, and maintain scalable data solutions using Databricks, ADF, and Google Cloud, Develop and implement data warehousing solutions, including ETL processes, data modeling, and reporting, Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions, Ensure data integrity, quality, and security across all data platforms, Provide expertise in data engineering, data product, and data platform concepts, Implement data mesh principles and medallion architecture to build scalable data platforms, Establish and maintain enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview, Implement data quality practices using tools like Great Expectations, Deequ, etc Work closely with the manufacturing and supply chain teams to understand domain-specific data requirements, Develop and maintain documentation for data solutions, data flows, and data models, Act as an individual contributor, picking up tasks from technical solution documents and delivering high-quality results, Qualifications: Bachelors degree in computer science, Information Technology, or a related field, Proven experience as a Data Engineer or similar role, In-depth knowledge of Databricks, Azure Data Factory, and Google Cloud, Strong data warehousing skills, including ETL processes, data modelling, and reporting, Familiarity with manufacturing and supply chain domains, Proficiency in data engineering, data product, data platform concepts, data mesh, and medallion architecture, Experience in establishing enterprise data catalogs using tools like Ataccama, Collibra, or Microsoft Purview, Proven experience in implementing data quality practices using tools like Great Expectations, Deequ, etc Excellent problem-solving and analytical skills, Strong communication and collaboration skills, Ability to work independently and as part of a team, Preferred Qualifications: Master's degree in a related field, Experience with cloud-based data platforms and tools, Certification in Databricks, Azure, or Google Cloud, As part of our total rewards philosophy, we believe in offering and maintaining competitive compensation and benefits programs for our employees to attract and retain a talented, highly engaged workforce Our compensation programs are focused on equitable, fair pay practices including market-based base pay, an annual pay-for-performance incentive plan, we offer a strong benefits package in each of the countries in which we operate, Advanced Energy is committed to diversity in its workforce including Equal Employment Opportunity for Minorities, Females, Protected Veterans, and Individuals with Disabilities, We are committed to protecting and respecting your privacy We take your privacy seriously and will only use your personal information to administer your application in accordance with the RA No 10173 also known as the Data Privacy Act of 2012

Posted 5 days ago

Apply

7.0 - 12.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Conduct technical analyses of existing data pipelines, ETL processes, and on-premises/cloud system, identify technical bottlenecks, evaluate migration complexities, and propose optimizations. Desired Skills and experience Candidates should have a B.E./B.Tech/MCA/MBA in Finance, Information Systems, Computer Science or a related field 7+ years of experience as a Data and Cloud architecture with client stakeholders Strong experience in Synapse Analytics, Databricks, ADF, Azure SQL (DW/DB), SSIS. Strong experience in Advanced PS, Batch Scripting, C# (.NET 3.0). Expertise on Orchestration systems with ActiveBatch and AZ orchestration tools. Strong understanding of data warehousing, DLs, and Lakehouse concepts. Excellent communication skills, both written and verbal Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the needs for close supervision and collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Experience in project management and team management Key responsibilities include: Understand and review PowerShell (PS), SSIS, Batch Scripts, and C# (.NET 3.0) codebases for data processes. Assess the complexity of trigger migration across Active Batch (AB), Synapse, ADF, and Azure Databricks (ADB). Define usage of Azure SQL DW, SQL DB, and Data Lake (DL) for various workloads, proposing transitions where beneficial. Analyze data patterns for optimization, including direct raw-to-consumption loading and zone elimination (e.g., stg/app zones). Understand requirements for external tables (Lakehouse) Lead project deliverables, ensuring actionable and strategic outputs. Evaluate and ensure quality of deliverables within project timelines Develop a strong understanding of equity market domain knowledge Collaborate with domain experts and business stakeholders to understand business rules/logics Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on dev, test, UAT and production environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and manage client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT)

Posted 5 days ago

Apply

4.0 - 6.0 years

8 - 12 Lacs

Gurugram, Delhi / NCR

Work from Office

Naukri logo

4-6 years of exp in Oracle ADF development.Should have hands-on exp in Oracle ADF,Spring Boot,& Oracle DB.Exp in Oracle Database with SQL, PL/SQL,& database optimization techniques.Oracle ADF components such as JSF,EJBs,ADF BC,ADF Faces & Task Flows. Required Candidate profile Familiarity with Java,J2EE,RESTful services, & modern software development practices. Design,develop& maintain web-based applications using Oracle ADF.Build robust,scalable, & secure ADF applications.

Posted 6 days ago

Apply

4.0 - 8.0 years

8 - 12 Lacs

Chennai

Work from Office

Naukri logo

About ValGenesis ValGenesis is a leading digital validation platform provider for life sciences companies. ValGenesis suite of products are used by 30 of the top 50 global pharmaceutical and biotech companies to achieve digital transformation, total compliance and manufacturing excellence/intelligence across their product lifecycle. Learn more about working for ValGenesis, the de facto standard for paperless validation in Life Sciences: https://www.youtube.com/watch?v=tASq7Ld0JsQ About the Role: We are looking for experienced database developer who could join our engineering team to build the enterprise applications for our global customers. If you are a technology enthusiast and have passion to develop enterprise cloud products with quality, security, and performance, we are eager to discuss with you about the potential role. Responsibilities Database Development Utilize expertise in MS SQL Server, PostgreSQL to design and develop efficient and scalable database solutions. Collaborate with development stakeholders to understand and implement database requirements. Write and optimize complex SQL queries, stored procedures, and functions. Database tuning & configurations of servers Knowledge on both Cloud & on-premises database Knowledge of SaaS based applications development ETL Integration Leverage experience with ETL tools such as ADF and SSIS to facilitate seamless data migration. Design and implement data extraction, transformation, and loading processes. Ensure data integrity during the ETL process and troubleshoot any issues that may arise. Reporting Develop and maintain SSRS reports based on customer needs. Collaborate with stakeholders to understand reporting requirements and implement effective solutions. Performance Tuning Database performance analysis using Dynatrace, NewRelic or similar tools. Analyze query performance and implement tuning strategies to optimize database performance. Conduct impact analysis and resolve production issues within specified SLAs. Version Control and Collaboration Utilize GIT and SVN for version control of database scripts and configurations. Collaborate with cross-functional teams using tools such as JIRA for story mapping, tracking, and issue resolution. Documentation Document database architecture, processes, and configurations. Provide detailed RCA (Root Cause Analysis) for any database-related issues. Requirements 6 - 9 years of hands-on experience in software development. Must have extensive experience in stored procedure development and performance fine tuning. Proficient in SQL, MS SQL Server, SSRS, and SSIS. Working knowledge of C# ASP.NET web application development. Ability to grasp new concepts and facilitate continuous learning. Strong sense of responsibility and accountability. We’re on a Mission In 2005, we disrupted the life sciences industry by introducing the world’s first digital validation lifecycle management system. ValGenesis VLMS® revolutionized compliance-based corporate validation activities and has remained the industry standard. Today, we continue to push the boundaries of innovation enhancing and expanding our portfolio beyond validation with an end-to-end digital transformation platform. We combine our purpose-built systems with world-class consulting services to help every facet of GxP meet evolving regulations and quality expectations. The Team You’ll Join Our customers’ success is our success. We keep the customer experience centered in our decisions, from product to marketing to sales to services to support. Life sciences companies exist to improve humanity’s quality of life, and we honor that mission. We work together. We communicate openly, support each other without reservation, and never hesitate to wear multiple hats to get the job done. We think big. Innovation is the heart of ValGenesis. That spirit drives product development as well as personal growth. We never stop aiming upward. We’re in it to win it. We’re on a path to becoming the number one intelligent validation platform in the market, and we won’t settle for anything less than being a market leader. How We Work Our Chennai, Hyderabad and Bangalore offices are onsite, 5 days per week. We believe that in-person interaction and collaboration fosters creativity, and a sense of community, and is critical to our future success as a company. ValGenesis is an equal-opportunity employer that makes employment decisions on the basis of merit. Our goal is to have the best-qualified people in every job. All qualified applicants will receive consideration for employment without regard to race, religion, sex, sexual orientation, gender identity, national origin, disability, or any other characteristics protected by local law.

Posted 1 week ago

Apply

8.0 - 13.0 years

11 - 15 Lacs

Mumbai, Vikhroli

Work from Office

Naukri logo

8+ years of experience in Oracle EPM suite and delivering solutions on Planning Budgeting applications (ePBCS) and PCMCS applications Experience in Financial and workforce planning Worked on an end to end implementations from requirement gathering to user trainings Experience in support and performance improvement. Knowledge of PBCS rules Certification from Oracle will be added advantage Qualifications Oracle EPM Consultants6+ years of experience in Oracle EPM suite Job Location

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Naukri logo

As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise We are seeking a skilled Azure Data Engineer with 5+ years of experience Including 3+ years of hands-on experience with ADF/Databricks The ideal candidate Data bricks,Data Lake, Phyton programming skills. The candidate will also have experience for deploying to data bricks. Familiarity with Azure Data Factory Preferred technical and professional experience Good communication skills. 3+ years of experience with ADF/DB/DataLake. Ability to communicate results to technical and non-technical audiences

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Mumbai

Work from Office

Naukri logo

As an Data Engineer at IBM you will harness the power of data to unveil captivating stories and intricate patterns. You'll contribute to data gathering, storage, and both batch and real-time processing. Collaborating closely with diverse teams, you'll play an important role in deciding the most suitable data management systems and identifying the crucial data required for insightful analysis. As a Data Engineer, you'll tackle obstacles related to database integration and untangle complex, unstructured data sets. In this role, your responsibilities may include Implementing and validating predictive models as well as creating and maintain statistical models with a focus on big data, incorporating a variety of statistical and machine learning techniques Designing and implementing various enterprise search applications such as Elasticsearch and Splunk for client requirements Work in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviours. Build teams or writing programs to cleanse and integrate data in an efficient and reusable manner, developing predictive or prescriptive models, and evaluating modeling results Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise We are seeking a skilled Azure Data Engineer with 5+ years of experience Including 3+ years of hands-on experience with ADF/Databricks The ideal candidate Data bricks,Data Lake, Phyton programming skills. The candidate will also have experience for deploying to data bricks. Familiarity with Azure Data Factory Preferred technical and professional experience Good communication skills. 3+ years of experience with ADF/DB/DataLake. Ability to communicate results to technical and non-technical audiences

Posted 1 week ago

Apply

5.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

When our values align, there's no limit to what we can achieve. At Parexel, we all share the same goal - to improve the world's health. From clinical trials to regulatory, consulting, and market access, every clinical development solution we provide is underpinned by something special - a deep conviction in what we do. Each of us, no matter what we do at Parexel, contributes to the development of a therapy that ultimately will benefit a patient. We take our work personally, we do it with empathy and we're committed to making a difference. Key Accountabilities : Using Microsoft Azure data PaaS services, design, build, modify, and support data pipelines leveraging DataBricks and PowerBI in a medallion architecture setting. If necessary, create prototypes to validate proposed ideas and solicit input from stakeholders. Excellent grasp of and expertise with test-driven development and continuous integration processes. Analysis and Design – Converts high-level design to low-level design and implements it. Collaborate with Team Leads to define/clarify business requirements, estimate development costs, and finalize work plans. Run unit and integration tests on all created code – Create and run unit and integration tests throughout the development lifecycle Benchmark application code proactively to prevent performance and scalability concerns. Collaborate with the Quality Assurance Team on issue reporting, resolution, and change management. Support and Troubleshooting – Assist the Operations Team with any environmental issues that arise during application deployment in the Development, QA, Staging, and Production environments. Assist other teams in resolving issues that may develop as a result of applications or the integration of multiple component. Knowledge and Experience : Understanding of design concepts and architectural basics. Knowledge of performance engineering. Understanding of quality processes and estimate methods. Fundamental grasp of the project domain. The ability to transform functional and nonfunctional needs into system requirements. The ability to develop and code complicated applications is required. The ability to create test cases and scenarios based on specifications. Solid knowledge of SDLC and agile techniques. Knowledge of current technology and trends. Logical thinking and problem-solving abilities, as well as the capacity to collaborate. Primary skills: Cloud Platform, Azure, Databricks, ADF, ADO. Sought: SQL, Python, PowerBI. General Knowledge: PowerApps, Java. 3-5 years of experience in software development with minimum 2 years of cloud computing. Education: Bachelor of Science in Computer Science, Engineering, or related technical field.

Posted 1 week ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Overview Data Science Team works in developing Machine Learning (ML) and Artificial Intelligence (AI) projects. Specific scope of this role is to develop ML solution in support of ML/AI projects using big analytics toolsets in a CI/CD environment. Analytics toolsets may include DS tools/Spark/Databricks, and other technologies offered by Microsoft Azure or open-source toolsets. This role will also help automate the end-to-end cycle with Azure Pipelines. You will be part of a collaborative interdisciplinary team around data, where you will be responsible of our continuous delivery of statistical/ML models. You will work closely with process owners, product owners and final business users. This will provide you the correct visibility and understanding of criticality of your developments. Responsibilities Delivery of key Advanced Analytics/Data Science projects within time and budget, particularly around DevOps/MLOps and Machine Learning models in scope Active contributor to code & development in projects and services Partner with data engineers to ensure data access for discovery and proper data is prepared for model consumption. Partner with ML engineers working on industrialization. Communicate with business stakeholders in the process of service design, training and knowledge transfer. Support large-scale experimentation and build data-driven models. Refine requirements into modelling problems. Influence product teams through data-based recommendations. Research in state-of-the-art methodologies. Create documentation for learnings and knowledge transfer. Create reusable packages or libraries. Ensure on time and on budget delivery which satisfies project requirements, while adhering to enterprise architecture standards Leverage big data technologies to help process data and build scaled data pipelines (batch to real time) Implement end-to-end ML lifecycle with Azure Databricks and Azure Pipelines Automate ML models deployments Qualifications BE/B.Tech in Computer Science, Maths, technical fields. Overall 2-4 years of experience working as a Data Scientist. 2+ years experience building solutions in the commercial or in the supply chain space. 2+ years working in a team to deliver production level analytic solutions. Fluent in git (version control). Understanding of Jenkins, Docker are a plus. Fluent in SQL syntaxis. 2+ years experience in Statistical/ML techniques to solve supervised (regression, classification) and unsupervised problems. 2+ years experience in developing business problem related statistical/ML modeling with industry tools with primary focus on Python or Pyspark development. Data Science Hands on experience and strong knowledge of building machine learning models supervised and unsupervised models. Knowledge of Time series/Demand Forecast models is a plus Programming Skills Hands-on experience in statistical programming languages like Python, Pyspark and database query languages like SQL Statistics Good applied statistical skills, including knowledge of statistical tests, distributions, regression, maximum likelihood estimators Cloud (Azure) Experience in Databricks and ADF is desirable Familiarity with Spark, Hive, Pig is an added advantage Business storytelling and communicating data insights in business consumable format. Fluent in one Visualization tool. Strong communications and organizational skills with the ability to deal with ambiguity while juggling multiple priorities Experience with Agile methodology for team work and analytics product creation. Experience in Reinforcement Learning is a plus. Experience in Simulation and Optimization problems in any space is a plus. Experience with Bayesian methods is a plus. Experience with Causal inference is a plus. Experience with NLP is a plus. Experience with Responsible AI is a plus. Experience with distributed machine learning is a plus Experience in DevOps, hands-on experience with one or more cloud service providers AWS, GCP, Azure(preferred) Model deployment experience is a plus Experience with version control systems like GitHub and CI/CD tools Experience in Exploratory data Analysis Knowledge of ML Ops / DevOps and deploying ML models is preferred Experience using MLFlow, Kubeflow etc. will be preferred Experience executing and contributing to ML OPS automation infrastructure is good to have Exceptional analytical and problem-solving skills Stakeholder engagement-BU, Vendors. Experience building statistical models in the Retail or Supply chain space is a plus

Posted 1 week ago

Apply

15.0 - 20.0 years

17 - 22 Lacs

Pune

Work from Office

Naukri logo

Oracle EPM Architect1 Oracle EPM Architect 15+ years (including minimum 5 years in Oracle EPM implementation/architecture) Seeking a highly skilled Oracle EPM Architect to lead the design, implementation, and management of Oracle Enterprise Performance Management (EPM) solutions. This role requires deep expertise in Oracle EPM Cloud and/or Hyperion stack, including strategic planning, solution design, and technical leadership across financial consolidation, planning, budgeting, and forecasting. Lead the architecture and design of Oracle EPM Cloud solutions, including modules such as: Planning and Budgeting Cloud Service (PBCS/EPBCS) Financial Consolidation and Close (FCCS) Enterprise Data Management (EDM) Profitability and Cost Management (PCM) Define and enforce best practices, integration standards, and governance models for EPM solutions. Engage with finance, IT, and business stakeholders to gather requirements and translate them into scalable EPM designs. Develop roadmaps, implementation strategies, and solution blueprints. Guide technical and functional consultants throughout the implementation lifecycle. Lead data integration efforts between Oracle EPM and ERP/other source systems. Ensure EPM solutions meet performance, security, compliance, and audit standards. Provide thought leadership in Oracle EPM innovations, product releases, and architecture trends. Support migration from on-premise Hyperion applications to EPM Cloud (if applicable). Conduct architecture reviews, performance tuning, and code quality assurance. Support post-go-live activities including training, documentation, and optimization.

Posted 1 week ago

Apply

3.0 - 7.0 years

6 - 10 Lacs

Mumbai

Work from Office

Naukri logo

Senior Azure Data Engineer ? L1 Support

Posted 1 week ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Gurugram

Work from Office

Naukri logo

Develop partnerships with key stake holders in HR to understand the strategic direction, business process, and business needs Should be well versed with AGILE / Scrum / Devops. Create technical solutions to meet business requirements Help Finance business users adopt best practices Excellent Verbal & written communication skills. Define user information requirements in Oracle E-Business Suite Implement plans to test business and functional processes Manage Test Scripts that support Oracle R12 financial applications Lead technical acceptance testing (Unit, SIT, and QAT) of patches and upgrades Deliver training content to users. Candidate must be ready to work from office daily and in shifts if required. NO Work From Home allowed. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Preferred technical and professional experience

Posted 1 week ago

Apply

15.0 - 20.0 years

10 - 14 Lacs

Hyderabad

Work from Office

Naukri logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : Microsoft Fabric Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. Your typical day will involve collaborating with various teams to ensure project milestones are met, facilitating discussions to address challenges, and guiding your team through the development process. You will also engage in strategic planning to align application development with organizational goals, ensuring that the solutions provided are effective and efficient. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Facilitate regular team meetings to discuss progress and address any roadblocks. Professional & Technical Skills: Lead and manage a team of data engineers, providing guidance, mentorship, and support.Foster a collaborative and innovative team culture. Work closely with stakeholders to understand data requirements and business objectives.Translate business requirements into technical specifications for the Data Warehouse.Lead the design of data models, ensuring they meet business needs and adhere to best practices.Collaborate with the Technical Architect to design dimensional models for optimal performance.Design and implement data pipelines for ingestion, transformation, and loading (ETL/ELT) using Fabric Data Factory Pipeline and Dataflows Gen2.Develop scalable and reliable solutions for batch data integration across various structured and unstructured data sources.Oversee the development of data pipelines for smooth data flow into the Fabric Data Warehouse.Implement and maintain data solutions in Fabric Lakehouse and Fabric Warehouse.Monitor and optimize pipeline performance, ensuring minimal latency and resource efficiency.Tune data processing workloads for large datasets in Fabric Warehouse and Lakehouse.Exposure in ADF and DataBricks Additional Information:- The candidate should have minimum 5 years of experience in Microsoft Fabric.- This position is based in Hyderabad.- A 15 years full time education is required. Qualification 15 years full time education

Posted 1 week ago

Apply

5.0 - 8.0 years

5 - 10 Lacs

Kolkata

Work from Office

Naukri logo

Skill required: Tech for Operations - Microsoft Azure Cloud Services Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation/12th/PUC/HSC Years of Experience: 5 to 8 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the world s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com.In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle:Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience.The Senior Azure Data factory (ADF) Support Engineer Il will be a critical member of our Enterprise Applications Team, responsible for designing, supporting & maintaining robust data solutions. The ideal candidate is proficient in ADF, SQL and has extensive experience in troubleshooting Azure Data factory environments, conducting code reviews, and bug fixing. This role requires a strategic thinker who can collaborate with cross-functional teams to drive our data strategy and ensure the optimal performance of our data systems. What are we looking for Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. Proven experience (5+ years) as a Azure Data Factory Support Engineer Il Expertise in ADF with a deep understanding of its data-related libraries. Strong experience in Azure cloud services, including troubleshooting and optimizing cloud-based environments. Proficient in SQL and experience with SQL database design. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Experience with ADF pipelines. Excellent problem-solving and troubleshooting skills. Experience in code review and debugging in a collaborative project setting. Excellent verbal and written communication skills. Ability to work in a fast-paced, team-oriented environment. Strong understanding of the business and a passion for the mission of Service Supply Chain Hands on with Jira, Devops ticketing, ServiceNow is good to have Roles and Responsibilities: Innovate. Collaborate. Build. Create. Solve ADF & associated systems Ensure systems meet business requirements and industry practices. Integrate new data management technologies and software engineering tools into existing structures. Recommend ways to improve data reliability, efficiency, and quality. Use large data sets to address business issues. Use data to discover tasks that can be automated. Fix bugs to ensure robust and sustainable codebase. Collaborate closely with the relevant teams to diagnose and resolve issues in data processing systems, ensuring minimal downtime and optimal performance. Analyze and comprehend existing ADF data pipelines, systems, and processes to identify and troubleshoot issues effectively. Develop, test, and implement code changes to fix bugs and improve the efficiency and reliability of data pipelines. Review and validate change requests from stakeholders, ensuring they align with system capabilities and business objectives. Implement robust monitoring solutions to proactively detect and address issues in ADF data pipelines and related infrastructure. Coordinate with data architects and other team members to ensure that changes are in line with the overall architecture and data strategy. Document all changes, bug fixes, and updates meticulously, maintaining clear and comprehensive records for future reference and compliance. Provide technical guidance and support to other team members, promoting a culture of continuous learning and improvement. Stay updated with the latest technologies and practices in ADF to continuously improve the support and maintenance of data systems. Flexible Work Hours to include US Time Zones Flexible working hours however this position may require you to work a rotational On-Call schedule, evenings, weekends, and holiday shifts when need arises Participate in the Demand Management and Change Management processes. Work in partnership with internal business, external 3rd party technical teams and functional teams as a technology partner in communicating and coordinating delivery of technology services from Technology For Operations (TfO) Qualification Any Graduation,12th/PUC/HSC

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

2 Data Engineer Azure Synapse/ADF , Workiva To manage and maintain the associated Connector, Chains, Tables and Queries, making updates, as needed, as new metrics or requirements are identified Develop functional and technical requirements for any changes impacting wData (Workiva Data) Configure and unit test any changes impacting wData (connector, chains, tables, queries Promote wData changes

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies