Home
Jobs

1188 Adf Jobs - Page 41

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

6 - 14 Lacs

Hyderabad

Remote

Job description Job Location : Hyderabad / Bangalore / Chennai / Kolkata / Noida/ Gurgaon / Pune / Indore / Mumbai Preferred: Hyderabad At least 4+ years of relevant hands on development experience as Azure Data Engineering role Proficient in Azure technologies like ADB, ADF, SQL(capability of writing complex SQL queries), ADB, PySpark, Python, Synapse, Delta Tables, Unity Catalog Hands on in Python, PySpark or Spark SQL Hands on in Azure Analytics and DevOps Taking part in Proof of Concepts (POCs) and pilot solutions preparation Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flow Experience in business processing mapping of data and analytics solutions

Posted 1 month ago

Apply

5.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

TCS Hiring for Azure Admin + Azure Platform Eng Experience: 5 to 8 Years Only Job Location: New Delhi,Kolkata,Mumbai,Pune,Bangalore TCS Hiring for Azure Admin + Azure Platform Eng Required Technical Skill Set: Deployment through Terraform,Azure Administration, DataFactory, DataBricks, Active Directory, Identity,Unitycatalog, Terraform, Mechine leaning, AI and Access Management 3+ years of prior product/technical support customer facing experience Must have good knowledge working in Azure cloud technical support Good to have technical skills and hands-on experience in following areas: Deployment through Terraform,PowerShell/CLI, Identity Management, Azure Resource Group Management, Azure PaaS services e.g.: ADF, Databricks, Storage Account Understanding about the machine leaning and AI concept related to Infrastructure. · Unity catalog end to end process to migrate from hive to UC Excellent team player with good interpersonal and communication skills. Experience of Life Science and Health care domain preferred. Roles & Responsibilities: Resource Group creation along with various component deployment using Terraform Template Management of user access in Azure PaaS product such as Azure SQL, WebApp, AppService, Storage Account , DataBricks, DataFactory Creation of Service Principle/AD groups and managing access using this to various application Troubleshoot issues regarding access, data visualizations, permission issues Kind Regards, Priyankha M Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Job Description: Customize and configure Oracle Fusion modules as per business requirements. Develop and modify reports (BIP, OTBI, FRS, Hyperion Smart View), interfaces, extensions (Page Composer, Application Composer (With Groovy Scripting), Process Composer), and workflows (Oracle BPM, AMX), Forms (ADF (Java Based)), VBCS and Page Customization to enhance functionality. Integrate Oracle Fusion applications with other business systems and third-party applications. (Oracle Integration Cloud) Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

What is Blend Blend is a premier AI services provider, committed to co-creating meaningful impact for its clients through the power of data science, AI, technology, and people. With a mission to fuel bold visions, Blend tackles significant challenges by seamlessly aligning human expertise with artificial intelligence. The company is dedicated to unlocking value and fostering innovation for its clients by harnessing world-class people and data-driven strategy. We believe that the power of people and AI can have a meaningful impact on your world, creating more fulfilling work and projects for our people and clients. For more information, visit www.blend360.com What is the Role As a Senior Data Engineer, your role is to spearhead the data engineering teams and elevate the team to the next level! You will be responsible for laying out the architecture of the new project as well as selecting the tech stack associated with it. You will plan out the development cycles deploying AGILE if possible and create the foundations for good data stewardship with our new data products !You will also set up a solid code framework that needs to be built to purpose yet have enough flexibility to adapt to new business use cases tough but rewarding challenge! What you’ll be doing? Collaborate with several stakeholders to deeply understand the needs of data practitioners to deliver at scale Lead Data Engineers to define, build and maintain Data Platform Work on building Data Lake in Azure Fabric processing data from multiple sources Migrating existing data store from Azure Synapse to Azure Fabric Implement data governance and access control Drive development effort End-to-End for on-time delivery of high-quality solutions that conform to requirements, conform to the architectural vision, and comply with all applicable standards. Present technical solutions, capabilities, considerations, and features in business terms. Effectively communicate status, issues, and risks in a precise and timely manner. Further develop critical initiatives, such as Data Discovery, Data Lineage and Data Quality Leading team and Mentor junior resources Help your team members grow in their role and achieve their career aspirations Build data systems, pipelines, analytical tools and programs Conduct complex data analysis and report on results What do we need from you? 5+ Years of Experience as a data engineer or similar role in Azure Synapses, ADF or relevant exp in Azure Fabric Degree in Computer Science, Data Science, Mathematics, IT, or similar field Must have experience executing projects end to end. At least one data engineering project should have worked in Azure Synapse, ADF or Azure Fabric Should be experienced in handling multiple data sources Technical expertise with data models, data mining, and segmentation techniques Deep understanding, both conceptually and in practice of at least one object orientated library (Python, pySpark) Strong SQL skills and a good understanding of existing SQL warehouses and relational databases. Strong Spark, PySpark, Spark SQL skills and good understanding of distributed processing frameworks. Build large-scale batches and real-time data pipelines. Ability to work independently and mentor junior resources. Desire to lead and develop a team of Data Engineers across multiple levels Experience or knowledge in Data Governance Azure Cloud experience with Data modeling, CI\CD, Agile Methodologies, Docker\Kubernetes What do you get in return? Competitive Salary : Your skills and contributions are highly valued here, and we make sure your salary reflects that, rewarding you fairly for the knowledge and experience you bring to the table. Dynamic Career Growth: Our vibrant environment offers you the opportunity to grow rapidly, providing the right tools, mentorship, and experiences to fast-track your career. Idea Tanks : Innovation lives here. Our "Idea Tanks" are your playground to pitch, experiment, and collaborate on ideas that can shape the fut ur e. Growth Chats : Dive into our casual "Growth Chats" where you can learn from the best whether it's over lunch or during a laid-back session with peers, it's the perfect space to grow your ski lls. Snack Zone : Stay fueled and inspired! In our Snack Zone, you'll find a variety of snacks to keep your energy high and ideas flow i ng. Recognition & Rewards : We believe great work deserves to be recognized. Expect regular Hive-Fives, shoutouts and the chance to see your ideas come to life as part of our reward prog r am. Fuel Your Growth Journey with Certifications : We’re all about your growth groove! Level up your skills with our support as we cover the cost of your certificat i ons. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

TCS Hiring for Azure Admin + Azure Platform Eng Experience: 5 to 8 Years Only Job Location: New Delhi,Kolkata,Mumbai,Pune,Bangalore TCS Hiring for Azure Admin + Azure Platform Eng Required Technical Skill Set: Deployment through Terraform,Azure Administration, DataFactory, DataBricks, Active Directory, Identity,Unitycatalog, Terraform, Mechine leaning, AI and Access Management 3+ years of prior product/technical support customer facing experience Must have good knowledge working in Azure cloud technical support Good to have technical skills and hands-on experience in following areas: Deployment through Terraform,PowerShell/CLI, Identity Management, Azure Resource Group Management, Azure PaaS services e.g.: ADF, Databricks, Storage Account Understanding about the machine leaning and AI concept related to Infrastructure. · Unity catalog end to end process to migrate from hive to UC Excellent team player with good interpersonal and communication skills. Experience of Life Science and Health care domain preferred. Roles & Responsibilities: Resource Group creation along with various component deployment using Terraform Template Management of user access in Azure PaaS product such as Azure SQL, WebApp, AppService, Storage Account , DataBricks, DataFactory Creation of Service Principle/AD groups and managing access using this to various application Troubleshoot issues regarding access, data visualizations, permission issues Kind Regards, Priyankha M Show more Show less

Posted 1 month ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Position: Enterprises Architect //Solution Architect// Cloud Pre-Sales Solution Architect. Job Type: -Permanent Location: Pune Experience: 10+ Years Role Responsibilities Roles and Responsibilities To design develop and implement solutions to complex business problems collaborating with stakeholders to understand their needs and requirements and design and implement solutions that meet those needs and create solutions that balance technology risks against business delivery driving consistency A seasoned IT Professional with + years of IT experience and currently, working as a Solution Architect Technologies COE/Practice Team understanding business requirements and designing solutions based on Multi clouds Azure, AWS, Google AWS Certified Associate/Professional Architect. As AWS Architect for Belron-Safelite - 80 Applications target designs as per AS-IS designs and signed off by presenting the target designs with stakeholders. Won the appreciations from the customer majorly on networking part for using the different AWS networking services. As AWS Architect lead the team of developers to configure the infra using the CFT. Using the AWS CFT to spin the EKS cluster for deployment the critical application Azure Cloud T&T - 21 different Azure Data Analytics services TDD got approved from customer and implemented e.g. ADF, Event Grid, Event Hub, Synapsys etc. Provisioning different Azure Analytics services for customers using IaC tool. Transitions 8-9 Fortune 500 customers in Azure, AWS and Google Cloud. Implemented Cloudera Big Data Hadoop-Anaconda S/W for Malaysia biggest financial customer. Transition Lead - Big Data Hadoop cluster 220 nodes, USA one of the biggest financial customers Managing Azure Landing Zone implementation EU customer. Engaged with GTM/SALE/PRE-Sale Team for technical expertise. Azure Cloud strategy consultant. Lead the T&T (Transitions & Transformations) for the Azure PaaS Services across the globe customers. Responsible to build the Big Data Hadoop Practice Team. Expert in Azure PaaS Data & Analytics Services. Involved in propositions , Pre-Sale, TDD, HLD, LLD, Solutioning & Designing, Architecture, Effort Estimation, RFP/RFI, T&T (Transition & Transformation) and Core Member of Interview Panel for Big Data Analytics and Clouds technologies. Lead the Team to Implement the Different Azure PaaS Data & Analytics Services Rich experience in preparing the deployment plan for different Azure PaaS service and get approval from Customer to provisioning the services. Worked closely with IaC Team to execute the deployment plan. Rich technical experience in architecting, designing & implementing Cloud based Data Platform & Analytics Services . Currently spearheading the delivery of Azure Data Lake and Modern Data Warehouse Solutions. Developing solutions, planning, creating & delivering compelling proof of concept demonstrations. Possess professional IT Delivery Management experience with strong work ethics, approachability and consistent commitment to the team leadership and innovation. Responsible for driving teamwork, communication, collaboration, and commitment within I.T. and teams. Providing and implementing suggestions on Cost Optimization on Client Infra. Working on various Microsoft Azure services such as Azure Virtual Machines, Azure Networking, Azure Storage, Azure Migrate, Azure DevOps, Azure Data Lake, Azure Synapse Analytics, Azure Stream Analytics, Azure Data Bricks, Azure Backup and Azure Active Directory. Configuring the Azure Firewall, Application Gateway with WAF, load balancers and traffic manager to manage security of the workload virtual network. Managing and implementing roles, users, groups, RBAC, MFA, Conditional Access Policies in Azure AD. Working various DevOps tools such as Repo, Dashboard, GitHub for version control, containers- Dockers and Kubernetes . Managing pods, Replica Sets, deployments, services in a Kubernetes cluster. Building POC environment in Google and IBM Cloud. Provisioning different resources/resource groups via Terraform. Worked as Mainframe Consultant with Tech Mahindra (Satyam Computers Ltd. ) for EU Clients to Implement Change man/Vision Plus. Expertise in Troubleshooting production issues, log analysis, performance monitoring . Excellent knowledge of ITIL processes like Incident, Problem, Change, Release, Availability management. Worked on various Service Management tools like Remedy/ServiceNow- Problem and Incident management tools. Responsible for Transition and Transformation of Hadoop Projects. Responsible for the various Big Data Analytics and Cloud Propositions. Big Data Hadoop with Global biggest Financial Customers: - Hadoop| HBase | Hive| Looker| Neo4j| OpenShift| Kubernetes| Docker| Rundeck| Prometheus| AWS| Azure| Shell| Python| Architect| Implementation| Troubleshooting| Solution Show more Show less

Posted 1 month ago

Apply

12.0 - 20.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

TCS Hiring for Hybrid Cloud Data Solutions Architect(GCP,Azure,AWS)_Kolkata / Hyderabad Experience: 12 to 20 Years Only Job Location: Kolkata / Hyderabad Only TCS Hiring for Hybrid Cloud Data Solutions Architect(GCP,Azure,AWS)_Kolkata / Hyderabad Required Technical Skill Set: Experience: Minimum of 8 years in data architecture or 10 years in engineering roles related to hybrid / multi cloud solutions, with a proven track record of successful client engagements. Technical Skills: Proficiency in two or more cloud platforms (AWS, Azure, GCP) related data services - Redshift, BigQuery, Synapse, Databricks, ADF, Glue, AWS EMR, Azure Insights, GCP Data Proc, GCP DataFlow etc., Experience in architecting applications leveraging containerization (Docker, Kubernetes), Cloud native (IaaS, PaaS, SaaS), Hybrid / Multi-Cloud, Hardware OEMs, Network, Security, Microservices, FinOps, iPaaS and APIs, Infrastructure as Code (IaC) tools (Terraform, CloudFormation), and CI/CD pipelines. Strong knowledge of enterprise architecture principles. Communication Skills: Excellent communication abilities to engage effectively with both technical and non-technical stakeholders to articulate technical concepts. Desirable Skill Set: Knowledge of any specific industry verticals (e.g., BFSI, Healthcare, Manufacturing, Telecom / Media). Technical certifications related to cloud computing (e.g., AWS Certified Solutions Architect, Microsoft Certified: Azure Solutions Architect Expert). Relevant cloud certifications (e.g., AWS Certified Solutions Architect) are preferred; must obtain certification within 90 days of employment. Understanding of DevOps concepts. Ability to lead cross-functional teams effectively. Key Responsibilities: Strategy & Design: Develop a comprehensive data strategy on Multi / Hybrid Cloud scenarios aligned with business goals. Design scalable, secure, and cost-effective Data solutions. Evaluate and select cloud platforms (AWS, Azure, GCP, OCI, IBM, Nutanix, Neo Cloud, etc.) and third-party tools. Develop blueprint, roadmap and drive implementation of data architecture, framework related epics / user stories Data modeling based on the business use cases Solution Design: Design data ingestion layer and data movement from ingestion layer to operational / analytical layers Design of data consumption layer (visualization, Analytics, AI/ML, outbound data) Design data governance track – framework design for data quality, data security, metadata etc., Architect tailored cloud solutions that leverage best practices and meet specific client requirements, utilizing native data services such as AWS, Azure, Google Cloud services Ability to understand data pipelines and modern ways of automating data pipeline using cloud based and on-premise technologies Good knowledge of any RBDMS/NoSQL database with strong SQL writing skills Good understanding of ML and AI concepts and Propose solutions to automate the process Technical Presentations: Conduct workshops and presentations to demonstrate solution feasibility and value, fostering trust and engagement with stakeholders. Proof of Concept (POC): Lead the design and implementation of POCs to validate proposed solutions, products against features & cost. Implementation & Management: Guide technical solution development in engagements related to Legacy modernization, migration of applications and infrastructure to hybrid cloud, Engineered cloud, etc. Guide, mentor the data development squads and review the deliverables, as required. Kind Regards, Priyankha M Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Role- azure data engineer Experience- 8-10yrs Location- Kolkata Must-Have** ETL, Azure Data Factory, SSRS, MS Fabric, Python, PowerShell SN Responsibility of / Expectations from the Role 1 Azure Data Engineer 2 Develop full SDLC project plans to implement ETL solution and identify resource requirements, Good Knowledge of SQL server complex queries, joins, etc. 3 Rest API, ADF pipeline, MS Fabric 4 SSIS and Azure Data Factory based ETL architecture. 5 Good exposure in Client Communication and supporting requests from customer Show more Show less

Posted 1 month ago

Apply

6.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Role: Data Engineer Location – Bangalore Type: Fulltime Experience - 10 + yrs Notice:-Immediate Job Description – Data Engineer (Azure, ADF, Databricks, PySpark, SCD, Unity Catalog, SQL) Role Overview: Looking for highly skilled experienced Data Engineer with expertise in Azure Data Factory (ADF), Azure Databricks, Delta Tables, Unity Catalog, Slowly Changing Dimension Type 2 (SCD2), and PySpark. The ideal candidate will be responsible for designing, developing, and optimizing data pipelines and ETL workflows while ensuring data integrity, scalability, and security within the Azure ecosystem. Key Responsibilities: Develop and optimize data pipelines using Azure Data Factory (ADF) and Azure Databricks for large-scale data processing. Implement Slowly Changing Dimension in Delta Tables to manage historical data changes effectively. Leverage Unity Catalog for secure and organized data governance, cataloging, and access control across Databricks. Write efficient PySpark code to process and transform large datasets, ensuring high performance and scalability. Design and implement ETL/ELT solutions to integrate data from multiple sources into Delta Lake. Monitor, debug, and optimize existing data pipelines to ensure smooth operations and minimal downtime. Ensure data quality, consistency, and lineage tracking through best practices and automation. Collaborate with data architects, analysts, and business teams to define requirements and implement data-driven solutions. Required Skills & Qualifications: 6+ years of experience in Data Engineering with a focus on Azure technologies. Expertise in Azure Data Factory (ADF) & Azure Databricks for ETL/ELT workflows. Strong knowledge of Delta Tables & Unity Catalog for efficient data storage and management. Experience with Slowly Changing Dimensions (SCD2) implementation in Delta Lake. Proficiency in PySpark for large-scale data processing & transformation. Hands-on experience with SQL & performance tuning for data pipelines. Understanding of data governance, security, and compliance best practices in Azure. Knowledge of CI/CD, DevOps practices for data pipeline automation. Preferred Qualifications: Experience with Azure Synapse Analytics, Data Lakes, and Power BI integration . Knowledge of Kafka or Event Hub for real-time data ingestion. Certifications in Microsoft Azure (DP-203, DP-900) or Databricks are a plus. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Greetings from TCS. Role : Azure Data Engineer Experience : 8 - 12 yrs WORK Location: Noida/Bangalore/Kolkata/Pune/Mumbai/Hyderabad Interview Mode: Virtual (MS Teams) Job Description: 1.Lead Back-end development & maintenance of Data Quality Product 2. Design and develop Data Pipeline using ADF, Databricks and integrate with other Azure Services 3. Experience in setting up DevOps pipeline 4. Databricks notebooks/Python Programming skills. 5. Knowledge of RDMS databases like SQL, Azure SQL Good-to-Have 1. Able to take the lead in debugging and resolving infrastructure and engineering issues 2. Experience in Azure Cloud Build and Automation using ARM template 3. Good Communication Skills If interested, please share your contact number and updated CV through DM , further details will be shared over telephonic discussion. Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

1 Role Data Engineer- Required Technical Skill Set SQL, Python, Hadoop, Spark, Azure Data Factory, Azure Data Lake Storage, Azure/GCP, Snowflake, Airflow, Data pipelines, Jenkins / Jira / Git, CICD, Kubernetes / Docker Desired Experience Range- 5+ 8 Year Location of Requirement- Bangalore, Chennai, Delhi, kochi, hyderabad Desired Competencies (Technical/Behavioral Competency) Must-Have · Strong in SQL, Python, Hadoop, Spark · Experience with cloud platforms (GCP/Azure/AWS) · Experience working in Agile delivery environment. · Experience with orchestration tools like Airflow, ADF. · Experience with real-time and streaming technology (i.e. Azure Event Hubs, Azure Functions Kafka, Spark Streaming). · Experience building automated data pipelines. · Experience performing data analysis and data exploration. · Experience working in multi-developer environment, using version control like Git. · Strong critical thinking, communication, and problem-solving skills. Good-to-Have · Understanding DevOps best practice and CICD · Understanding of containerization (i.e. Kubernetes, Docker) · Healthcare Domain knowledge Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. In Oracle human capital at PwC, you will specialise in providing consulting services for Oracle human capital management (HCM) applications. You will analyse client requirements, implement HCM software solutions, and provide training and support for seamless integration and utilisation of Oracle HCM applications. Working in this area, you will enable clients to optimise their human resources processes, enhance talent management, and achieve their strategic objectives. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary A career within PwC's Oracle Services Practice, will provide you with the opportunity to help organizations use enterprise technology to achieve their digital technology goals and capitalize on business opportunities. We help our clients implement and effectively use Oracle offerings to solve their business problems and fuel success in the areas of finance operations, human capital management, supply chain management, reporting and analytics, and governance, risk and compliance . *Responsibilities: Participate in the implementation of Oracle HCM Cloud modules such as Core HR, Payroll, Benefits, Talent Management, Compensation, and others. Configure Oracle HCM Cloud applications to meet client requirements. Develop and customize reports using Oracle BI Publisher, OTBI, and other reporting tools. Create and modify HCM extracts, HDL (HCM Data Loader) scripts, and other data integration processes. Design and develop integrations using Oracle Integration Cloud (OIC) or other middleware solutions. * Mandatory skill sets Design and develop integrations using Oracle Integration Cloud (OIC) or other middleware solutions. Modules: Absence, Time and Labour, Payroll, workforce planning, HR helpdesk, Oracle digital assistants, Oracle guided learning *Preferred skill sets Provide technical support and troubleshooting for Oracle HCM Cloud applications. Perform routine maintenance and upgrades to ensure optimal performance of the HCM system. *Years of experience required 2 - 4 Yrs experience *Education Qualification BE/BTech /MBA/MCA/C As Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Chartered Accountant Diploma, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Integration Cloud (OIC) Optional Skills Absence Management, Absence Management, Accepting Feedback, Active Listening, Benefits Administration, Business Analysis, Business Process Improvement, Change Management, Communication, Emotional Regulation, Empathy, Employee Engagement Strategies, Employee Engagement Surveys, Employee Relations Investigations, Human Capital Management, Human Resources (HR) Consulting, Human Resources (HR) Metrics, Human Resources (HR) Policies, Human Resources (HR) Project Management, Human Resources (HR) Transformation, Human Resources Management (HRM), Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF) {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. In Oracle human capital at PwC, you will specialise in providing consulting services for Oracle human capital management (HCM) applications. You will analyse client requirements, implement HCM software solutions, and provide training and support for seamless integration and utilisation of Oracle HCM applications. Working in this area, you will enable clients to optimise their human resources processes, enhance talent management, and achieve their strategic objectives. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary A career within PwC's Oracle Services Practice, will provide you with the opportunity to help organizations use enterprise technology to achieve their digital technology goals and capitalize on business opportunities. We help our clients implement and effectively use Oracle offerings to solve their business problems and fuel success in the areas of finance operations, human capital management, supply chain management, reporting and analytics, and governance, risk and compliance . *Responsibilities: Participate in the implementation of Oracle HCM Cloud modules such as Core HR, Payroll, Benefits, Talent Management, Compensation, and others. Configure Oracle HCM Cloud applications to meet client requirements. Develop and customize reports using Oracle BI Publisher, OTBI, and other reporting tools. Create and modify HCM extracts, HDL (HCM Data Loader) scripts, and other data integration processes. Design and develop integrations using Oracle Integration Cloud (OIC) or other middleware solutions. * Mandatory skill sets Design and develop integrations using Oracle Integration Cloud (OIC) or other middleware solutions. Modules: Absence, Time and Labour, Payroll, workforce planning, HR helpdesk, Oracle digital assistants, Oracle guided learning *Preferred skill sets Provide technical support and troubleshooting for Oracle HCM Cloud applications. Perform routine maintenance and upgrades to ensure optimal performance of the HCM system. *Years of experience required 2 - 4 Yrs experience *Education Qualification BE/BTech /MBA/MCA/C As Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Chartered Accountant Diploma, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Integration Cloud (OIC) Optional Skills Absence Management, Absence Management, Accepting Feedback, Active Listening, Benefits Administration, Business Analysis, Business Process Improvement, Change Management, Communication, Emotional Regulation, Empathy, Employee Engagement Strategies, Employee Engagement Surveys, Employee Relations Investigations, Human Capital Management, Human Resources (HR) Consulting, Human Resources (HR) Metrics, Human Resources (HR) Policies, Human Resources (HR) Project Management, Human Resources (HR) Transformation, Human Resources Management (HRM), Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF) {+ 21 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 month ago

Apply

9.0 - 14.0 years

30 - 45 Lacs

Bengaluru

Work from Office

Design, deploy, and optimize Azure-based data pipelines and architectures. Ensure scalability, data integrity, and CI/CD automation. Collaborate with analytics teams and lead data engineering initiatives across hybrid data platforms Required Candidate profile Bachelor’s in CS/IT with 7–12 years of experience in Azure data engineering. Strong in ADF, Synapse, Databricks, and CI/CD. Able to mentor junior engineers, optimize large-scale data systems

Posted 1 month ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description Design, develop, troubleshoot and debug software programs for databases, applications, tools, networks etc. As a member of the software engineering division, you will take an active role in the definition and evolution of standard practices and procedures. You will be responsible for defining and developing software for tasks associated with the developing, designing and debugging of software applications or operating systems. Work is non-routine and very complex, involving the application of advanced technical/business skills in area of specialization. Leading contributor individually and as a team member, providing direction and mentoring to others. BS or MS degree or equivalent experience relevant to functional area. 7 years of software engineering or related experience. Career Level - IC4 Responsibilities Job Description: The Applications Technology Group is responsible for building the Framework & Technology foundation for Oracle E-Business Suite. Life Cycle Management falls under the Application Technology Group & this group is responsible developing the products which would be responsible Installing, Upgrading, Maintaining, Monitoring, Bundling the technology stack, cloning etc . This group carries forward the same responsibility in deploying the Oracle E-Business Suite in Oracle Cloud Infrastructure as well. RESPONSIBILITIES: This position mainly involves developing products which automates the deployment of Oracle E-Business Suite in Oracle Cloud Infrastructure. Deliver configuration level products for Oracle E-Business Suite to use Oracle Application Server with and Oracle Databases. Work closely with other lines of business including Applications, Application Server, and Database to identify and implement solutions for challenging technical problems. Give to the definition of standard practices and procedures for software development. Recommend and explain major changes to existing products, services and processes. This is purely a individual contributor role QUALIFICATIONS: Mandatory Skill Sets: Java back-end programming Experience with any of the Java based UI frameworks (JET, ADF, OAFWK, UIX, SPRING UI framework ) PL/SQL programming PERL, Unix Shell scripting Excellent analytical and problem solving skills. Self-motivated, drive, great teammate and results oriented Experience in the following would be helpful: Experience in developing System Administration tools Other Scripting languages like Python, Ruby Experience & exposure to Automation platforms like Chef & Puppet is helpful Experience in automating the deployment using the infrastructure provided by different Cloud vendors. Experience in some or all of the following Oracle technologies ? Oracle Application Server, Oracle Web Logic Server, Oracle Database or experience with Knowledge of Oracle E-Business Suite Academics Any Of The Following Qualifications With Excellent Academic Credentials. BE or MS degree in computer science or equivalent M.C.A Experience: Looking for a minimum of 6+ years of experience Design, develop, fix and debug software programs for databases, applications, tools, networks etc. Diversity and Inclusion: An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that encourages thought leadership and innovation. We also encourage our employees to engage in the culture of giving back to the communities where we live and do business. At Oracle, we believe that innovation starts with diversity and inclusion and to build the future we need talent from various backgrounds, perspectives, and abilities About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Role Profile We are looking for sophisticated and forward-looking Full Stack Jr. Engineer - Oracle Cloud OICS/VBCS/JAVA to join our engineering team. The successful candidate will be responsible for leading Oracle Cloud Fusion, OICS, VBCS, ADF and JAVA development, including the setup, configuration, and management, as well as coordinating OCI support. The candidate must full grasping the end-to-end configuration, technical dependencies, and overall behavioural characteristics of large-scale implementation of Cloud Fusion. Responsibility includes development and implementation of critical test cases with focus on security, resiliency, scale, and performance. Partner with development teams and work towards addressing and fixing production issues on cloud, defining and implementing product improvements. Collaborate with various cloud operations teams to understand the production issues and work towards to build a reproducible test case in the lab environment to present to the development team. Background LSEG have embarked on a Finance Transformation programme to deliver our Finance Vision and transform the way we work to build value and deliver balanced growth for the business. The Programme is driving efficiencies and max improving benefits for LSEG by moving to a single cloud-based Enterprise Resource Planning and Performance Management (Oracle Fusion/EPMCS/EDMCS/OICS/ ORMBCS). As a Full Stack Jr. Engineer - Oracle Cloud Fusion OICS, VBCS, ADF & JAVA will be responsible for: Business Requirements Analysis: Collaborate with business collaborators to elicit, analyze, and detailt comprehensive reporting requirements, translating them into technical specifications. Provided solutions as per business requirements, project plan estimations Integration Design and Development: Design, develop, and implement sophisticated integrations between Oracle Cloud applications and external systems using Oracle Integration Cloud Service (OICS), SQL, PL/SQL, Python, ADF, Java and VBCS. Integration Architecture: Define and implement integration architectures that align with business requirements, ensuring scalability, performance, and security. API Development: Build and handle APIs using OICS to expose data and functionality from various systems, enabling detailed integration. Data Mapping and Transformation: Design and implement data mapping rules and transformations to ensure accurate data flow between systems. Testing and Quality Assurance: Develop and complete comprehensive test plans to validate the accuracy, performance, and reliability of integrations, identifying and addressing any issues. Deployment and Support: Deploy integrations to production environments and provide ongoing support, solving issues and implementing improvements as needed. Documentation and Knowledge Transfer: Create clear and concise documentation for integrations, including design specifications, user guides, and maintenance procedures. Support and Maintenance: Provide ongoing support for existing reports, addressing user inquiries, solving issues, and implementing enhancements as needed to maintain report effectiveness. Handle and work with Oracle support, consulting teams and should have a thorough understanding of the process that is involved taking care of Oracle SR’s. Work independently and tackle problems and are willing to do what it takes to get things done. Ability to establish relationships and influence outside of authority while demonstrating Oracle expertise and resources. Effective interpersonal skills (written and spoken) and strong problem-solving skills Ability to work in a fast-paced Agile development and rapid deployment Strong inclination towards test driven development Extensive experience of Oracle Cloud and EBS R12 /RMB Architecture Hands on Knowledge on Cloud at Customer (GEN I, GEN II) or similar PaaS/IaaS experience Phenomenal teammate, able to work with different levels in the organization across multiple time zones. Naturally inquisitive, able to think creatively and offer solutions or alternative viewpoints. Knowledge/Skills: Solid understanding in Oracle Cloud Fusion, OICS, REST services, SQL, PL/SQL, Python, Java and VBCS. Finance business process Knowledge with R2R, P2P, O2C and Oracle RMBCS. Data modelling, designed and implemented ETL interfaces/mappings to load data into DWH as per business and reporting requirements Agile, DevOps, SDLC industry standard process/methods. Certifications: Relevant Oracle Cloud Fusion certifications, such as Oracle Cloud Infrastructure (OCI) certifications or Oracle Fusion Applications certifications. Industry Expertise: Specific confirmed experience in areas such as finance, human capital management, supply chain, or enterprise resource planning. Skill of leading virtual teams, partner across interpersonal boundaries and influence development direction and Ability to empower, grow and guide developers. Experience One Full Life Cycle Implementation of OICS, REST services, Java, ADF, SOA, SQL, PL/SQL, Python, Java and VBCS, Oracle Database ADW Applications Seeded Customization Framework, OATs. PSR Tools, Accessibility Testing Tools preferred for Oracle Cloud Fusion ERP. Experience working within a Finance Technology function, preferably with large Financial Services Organization. Experience in driving high availability, resiliency, and scalability of a global Oracle ERP Cloud Fusion Applications landscape delivering continuous improvements. Experience in continuous delivery, deployment and monitoring of cloud-based services. Ability to work with multi-functional Directors and global leads Degree or equivalent experience in Computer Science, Software Engineering. LSEG is a leading global financial markets infrastructure and data provider. Our purpose is driving financial stability, empowering economies and enabling customers to create sustainable growth. Our purpose is the foundation on which our culture is built. Our values of Integrity, Partnership , Excellence and Change underpin our purpose and set the standard for everything we do, every day. They go to the heart of who we are and guide our decision making and everyday actions. Working with us means that you will be part of a dynamic organisation of 25,000 people across 65 countries. However, we will value your individuality and enable you to bring your true self to work so you can help enrich our diverse workforce. You will be part of a collaborative and creative culture where we encourage new ideas and are committed to sustainability across our global business. You will experience the critical role we have in helping to re-engineer the financial ecosystem to support and drive sustainable economic growth. Together, we are aiming to achieve this growth by accelerating the just transition to net zero, enabling growth of the green economy and creating inclusive economic opportunity. LSEG offers a range of tailored benefits and support, including healthcare, retirement planning, paid volunteering days and wellbeing initiatives. We are proud to be an equal opportunities employer. This means that we do not discriminate on the basis of anyone’s race, religion, colour, national origin, gender, sexual orientation, gender identity, gender expression, age, marital status, veteran status, pregnancy or disability, or any other basis protected under applicable law. Conforming with applicable law, we can reasonably accommodate applicants' and employees' religious practices and beliefs, as well as mental health or physical disability needs. Please take a moment to read this privacy notice carefully, as it describes what personal information London Stock Exchange Group (LSEG) (we) may hold about you, what it’s used for, and how it’s obtained, your rights and how to contact us as a data subject. If you are submitting as a Recruitment Agency Partner, it is essential and your responsibility to ensure that candidates applying to LSEG are aware of this privacy notice. Show more Show less

Posted 1 month ago

Apply

13.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Director Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities About the role: As a Director, you'll be taking the lead in designing and maintaining complex data ecosystems. Your experience will be instrumental in optimizing data processes, ensuring data quality, and driving data-driven decision-making within the organization. Responsibilities: Architecting and designing complex data systems and pipelines. Leading and mentoring junior data engineers and team members. Collaborating with cross-functional teams to define data requirements. Implementing advanced data quality checks and ensuring data integrity. Optimizing data processes for efficiency and scalability. Overseeing data security and compliance measures. Evaluating and recommending new technologies to enhance data infrastructure. Providing technical expertise and guidance for critical data projects. Required skills & experience: Proficiency in designing and building complex data pipelines and data processing systems. Leadership and mentorship capabilities to guide junior data engineers and foster skill development. Strong expertise in data modeling and database design for optimal performance. Skill in optimizing data processes and infrastructure for efficiency, scalability, and cost-effectiveness. Knowledge of data governance principles, ensuring data quality, security, and compliance. Familiarity with big data technologies like Hadoop, Spark, or NoSQL. Expertise in implementing robust data security measures and access controls. Effective communication and collaboration skills for cross-functional teamwork and defining data requirements. Skills: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc., Mandatory skill sets: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Preferred skill sets: Cloud: Azure/GCP/AWS DE Technologies: ADF, Big Query, AWS Glue etc., Data Lake: Snowflake, Data Bricks etc. Years of experience required: 13+ years Education qualification: BE/BTECH, ME/MTECH, MBA, MCA Mandatory Skill Sets Technical Delivery Preferred Skill Sets Technical Delivery Years Of Experience Required 13 - 20 Education Qualification B.Tech / M.Tech / MBA / MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Technical Delivery Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Coaching and Feedback, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion {+ 24 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 month ago

Apply

5.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Key Accountabilities JOB DESCRIPTION The Azure Data Support engineer focuses on data-related tasks in Azure. Manage, monitor, and ensure the security and privacy of data to satisfy business needs. Monitor real time and batch processes to ensure data accuracy. Monitor azure pipelines and troubleshoot where required. Enhance existing pipelines, databricks notebooks as and when required. Involved in development stages of new pipelines as and when required. Troubleshoot pipelines, real time replication jobs and ensuring minimum data lag. Available to work on a shift basis to cover monitoring during weekends. (one weekend out of three) Act as an ambassador for DP World at all times when working; promoting and demonstrating positive behaviours in harmony with DP World’s Principles, values and culture; ensuring the highest level of safety is applied in all activities; understanding and following DP World’s Code of Conduct and Ethics policies Perform other related duties as assigned JOB CONTEXT Responsible for monitoring and enhancing existing data pipelines using Microsoft Stack. Responsible for enhancement of existing data platforms. Experience with Cloud Platforms such as Azure, AWS , Google Cloud etc. Experience with Azure data Lake, Azure datalake Analytics, Azure SQL Database, Azure, Data Bricks and Azure SQL Data warehouse. Good understanding of Spark Architecture including Spark Core, Spark SQL, Data Frames, Spark Streaming, Driver Node, Worker Node, Stages, Executors and Tasks. Good understanding of Big Data Hadoop and Yarn architecture along with various Hadoop Demons such as Job Tracker, Task Tracker, Name Node, Data Node, Resource/Cluster Manager, and Kafka (distributed stream-processing) . Experience in Database Design and development with Business Intelligence using SQL Server 2014/2016, Integration Services (SSIS), DTS Packages, SQL Server Analysis Services (SSAS), DAX, OLAP Cubes, Star Schema and Snowflake Schema. Monitoring of pipelines in ADF and experience with Azure SQL, Blob storage, Azure SQL Data warehouse. Experience in a support environment working with real time data replication will be a plus. Qualification QUALIFICATIONS, EXPERIENCE AND SKILLS Bachelor/master’s in computer science/IT or equivalent. Azure certifications will be an added advantage (Certification in AZ-900 and/or AZ-204, AZ-303, AZ-304 or AZ-400 , DP200 & DP201). ITIL certification a plus. Experience : 5 - 8 Years Must Have Skills Azure Data lake, Data factory, Azure Databricks Azure SQL database, Azure SQL Datawarehouse. Hadoop ecosystem. Azure analytics services. Programming Python, R, Spark SQL Good To Have Skills MSBI (SSIS, SSAS, SSRS), Oracle, SQL, PL/SQL Data Visualization, Power BI Data Migration Show more Show less

Posted 1 month ago

Apply

8.0 - 13.0 years

10 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Databricks with Data Scientist experience (8 to 10 Years). - India. 4 years of relevant work experience as a data scientist. Minimum 2 years of experience in Azure Cloud using Databricks Services, PySpark, Natural Language API, MLflow. Experience in Data Transformation using PySpark and SQL. Skillset: Python, Pyspark, Databricks, MLflow, ADF. Please share your cv to ranjitha@promantusinc.com Regards, Ranjitha 7619598141.

Posted 1 month ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 6 - 15 Yrs Location: Pan India Job Description: Candidate must be experienced working in projects involving Other ideal qualifications include experiences in Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python Familiarity with AWS compute storage and IAM concepts Experience in working with S3 Data Lake as the storage tier Any ETL background Talend AWS Glue etc. is a plus but not required Cloud Warehouse experience Snowflake etc. is a huge plus Carefully evaluates alternative risks and solutions before taking action. Optimizes the use of all available resources Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Skills Hands on experience on Databricks Spark SQL AWS Cloud platform especially S3 EMR Databricks Cloudera etc. Experience on Shell scripting Exceptionally strong analytical and problem-solving skills Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses Strong experience with relational databases and data access methods especially SQL Excellent collaboration and cross functional leadership skills Excellent communication skills both written and verbal Ability to manage multiple initiatives and priorities in a fast-paced collaborative environment Ability to leverage data assets to respond to complex questions that require timely answers has working knowledge on migrating relational and dimensional databases on AWS Cloud platform Skills Interested can share your resume to sankarspstaffings@gmail.com with below inline details. Over All Exp : Relevant Exp : Current CTC : Expected CTC : Notice Period :

Posted 1 month ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 6 - 15 Yrs Location: Pan India Job Description: Candidate must be proficient in Databricks Understands where to obtain information needed to make the appropriate decisions Demonstrates ability to break down a problem to manageable pieces and implement effective timely solutions Identifies the problem versus the symptoms Manages problems that require involvement of others to solve Reaches sound decisions quickly Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Roles Responsibilities Provides innovative and cost effective solution using databricks Optimizes the use of all available resources Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Learn adapt quickly to new Technologies as per the business need Develop a team of Operations Excellence building tools and capabilities that the Development teams leverage to maintain high levels of performance scalability security and availability Skills The Candidate must have 710 yrs of experience in databricks delta lake Hands on experience on Azure Experience on Python scripting Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses Strong experience with relational databases and data access methods especially SQL Knowledge of Azure architecture and design Interested can share your resume to sankarspstaffings@gmail.com with below inline details. Over All Exp : Relevant Exp : Current CTC : Expected CTC : Notice Period :

Posted 1 month ago

Apply

3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description Department Description: Corporate Tools Engineering (CTE) team, part of our Oracle’s SaaS Operations group , designs, develops, implements, maintains and upgrades the applications that Oracle requires to meet it's constantly evolving business requirements and to support customers, and provides a showcase for Oracle's products. The team also develops and maintains applications used by the Product Development and Support organization and influences the direction and strategy for Oracle's products through collaborative and often groundbreaking involvement in implementation. Job Description: As a member of the CTE team, you will be involved in analyzing the requirements working closely with the product management / business team, prepare technical design, and perform development & unit testing to successfully implement key applications used by the entire Product Development and Customer Services Organizations. From the technology stand-point you will work with Oracle’s world-class technology that includes the latest version of Oracle Database, SQL & PLSQL, Middleware, Application Development Frameworks involving Java/ADF, JavaScript, Web Services, JS Frameworks etc. Required Skills: We are looking for outstanding individuals with proven track record and who are excellent great teammates. Should be able to perform as a team member, must be a self-starter and ready to work as an individual contributor and should possess strong analytical and problem solving capabilities. Handle the work you're doing independently and deliver quality solution adhering to the defined timelines within the defined policies and procedures. Candidate will investigate, estimate, document, design and implement changes to new and existing software architectures. Should have 3 to 10 years of work experience in design, development/customization, implementation and support of web based Database Applications , and should be proficient in UI Development , PL/SQL , and modern UI Frameworks . Knowledge of Oracle APEX is a plus. Strong skills and hands on experience in some of the areas like Data Modeling, Application Design, Oracle Database (SQL, PLSQL), Performance Tuning . Knowledge/Experience in Sophisticated Oracle Database Technologies like Oracle Streams & Advanced Queuing, Edition Based Redefinition will be a plus. Knowledge of Oracle Cloud Infrastructure and AI Models is much solicited. Knowledge of mobile application development (native / browser based) will be a plus as well. Knowledge of Chef, Jenkins, Ruby on rails, Puppet is added advantage Should be able to participate in technical designs, recommend solutions, follow best design and coding practices and also provide technical assistance to the team. Willingness to learn new things and ready to work on some highly transparent and critical applications. Good interpersonal skills, both written and verbal are required. Candidate should have outstanding analytical, debugging and fixing skills. Career Level - IC3 Responsibilities As a member of the software engineering division, you will assist in defining and developing software for tasks associated with the developing, debugging or designing of software applications or operating systems. Provide technical leadership to other software developers. Specify, design and implement modest changes to existing software architecture to meet changing needs. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications.As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products.Duties and tasks are varied and complex needing independent judgment. Fully competent in own area of expertise. May have project lead role and or supervise lower level personnel. BS or MS degree or equivalent experience relevant to functional area. 4 years of software engineering or related experience. Career Level - IC3 Responsibilities Our Procurement Cloud is the key offering from the Oracle Applications Cloud Suite. Procurement Cloud is a fast growing division within Oracle Cloud Applications and have a variety of customers starting from a leading fast-food joint to world's largest furniture maker. Procurement Cloud Development works on different sophisticated areas starting from a complex search engine to a time critical auctions/bidding process to core business functionalities like bulk order processing, just to name a few. As a member of our team, you will use the latest technologies, including JDeveloper, ADF, Oracle 12c Database, Oracle SQL, BPEL, Oracle Text, BC4J, web-services, and service oriented architectures (SOA). In addition to gaining this technical experience, you will also be exposed to the business side of the industry. Developers are involved in the entire development cycle, so you will have the chance to take part in activities such as working with the product management team to define the product’s functionality and interacting with customers to resolve issues. So are you looking to be technically challenged and gain business experience? Do you want to be part of a team of upbeat, hard-working developers who know how to work and have fun at the same time? Well look no further. Join us and be the newest member of the Fusion Procurement Development! Skills/languages:: 1-8 years of experience in building Java based Applications. Good programming skills, excellent analytical/logical skills. Able to craft a feature from end to end. Can think out of the box, has practical knowledge on the given technologies, can apply logic to tackle a technical problem though might not have the background on the same. Should be persistent in their efforts. Experience in BPEL, Workflow System, ADF, REST Implementation, AI/ML, Scrum processes is a plus. Required: Java, OOPS Concepts, JavaScript/VBCS/JET Optional: JDBC, XML, SQL, PL/SQL, Unix/Linux, REST, ADF, AI/ML, Scrum Analyze, design develop, fix and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Andhra Pradesh, India

On-site

Skills And Experience Degree in Data Analytics/Computer Science or equivalent work experience 3 years plus experience as an Azure Data Engineer or in a similar data engineering role using Microsoft Azure Cloud (Data Factory, Azure Functions, Databricks, etc.). Practical Experience to Azure Fabric (Ideally DP-700 certified) Proven experience in building and optimising big data pipelines and architectures Strong analytical skills, especially in working with unstructured data Proficient in designing and implementing well-written code Experience with data pipeline testing, from source to presentation layer including troubleshooting skills Key Responsibilities Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure. Work together with data scientists and analysts to understand the needs for data and create effective data workflows. Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and resolving data pipeline problems will guarantee consistency and availability of the data. Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Key Responsibilities: Develop And Maintain Web Applications Build dynamic, user-centric web applications using React, React Hooks, and modern web technologies like HTML5 and CSS3. Ensure that the application is scalable, maintainable, and easy to navigate for end-users. Collaborate With Cross-Functional Teams Work closely with designers and product teams to bring UI/UX designs to life, ensuring the design vision is executed effectively using HTML and CSS. Ensure the application is responsive, performing optimally across all devices and browsers. State Management Utilize Redux to manage and streamline complex application states, ensuring seamless data flow and smooth user interactions. Component Development Develop reusable, modular, and maintainable React components using React Hooks and CSS/SCSS to style components effectively. Build component libraries and implement best practices to ensure code maintainability and reusability. Role Proficiency This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures Of Outcomes Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration Define and govern the configuration management plan. Ensure compliance within the team. Testing Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management Manage the delivery of modules effectively. Defect Management Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation Create and provide input for effort and size estimation for projects. Knowledge Management Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management Execute and monitor the release process to ensure smooth transitions. Design Contribution Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments "Frontend developer Required Skills and Experience: React.js Proficiency: - In-depth knowledge of React.js, JSX, React Hooks, and React Router. - Experience with state management using Redux or similar libraries. - Familiar with React performance optimization techniques, including lazy loading, memoization, and code splitting. - Experience with tools like react-testing-library, NPM (vite, Yup, Formik). CSS Expertise: - Strong proficiency in CSS, including the use of third-party frameworks like Material-UI (MUI) and Tailwind CSS for styling. - Ability to create responsive, visually appealing layouts with modern CSS practices. JavaScript/ES6+ Expertise: - Strong command of modern JavaScript (ES6+), including async/await, destructuring, and class-based components. - Familiarity with other JavaScript frameworks and libraries such as TypeScript is a bonus. Version Control: - Proficient with Git and platforms like GitHub or GitLab, including managing pull requests and version control workflows. API Integration: - Experienced in interacting with RESTful APIs. - Understanding of authentication mechanisms like JWT. Testing: - Able to write unit, integration, and end-to-end tests using tools such as react-testing-library. ------------------------------------------------------------------------------------------------------------------- Basic Qualifications: - At least 3 years of experience working with JavaScript frameworks, particularly React.js. - Experience working in cloud environments (AWS, Azure, Google Cloud) is a plus. - Basic understanding of backend technologies such as Python is advantageous." Skills Cloud Services,Backend Systems,Css Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies