Jobs
Interviews

1544 Adf Jobs - Page 36

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Designation – Sr.Consultant Experience- 6 to 7 years Location- Bengaluru Skills Req- Python, SQL, Databrciks , ADF ,within-Databrcisk - DLT, PySpark, Structural streaming , performance and cost optimization. Roles and Responsibilities: Capture business problems, value drivers, and functional/non-functional requirements and translate into functionality. Assess the risks, feasibility, opportunities, and business impact. Assess and model processes, data flows, and technology to understand the current value and issues, and identify opportunities for improvement. Create / update clear documentation of requirements to align with the solution over the project lifecycle. Ensure traceability of requirements from business needs through testing and scope changes, to final solution. Interact with software suppliers, designers and developers to understand software limitations, deliver elements of system and database design, and ensure that business requirements and use cases are handled. Configure and document software and processes, using agreed standards and tools. Create acceptance criteria and validate that solutions meet business needs, through defining and coordinating testing. Create and present compelling business cases to justify solution value and establish approval, funding and prioritization. Initiate, plan, execute, monitor, and control Business Analysis activities on projects within agreed parameters of cost, time and quality. Lead stakeholder management activities and large design sessions. Lead teams to complete business analysis on projects. Configure and document software and processes. Define and coordinate testing. Agile project experience. Understand Agile frameworks and tools. Worked in Agile. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Gurgaon/Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking an Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner . What You’ll Be DOING What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to ensure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Pyspark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to ensure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to Technical Lead. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and enables business growth and is critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most inclusive workforce possible, and create a culture where everyone can bring their full selves to work and reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe. Robust support for Flexible Working Arrangements Enhanced family-friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides competitive compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far-reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability. Show more Show less

Posted 1 month ago

Apply

5.0 - 8.0 years

15 - 25 Lacs

Gurugram, Bengaluru

Hybrid

Warm Greetings from SP Staffing!! Role :Azure Data Engineer Experience Required :5 to 8 yrs Work Location : Bangalore/Gurgaon Required Skills, Azure Databricks, ADF, Pyspark/SQL Interested candidates can send resumes to nandhini.spstaffing@gmail.com

Posted 1 month ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Purpose Client calls, guide towards optimized, cloud-native architectures, future state of their data platform, strategic recommendations and Microsoft Fabric integration. Desired Skills And Experience Candidates should have a B.E./B.Tech/MCA/MBA in Finance, Information Systems, Computer Science or a related field 7+ years of experience as a Data and Cloud architecture with client stakeholders AZ Data Platform Expertise: Synapse, Databricks, Azure Data Factory (ADF), Azure SQL (DW/DB), Power BI (PBI). Define modernization roadmaps and target architecture. Strong understanding of data governance best practices for data quality, Cataloguing, and lineage. Proven ability to lead client engagements and present complex findings. Excellent communication skills, both written and verbal Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the needs for close supervision and collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment Experience in project management and team management Key responsibilities include: Lead all interviews & workshops to capture current/future needs. Direct the technical review of Azure (AZ) infrastructure (Databricks, Synapse Analytics, Power BI) and critical on-premises (on-prem) systems. Come up with architecture designs (Arch. Designs), focusing on refined processing strategies and Microsoft Fabric. Understand and refine the Data Governance (Data Gov.) roadmap, including data cataloguing (Data Cat.), lineage, and quality. Lead project deliverables, ensuring actionable and strategic outputs. Evaluate and ensure quality of deliverables within project timelines Develop a strong understanding of equity market domain knowledge Collaborate with domain experts and business stakeholders to understand business rules/logics Ensure effective, efficient, and continuous communication (written and verbally) with global stakeholders Independently troubleshoot difficult and complex issues on dev, test, UAT and production environments Responsible for end-to-end delivery of projects, coordination between client and internal offshore teams and manage client queries Demonstrate high attention to detail, should work in a dynamic environment whilst maintaining high quality standards, a natural aptitude to develop good internal working relationships and a flexible work ethic Responsible for Quality Checks and adhering to the agreed Service Level Agreement (SLA) / Turn Around Time (TAT) Show more Show less

Posted 1 month ago

Apply

0 years

4 - 7 Lacs

Hyderābād

On-site

As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. The Fusion Supply Chain / Manufacturing Support Team is expanding to support our rapidly increasing customer base in the Cloud (SaaS), as well as growing numbers of on-premise accounts. The team partners with Oracle Development in supporting early adopters and many other new customers. This is a unique opportunity to be part of the future of Oracle Support and help shape the product and the organization to benefit our customers and our employees. This position is for supporting Fusion Applications, particularly under the Fusion SCM modules - Fusion SCM Planning, Fusion SCM Manufacturing, Fusion SCM Maintenance. Research, resolve and respond to complex issues across the Application product lines and product boundaries in accordance with current standards Demonstrate strong follow-through and consistently keep commitments to customers and employees Ensure that each and every customer is handled with a consummately professional attitude and the highest possible level of service Take ownership and responsibility for priority customer cases where and when required Review urgent and critical incidents for quality Queue reviews with analysts to ensure quality and efficiency of support Report high visibility cases, escalation, customer trends to management Act as information resource to the management team Contribute to an environment that encourages information sharing, team-based resolution activity, cross training and an absolute focus on resolving customer cases as quickly and effectively as possible Participate in projects that enhance the quality or efficiency of support Participate in system and release testing, as needed Act as a role model and mentor for other analysts Perform detailed technical analysis and troubleshooting using SQL, PL/SQL,Java, ADF, Redwood, VBCS, SOA and Rest API Participate in after hour support as required. Work with Oracle Development/Support Development for product related issues Demonstrate core competencies (employ sound business judgment, creative and innovative ways to solve problems, strong work ethic and do whatever it takes to get the job done) Knowledge of Business process and functional knowledge required for our support organization for Maintenance Module Asset Management: Oversee the entire lifecycle of physical assets to optimize utilization and visibility into maintenance operations.Track and manage enterprise-owned and customer-owned assets, including Install Base Assets. Preventive maintenance/Maintenance Program: Define and generate daily preventive maintenance forecasts for affected assets within maintenance-enabled organizations. Utilize forecasts to create preventive maintenance work orders, reducing workload for planners and enhancing program auditing, optimization, and exception management. Work Definition: Identify and manage Maintenance Work Areas based on physical, geographical, or logical groupings of work centers. Define and manage resources, work centers, and standard operations. Develop reusable operation templates (standard operations) detailing operation specifics and required resources. Apply standard operations to multiple maintenance work definitions and work orders. Work Order creation, scheduling and Dispatch: Track material usage and labour hours against planned activities. Manage component installation and removal. Conduct inspections and ensure seamless execution of work orders. Work Order Transactions: Apply knowledge of operation pull, assembly pull, and backflush concepts. Execute operation transactions to update dispatch status in count point operations. Manage re- sequenced operations within work order processes. Charge maintenance work orders for utilized resources and ensure accurate transaction recording. Technical skills required for our support organization for Maintenance Module SQL and PL/SQL REST API - creating, different methods and testing via POSTMAN Knowledge of JSON format Knowledge of WSDL, XML and SOAP Webservices Oracle SOA - Composites, Business Events, debugging via SOA Composite trace and logs Java and Oracle ADF Oracle Visual Builder Studio (good to have) Page Composer(Fusion Apps) : Customize existing UI (good to have) Application Composer(Fusion Apps) - sandbox, creating custom object and fields, dynamic page layout and Object Functions (good to have) Career Level - IC3 Responsibilities As a Sr. Support Engineer, you will be the technical interface to customer) for resolution of problems related to the maintenance and use of Oracle products. Have an understanding of all Oracle products in their competencies and in-depth knowledge of several products and/or platforms. Also, you should be highly experienced in multiple platforms and be able to complete assigned duties with minimal direction from management. In this position, you will routinely act independently while researching and developing solutions to customer issues. Research, resolve and respond to complex issues across the Application product lines and product boundaries in accordance with current standards Demonstrate strong follow-through and consistently keep commitments to customers and employees Ensure that each and every customer is handled with a consummately professional attitude and the highest possible level of service Take ownership and responsibility for priority customer cases where and when required Review urgent and critical incidents for quality Queue reviews with analysts to ensure quality and efficiency of support Report high visibility cases, escalation, customer trends to management Act as information resource to the management team Contribute to an environment that encourages information sharing, team-based resolution activity, cross training and an absolute focus on resolving customer cases as quickly and effectively as possible Participate in projects that enhance the quality or efficiency of support Participate in system and release testing, as needed Act as a role model and mentor for other analysts Perform detailed technical analysis and troubleshooting using SQL, Java, ADF, Redwood, VBCS, SOA and Rest API Participate in after hour support as required. Work with Oracle Development/Support Development for product related issues Demonstrate core competencies (employ sound business judgment, creative and innovative ways to solve problems, strong work ethic and do whatever it takes to get the job done) Knowledge of Business process and functional knowledge required for our support organization for Maintenance Module Asset Management: Oversee the entire lifecycle of physical assets to optimize utilization and visibility into maintenance operations.Track and manage enterprise-owned and customer-owned assets, including Install Base Assets. Preventive maintenance/Maintenance Program: Define and generate daily preventive maintenance forecasts for affected assets within maintenance-enabled organizations. Utilize forecasts to create preventive maintenance work orders, reducing workload for planners and enhancing program auditing, optimization, and exception management. Work Definition: Identify and manage Maintenance Work Areas based on physical, geographical, or logical groupings of work centers. Define and manage resources, work centers, and standard operations. Develop reusable operation templates (standard operations) detailing operation specifics and required resources. Apply standard operations to multiple maintenance work definitions and work orders. Work Order creation, scheduling and Dispatch: Track material usage and labour hours against planned activities. Manage component installation and removal. Conduct inspections and ensure seamless execution of work orders. Work Order Transactions: Apply knowledge of operation pull, assembly pull, and backflush concepts. Execute operation transactions to update dispatch status in count point operations. Manage re- sequenced operations within work order processes. Charge maintenance work orders for utilized resources and ensure accurate transaction recording. Technical skills required for our support organization for Maintenance Module SQL and PL/SQL REST API - creating, different methods and testing via POSTMAN Knowledge of JSON format Knowledge of WSDL, XML and SOAP Webservices Oracle SOA - Composites, Business Events, debugging via SOA Composite trace and logs Java and Oracle ADF Oracle Visual Builder Studio (good to have) Page Composer(Fusion Apps) : Customize existing UI (good to have) Application Composer(Fusion Apps) - sandbox, creating custom object and fields, dynamic page layout and Object Functions (good to have)

Posted 1 month ago

Apply

5.0 years

4 - 5 Lacs

Hyderābād

On-site

Job Description: At least 5+ years’ of relevant hands on development experience as Azure Data Engineering role Proficient in Azure technologies like ADB, ADF, SQL(capability of writing complex SQL queries), ADB, PySpark, Python, Synapse, Delta Tables, Unity Catalog Hands on in Python, PySpark or Spark SQL Hands on in Azure Analytics and DevOps Taking part in Proof of Concepts (POCs) and pilot solutions preparation Ability to conduct data profiling, cataloguing, and mapping for technical design and construction of technical data flows Experience in business processing mapping of data and analytics solutions Recruitment fraud is a scheme in which fictitious job opportunities are offered to job seekers typically through online services, such as false websites, or through unsolicited emails claiming to be from the company. These emails may request recipients to provide personal information or to make payments as part of their illegitimate recruiting process. DXC does not make offers of employment via social media networks and DXC never asks for any money or payments from applicants at any point in the recruitment process, nor ask a job seeker to purchase IT or other equipment on our behalf. More information on employment scams is available here .

Posted 1 month ago

Apply

10.0 years

0 Lacs

India

On-site

Company Description 👋🏼 We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale — across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 38 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That's where you come in! Job Description REQUIREMENTS: Total experience: 10+ Years Strong experience in delivering data engineering projects with Python. Strong proficiency in Python for data analysis and scripting. Hands-On experience in AWS technologies (Azure ADF, Synapse etc.), Have strong knowledge in ETL, Data warehousing, Business intelligence Proficient in designing and developing data integration workflows. Strong experience with Azure Synapse Analytics for data warehousing. Solid experience with Databricks for big data processing. Experience in managing complex and technical development projects in the areas of ETL, Datawarehouse & BI. Excellent problem-solving skills, strong communication abilities, and a collaborative mindset. Relevant certifications in Azure or data engineering are a plus RESPONSIBILITIES: Understanding the client’s business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements. Mapping decisions with requirements and be able to translate the same to developers. Identifying different solutions and being able to narrow down the best option that meets the client’s requirements. Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developers Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it Understanding and relating technology integration scenarios and applying these learnings in projects Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Carrying out POCs to make sure that suggested design/technologies meet the requirements. Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

We are looking for passionate engineers who designs, develops, codes and customizes software applications from product conception to end user interface. The person should be able to Analyz and understand customer requirements and preferences, & incorporate these into the design and development process. About You – Experience, Education, Skills, And Accomplishments Bachelors’ degree or higher in related field, such as Computer Engineering or Computer Science, plus at least 3 years of software development experience, or equivalent combination of education and experience. At least 3 years’ experience working with E-Business Suite; specifically, with financials, order management, service contracts, inventory, Accounts Receivables and Advanced pricing modules. At least 3 yrs experience performance tuning in E-Business Suite. Experience developing custom components using OAF and ADF workflow, developing solutions using Oracle Apex. Experience integrating data from Oracle eBS to Sales force and working with AIM and formulating strategies for implementation. Expert knowledge of Oracle Applications interfaces, tables, and APIs. Expertise in RICE (developed new Reports, Interface, Customization, Extensions, and form personalization). It would be great, if you also have … Experience in web technologies like HTML, JavaScript, CSS, JQuery Proficiency in Java with ability to write clean, efficient and maintainable code in Java Experience in designing, developing and maintaining Java applications Sound knowledge of Object-Oriented Programming (OOP) concepts (Optionally) Experience in AngularJS and Angular What will you be doing in this role? Write clean, efficient, and maintainable code in accordance with coding standards. Review other code to ensure clean, efficient, and maintainable code. Defines architecture of software solution. Suggests alternative methodologies or techniques to achieving desired results. Develops and maintains understanding of software development lifecycle and delivery methodology. Reviews and revises new procedures as needed for the continuing development of high-quality systems. Maintains knowledge of technical advances and evaluates new hardware / software for company use. Follows departmental policies, procedures, and work instructions. Works closely with higher-level engineers to increase functional knowledge. Automate tests and unit tests all assigned applications. Participates as a team member on various engineering projects. Writes application technical documentation. About The Team The position is for Finance team within the Enterprise Services organization, a dynamic and collaborative group focused on supporting the company’s key finance applications, including order to cash functions, invoice delivery, cash collections, service contracts, third-party integrations, and the general ledger. This team ensures seamless and efficient financial processes, maintaining healthy cash flow and accurate financial reporting. The team is committed to continuous improvement, leveraging the latest technologies and best practices. Join a team that values collaboration, innovation, and excellence in supporting the company's financial operations and strategic goals. At Clarivate, we are committed to providing equal employment opportunities for all qualified persons with respect to hiring, compensation, promotion, training, and other terms, conditions, and privileges of employment. We comply with applicable laws and regulations governing non-discrimination in all locations. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

About the Role We are looking for a skilled and motivated Data Analyst with 2–5 years of experience to join our team. In this role, you will work closely with the product team to support strategic decision-making by delivering data-driven insights, dashboards, and performance reports. Your ability to transform raw data into actionable insights will directly impact on how we build and improve our products. Key Responsibilities Collaborate with the product team to understand data needs and define key performance indicators (KPIs) Develop and maintain insightful reports and dashboards using Power BI Write efficient and optimized SQL queries to extract and manipulate data from multiple sources Perform data analysis using Python and pandas for deeper trend analysis and data modeling Present findings clearly through visualizations and written summaries to stakeholders Ensure data quality and integrity across reporting pipelines Contribute to ongoing improvements in data processes and tooling Required Skills & Experience 2–5 years of hands-on experience as a Data Analyst or in a similar role Strong proficiency in SQL for querying and data manipulation Experience in building interactive dashboards with Power BI Good command of Python , especially with pandas for data wrangling and analysis Experience with Databricks or working with big data tools Understanding of Medallion Architecture and its application in analytics pipelines Strong communication and collaboration skills, especially in cross-functional team settings Good to Have Familiarity with data engineering practices , including: Data transformation using Databricks notebooks Apache Spark SQL for distributed data processing Azure Data Factory (ADF) for orchestration Version control using Git Exposure to product analytics, cohort analysis, or A/B testing methodologies Interested candidates please share your resume with balaji.kumar@flyerssoft.com Show more Show less

Posted 1 month ago

Apply

6.0 years

0 Lacs

India

Remote

AI/ML Engineer – Senior Consultant AI Engineering Group is part of Data Science & AI Competency Center and is focusing technical and engineering aspects of DS/ML/AI solutions. We are looking for experienced AI/ML Engineers to join our team to help us bring AI/ML solutions into production, automate processes, and define reusable best practices and accelerators. Duties description: The person we are looking for will become part of DataScience and AI Competency Center working in AI Engineering team. The key duties are: Building high-performing, scalable, enterprise-grade ML/AI applications in cloud environment Working with Data Science, Data Engineering and Cloud teams to implement Machine Learning models into production Practical and innovative implementations of ML/AI automation, for scale and efficiency Design, delivery and management of industrialized processing pipelines Defining and implementing best practices in ML models life cycle and ML operations Implementing AI/MLOps frameworks and supporting Data Science teams in best practices Gathering and applying knowledge on modern techniques, tools and frameworks in the area of ML Architecture and Operations Gathering technical requirements & estimating planned work Presenting solutions, concepts and results to internal and external clients Being Technical Leader on ML projects, defining task, guidelines and evaluating results Creating technical documentation Supporting and growing junior engineers Must have skills: Good understanding of ML/AI concepts: types of algorithms, machine learning frameworks, model efficiency metrics, model life-cycle, AI architectures Good understanding of Cloud concepts and architectures as well as working knowledge with selected cloud services, preferably GCP Experience in programming ML algorithms and data processing pipelines using Python At least 6-8 years of experience in production ready code development Experience in designing and implementing data pipelines Practical experience with implementing ML solutions on GCP Vertex.AI and/or Databricks Good communication skills Ability to work in team and support others Taking responsibility for tasks and deliverables Great problem-solving skills and critical thinking Fluency in written and spoken English. Nice to have skills & knowledge: Practical experience with other programming languages: PySpark, Scala, R, Java Practical experience with tools like AirFlow, ADF or Kubeflow Good understanding of CI/CD and DevOps concepts, and experience in working with selected tools (preferably GitHub Actions, GitLab or Azure DevOps) Experience in applying and/or defining software engineering best practices Experience productization ML solutions using technologies like Docker/Kubernetes We Offer: Stable employment. On the market since 2008, 1300+ talents currently on board in 7 global sites. 100% remote. Flexibility regarding working hours. Full-time position Comprehensive online onboarding program with a “Buddy” from day 1. Cooperation with top-tier engineers and experts. Internal Gallup Certified Strengths Coach to support your growth. Unlimited access to the Udemy learning platform from day 1. Certificate training programs. Lingarians earn 500+ technology certificates yearly. Upskilling support. Capability development programs, Competency Centers, knowledge sharing sessions, community webinars, 110+ training opportunities yearly. Grow as we grow as a company. 76% of our managers are internal promotions. A diverse, inclusive, and values-driven community. Autonomy to choose the way you work. We trust your ideas. Create our community together. Refer your friends to receive bonuses. Activities to support your well-being and health. Plenty of opportunities to donate to charities and support the environment. Please click on this link to submit your application: https://system.erecruiter.pl/FormTemplates/RecruitmentForm.aspx?WebID=ac709bd295cc4008af7d0a7a0e465818 Show more Show less

Posted 1 month ago

Apply

4.0 - 12.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Greetings from TCS! We are looking for Oracle EBS Technical Experience : 4 - 12 Years Location : Kolkata Must Have: Oracle EBS Technical Responsibility of / Expectations from the Role Experience in full lifecycle software projects – including client/server and web Applications, with responsibilities ranging from system analysis, design, development, unit testing, and documentation. Other Technical skills like SOA, ADF and UNIX shell scripting. Extensive experience in writing Packages, stored functions, stored procedures, triggers and very strong in PL/SQL. Extensively worked on Oracle APIs, Forms6i/10g and Reports6i/10g with Oracle Database10g/11g, Discoverer, XML/BI Publisher, Workflows and Web ADI. Having good exposure in Oracle AIM Methodology means the preparation of documents like MD050, MD070, CV040, CV060, TE020 and MD120. Ability to learn Domain Knowledge related to the application in short period of time. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title: Data Engineer – Databricks, Delta Live Tables, Data Pipelines Location: Bhopal / Hyderabad / Pune (On-site) Experience Required: 5+ Years Employment Type: Full-Time Job Summary: We are seeking a skilled and experienced Data Engineer with a strong background in designing and building data pipelines using Databricks and Delta Live Tables. The ideal candidate should have hands-on experience in managing large-scale data engineering workloads and building scalable, reliable data solutions in cloud environments. Key Responsibilities: Design, develop, and manage scalable and efficient data pipelines using Databricks and Delta Live Tables . Work with structured and unstructured data to enable analytics and reporting use cases. Implement data ingestion , transformation , and cleansing processes. Collaborate with Data Architects, Analysts, and Data Scientists to ensure data quality and integrity. Monitor data pipelines and troubleshoot issues to ensure high availability and performance. Optimize queries and data flows to reduce costs and increase efficiency. Ensure best practices in data security, governance, and compliance. Document architecture, processes, and standards. Required Skills: Minimum 5 years of hands-on experience in data engineering . Proficient in Apache Spark , Databricks , Delta Lake , and Delta Live Tables . Strong programming skills in Python or Scala . Experience with cloud platforms such as Azure , AWS , or GCP . Proficient in SQL for data manipulation and analysis. Experience with ETL/ELT pipelines , data wrangling , and workflow orchestration tools (e.g., Airflow, ADF). Understanding of data warehousing , big data ecosystems , and data modeling concepts. Familiarity with CI/CD processes in a data engineering context. Nice to Have: Experience with real-time data processing using tools like Kafka or Kinesis. Familiarity with machine learning model deployment in data pipelines. Experience working in an Agile environment. Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

India

Remote

Title: Azure Data Engineer Location: Remote Employment type: Full Time with BayOne We’re looking for a skilled and motivated Data Engineer to join our growing team and help us build scalable data pipelines, optimize data platforms, and enable real-time analytics. What You'll Do Design, develop, and maintain robust data pipelines using tools like Databricks, PySpark, SQL, Fabric, and Azure Data Factory Collaborate with data scientists, analysts, and business teams to ensure data is accessible, clean, and actionable Work on modern data lakehouse architectures and contribute to data governance and quality frameworks Tech Stack Azure | Databricks | PySpark | SQL What We’re Looking For 3+ years experience in data engineering or analytics engineering Hands-on with cloud data platforms and large-scale data processing Strong problem-solving mindset and a passion for clean, efficient data design Job Description: Min 3 years of experience in modern data engineering/data warehousing/data lakes technologies on cloud platforms like Azure, AWS, GCP, Data Bricks etc. Azure experience is preferred over other cloud platforms. 5 years of proven experience with SQL, schema design and dimensional data modelling Solid knowledge of data warehouse best practices, development standards and methodologies Experience with ETL/ELT tools like ADF, Informatica, Talend etc., and data warehousing technologies like Azure Synapse, Microsoft Fabric, Azure SQL, Amazon redshift, Snowflake, Google Big Query etc. Strong experience with big data tools (Databricks, Spark etc..) and programming skills in PySpark and Spark SQL. Be an independent self-learner with “let’s get this done” approach and ability to work in Fast paced and Dynamic environment. Excellent communication and teamwork abilities. Nice-to-Have Skills: Event Hub, IOT Hub, Azure Stream Analytics, Azure Analysis Service, Cosmo DB knowledge. SAP ECC /S/4 and Hana knowledge. Intermediate knowledge on Power BI Azure DevOps and CI/CD deployments, Cloud migration methodologies and processes BayOne is an Equal Opportunity Employer and does not discriminate against any employee or applicant for employment because of race, color, sex, age, religion, sexual orientation, gender identity, status as a veteran, and basis of disability or any federal, state, or local protected class. This job posting represents the general duties and requirements necessary to perform this position and is not an exhaustive statement of all responsibilities, duties, and skills required. Management reserves the right to revise or alter this job description. Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Location: PAN India Duration: 6 Months Experience Required: 7–8 years Job Summary We are looking for an experienced SSAS Developer with strong expertise in developing both OLAP and Tabular models using SQL Server Analysis Services (SSAS), alongside advanced ETL development skills using tools like SSIS, Informatica, or Azure Data Factory . The ideal candidate will be well-versed in T-SQL , dimensional modeling, and building high-performance, scalable data solutions. Key Responsibilities Design, build, and maintain SSAS OLAP cubes and Tabular models Create complex DAX and MDX queries for analytical use cases Develop robust ETL workflows and pipelines using SSIS, Informatica, or ADF Collaborate with cross-functional teams to translate business requirements into BI solutions Optimize SSAS models for scalability and performance Implement best practices in data modeling, version control, and deployment automation Support dashboarding and reporting needs via Power BI, Excel, or Tableau Maintain and troubleshoot data quality, performance, and integration issues Must-Have Skills Hands-on experience with SSAS (Tabular & Multidimensional) Proficient in DAX, MDX, and T-SQL Advanced ETL skills using SSIS / Informatica / Azure Data Factory Knowledge of dimensional modeling (star & snowflake schema) Experience with Azure SQL / MS SQL Server Familiarity with Git and CI/CD pipelines Nice to Have Exposure to cloud data platforms (Azure Synapse, Snowflake, AWS Redshift) Working knowledge of Power BI or similar BI tools Understanding of Agile/Scrum methodology Bachelor's degree in Computer Science, Information Systems, or equivalent Show more Show less

Posted 1 month ago

Apply

40.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description As a member of the Support organization, your focus is to deliver post-sales support and solutions to the Oracle customer base while serving as an advocate for customer needs. This involves resolving post-sales non-technical customer inquiries via phone and electronic means, as well as, technical questions regarding the use of and troubleshooting for our Electronic Support Services. A primary point of contact for customers, you are responsible for facilitating customer relationships with Support and providing advice and assistance to internal Oracle employees on diverse customer situations and escalated issues. The Fusion Supply Chain / Manufacturing Support Team is expanding to support our rapidly increasing customer base in the Cloud (SaaS), as well as growing numbers of on-premise accounts. The team partners with Oracle Development in supporting early adopters and many other new customers. This is a unique opportunity to be part of the future of Oracle Support and help shape the product and the organization to benefit our customers and our employees. This position is for supporting Fusion Applications, particularly under the Fusion SCM modules - Fusion SCM Planning, Fusion SCM Manufacturing, Fusion SCM Maintenance. Research, resolve and respond to complex issues across the Application product lines and product boundaries in accordance with current standards Demonstrate strong follow-through and consistently keep commitments to customers and employees Ensure that each and every customer is handled with a consummately professional attitude and the highest possible level of service Take ownership and responsibility for priority customer cases where and when required Review urgent and critical incidents for quality Queue reviews with analysts to ensure quality and efficiency of support Report high visibility cases, escalation, customer trends to management Act as information resource to the management team Contribute to an environment that encourages information sharing, team-based resolution activity, cross training and an absolute focus on resolving customer cases as quickly and effectively as possible Participate in projects that enhance the quality or efficiency of support Participate in system and release testing, as needed Act as a role model and mentor for other analysts Perform detailed technical analysis and troubleshooting using SQL, PL/SQL,Java, ADF, Redwood, VBCS, SOA and Rest API Participate in after hour support as required. Work with Oracle Development/Support Development for product related issues Demonstrate core competencies (employ sound business judgment, creative and innovative ways to solve problems, strong work ethic and do whatever it takes to get the job done) Knowledge of Business process and functional knowledge required for our support organization for Maintenance Module Asset Management: Oversee the entire lifecycle of physical assets to optimize utilization and visibility into maintenance operations.Track and manage enterprise-owned and customer-owned assets, including Install Base Assets. Preventive maintenance/Maintenance Program: Define and generate daily preventive maintenance forecasts for affected assets within maintenance-enabled organizations. Utilize forecasts to create preventive maintenance work orders, reducing workload for planners and enhancing program auditing, optimization, and exception management. Work Definition: Identify and manage Maintenance Work Areas based on physical, geographical, or logical groupings of work centers. Define and manage resources, work centers, and standard operations. Develop reusable operation templates (standard operations) detailing operation specifics and required resources. Apply standard operations to multiple maintenance work definitions and work orders. Work Order creation, scheduling and Dispatch: Track material usage and labour hours against planned activities. Manage component installation and removal. Conduct inspections and ensure seamless execution of work orders. Work Order Transactions: Apply knowledge of operation pull, assembly pull, and backflush concepts. Execute operation transactions to update dispatch status in count point operations. Manage re- sequenced operations within work order processes. Charge maintenance work orders for utilized resources and ensure accurate transaction recording. Technical skills required for our support organization for Maintenance Module SQL and PL/SQL REST API - creating, different methods and testing via POSTMAN Knowledge of JSON format Knowledge of WSDL, XML and SOAP Webservices Oracle SOA - Composites, Business Events, debugging via SOA Composite trace and logs Java and Oracle ADF Oracle Visual Builder Studio (good to have) Page Composer(Fusion Apps) : Customize existing UI (good to have) Application Composer(Fusion Apps) - sandbox, creating custom object and fields, dynamic page layout and Object Functions (good to have) Career Level - IC3 Responsibilities RESPONSIBILITIES As a Sr. Support Engineer, you will be the technical interface to customer) for resolution of problems related to the maintenance and use of Oracle products. Have an understanding of all Oracle products in their competencies and in-depth knowledge of several products and/or platforms. Also, you should be highly experienced in multiple platforms and be able to complete assigned duties with minimal direction from management. In this position, you will routinely act independently while researching and developing solutions to customer issues. Research, resolve and respond to complex issues across the Application product lines and product boundaries in accordance with current standards Demonstrate strong follow-through and consistently keep commitments to customers and employees Ensure that each and every customer is handled with a consummately professional attitude and the highest possible level of service Take ownership and responsibility for priority customer cases where and when required Review urgent and critical incidents for quality Queue reviews with analysts to ensure quality and efficiency of support Report high visibility cases, escalation, customer trends to management Act as information resource to the management team Contribute to an environment that encourages information sharing, team-based resolution activity, cross training and an absolute focus on resolving customer cases as quickly and effectively as possible Participate in projects that enhance the quality or efficiency of support Participate in system and release testing, as needed Act as a role model and mentor for other analysts Perform detailed technical analysis and troubleshooting using SQL, Java, ADF, Redwood, VBCS, SOA and Rest API Participate in after hour support as required. Work with Oracle Development/Support Development for product related issues Demonstrate core competencies (employ sound business judgment, creative and innovative ways to solve problems, strong work ethic and do whatever it takes to get the job done) Knowledge of Business process and functional knowledge required for our support organization for Maintenance Module Asset Management: Oversee the entire lifecycle of physical assets to optimize utilization and visibility into maintenance operations.Track and manage enterprise-owned and customer-owned assets, including Install Base Assets. Preventive maintenance/Maintenance Program: Define and generate daily preventive maintenance forecasts for affected assets within maintenance-enabled organizations. Utilize forecasts to create preventive maintenance work orders, reducing workload for planners and enhancing program auditing, optimization, and exception management. Work Definition: Identify and manage Maintenance Work Areas based on physical, geographical, or logical groupings of work centers. Define and manage resources, work centers, and standard operations. Develop reusable operation templates (standard operations) detailing operation specifics and required resources. Apply standard operations to multiple maintenance work definitions and work orders. Work Order creation, scheduling and Dispatch: Track material usage and labour hours against planned activities. Manage component installation and removal. Conduct inspections and ensure seamless execution of work orders. Work Order Transactions: Apply knowledge of operation pull, assembly pull, and backflush concepts. Execute operation transactions to update dispatch status in count point operations. Manage re- sequenced operations within work order processes. Charge maintenance work orders for utilized resources and ensure accurate transaction recording. Technical skills required for our support organization for Maintenance Module SQL and PL/SQL REST API - creating, different methods and testing via POSTMAN Knowledge of JSON format Knowledge of WSDL, XML and SOAP Webservices Oracle SOA - Composites, Business Events, debugging via SOA Composite trace and logs Java and Oracle ADF Oracle Visual Builder Studio (good to have) Page Composer(Fusion Apps) : Customize existing UI (good to have) Application Composer(Fusion Apps) - sandbox, creating custom object and fields, dynamic page layout and Object Functions (good to have) Qualifications Career Level - IC3 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 month ago

Apply

20.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

We are seeking an experienced Enterprise Architect with expertise in SAP ECC and Success Factors to lead the development and maintenance of our enterprise architecture strategy. This strategic role involves collaborating with stakeholders, aligning technology with business needs, and ensuring scalable, secure, and efficient enterprise-level implementations. About About RWS Technology Services – India RWS Technology Services provide end-to-end business technology solutions. Our team of experts provides a wide portfolio of services around digital technologies and technology operations to help organizations stay ahead of the curve, lower their total cost of ownership, and improve efficiencies. How we help - RWS Technology Services offer state-of-the-art technology solutions across the product lifecycle management process – all the way from consulting, concept, design, development to maintenance and optimization. We specialize in helping companies excel in the global, fast-paced technology landscape by supporting them in every aspect of customer interaction: Globalization, Digitization, Customer Experiences Management, Business Processes Automation, and Technology Infrastructure Modernization. Why choose RWS? - Innovative: RWS understands the needs of our customers to use the best talent, latest technologies, and solutions to help create connected customer experiences. We help our clients differentiate themselves by making their product engineering capabilities more data driven, powered by AI, and supported by cloud services and intelligent edge devices. Tailored: RWS Technology Services has been delivering technology services and solutions to start-ups, mid-sized and Fortune 500 corporations for over 20 years now. Our technology experience across all key industries ensures tailored applications development to meet the unique business needs of our clients. Our group is led by dedicated on-shore and off-shore project management teams of highly experienced professionals specializing in both agile and waterfall methodologies. We understand complex technology deployments and have a proven record to manage business critical, time-sensitive, and highly secure deployments that scale with your business growth. Key Responsibilities Job Overview Define and maintain the enterprise architecture strategy and roadmap. Collaborate with stakeholders to translate business requirements into scalable technical solutions. Ensure alignment with industry standards, IT best practices, and security frameworks. Design and implement secure, scalable, and high-performing enterprise solutions. Evaluate emerging technologies and recommend adoption where beneficial. Establish and enforce technical standards, policies, and best practices. Provide architectural guidance to development teams for optimal solution design. Ensure solutions align with business continuity and disaster recovery plans. Skills & Experience RWS is looking for 15+ years of relevant experience candidates, Who can join us as a Part time/Freelancer/Contract. Bachelor’s degree in Computer Science, Information Technology, or a related field. 15+ years of experience in technology architecture, including 5+ years in an enterprise architect role. Strong expertise in SAP ECC and SuccessFactors architecture, data models, and integrations. Familiarity with Azure, ADF or AppFabric for data integration. Experience with Power BI for data visualization. Proficiency in cloud computing, microservices architecture, and containerization. Experience with enterprise integration technologies such as ESBs and API gateways. Strong understanding of IT security and experience designing secure solutions. Experience in agile environments and DevOps methodologies. Excellent communication, stakeholder management, and problem-solving skills. Ability to work effectively in cross-functional, fast-paced environments. Life at RWS RWS is a content solutions company, powered by technology and human expertise. We grow the value of ideas, data and content by making sure organizations are understood. Everywhere. Our proprietary technology, 45+ AI patents and human experts help organizations bring ideas to market faster, build deeper relationships across borders and cultures, and enter new markets with confidence – growing their business and connecting them to a world of opportunities. It’s why over 80 of the world’s top 100 brands trust RWS to drive innovation, inform decisions and shape brand experiences. With 60+ global locations, across five continents, our teams work with businesses across almost all industries. Innovating since 1958, RWS is headquartered in the UK and publicly listed on AIM, the London Stock Exchange regulated market (RWS.L). RWS Values We Partner, We Pioneer, We Progress – and we´ll Deliver together. For further information, please visit: RWS RWS embraces DEI and promotes equal opportunity, we are an Equal Opportunity Employer and prohibit discrimination and harassment of any kind. RWS is committed to the principle of equal employment opportunity for all employees and to providing employees with a work environment free of discrimination and harassment. All employment decisions at RWS are based on business needs, job requirements and individual qualifications, without regard to race, religion, nationality, ethnicity, sex, age, disability, or sexual orientation. RWS will not tolerate discrimination based on any of these characteristics Recruitment Agencies: RWS Holdings PLC does not accept agency resumes. Please do not forward any unsolicited resumes to any RWS employees. Any unsolicited resume received will be treated as the property of RWS and Terms & Conditions associated with the use of such resume will be considered null and void. RWS. Smarter content starts here. www.rws.com Show more Show less

Posted 1 month ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Description Key Responsibilities Develop and maintain supply chain analytics to monitor operational performance and trends. Lead and participate in Six Sigma and supply chain improvement initiatives. Ensure data integrity and consistency across all analytics and reporting platforms. Design and implement reporting solutions for key supply chain KPIs. Analyze KPIs to identify improvement opportunities and develop actionable insights. Build and maintain repeatable, scalable analytics using business systems and BI tools. Conduct scenario modeling and internal/external benchmarking. Provide financial analysis to support supply chain decisions. Collaborate with global stakeholders to understand requirements and deliver impactful solutions. Responsibilities Qualifications Bachelor’s degree in Engineering, Computer Science, Supply Chain, or a related field. Relevant certifications in BI tools, Agile methodologies, or cloud platforms are a plus. This position may require licensing for compliance with export controls or sanctions regulations. Qualifications Experience 8–10 years of total experience, with at least 6 years in a relevant analytics or supply chain role. Proven experience in leading small teams and managing cross-functional projects. Technical Skills Expertise in : SQL, SQL Server, SSIS, SSAS, Power BI. Advanced DAX development for complex reporting needs. Performance optimization for SQL and SSAS environments. Cloud and Data Engineering : Azure Synapse, Azure Data Factory (ADF), Python, Snowflake Agile methodology : Experience working in Agile teams and sprints. Job Supply Chain Planning Organization Cummins Inc. Role Category Hybrid Job Type Exempt - Experienced ReqID 2415717 Relocation Package No Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Bhopal, Madhya Pradesh, India

On-site

At Iron Mountain we know that work, when done well, makes a positive impact for our customers, our employees, and our planet. That’s why we need smart, committed people to join us. Whether you’re looking to start your career or make a change, talk to us and see how you can elevate the power of your work at Iron Mountain. We provide expert, sustainable solutions in records and information management, digital transformation services, data centers, asset lifecycle management, and fine art storage, handling, and logistics. We proudly partner every day with our 225,000 customers around the world to preserve their invaluable artifacts, extract more from their inventory, and protect their data privacy in innovative and socially responsible ways. Are you curious about being part of our growth stor y while evolving your skills in a culture that will welcome your unique contributions? If so, let's start the conversation. About the role: As a Senior Executive – Digital Solutions at Iron Mountain, you will be primarily responsible for managing scanning and digitization projects at both customer sites and IMI facilities. This includes supervising and coordinating in-house teams as well as vendor resources, ensuring seamless, high-quality, and on-time project delivery aligned with the defined scope of work. You will also handle key project milestones such as Proof of Concept (POC), User Acceptance Testing (UAT), and Work Completion Certifications (WCC). Additionally, you will support vertical leads in achieving monthly, quarterly, and annual revenue targets. You should be collaborative, open to automation opportunities, and comfortable working with advanced scanning and production imaging equipment. Qualifications and Skills: Target-driven and self-motivated team player with a strong understanding of scanning, digitization, metadata handling, Document Management Systems (DMS), workflow processes, and automation of repetitive tasks. Prior experience managing scanning and digitization projects involving both in-house and outsourced/vendor teams. Minimum 2–5 years of relevant industry experience, preferably having led teams of 50+ members. Proficient in Google Sheets and skilled in MIS reporting. Education: Graduation is mandatory; an MBA in Operations is preferred. Familiarity with production scanners such as ADF, Overhead, Flatbed, BookEye, etc. Customer-focused mindset with a willingness to relocate based on project requirements. A proven track record in digitization projects will be an added advantage. Category: Operations Group Iron Mountain is a global leader in storage and information management services trusted by more than 225,000 organizations in 60 countries. We safeguard billions of our customers’ assets, including critical business information, highly sensitive data, and invaluable cultural and historic artifacts. Take a look at our history here. Iron Mountain helps lower cost and risk, comply with regulations, recover from disaster, and enable digital and sustainable solutions, whether in information management, digital transformation, secure storage and destruction, data center operations, cloud services, or art storage and logistics. Please see our Values and Code of Ethics for a look at our principles and aspirations in elevating the power of our work together. If you have a physical or mental disability that requires special accommodations, please let us know by sending an email to accommodationrequest@ironmountain.com. See the Supplement to learn more about Equal Employment Opportunity. Iron Mountain is committed to a policy of equal employment opportunity. We recruit and hire applicants without regard to race, color, religion, sex (including pregnancy), national origin, disability, age, sexual orientation, veteran status, genetic information, gender identity, gender expression, or any other factor prohibited by law. To view the Equal Employment Opportunity is the Law posters and the supplement, as well as the Pay Transparency Policy Statement, CLICK HERE Requisition: J0088899 Show more Show less

Posted 1 month ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary: We are hiring an experienced Application Security Engineer specializing in Java ADF and Jasper Reports, with a strong track record of resolving Vulnerability Assessment and Penetration Testing (VAPT) findings. The ideal candidate must have secured complex enterprise applications, including online payments and eCommerce systems, particularly on legacy stacks such as Java 1.7, MySQL 5.5, and JBoss 7.1. This role is hands-on and remediation-focused, requiring deep understanding of secure development and hardening in deprecated environments. Key Responsibilities: Lead remediation of high-priority VAPT findings in large-scale enterprise systems. Secure passwords and PII data at all stages: At view/input: masking, form validation, secure front-end patterns In transit: TLS, secure headers, HTTPS enforcement At rest: encryption, proper salting and hashing (e.g., bcrypt, SHA-256) Fix injection attacks (SQLi, XSS, LDAPi, command injection), CSRF, clickjacking, IDOR, and other OWASP Top 10 issues. Apply secure API integration practices: auth tokens, rate limiting, input validation. Harden session and cookie management (HttpOnly, Secure, SameSite attributes, session fixation prevention). Review and fix insecure code in ADF Faces, Task Flows, Bindings, BC4J, and Jasper Reports. Secure Jasper Reports generation and access (parameter validation, report-level authorization, export sanitization). Work hands-on with legacy platforms: Java 1.7, MySQL 5.5, JBoss 7.1 — applying secure remediation without disrupting production. Strengthen security of online payment/eCommerce systems with proven compliance (e.g., PCI-DSS). Maintain detailed remediation logs, documentation, and evidence for audits and compliance (GDPR, DPDPA, STQC, etc.). Technical Skills: Java EE, Oracle ADF (ADF Faces, Task Flows, BC4J), Jasper Reports Studio/XML Strong debugging skills in Java 1.7, MySQL 5.5, JBoss 7.1 Secure development lifecycle practices with a focus on legacy modernization Strong grounding in OWASP Top 10, SANS 25, CVSS, and secure coding principles Experience in PII handling, data masking, salting, and hashing Proficiency in OAuth2, SAML, JWT, and RBAC security models Performance improvement and application profiling Expertise in analyzing application, system, and security logs to identify and fix issues Ability to ensure application stability and high availability Be the champion/lead and guide the team to fix the issues PHP experience is a plus, especially in legacy web app environments Required Experience: 5–10+ years in application development and security Demonstrated experience remediating security vulnerabilities in eCommerce and payment platforms Ability to work independently in production environments with deprecated technologies Preferred Qualifications / Plus: B.E./B.Tech/MCA in Computer Science, IT, or Cybersecurity Use of AI tools for identification and fixing the issues is real plus Any VAPT or Application Security Certification is a plus (e.g., CEH, OSCP, CSSLP, GWAPT, Oracle Certified Expert) Familiarity with compliance standards: PCI-DSS, GDPR, DPDPA, STQC Proficiency with security tools: Fortify, ZAP, SonarQube, Checkmarx, Burp Suite Soft Skills: Strong problem-solving and diagnostic capabilities, especially in large monolithic codebases Good documentation and communication skills for cross-functional collaboration Able to work under pressure, troubleshoot complex issues, and deliver secure code fixes rapidly Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Greater Chennai Area

On-site

Who You'll Work With Driving lasting impact and building long-term capabilities with our clients is not easy work. You are the kind of person who thrives in a high performance/high reward culture - doing hard things, picking yourself up when you stumble, and having the resilience to try another way forward. In return for your drive, determination, and curiosity, we'll provide the resources, mentorship, and opportunities you need to become a stronger leader faster than you ever thought possible. Your colleagues—at all levels—will invest deeply in your development, just as much as they invest in delivering exceptional results for clients. Every day, you'll receive apprenticeship, coaching, and exposure that will accelerate your growth in ways you won’t find anywhere else. When you join us, you will have Continuous learning Our learning and apprenticeship culture, backed by structured programs, is all about helping you grow while creating an environment where feedback is clear, actionable, and focused on your development. The real magic happens when you take the input from others to heart and embrace the fast-paced learning experience, owning your journey. A voice that matters From day one, we value your ideas and contributions. You’ll make a tangible impact by offering innovative ideas and practical solutions. We not only encourage diverse perspectives, but they are critical in driving us toward the best possible outcomes. Global community With colleagues across 65+ countries and over 100 different nationalities, our firm’s diversity fuels creativity and helps us come up with the best solutions for our clients. Plus, you’ll have the opportunity to learn from exceptional colleagues with diverse backgrounds and experiences. World-class benefits On top of a competitive salary (based on your location, experience, and skills), we provide a comprehensive benefits package, which includes medical, dental, mental health, and vision coverage for you, your spouse/partner, and children. Your Impact As a Data Engineer I at McKinsey & Company, you will play a key role in designing, building, and deploying scalable data pipelines and infrastructure that enable our analytics and AI solutions. You will work closely with product managers, developers, asset owners, and client stakeholders to turn raw data into trusted, structured, and high-quality datasets used in decision-making and advanced analytics. Your core responsibilities will include Developing robust, scalable data pipelines for ingesting, transforming, and storing data from multiple structured and unstructured sources using Python/SQL. Creating and optimizing data models and data warehouses to support reporting, analytics, and application integration. Working with cloud-based data platforms (AWS, Azure, or GCP) to build modern, efficient, and secure data solutions. Contributing to R&D projects and internal asset development. Contributing to infrastructure automation and deployment pipelines using containerization and CI/CD tools. Collaborating across disciplines to integrate data engineering best practices into broader analytical and generative AI (gen AI) workflows. Supporting and maintaining data assets deployed in client environments with a focus on reliability, scalability, and performance. Furthermore, you will have opportunity to explore and contribute to solutions involving generative AI, such as vector embeddings, retrieval-augmented generation (RAG), semantic search, and LLM-based prompting, especially as we integrate gen AI capabilities into our broader data ecosystem. Your Qualifications and Skills Bachelor’s degree in computer science, engineering, mathematics, or a related technical field (or equivalent practical experience). 3+ years of experience in data engineering, analytics engineering, or a related technical role. Strong Python programming skills with demonstrated experience building scalable data workflows and ETL/ELT pipelines. Proficient in SQL with experience designing normalized and denormalized data models. Hands-on experience with orchestration tools such as Airflow, Kedro, or Azure Data Factory (ADF). Familiarity with cloud platforms (AWS, Azure, or GCP) for building and managing data infrastructure. Discernable communication skills, especially around breaking down complex structures into digestible and relevant points for a diverse set of clients and colleagues, at all levels. High-value personal qualities including critical thinking and creative problem-solving skills; an ability to influence and work in teams. Entrepreneurial mindset and ownership mentality are must; desire to learn and develop, within a dynamic, self-led organization. Hands-on experience with containerization technologies (Docker, Docker-compose). Hands on experience with automation frameworks (Github Actions, CircleCI, Jenkins, etc.). Exposure to generative AI tools or concepts (e.g., OpenAI, Cohere, embeddings, vector databases). Experience working in Agile teams and contributing to design and architecture discussions. Contributions to open-source projects or active participation in data engineering communities. Show more Show less

Posted 1 month ago

Apply

7.0 - 12.0 years

16 - 27 Lacs

Hyderabad

Work from Office

Job Description Data Engineer We are seeking a highly skilled Data Engineer with extensive experience in Snowflake, Data Build Tool (dbt), Snaplogic, SQL Server, PostgreSQL, Azure Data Factory, and other ETL tools. The ideal candidate will have a strong ability to optimize SQL queries and a good working knowledge of Python. A positive attitude and excellent teamwork skills are essential. Role & responsibilities Data Pipeline Development: Design, develop, and maintain scalable data pipelines using Snowflake, DBT, Snaplogic, and ETL tools. SQL Optimization: Write and optimize complex SQL queries to ensure high performance and efficiency. Data Integration: Integrate data from various sources, ensuring consistency, accuracy, and reliability. Database Management: Manage and maintain SQL Server and PostgreSQL databases. ETL Processes: Develop and manage ETL processes to support data warehousing and analytics. Collaboration: Work closely with data analysts, data scientists, and business stakeholders to understand data requirements and deliver solutions. Documentation: Maintain comprehensive documentation of data models, data flows, and ETL processes. Troubleshooting: Identify and resolve data-related issues and discrepancies. Python Scripting: Utilize Python for data manipulation, automation, and integration tasks. Preferred candidate profile Proficiency in Snowflake, DBT, Snaplogic, SQL Server, PostgreSQL, and Azure Data Factory. Strong SQL skills with the ability to write and optimize complex queries. Knowledge of Python for data manipulation and automation. Knowledge of data governance frameworks and best practices Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration skills. Positive attitude and ability to work well in a team environment. Certifications: Relevant certifications (e.g., Snowflake, Azure) are a plus. Please forward your updated profiles to the below mentioned Email Address: divyateja.s@prudentconsulting.com

Posted 1 month ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

JobOpening Senior Data Engineer (Remote, Contract 6 Months) Remote | Contract Duration: 6 Months | Experience: 6-8 Years We are hiring a Senior Data Engineer for a 6-month remote contract position. The ideal candidate is highly skilled in building scalable data pipelines and working within the Azure cloud ecosystem, especially Databricks, ADF, and PySpark. You'll work closely with cross-functional teams to deliver enterprise-level data engineering solutions. #KeyResponsibilities Build scalable ETL pipelines and implement robust data solutions in Azure. Manage and orchestrate workflows using ADF, Databricks, ADLS Gen2, and Key Vaults. Design and maintain secure and efficient data lake architecture. Work with stakeholders to gather data requirements and translate them into technical specs. Implement CI/CD pipelines for seamless data deployment using Azure DevOps. Monitor data quality, performance bottlenecks, and scalability issues. Write clean, organized, reusable PySpark code in an Agile environment. Document pipelines, architectures, and best practices for reuse. #MustHaveSkills Experience: 6+ years in Data Engineering Tech Stack: SQL, Python, PySpark, Spark, Azure Databricks, ADF, ADLS Gen2, Azure DevOps, Key Vaults Core Expertise: Data Warehousing, ETL, Data Pipelines, Data Modelling, Data Governance Agile, SDLC, Containerization (Docker), Clean coding practices #GoodToHaveSkills Event Hubs, Logic Apps Power BI Strong logic building and competitive programming background Mode: Remote Duration: 6 Months Locations : Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 1 month ago

Apply

2.0 - 3.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Job Title JOB Description – Middleware L1 Middleware Administrator – L1 Roles And Responsibility We are looking for a passionate candidate who can perform Middleware L1 level tasks. Eligibility SN Skill Experience 1 Middleware Administrator – L1 B.E. / B. Tech/BCA (On-Site), Relevant Certification (Preference). 2-3 Years relevant experience, ITIL Trained, OEM Certified on Minimum One Technology. Technology: Oracle Forms, Oracle Fusion Middleware Desired Skills & Experience ✓ Should be a Team player ✓ Communication and Problem-Solving – should have good communication skills and the ability to solve problems ✓ Process Knowledge – Working knowledge of ITSM tool & knowledge on ITIL process i.e. SR, Incident, Change, Release & Problem Management etc. ✓ Should have Collaborative approach, Adapatibility Technical Skills ✓ Oracle Applications: Oracle Forms 10g ,Oracle SSO 10g, OID 10g ,Oracle Portal 10g, Oracle Reports 10g, Internet Application Server (OAS) 10.1.2.2.0, Oracle Web Server (OWS) 10.1.2.2.0, Oracle WebCenter Portal 12.2.1.3, Oracle Access Manager 12.2.1.3, Oracle Internet Directory 12.2.1.3, Oracle WebLogic Server 12.2.1.3, Oracle HTTP Server 12.2.1.3, Oracle ADF 12.2.1.3 (Fusion middleware), Oracle Forms 12.2.1.3, Oracle Reports12.2.1.3, Mobile apps, tomcat etc ✓ Microsoft Applications: Windows IIS, portal, Web cache, BizTalk application and DNS applications ✓ Operating systems: RHEL 7, 8, 9 ✓ Tools & Utilities: ITSM Tools (ServiceNow, Symphony SUMMIT), JIRA Key Responsibilities Application Monitoring Services ✓ Monitor application response times from the end-user perspective in real time and alert organizations when performance is unacceptable. By alerting the user to problems and intelligently segmenting response times, it should quickly expose problem sources and minimizes the time necessary for resolution. ✓ It should allow specific application transactions to be captured and monitored separately. This allows administrators to select the most important operations within business-critical applications to be measured and tracked individually. ✓ It should use baseline-oriented thresholds to raise alerts when application response times deviate from acceptable levels. This allows IT administrators to respond quickly to problems and minimize the impact on service delivery. ✓ It should automatically segment response-time information into network, server and local workstation components to easily identify the source of bottlenecks. ✓ Monitoring of applications, including Oracle Forms 10g ,Oracle SSO 10g ,OID 10g, Oracle Portal 10g ,Oracle Reports 10g ,Internet Application Server (OAS) 10.1.2.2.0, Oracle Web Server (OWS) 10.1.2.2.0, Oracle WebCenter Portal 12.2.1.3 ,Oracle Access Manager 12.2.1.3,Oracle Internet Directory 12.2.1.3,Oracle WebLogic Server 12.2.1.3,Oracle HTTP Server 12.2.1.3, Oracle ADF 12.2.1.3 (Fusion middleware) ,Oracle Forms 12.2.1.3,Oracle Reports12.2.1.3,mobile apps, Windows IIS, portal, web cache, BizTalk application and DNS applications, tomcat etc. ✓ Shutdown and start-up of applications, generation of MIS reports, monitoring of application load user account management scripts execution, analysing system events, monitoring of error logs etc. ✓ Compliance to daily health checklist, portal Updation ✓ Logging of system events and incidents ✓ SR, Incidents tickets Updation in Symphony iServe Tool Application Release Management ✓ Scheduling, coordinating and managing releases for application ✓ Take application code backup, place new code and restart the services Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D And A) – Azure Data Engineer - Senior As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop & deploy big data pipelines in a cloud environment using Azure Cloud services ETL design, development and migration of existing on-prem ETL routines to Cloud Service Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams Design and optimize model codes for faster execution Skills And Attributes For Success Overall 3+ years of IT experience with 2+ Relevant experience in Azure Data Factory (ADF) and good hands-on with Exposure to latest ADF Version Hands-on experience on Azure functions & Azure synapse (Formerly SQL Data Warehouse) Should have project experience in Azure Data Lake / Blob (Storage purpose) Should have basic understanding on Batch Account configuration, various control options Sound knowledge in Data Bricks & Logic Apps Should be able to coordinate independently with business stake holders and understand the business requirements, implement the requirements using ADF To qualify for the role, you must have Be a computer science graduate or equivalent with 3-7 years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Ideally, you’ll also have Client management skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Project Description: Our client is an EU subsidiary of a Global Financial Bank working in multiple markets and asset classes. The Bank's Data Store has been transformed to a Data warehouse (DWH) which is the central source for Regulatory Reporting. It is also intended to be the core data integration platform which not only provide date for regulatory reporting but also provide data for Risk Modelling, Portfolio Analysis, Ad Hoc Analysis & Reporting (Finance, Risk, other), MI Reporting, Data Quality Management, etc. Due to high demand of regulatory requirements, a lot of regulatory projects are in progress to reflect regulatory requirements on existing regulatory reports and to develop new regulatory reports on MDS. Examples are IFRS9, AnaCredit, IRBB, the new Deposit Guarantee Directive (DGSD), Bank Data Retrieval Portal (BDRP) and the Fundamental Review of the Trading Book (FRTB). DWH / ETL Tester will work closely with the Development Team to design, build interfaces and integrate data from a variety from internal and external data sources into the new Enterprise Data Warehouse environment. The ETL Tester will be primarily responsible for testing Enterprise Data Warehouse using Automation within industry recognized ETL standards, architecture, and best practices. Responsibilities: Testing the Bank's data warehouse system changes, testing the changes (user stories), support IT integration testing in TST and support business stakeholders with User Acceptance Testing. It is hands-on position: you will be required to write and execute test cases, build test automation where it is applicable. Overall Purpose of Job - Test the MDS data warehouse system - Validate regulatory reports - Supporting IT and Business stakeholders during UAT phase - Contribute to improvement of testing and development processes - Work as part of a cross-functional team and take ownership of tasks - Contribute in Testing Deliverables. - Ensure the implementation of test standards and best practices for the agile model & contributes to their development. - Engage with internal stakeholders in various areas of the organization to seek alignment and collaboration. - Deals with external stakeholders / Vendors. - Identify risks / issues and present associated mitigating actions taking into account the critically of the domain of the underlying business. - Contribute to continuous improvement of testing standard processes. Additional responsibilities include work closely with the systems analysts and the application developers, utilize functional design documentation and technical specifications to facilitate the creation and execution of manual and automated test scripts, perform data analysis and creation of test data, track and help resolve defects and ensure that all testing is conducted and documented in adherence with the bank's standard. Mandatory Skills: Data Warehouse (DWH) ETL Test Management Mandatory Skills Description: Must have experience/expertise : Tester, Test Automation, Data Warehouse, Banking Technical: - At least 5 years of testing experience of which at least 2 years in the finance industry with good level knowledge on Data Warehouse, RDBMS concepts. - Strong SQL scripting knowledge and hands-on experience and experience with ETL & Databases. - Expertise on new age cloud based Data Warehouse solutions - ADF, SnowFlake, GCP etc. - Hands-On expertise in writing complex SQL using multiple JOINS and highly complex functions to test various transformations and ETL requirements. - Knowledge and Experience on creating Test Automation for Database and ETL Testing Regression Suite. - Automation using Selenium with Python (or Java Script), Python Scripts, Shell Script. - Knowledge of framework designing, REST API Testing of databases using Python. - Experience using Atlassian tool set, Azure DevOps and code & Version Management - GIT, Bitbucket, Azure Repos etc. - Help and provide inputs for the creation of Test Plan to address the needs of Cloud Based ETL Pipelines. Non-Technical: - Able to work in an agile environment - Experience in working in high priority projects (high pressure on delivery) - Some flexibility outside 9-5 working hours (Netherlands Time zone) - Able to work in demanding environment and have pragmatic approach with "can do" attitude. - Able to work independently and also to collaborate across the organization - Highly developed problem-solving skills with minimal supervision - Able to easily adapt to new circumstances / technologies / procedures. - Stress resistant and constructive - whatever the context. - Able to align with existing standards and acting with attention to detail. Nice-to-Have Skills Description: - Experience of financial regulatory reports - Experience in test automation for data warehouse (using bamboo) Software skills: - Bitbucket - Bamboo - Azure Tech Stack - Azure Data Factory - WKFS OneSumX reporting generator - Analytics tool such as Power BI / Excel / SSRS / SSAS, WinSCP Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies