Home
Jobs

977 Adf Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

As one of the world’s leading asset managers, Invesco is dedicated to helping investors worldwide achieve their financial objectives. By delivering the combined power of our distinctive investment management capabilities, we provide a wide range of investment strategies and vehicles to our clients around the world. If you're looking for challenging work, smart colleagues, and a global employer with a social conscience, come explore your potential at Invesco. Make a difference every day! Job Description Job Purpose (Job Summary): The primary role of this position will be to join the Invesco technology team that will implement a global Enterprise Content Management solution on the Alfresco platform. The vision is to utilize this platform for all of Invesco’s content, including but not limited to: Legal documents, marketing collateral, text snippets, digital assets, etc. This role will be working as a technology team member, responsible for the building and maintenance of the platform. Alfresco is one component of a technical transformation taking place, so it will be integrated with other emerging technologies within Invesco. Invesco technology is creating strategic advantage by being the best. We are highly motivated, not your usual financial services stereotype, and thrive within an industry resistant to change. As a team we respect and trust each other and celebrate our successes AND failures while continually looking for ways to improve. Key Responsibilities/Duties: Work closely with clients to understand business requirements and to gain a thorough understanding of the business processes. Answers team member questions, assists team members with issues, and reviews team member work for quality. Learn and adhere to department standards for design, development and quality control. Complete major deliverables including technical design documentation and implementation of business requirements. Prepare for and support user acceptance testing. Work as part of a project team, reporting progress and escalating issues to project management in a timely manner to ensure successful completion of projects / reviews. Complete all tasks related to technical analysis, building and unit testing, quality assurance, system test and implementation in accordance with the IT development life cycle. Provide post implementation support. Create and maintain documentation for systems and processes. Assist with developing improvements to team processes and procedures. Stay abreast of new developments and functionality of Alfresco and content management industry trends. Work Experience: Proficient in using Alfresco for content management Must have a understanding of the Alfresco Content Services architecture Must have experience extending Alfresco via ADF, content modeling, web scripts, behaviors, actions and scheduled jobs Experience / understanding development process in the DevOps environment (including exposure to CI/CD tools such a Jenkins, source code management tools - e.g. GIT) Excellent problem-solving skills Good understanding of software engineering concepts and practices Experience in developing and consuming REST or GraphQL APIs Exposure and interest in full stack development in one of the following: Node.JS, Python or Java Debugging skills with the ability to ask for help Consume libraries, tools, and APIs from other teams to achieve the desired functionality Experience Agile framework is preferred Knowledge/ Abilities: Excellent verbal and written skills Enjoy challenging and thought provoking work and have a strong desire to learn and progress (motivated enough to self-learn). Must demonstrate a strong customer focused attitude and understand the fundamentals of customer service Structured, disciplined approach to work, with attention to detail Be able to work under pressure and multi-task while remaining professional and courteous. Open-minded, flexible and willing to listen for other people’s opinions. Ability to work as part of a distributed team in a self-directed way with strong communication skills. Flexibility, teamwork, empathy, a sense of humor and courage to challenge the status quo are must haves in this environment Adaptable and resourceful, capable of working under pressure to meet aggressive deadlines with limited resources. Able to work independently and with team members (brainstorming, team building activities, etc). Formal Education and Experience Required: (minimum requirement to perform job duties): A Bachelor’s Degree in IT related program is preferred and a minimum of 3 years of related experience. Working Conditions: Normal office environment with little exposure to noise, dust and temperatures. The ability to lift, carry or otherwise move objects of up to 10 pounds is also necessary. Normally works a regular schedule of hours, however hours may vary depending upon the project or assignment. Willingness to travel both domestically and internationally. Frequency and duration to be determined by manager. About Invesco Invesco Ltd. is a leading independent global investment management firm, dedicated to helping investors worldwide achieve their financial objectives. By delivering the combined power of our distinctive investment management capabilities, Invesco provides a wide range of investment strategies and vehicles to our clients around the world. Operating in more than 20 countries, the firm is listed on the New York Stock Exchange under the symbol IVZ. About Invesco Technology Do you like working with top Technology professionals where everyone has an opportunity to collaborate, share ideas and work on leading-edge technologies? Do you thrive in an environment where you are part of a team implementing innovative technology solutions for clients and employees? Invesco’s Technology team is a global organization with 1300+ employees working together to serve the business. We value everyone’s ideas and input and provide opportunities to develop skills. If this sounds like a team you want to be a part of, read on to learn more about the opportunity to join us. Full Time / Part Time Full time Worker Type Employee Job Exempt (Yes / No) Yes Workplace Model At Invesco, our workplace model supports our culture and meets the needs of our clients while providing flexibility our employees value. As a full-time employee, compliance with the workplace policy means working with your direct manager to create a schedule where you will work in your designated office at least three days a week, with two days working outside an Invesco office. Why Invesco In Invesco, we act with integrity and do meaningful work to create impact for our stakeholders. We believe our culture is stronger when we all feel we belong, and we respect each other’s identities, lives, health, and well-being. We come together to create better solutions for our clients, our business and each other by building on different voices and perspectives. We nurture and encourage each other to ensure our meaningful growth, both personally and professionally. We believe in diverse, inclusive, and supportive workplace where everyone feels equally valued, and this starts at the top with our senior leaders having diversity and inclusion goals. Our global focus on diversity and inclusion has grown exponentially and we encourage connection and community through our many employee-led Business Resource Groups (BRGs). What’s in it for you? As an organization we support personal needs, diverse backgrounds and provide internal networks, as well as opportunities to get involved in the community and in the world. Our benefit policy includes but not limited to: Competitive Compensation Flexible, Hybrid Work 30 days’ Annual Leave + Public Holidays Life Insurance Retirement Planning Group Personal Accident Insurance Medical Insurance for Employee and Family Annual Health Check-up 26 weeks Maternity Leave Paternal Leave Adoption Leave Near site Childcare Facility Employee Assistance Program Study Support Employee Stock Purchase Plan ESG Commitments and Goals Business Resource Groups Career Development Programs Mentoring Programs Invesco Cares Dress for your Day In Invesco, we offer development opportunities that help you thrive as a lifelong learner in a constantly evolving business environment and ensure your constant growth. Our AI enabled learning platform delivers curated content based on your role and interest. We ensure our manager and leaders also have many opportunities to advance their skills and competencies that becomes pivotal in their continuous pursuit of performance excellence. To know more about us About Invesco: https://www.invesco.com/corporate/en/home.html About our Culture: https://www.invesco.com/corporate/en/about-us/our-culture.html About our D&I policy: https://www.invesco.com/corporate/en/our-commitments/diversity-and-inclusion.html About our CR program: https://www.invesco.com/corporate/en/our-commitments/corporate-responsibility.html Apply for the role @ Invesco Careers: https://careers.invesco.com/india/

Posted 21 hours ago

Apply

7.0 years

1 - 10 Lacs

Hyderābād

On-site

GlassDoor logo

SUMMARY The Database Developer III will play a critical role in engaging with stakeholders and technical team members for requirement gathering, creating data pipelines in ADF and SSIS, mapping, extraction, transformation, visualizations, and analytical data analysis. You will work closely with cross-functional teams, including IT and business stakeholders to ensure seamless and efficient data flow, report generation, visualizations, and data analysis. You will collaborate with various departments to ensure data accuracy, integrity, and compliance with established data standards. This role will report to the BEST Data Services lead in our Business Enterprise Systems Technology department. A successful Database Developer must take a hands-on approach, ensuring the highest quality solutions are provided to our business stakeholders, with accurate development, documentation, and adherence to deadlines. This role will also work with key stakeholders across the organization to drive enhancements to a successful implementation and ensure all master data meets requirements and are deployed and implemented properly. PRIMARY RESPONSIBILITIES Expertise in writing complex SQL queries, Data modeling and Performance tuning. Proficient in ETL operations using SSIS and other tools. Proficient in Python API calls, Cloud ETL operations Proficient in Cloud technologies like Azure and GCP. Collaborating with other stakeholders to ensure the architecture is aligned with business requirements. Collaborating with senior Leaders to determine business-specific application needs. Providing technical leadership to the application development team. Performing design and code reviews and ensuring application design standards are maintained. Compiling and implementing application development plans for new or existing applications. Must be following SDLC practices. Demonstrating application prototypes and integrating user feedback. Writing scripts and code for applications, as well as installing and updating applications. Mentoring junior application developers and providing end-users with technical support. Performing application integration, maintenance, upgrades, and migration. Documenting application development processes, procedures, and standards. Integrating trends in application architecture in application development projects. streamlining of day-to-day activities; providing a more efficient production environment. lowering costs and gaining cost-effectiveness; providing a secure, stable, and supportable environment. REQUIRED KNOWLEDGE/SKILLS/ABILITIES Bachelor's degree in Computer Science, Engineering, or related field. 7+ years of experience in SQL SERVER (Expert) and SSIS (Expert) development. Proficient in DataBricks. Proficient in Python API calls. Basic knowledge of Cloud Technologies (Azure/GCP). Strong knowledge of Data Modeling and Data Warehousing concepts. Good to have knowledge in POWERBI. Strong analytical and problem-solving skills.

Posted 21 hours ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

Simeio is a global identity and access management service provider focused on protecting organizations key data and access requirements to business-critical systems and applications. Simeio provides services such as Access Management, IGA, PAM & CIAM plus our wider service offerings include support, upgrades, governance and application onboarding. The Opportunity The Senior Consultant is responsible for supporting delivery and support of high-quality Identity and Access Management services while adhering to Simeio's standards and best practices. The ideal Candidate would have experience primarily in Access Management and preferably in CIAM to serve as a member of multiple client engagement teams that assist clients in employing proper information systems, resources, and controls to maximize efficiencies and minimize risk. Ability to take up challenges, adapt to the business needs, and staying focused on delivering results are essential to the success of this role. The Role: Senior Identity Specialist Key Skills: Access Management, IBM Security Access Manager, Core Directory Services Key Accountabilities Responsible for all development, customization and configuration activities for the Access Management Program. These configurations will be performed via the Ping suite which provides customers with a single comprehensive platform for access request. In-depth knowledge of Identity Security Access Manager (ISAM) is required. Candidate will be responsible for working on a variety of supplementary products such as Java, IBM LDAP, WebSphere, and WebSEAL. to perform configuration and customization for the Customer Access Management Program. Working experience on setting up Single Sign-On (SSO), WebSEAL/Reverse proxy etc. Work experience on different types of junctions. Should have worked on different policy configurations like ACLs, POPs and authorization rules. Good working knowledge on SAML2.0, OAuth and OIDC setup for SSOand API protection and on IBM security Directory Serve (LDAP) Candidate will be responsible for the following specific life cycle duties: Technical Analysis of Business Requirements and Design interpretation – Ensuring understanding of design in response to business requirements and is able to interpret and successfully implement solutions. Technical Process Management – Ensuring all development and implementation activities follow technical process for compliance purposes. Release Management – Coordinating and building deployment processes to enable management of code/configuration releases Configuration and Customization – Completing all assigned configuration and customization activities within and in support of the Oracle Identity Governance suite. Quality Assurance Management – Triaging and correcting testing defects. Technical Process Documentation – Creating and maintaining development and deployment documentation. Knowledge on -- Web services, disconnected systems and ADF would be added advantage. Must Have Requisites: Familiarity With The Following Technologies And Concepts Is Desired Directories (Active Directory, LDAP & X500) Extensive working experience on IBM ISAM 9 or above Single Sign-on and Federation (Kerberos, SAML 2.0, OAuth 2.0, etc.) Good to Have Requisites: Solid written and verbal communication Capable of working on multiple projects simultaneously Capable of solving complex problems Capable of defining strategic and tactical solutions, and knowing when each applies Educational Qualification: Bachelor's degree in computer science, systems analysis or a related study, or equivalent experience (Masters preferred). About Simeio Simeio has over 650 talented employees across the globe. We have offices in USA (Atlanta HQ and Texas), India, Canada, Costa Rica and UK. Founded in 2007, and now backed by private equity company ZMC, Simeio is recognized as a top IAM provider by industry analysts. Alongside Simeio’s identity orchestration tool ‘Simeio IO’ - Simeio also partners with industry leading IAM software vendors to provide access management, identity governance and administration, privileged access management and risk intelligence services across on-premise, cloud, and hybrid technology environments. Simeio provides services to numerous Fortune 1000 companies across all industries including financial services, technology, healthcare, media, retail, public sector, utilities and education. Simeio is an equal opportunity employer. If you require assistance with completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employee selection process, please direct your inquiries to our recruitment team - [email protected]. Thank you About Your Application We review every application received and will get in touch if your skills and experience match what we’re looking for. If you don’t hear back from us within 10 days, please don’t be too disappointed – we may keep your CV on our database for any future vacancies and we would encourage you to keep an eye on our career opportunities as there may be other suitable roles. Simeio is an equal opportunity employer. If you require assistance with completing this application, interviewing, completing any pre-employment testing, or otherwise participating in the employee selection process, please direct your inquiries to any of the recruitment team at recruitment@simeio.com or +1 404-882-3700.

Posted 21 hours ago

Apply

5.0 years

0 Lacs

Udaipur, Rajasthan, India

On-site

Linkedin logo

For quick Response, please fill out the form Job Application Form 34043 - Data Scientist - Senior I - Udaipur https://docs.google.com/forms/d/e/1FAIpQLSeBy7r7b48Yrqz4Ap6-2g_O7BuhIjPhcj-5_3ClsRAkYrQtiA/viewform 3–5 years of experience in Data Engineering or similar roles Strong foundation in cloud-native data infrastructure and scalable architecture design Build and maintain reliable, scalable ETL/ELT pipelines using modern cloud-based tools Design and optimize Data Lakes and Data Warehouses for real-time and batch processing Ingest, transform, and organize large volumes of structured and unstructured data Collaborate with analysts, data scientists, and backend engineers to define data needs Monitor, troubleshoot, and improve pipeline performance, cost-efficiency, and reliability Implement data validation, consistency checks, and quality frameworks Apply data governance best practices and ensure compliance with privacy and security standards Use CI/CD tools to deploy workflows and automate pipeline deployments Automate repetitive tasks using scripting, workflow tools, and scheduling systems Translate business logic into data logic while working cross-functionally Strong in Python and familiar with libraries like pandas and PySpark Hands-on experience with at least one major cloud provider (AWS, Azure, GCP) Experience with ETL tools like AWS Glue, Azure Data Factory, GCP Dataflow, or Apache NiFi Proficient with storage systems like S3, Azure Blob Storage, GCP Cloud Storage, or HDFS Familiar with data warehouses like Redshift, BigQuery, Snowflake, or Synapse Experience with serverless computing like AWS Lambda, Azure Functions, or GCP Cloud Functions Familiar with data streaming tools like Kafka, Kinesis, Pub/Sub, or Event Hubs Proficient in SQL, and knowledge of relational (PostgreSQL, MySQL) and NoSQL (MongoDB, DynamoDB) databases Familiar with big data frameworks like Hadoop or Apache Spark Experience with orchestration tools like Apache Airflow, Prefect, GCP Workflows, or ADF Pipelines Familiarity with CI/CD tools like GitLab CI, Jenkins, Azure DevOps Proficient with Git, GitHub, or GitLab workflows Strong communication, collaboration, and problem-solving mindset Experience with data observability or monitoring tools (bonus points) Contributions to internal data platform development (bonus points) Comfort working in data mesh or distributed data ownership environments (bonus points) Experience building data validation pipelines with Great Expectations or similar tools (bonus points)

Posted 22 hours ago

Apply

40.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Description Engineering Graduate / Post Graduate preferably in Computer Science or MCA having 2+ yrs of development experience in : Oracle and ADF based applications Knowledge of RDBMS and data modeling concepts Oracle database, knowledge of SQL, and PL/SQL Cient side web development languages (JavaScript, HTML, DHTML, and CSS) Desirable : Rest API Implementation SOA (REST-based micro-services) Collaborative development , (Gitflow, peer reviewing) Maven SQL - Continuous Integration/delivery (Jenkins,Docker) Diversity and Inclusion: An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry.In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Responsibilities Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Java developers need to be successful building cloud-native applications. Leverage deep integrations with familiar tools like Spring, Maven, Kubernetes, and IntelliJ to get started quickly. As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products. Duties and tasks are varied and complex needing independent judgment. Fully competent in own area of expertise. May have project lead role and or supervise lower level personnel. BS or MS degree or equivalent experience relevant to functional area. 4 years of software engineering or related experience. Qualifications Career Level - About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Posted 1 day ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Exciting Opportunity Alert! 🌟 HTC Global Services is hiring Azure Data Engineer for our premium project. HTC Global Services - a leading CMM level 5 global provider of innovative IT and Business Process Services and Solutions since 1990 with headquarters in Troy, Michigan, USA. Position Title: Data Engineer(Azure stack) Location : Chennai / Bangalore / Hyderabad Maintaining data pipelines, storage solutions (e.g., data lakes, warehouses), and ETL frameworks Implement scalable solutions for orchestration (ADF, ADB...etc.) and data observability Infrastructure provisioning and support: Collaborate on setting up data storage (ADLS, DB's, etc.), compute clusters, and networking. Data ingestion and integration: Collaborate on APIs, event streaming (e.g. Event Hub), and data logging strategies. Schema design and versioning: Ensure compatibility and scalability in data exchange formats. Technical escalations: Serve as the point of contact for issues like service outages, quota limits, and performance bottlenecks. Requirements gathering: Translate product goals into data specifications Data integration: Ingest and normalize external data feeds (APIs, flat files, real-time streams). SLAs and data quality: Maintain contracts and expectations for delivery timeliness, format consistency, and accuracy. Data sourcing and transformation: Build reliable pipelines for financial reporting, KPI dashboards, and forecasts. Data access audits: Share lineage, transformation logs, and access controls. Governance and access control: Implement policies for PII handling, encryption, and user role management. Strategic alignment: Communicate the impact and roadmap of data initiatives. Custom implementations: Collaborate on complex integrations and performance tuning. Executive dashboards: Deliver reliable, real-time insights and metrics to support decision-making. Knowledge transfer: Ensure that external teams document and train internal staff for long-term ownership. Working with Data Science & BI Analytics team to understand the models, BI report, and deployment. Data delivery: Provide APIs, dashboards, or file exports to enterprise clients. Support and troubleshooting: Handle escalations related to data issues or pipeline delays. Interested candidates please do share your updated CV to mubeenakamal.basha@htcinc.com mentioning your Current CTC, expected CTC and notice period details. #HTCGlobalServices #ITJobs #hiring #lookingforjob #careers #jobs #immediatejoiner #recruitment #technology #jobseekers #interview #lookingforjobchange #newjob #Azure #Dataengineer #ADF #ADB #ADL #Databricks #DataLake

Posted 1 day ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Exciting Opportunity Alert! 🌟 HTC Global Services is hiring Azure Data Engineer for our premium project. HTC Global Services - a leading CMM level 5 global provider of innovative IT and Business Process Services and Solutions since 1990 with headquarters in Troy, Michigan, USA. Position Title: Data Engineer(Azure stack) Location : Chennai / Bangalore / Hyderabad Maintaining data pipelines, storage solutions (e.g., data lakes, warehouses), and ETL frameworks Implement scalable solutions for orchestration (ADF, ADB...etc.) and data observability Infrastructure provisioning and support: Collaborate on setting up data storage (ADLS, DB's, etc.), compute clusters, and networking. Data ingestion and integration: Collaborate on APIs, event streaming (e.g. Event Hub), and data logging strategies. Schema design and versioning: Ensure compatibility and scalability in data exchange formats. Technical escalations: Serve as the point of contact for issues like service outages, quota limits, and performance bottlenecks. Requirements gathering: Translate product goals into data specifications Data integration: Ingest and normalize external data feeds (APIs, flat files, real-time streams). SLAs and data quality: Maintain contracts and expectations for delivery timeliness, format consistency, and accuracy. Data sourcing and transformation: Build reliable pipelines for financial reporting, KPI dashboards, and forecasts. Data access audits: Share lineage, transformation logs, and access controls. Governance and access control: Implement policies for PII handling, encryption, and user role management. Strategic alignment: Communicate the impact and roadmap of data initiatives. Custom implementations: Collaborate on complex integrations and performance tuning. Executive dashboards: Deliver reliable, real-time insights and metrics to support decision-making. Knowledge transfer: Ensure that external teams document and train internal staff for long-term ownership. Working with Data Science & BI Analytics team to understand the models, BI report, and deployment. Data delivery: Provide APIs, dashboards, or file exports to enterprise clients. Support and troubleshooting: Handle escalations related to data issues or pipeline delays. Interested candidates please do share your updated CV to mubeenakamal.basha@htcinc.com mentioning your Current CTC, expected CTC and notice period details. #HTCGlobalServices #ITJobs #hiring #lookingforjob #careers #jobs #immediatejoiner #recruitment #technology #jobseekers #interview #lookingforjobchange #newjob #Azure #Dataengineer #ADF #ADB #ADL #Databricks #DataLake

Posted 1 day ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients *Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer . Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) * Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) *Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) *Years of experience required Minimum 4Years of Oracle fusion experience *Education Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Bachelor of Technology, Master of Business Administration Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Integration Cloud (OIC) Optional Skills Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Self-Awareness, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date

Posted 1 day ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Summary The Database Developer III will play a critical role in engaging with stakeholders and technical team members for requirement gathering, creating data pipelines in ADF and SSIS, mapping, extraction, transformation, visualizations, and analytical data analysis. You will work closely with cross-functional teams, including IT and business stakeholders to ensure seamless and efficient data flow, report generation, visualizations, and data analysis. You will collaborate with various departments to ensure data accuracy, integrity, and compliance with established data standards. This role will report to the BEST Data Services lead in our Business Enterprise Systems Technology department. A successful Database Developer must take a hands-on approach, ensuring the highest quality solutions are provided to our business stakeholders, with accurate development, documentation, and adherence to deadlines. This role will also work with key stakeholders across the organization to drive enhancements to a successful implementation and ensure all master data meets requirements and are deployed and implemented properly. Primary Responsibilities Expertise in writing complex SQL queries, Data modeling and Performance tuning. Proficient in ETL operations using SSIS and other tools. Proficient in Python API calls, Cloud ETL operations Proficient in Cloud technologies like Azure and GCP. Collaborating with other stakeholders to ensure the architecture is aligned with business requirements. Collaborating with senior Leaders to determine business-specific application needs. Providing technical leadership to the application development team. Performing design and code reviews and ensuring application design standards are maintained. Compiling and implementing application development plans for new or existing applications. Must be following SDLC practices. Demonstrating application prototypes and integrating user feedback. Writing scripts and code for applications, as well as installing and updating applications. Mentoring junior application developers and providing end-users with technical support. Performing application integration, maintenance, upgrades, and migration. Documenting application development processes, procedures, and standards. Integrating trends in application architecture in application development projects. streamlining of day-to-day activities; providing a more efficient production environment. lowering costs and gaining cost-effectiveness; providing a secure, stable, and supportable environment. Required Knowledge/Skills/Abilities Bachelor's degree in Computer Science, Engineering, or related field. 7+ years of experience in SQL SERVER (Expert) and SSIS (Expert) development. Proficient in DataBricks. Proficient in Python API calls. Basic knowledge of Cloud Technologies (Azure/GCP). Strong knowledge of Data Modeling and Data Warehousing concepts. Good to have knowledge in POWERBI. Strong analytical and problem-solving skills.

Posted 1 day ago

Apply

4.0 - 9.0 years

15 - 25 Lacs

Hyderabad

Remote

Naukri logo

Role & responsibilities Preferred candidate profile

Posted 1 day ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description Oracle Global Services Center (GSC) is a fast-growing cloud consulting team passionate about our customer’s rapid and successful adoption of Oracle Cloud Solutions. Our flexible and innovative “Optimum Shore” approach helps our clients implement, maintain, and integrate their Oracle Cloud Applications and Technology environments while reducing overall total cost of ownership. We assemble an efficient team for each client by blending resources from onshore, near shore, and offshore global delivery centers to match the right expertise, to the right solution, for the right cost. To support our rapid growth, we are seeking versatile consultants that bring a passion for providing excellent client experience, enabling client success by developing innovative solutions. Our cloud solutions are redefining the world of business, empowering governments, and helping society evolve with the pace of change. Join the team of top-class consultants and help our customers achieve more than ever before.. Senior consulting position operating independently with some assistance and mentorship to a project team or customer align with Oracle methodologies and practices. Performs standard duties and tasks with some variation to implement Oracle products and technology to meet customer specifications. Life at Oracle: We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veteran’s status or any other characteristic protected by law. At Oracle, we don’t just value differences—we celebrate them! Committed to crafting a workplace where all kinds of people work together. We believe innovation starts with diversity. https://www.oracle.com/corporate/careers/culture/diversity.html Career Level - IC2 Responsibilities Oracle Global Services Center (GSC) is a fast-growing cloud consulting team passionate about our customer’s rapid and successful adoption of Oracle Cloud Solutions. Our flexible and innovative “Optimum Shore” approach helps our clients implement, maintain, and integrate their Oracle Cloud Applications and Technology environments while reducing overall total cost of ownership. We assemble an efficient team for each client by blending resources from onshore, near shore, and offshore global delivery centers to match the right expertise, to the right solution, for the right cost. To support our rapid growth, we are seeking versatile consultants that bring a passion for providing excellent client experience, enabling client success by developing innovative solutions. Our cloud solutions are redefining the world of business, empowering governments, and helping society evolve with the pace of change. Join the team of top-class consultants and help our customers achieve more than ever before.. Life at Oracle: We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veteran’s status or any other characteristic protected by law. At Oracle, we don’t just value differences—we celebrate them! Committed to crafting a workplace where all kinds of people work together. We believe innovation starts with diversity. https://www.oracle.com/corporate/careers/culture/diversity.html Detailed Description Operates independently to provide quality work products to an engagement. Performs multifaceted and complex tasks that need independent judgment. Applies Oracle methodology, company procedures, and leading practices. Demonstrates expertise to deliver solutions on complex engagements. May act as the functional team lead on projects. Efficiently collaborates with management of customer organizations. Participates in business development activities. Develops and configures detailed solutions for complex projects. Detail Requirements: The candidate is expected to have a sound domain knowledge in HCM covering the hire to retire cycle with 7 to 12 years experience. They must have been a part of at least 3 end to end HCM Cloud implementations along with experience in at least 1 projects as a lead. FUNCTIONAL - The candidate must have knowledge in any of the modules along with Core HR module -Time and Labor Absence Management Payroll Benefits Compensation Recruiting The candidate should have been in client facing roles and interacted with customers in requirement gathering workshops, design, configuration, testing and go-live. Engineering graduates with MBA (HR) will be preferred. TECHNICAL - In-depth understanding of Data Model and Business process functionality and its data flow) in HCM Cloud application and Oracle EBS / PeopleSoft AU (HRMS). Experienced knowledge on Cloud HCM Conversions, integrations (HCM Extracts & BIP), Reporting (OTBI & BIP), Fast Formula & Personalization. Engineering Graduation in any field or MCA Degree or equivalent experience. Proven experience with Fusion technologies including HDL, HCM Extracts, Fast Formulas, BI Publisher Reports & Design Studio. Apart from the above experience, advanced knowledge in OIC, ADF, Java, PaaS, DBCS etc would be an added advantage. Good functional or technical leadership capability with strong planning and follow up skills, mentorship, Work Allocation, Monitoring and status updates to Project Coordinator Should have strong written and verbal communication skills, personal drive, flexibility, teammate, problem solving, influencing and negotiating skills and organizational awareness and sensitivity, engagement delivery, continuous improvement and sharing the knowledge and client management. Assist in the identification, assessment and resolution of complex Technical issues/problems. Interact with client frequently around specific work efforts/deliverable. Candidate should be open for domestic or international travel for short as well as long duration. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Posted 1 day ago

Apply

130.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Northern Trust Northern Trust, a Fortune 500 company, is a globally recognized, award-winning financial institution that has been in continuous operation since 1889. Northern Trust is proud to provide innovative financial services and guidance to the world’s most successful individuals, families, and institutions by remaining true to our enduring principles of service, expertise, and integrity. With more than 130 years of financial experience and over 22,000 partners, we serve the world’s most sophisticated clients using leading technology and exceptional service. Northern Trust is looking for a Senior Lead Software Engineer to join its Technology Development Centre in Pune, India in the Investment Management team. This individual as part of an agile development team, will be responsible for analysis and design of upcoming Alternatives business platform that meet business and technical requirements. Major Duties Analyse and build the data model from requirements of Private Equity, Hedge Fund businesses of Northern Trust Asset Management upcoming Alts datawarehouse. Analyse the source data while working with upstream teams and 50 South development team to produce schema. Build pipelines to extract required data from upstream systems and model them for reporting to clients and downstream systems. Break down requirements to domain, model and entity data for setup in datawarehouse. Able to define the Raw, Transform and Curate layers for data consumption. Liaise with various vendor products, internal applications to refine the requirements to help technical team solutions. First point of contact for clarification of any business gaps in Tech team locally. Participate in data modeling discussions and ensure the dataware house model meets business needs A team player with an ability to own the design and code as per requirement given. Communicate status (written and verbal) to project team and management Continuously looks for ways to improve the application’s stability, scalability and user experience Experience Experience/ Skills Bachelor or equivalent degree in finance with technology background 8-12 years experienced technical engineer who can develop and maintain high-performance, reliable, and scalable Java-microservice architecture applications. Strong on Design and implement cloud-native applications on Microsoft Azure, utilizing services like Azure App Services, Azure Functions, and Azure Kubernetes Services (AKS), ADF, Azure Networking concepts. Write clean, reusable, and well-documented code. Collaborate with cross-functional teams, including UI/UX designers, QA engineers, and product managers. Ensure applications adhere to high performance, scalability, and security standards. Leverage Azure DevOps for CI/CD pipelines and automation. Monitor, troubleshoot, and optimize performance for cloud-hosted applications. Integrate data storage solutions using Azure SQL, Snowflake, or other database technologies. Stay updated with emerging technologies and cloud trends to continuously enhance systems and solutions. Required Skills Strong expertise in Java (Java 8 and java 17 or higher). Proficiency in frameworks like Spring Boot, Microservice Architecture. Experience in cloud-native development and deployment on Microsoft Azure. Hands-on experience with Azure services such as Azure App Services, Functions, Kubernetes (AKS), Azure DevOps, Blob Storage, and Service Bus. Knowledge of RESTful APIs, SOAP, and microservices architecture. Solid understanding of database technologies (e.g., Azure SQL, MySQL, Cosmos DB, PostgreSQL). Experience with version control systems like Git. Familiarity with containerization tools such as Docker and orchestration tools like Kubernetes. Strong understanding of design patterns, algorithms, and data structures. Excellent problem-solving, debugging, and analytical skills. Design, develop and use data structures and data marts to support reporting. Good analytical and problem-solving skills Both attention to detail & ability to rise above details to see broader implications & recommend strategic solutions Self-starter; Positive & adaptable in a continually changing environment Ability to work independently and with a team Proven interpersonal and communication skills with technical & business partners Strong understanding of building CI/CD pipelines for change management Preferred/ Recommended Skills Familiarity with Change management process Financial domain knowledge – Investment Management, potfolio construction and risk management. Worked on project streamlining the testing process by introducing automation, leveraging tools and setting goals to reduce time and effort. Experience with Azure Data Factory (ADF) for building and orchestrating data pipelines. Knowledge of messaging systems like Kafka or . Certification in Microsoft Azure (e.g., Azure Developer Associate or Azure Solutions Architect). Familiarity with front-end technologies like JavaScript, Angular, or React. Working With Us As a Northern Trust partner, greater achievements await. You will be part of a flexible and collaborative work culture in an organization where financial strength and stability is an asset that emboldens us to explore new ideas. Movement within the organization is encouraged, senior leaders are accessible, and you can take pride in working for a company committed to assisting the communities we serve! Join a workplace with a greater purpose. We’d love to learn more about how your interests and experience could be a fit with one of the world’s most admired and sustainable companies! Build your career with us and apply today. #MadeForGreater Reasonable accommodation Northern Trust is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation for any part of the employment process, please email our HR Service Center at MyHRHelp@ntrs.com. We hope you’re excited about the role and the opportunity to work with us. We value an inclusive workplace and understand flexibility means different things to different people. Apply today and talk to us about your flexible working requirements and together we can achieve greater.

Posted 1 day ago

Apply

4.0 - 6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Job Description We are looking for a skilled Power BI and SQL Developer who can design and deliver robust, user-friendly dashboards and data solutions. The ideal candidate will have strong experience in Power BI development, SQL programming, and data modeling, with excellent communication and stakeholder management abilities. Key Responsibilities Design, develop, and maintain interactive Power BI dashboards and reports based on business requirements. Build efficient data models (star/snowflake schemas) and implement Row-Level Security (RLS) for secure reporting. Develop complex DAX expressions and calculations to support advanced analytics. Write and optimize complex SQL queries, stored procedures, and views to support data extraction and transformation. Collaborate with stakeholders to gather requirements, understand business needs, and translate them into effective data solutions. Work closely with cross-functional teams, providing regular updates and managing stakeholder expectations throughout the project lifecycle. Must-Have Skills - Power BI & SQL Development Proven experience in designing, developing, and maintaining Power BI dashboards and reports. Proficiency in creating optimized data models (star/snowflake schemas) and implementing Row-Level Security (RLS) for secure and scalable reporting. Strong command of advanced DAX expressions for complex data calculations and analytical insights. Expertise in writing and optimizing complex SQL queries, stored procedures, and database views. Solid understanding of query performance tuning, data normalization, and indexing strategies, along with monitoring and supporting Power BI reports. Excellent verbal and written communication skills. Proven ability to work with business stakeholders and cross-functional teams. Good-to-Have Skills - Azure Data Factory (ADF) & ETL Pipelines: Experience with Azure Data Factory (ADF) and building scalable ETL pipelines. Familiarity with Azure Synapse Analytics, Blob Storage, or other Azure data services. Ability to schedule and automate data refreshes and transformations in a cloud environment. Qualifications Bachelor’s degree in information technology, Computer Science, or related discipline. 4 to 6 years of hands-on experience in Power BI (including DAX, Power Query, and Data Modeling) and SQL development

Posted 1 day ago

Apply

5.0 - 8.0 years

15 - 27 Lacs

Pune

Work from Office

Naukri logo

We are seeking a skilled Data Engineer with hands-on experience in Azure Data Factory (ADF) and Snowflake development. The ideal candidate will have a solid background in SQL, data warehousing, and cloud data pipelines, with a keen ability to design, implement, and maintain robust data solutions that support business intelligence and analytics initiatives. Key Responsibilities: Design and develop scalable data pipelines using ADF and Snowflake Integrate data from various sources using SQL, GitHub, and cloud-native tools Apply data warehousing best practices and ensure optimal data flow and data quality Collaborate within Scrum teams and contribute to agile development cycles Liaise effectively with stakeholders across the globe to gather requirements and deliver solutions Support data modeling efforts and contribute to Python-based enhancements (as needed) Qualifications: Minimum 5 years of overall Data Engineering experience At least 2 years of hands-on experience with Snowflake At least 2 years of experience working with Azure Data Factory Strong understanding of Data Warehousing concepts and methodologies Experience with Data Modeling and proficiency in Python is a plus Familiarity with version control systems like GitHub Experience working in an agile (Scrum) environment

Posted 1 day ago

Apply

2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer . Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) * Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) *Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) *Year of experience required Minimum 2+ Years of Oracle fusion experience *Educational Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Technology, Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Business Intelligence (BI) Publisher, Oracle Fusion Applications Optional Skills Accepting Feedback, Active Listening, Business Transformation, Communication, Design Automation, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date

Posted 1 day ago

Apply

0 years

0 Lacs

Hyderābād

On-site

GlassDoor logo

Req ID: 321505 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Test Analyst to join our team in Bangalore, Karnātaka (IN-KA), India (IN). Job Duties: Understand business requirements , develop test cases. Work with tech team and client to validate and finalise test cases.. Use Jira or equivalent test management tool to record test cases, expected results, outcomes, assign defects Run in testing phase - SIT and UAT Test Reporting & Documentation Basic knowledge to use snowflake, SQL, ADF (optional) and Fivetran (optional) Minimum Skills Required: Test Cases development Jira knowledge for record test cases, expected results, outcomes, assign defects) Test Reporting & Documentation Basic knowledge to use snowflake, SQL, ADF (optional) and Fivetran (optional) About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 1 day ago

Apply

0 years

0 Lacs

Gurgaon

On-site

GlassDoor logo

Req ID: 321505 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Test Analyst to join our team in Bangalore, Karnātaka (IN-KA), India (IN). Job Duties: Understand business requirements , develop test cases. Work with tech team and client to validate and finalise test cases.. Use Jira or equivalent test management tool to record test cases, expected results, outcomes, assign defects Run in testing phase - SIT and UAT Test Reporting & Documentation Basic knowledge to use snowflake, SQL, ADF (optional) and Fivetran (optional) Minimum Skills Required: Test Cases development Jira knowledge for record test cases, expected results, outcomes, assign defects) Test Reporting & Documentation Basic knowledge to use snowflake, SQL, ADF (optional) and Fivetran (optional) About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 1 day ago

Apply

0 years

0 Lacs

Noida

On-site

GlassDoor logo

Req ID: 321505 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Test Analyst to join our team in Bangalore, Karnātaka (IN-KA), India (IN). Job Duties: Understand business requirements , develop test cases. Work with tech team and client to validate and finalise test cases.. Use Jira or equivalent test management tool to record test cases, expected results, outcomes, assign defects Run in testing phase - SIT and UAT Test Reporting & Documentation Basic knowledge to use snowflake, SQL, ADF (optional) and Fivetran (optional) Minimum Skills Required: Test Cases development Jira knowledge for record test cases, expected results, outcomes, assign defects) Test Reporting & Documentation Basic knowledge to use snowflake, SQL, ADF (optional) and Fivetran (optional) About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 1 day ago

Apply

4.0 - 7.0 years

15 - 30 Lacs

Hyderabad

Remote

Naukri logo

Experience Required: 5 to 7Years Mandate Mode of work: Remote Primary Skill: Azure Data Factory, SQL, Python/Scala Notice Period : Immediate Joiners/ Permanent(Can join within July 4th 2025 ) 5 to 7 years of experience with Big Data technologies Experience with Microsoft Azure cloud platform. Design, implement, and maintain data pipelines for data ingestion, processing, and transformation in Azure Data Factory. Create and maintain data storage solutions including Azure SQL Database, Azure Data Lake, and Azure Blob Storage. Utilizing Azure Data Factory or comparable technologies, create and maintain ETL (Extract, Transform, Load) operations. Implementing data validation and cleansing procedures will ensure the quality, integrity, and dependability of the data. Improve the scalability, efficiency, and cost-effectiveness of data pipelines. Monitoring and resolving data pipeline problems will guarantee consistency and availability of the data Interested candidate can share your resume to OR you can refer your friend to Pavithra.tr@enabledata.com for the quick response.

Posted 1 day ago

Apply

5.0 years

0 Lacs

India

Remote

Linkedin logo

This is full time remote contract position. You may need to provide few hours of overlapping time with US timezone . You may need to go through the background verification process in which your claimed experience, education certificates and references given will be verified. So pls don't apply if you are not comfortable to go through this verification process. This is client facing role hence excellent communication in English language is MUST . Min. Experience : 5+ years About role : Our client is about to start an ERP replacement. They plan to move away from the AWS platform and move to an Azure data lake feeding Snowflake. We need a resource who can be Snowflake thought leader and having Microsoft azure data engineering expertise. Key Responsibilities: Data Ingestion & Orchestration (Transformation & Cleansing) : o Design and maintain Azure Data Factory (ADF) pipelines : Extract data from sources like ERPs (SAP, Oracle), UKG, SharePoint, and REST APIs. o Configure scheduled/event-driven loads : Set up ADF for automated data ingestion. o Transform and cleanse data : Develop logic in ADF for Bronze-to-Silver layer transformations. o Implement data quality checks : Ensure accuracy and consistency. · Snowflake Data Warehousing: o Design/optimize data models: Create tables, views, and stored procedures for Silver and Gold layers. o ETL/ELT in Snowflake: Transform curated Silver data into analytical Gold structures. o Performance tuning: Optimize queries and data loads. Design, develop, and optimize data models within Snowflake, including creating tables, views, and stored procedures for both Silver and Gold layers. o Implement ETL/ELT processes within Snowflake to transform curated data (Silver) into highly optimized analytical structures (Gold) Data Lake Management: o Implement Azure Data Lake Gen2 solutions : Follow medallion architecture (Bronze, Silver). o Manage partitioning, security, governance: Ensure efficient and secure data storage. · Collaboration & Documentation: Partner with stakeholders to convert data needs into technical solutions, document pipelines and models, and uphold best practices through code reviews. Monitoring & Support: Track pipeline performance, resolve issues, and deploy alerting/logging for proactive data integrity and issue detection. · Data visualization tools : Proficient in like Power BI, DAX, and Power Query for creating insightful reports. Skilled in Python for data processing and analysis to support data engineering tasks. Required Skills & Qualifications: Over 5+ years of experience in data engineering, data warehousing, or ETL development. Microsoft Azure proficiency: Azure Data Factory (ADF): Experience in designing, developing, and deploying complex data pipelines. Azure Data Lake Storage Gen2: Hands-on experience with data ingestion, storage, and organization. Expertise in Snowflake Data Warehouse and ETL/ELT: Understanding Snowflake architecture. SQL proficiency for manipulation and querying. Experience with Snowpipe, tasks, streams, and stored procedures. Strong understanding of data warehousing concepts and ETL/ELT principles. Data Formats & Integration : Experience with various data formats (e.g., Parquet, CSV, JSON) and data integration patterns. Data Visualization: Experience with Power BI, DAX, Power Query. Scripting: Python for data processing and analysis. Soft Skills: Problem-solving, attention to detail, communication, and collaboration Nice-to-Have Skills : Version control (e.g., Git), Agile/Scrum methodologies and Data governance and security best practices.

Posted 1 day ago

Apply

4.0 years

0 Lacs

India

On-site

Linkedin logo

About PTR Global PTR Global is a leader in providing innovative workforce solutions, dedicated to optimizing talent acquisition and management processes. Our commitment to excellence has earned us the trust of businesses looking to enhance their talent strategies. We cultivate a dynamic and collaborative environment that empowers our employees to excel and contribute to our clients' success. Job Summary We are seeking a highly skilled ETL Developer to join our team in Chennai. The ideal candidate will be responsible for designing, developing, and maintaining ETL processes, as well as data warehouse design and modeling, to support our data integration and business intelligence initiatives. This role requires proficiency in T-SQL, Azure Data Factory (ADF), and SSIS, along with excellent problem-solving and communication skills. Responsibilities Design, develop, and maintain ETL processes to support data integration and business intelligence initiatives. Utilize T-SQL to write complex queries and stored procedures for data extraction and transformation. Implement and manage ETL processes using SSIS (SQL Server Integration Services). Design and model data warehouses to support reporting and analytics needs. Ensure data accuracy, quality, and integrity through effective testing and validation procedures. Collaborate with business analysts and stakeholders to understand data requirements and deliver solutions that meet their needs. Monitor and troubleshoot ETL processes to ensure optimal performance and resolve any issues promptly. Document ETL processes, workflows, and data mappings to ensure clarity and maintainability. Stay current with industry trends and best practices in ETL development, data integration, and data warehousing. Must Haves Minimum 4+ years of experience as an ETL Developer or in a similar role. Proficiency in T-SQL for writing complex queries and stored procedures. Experience with SSIS (SQL Server Integration Services) for developing and managing ETL processes. Knowledge of ADF (Azure Data Factory) and its application in ETL processes. Experience in data warehouse design and modeling. Knowledge of Microsoft's Azure cloud suite, including Data Factory, Data Storage, Blob Storage, Power BI, and Power Automate. Strong problem-solving and analytical skills. Excellent communication and interpersonal skills. Strong attention to detail and commitment to data quality. Bachelor's degree in Computer Science, Information Technology, or a related field is preferred.

Posted 1 day ago

Apply

7.0 years

0 Lacs

India

Remote

Linkedin logo

Senior Azure Developer (Remote / WFH) Summary: As a Senior Azure Developer, you will lead the design, development, and implementation of complex cloud-based applications on the Microsoft Azure platform. You will provide technical leadership and mentor junior and mid-level developers. Responsibilities: Lead the design and development of cloud-based applications. Collaborate with stakeholders to define project requirements. Write high-quality, scalable, and maintainable code. Conduct code reviews and provide technical guidance. Implement and manage CI/CD pipelines. Ensure the security and performance of applications. Troubleshoot and resolve advanced technical issues. Optimize application architecture and performance. Create and maintain detailed documentation. Stay updated with the latest Azure technologies and industry trends. Qualifications: Bachelor’s degree in Computer Science, Information Technology, or related field. 7+ years of experience in cloud development. Expert understanding of Microsoft Azure services. Proficiency in programming languages such as C#, JavaScript, or Python. Excellent problem-solving and analytical skills. Strong communication and leadership abilities. Experience with Agile methodologies. Preferred Certifications: Microsoft Certified - Azure DevOps Engineer Expert and Microsoft Certified - Azure Solutions Architect Expert Required Knowledge and Skills: Expert knowledge of Azure services like Azure App Service, Azure Functions, and Azure Storage. Leading the design and architecture of Azure-based applications, ensuring scalability, security, and performance. Proficiency in RESTful APIs and web services. Experience with version control systems like Git. Strong knowledge of SQL and NoSQL databases. In-depth understanding of DevOps practices. Experience with CI/CD pipelines. Strong understanding of networking concepts. Knowledge of security best practices in cloud environments. Ability to write clean, maintainable code. Experience with performance optimization. Hands-on writing automated test cases in Nunit/xunit/MSTest framework Hands-on with Azure containerization services Hands-on with ADF or Synapse Technologies, Coding Languages, and Methodologies: Microsoft Azure (Key Vault, Service Bus Queues, Storage Queues, Topics, Blob storage, Azure Container services (kubernetes, docker), App Services [Web Apps, Logic Apps, Function Apps], Azure functions (time triggered, durable), Azure AI services) Azure SQL, Cosmos DB .NET Core (latest version) API s, APIM Angular/ React JavaScript, Python SQL, Azure SQL, Cosmos DB Azure containerization services (Docker, Kubernetes) ADF or Synapse Nunit/xunit/MSTest framework Git Agile methodologies CI/CD pipelines IaC (Infrastructure as Code) - ARM/Bicep/TerraForms Azure DevOps Outcomes: Lead the design and development of complex cloud-based applications. Collaborate effectively with stakeholders. Write high-quality and scalable code. Provide technical leadership and mentorship. Implement and manage CI/CD pipelines. Ensure application security and performance. Troubleshoot and resolve advanced technical issues. Optimize application architecture and performance. Create and maintain detailed documentation. Stay updated with Azure technologies and industry trends.

Posted 1 day ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Title: SSE – DevOps Engineer Mode of work: Work from Office Experience: 4 - 10 Years of Experience Know your team At ValueMomentum’s Engineering Center, we are a team of passionate engineers who thrive on tackling complex business challenges with innovative solutions while transforming the P&C insurance value chain. We achieve this through strong engineering foundation and continuously refining our processes, methodologies, tools, agile delivery teams, and core engineering archetypes. Our core expertise lies in six key areas: Cloud Engineering, Application Engineering, Data Engineering, Core Engineering, Quality Engineering, and Domain expertise. Join a team that invests in your growth. Our Infinity Program empowers you to build your career with role-specific skill development leveraging immersive learning platforms. You'll have the opportunity to showcase your talents by contributing to impactful projects. Requirements - Must Have: 5+ years in DevOps with strong data pipeline experience Build and maintain CI/CD pipelines for Azure Data Factory and Databricks notebooks The role demands deep expertise in Databricks, including the automation of unit, integration, and QA testing workflows. Additionally, strong data architecture skills are essential, as the position involves implementing CI/CD pipelines for schema updates. Strong experience with Azure DevOps Pipelines, YAML builds, and release workflows. Proficiency in scripting languages like Python, PowerShell, Terraform Working knowledge of Azure services: ADF, Databricks, DABs, ADLS Gen2, Key Vault, ADO . Maintain infrastructure-as-code practices Collaborate with Data Engineers and Platform teams to maintain development, staging, and production environments. Monitor and troubleshoot pipeline failures and deployment inconsistencies. About ValueMomentum ValueMomentum is a leading solutions provider for the global property & casualty insurance industry, supported by deep domain and technology capabilities. We offer a comprehensive suite of advisory, development, implementation, and maintenance services across the entire P&C insurance value chain. This includes Underwriting, Claims, Distribution, and more, empowering insurers to stay ahead with sustained growth, high performance, and enhanced stakeholder value. Trusted by over 75 insurers, ValueMomentum is one of the largest standalone insurance-focused solutions providers to the US insurance industry. Our culture – Our fuel At ValueMomentum, we believe in making employees win by nurturing them from within, collaborating and looking out for each other. People first - Empower employees to succeed. Nurture leaders - Nurture from within. Enjoy wins – Recognize and celebrate wins. Collaboration – Foster a culture of collaboration and people-centricity. Diversity – Committed to diversity, equity, and inclusion. Fun – Create a fun and engaging work environment. Warm welcome – Provide a personalized onboarding experience. Company Benefits Compensation - Competitive compensation package comparable to the best in the industry. Career Growth - Career development, comprehensive training & certification programs, and fast track growth for high potential associates. Benefits: Comprehensive health benefits and life insurance.

Posted 1 day ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Job Title: Senior Data Engineer Location: Noida | Gurgaon | Hyderabad | Bangalore (Hybrid – 2 days/month in office) Experience: 8+ years Employment Type: Full-time | Hybrid Skills: PySpark | Databricks | ADF | Big Data | Hadoop | Hive About the Role: We are looking for a highly experienced and results-driven Senior Data Engineer to join our growing team. This role is ideal for a data enthusiast who thrives in managing and optimizing big data pipelines using modern cloud and big data tools. You’ll play a key role in designing scalable data architectures and enabling data-driven decision-making across the organization. Key Responsibilities: Design, build, and maintain scalable and efficient data pipelines using PySpark and Databricks Develop ETL workflows and orchestrate data pipelines using Azure Data Factory (ADF) Work with structured and unstructured data across the Hadoop ecosystem (HDFS, Hive, Spark) Optimize data processing and storage for high performance and reliability Collaborate with data scientists, analysts, and business teams to ensure data availability and quality Implement data governance, data quality, and security best practices Monitor and troubleshoot production data pipelines and jobs Document technical solutions and standard operating procedures Required Skills & Qualifications: 8+ years of hands-on experience in data engineering and big data technologies Proficiency in PySpark , Databricks , and Azure Data Factory (ADF) Strong experience with Big Data technologies: Hadoop , Hive , Spark , HDFS Solid understanding of data modeling, warehousing concepts, and performance tuning Familiarity with cloud data platforms, preferably Azure Strong SQL skills and experience in managing large-scale data systems Excellent problem-solving, debugging, and communication skills Nice to Have: Experience with Delta Lake , Apache Airflow , or Kafka Exposure to CI/CD for data pipelines Knowledge of data lake architectures and data mesh principles

Posted 2 days ago

Apply

7.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

About ProcDNA ProcDNA is a global consulting firm. We fuse design thinking with cutting-edge technology to create game-changing Commercial Analytics and Technology solutions for our clients. We're a passionate team of 275+ across 6 offices, all growing and learning together since our launch during the pandemic. Here, you won't be stuck in a cubicle - you'll be out in the open water, shaping the future with brilliant minds. At ProcDNA, innovation isn't just encouraged; it's ingrained in our DNA. Ready to join our epic growth journey? What We Are Looking For You’ll be driving the adoption of the latest technologies in our solutions, bringing in thought leadership to guide clients on complex data management problems, and driving business performance. You will work with the leadership team to bring subject matter expertise in areas such as Big Data, ETL, Reporting, CRM, Data Warehousing, MDM, DevOps, Software Development, etc. We are seeking an individual who not only possesses the requisite expertise but also thrives in the dynamic landscape of a fast-paced global firm. What You’ll Do Leading end-to-end data management solution projects for multiple clients across data engineering and BI technologies. Responsible for creating a project management plan and ensuring adherence to project timelines. Integrate multiple data sources into one visualization to tell a story. Interact with customers to understand their business problems and provide best-in-class analytics solutions. Interact with Data Platform leaders and understand data flows that integrate into Tableau/analytics. Understand data governance, quality, security, and integrate analytics with these enterprise platforms. Interact with UX/UI global functions and design best-in class visualization for customers, harnessing all product capabilities. Must have 7 - 10 years of data warehousing and data engineering. Experience in interacting with Life Science clients directly, discussing requirements, and stakeholder management. Experience in requirement gathering and designing enterprise warehouse solutions from scratch. Hands-on experience with ETL tools like ADF, Databricks, and Informatica; experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc; experience in data warehouse: SQL/NoSQL, Amazon Redshift, Snowflake, Apache Hive, HDFS, etc. BI tools knowledge and experience in leading the implementation of dashboards. Deep understanding of data governance and data quality management frameworks. Strong communication and presentation skills with a strong problem-solving attitude. Excellent analytical, problem-solving, and debugging skills, with a strong ability to quickly learn and comprehend business processes and problems to effectively develop technical solutions to their requirements. Skills: mdm,sql,hdfs,data warehousing,big data,devops,cloud,amazon redshift,snowflake,pharmaceutical consulting,data management,apache hive,azure,reporting,problem-solving,luigi,informatica,analytical skills,presentation skills,data governance,adf,data engineering,crm,databricks,bi technologies,airflow,team management,business technology,aws,azkaban,software development,etl,client management,data quality management,life science

Posted 2 days ago

Apply

Exploring ADF Jobs in India

The job market for ADF (Application Development Framework) professionals in India is witnessing significant growth, with numerous opportunities available for job seekers in this field. ADF is a popular framework used for building enterprise applications, and companies across various industries are actively looking for skilled professionals to join their teams.

Top Hiring Locations in India

Here are 5 major cities in India where there is a high demand for ADF professionals: - Bangalore - Hyderabad - Pune - Chennai - Mumbai

Average Salary Range

The estimated salary range for ADF professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum

Career Path

In the ADF job market in India, a typical career path may include roles such as Junior Developer, Senior Developer, Technical Lead, and Architect. As professionals gain more experience and expertise in ADF, they can progress to higher-level positions with greater responsibilities.

Related Skills

In addition to ADF expertise, professionals in this field are often expected to have knowledge of related technologies such as Java, Oracle Database, SQL, JavaScript, and web development frameworks like Angular or React.

Interview Questions

Here are 25 interview questions for ADF roles, categorized by difficulty level: - Basic: - What is ADF and its key features? - What is the difference between ADF Faces and ADF Task Flows? - Medium: - Explain the lifecycle of an ADF application. - How do you handle exceptions in ADF applications? - Advanced: - Discuss the advantages of using ADF Business Components. - How would you optimize performance in an ADF application?

Closing Remark

As you explore job opportunities in the ADF market in India, make sure to enhance your skills, prepare thoroughly for interviews, and showcase your expertise confidently. With the right preparation and mindset, you can excel in your ADF career and secure rewarding opportunities in the industry. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies