Jobs
Interviews

1611 Adf Jobs - Page 47

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

15 - 30 Lacs

Pune

Hybrid

Skills- Data Engineer, Azure Data Factory (ADF), SQL, Power BI, SSRS, SSIS, SSAS, ETL, Data Bricks, Data Integration, Data Model

Posted 1 month ago

Apply

5.0 years

0 Lacs

Jodhpur, Rajasthan, India

On-site

Power BI Development (Mandatory Requirements) Design and develop interactive dashboards, KPIs, and reports using Power BI. Create data visualizations and reports based on business requirements. Optimize Power BI datasets and reports for performance. Publish reports to Power BI Service and manage user access and permissions. Collaborate with business stakeholders to gather requirements and translate them into effective dashboards. Data Engineering Develop and maintain robust ETL/ELT pipelines using tools like ADF, SSIS, or custom scripts. - Nice to have Work with large datasets from multiple sources (SQL, APIs, cloud storage, etc.). - Must have Create and maintain data models and data warehouses (Azure Synapse, Snowflake, etc.). - Nice to have Implement data quality checks, logging, and performance monitoring. - Must have Collaborate with data architects and analysts to define data standards and governance. - Must have Required Skills & Experience: (Must Have) 4–5 years of professional experience in data engineering and Power BI development. Strong experience with SQL and relational databases (e.g., SQL Server, PostgreSQL, MySQL). Proficient in Power BI (DAX, Power Query, Data Modeling). Hands-on experience with Azure Data Factory, Synapse Analytics, or other cloud-based data tools. Knowledge of Python or other scripting languages for data processing (optional but preferred). Strong understanding of data warehousing concepts and dimensional modelling. Excellent problem-solving, communication, and documentation skills. Strong in Business Analytics skills Show more Show less

Posted 1 month ago

Apply

3.0 - 8.0 years

7 - 11 Lacs

Mumbai

Work from Office

Technology Governance is as integral a part of the Technology team as are the Business facing, Infrastructure and the Development teams. Technology Governance is responsible for driving the compliance to the various internal processes and policies within the Technology team to ensure compliance and effective governance for the smooth execution of the Technology team. As a manager in Technology Governance team, the main role will be to establish and track against Data governance related activities. Support the implementation and maintenance of data governance policies, procedures, and standards specific to the banking industry. Hands-on experience in creating and maintaining activities associated with data life cycle management and various data governance activities. Develop, update, and maintain the data dictionary for critical banking data assets, ensuring accurate definitions, attributes, and classifications. Interfacing Work with business units and IT teams to standardize terminology across systems for consistency and clarity. Document end-to-end data lineage for key banking data processes (e.g., customer data, transaction data, risk management data). Create and maintain documentation of metadata, data dictionaries, and lineage for ongoing governance processes. Experience on reports and dashboards preparation for data quality scores, and lineage status. Technical Skills. Experience in Data governance related activities like (preparation on data dictionary and data lineage documents). Proficient in writing database queries. anyone of the database like (SQL, Oracle, MySQL, Postgres) Experience in data life cycle management Understanding of data privacy and security frameworks specific to banking, such as PCI DSS, DPDP act.

Posted 1 month ago

Apply

5.0 - 10.0 years

9 - 13 Lacs

Pune

Work from Office

Key Responsibilities Lead the design, development, and deployment of Oracle Fusion SaaS solutions, particularly in Supply Chain and Finance. Build and maintain integrations using Oracle Integration Cloud (OIC), REST/SOAP web services, and middleware tools. Customize and extend Fusion applications using BI Publisher, OTBI, FBDI, HDL, and ADF. Translate business requirements into technical specifications and detailed solution designs. Support the full development lifecycle including change management, documentation, testing, and deployment. Participate in formal design/code reviews and ensure adherence to coding standards. Collaborate with IT service providers to ensure quality, performance, and scalability of outsourced work. Provide Level 3 support for critical technical issues. Stay current with emerging Oracle technologies and contribute to continuous improvement initiatives. Experience 5+ years of hands-on experience in Oracle Fusion SaaS development and technical implementation. Proven experience with Oracle Fusion Supply Chain and Finance modules. Intermediate level of relevant work experience (3-5 years minimum). Skills & Technical Expertise Strong knowledge of Oracle SaaS architecture, data models, and PaaS extensions. Proficiency in Oracle Integration Cloud (OIC), REST/SOAP APIs. Experience with Oracle tools: BI Publisher, OTBI, FBDI, HDL, ADF. Ability to analyze and revise existing systems for improvements. Familiarity with SDLC, version control, and automation tools. Qualifications Bachelor s degree in Computer Science, Information Technology, Business, or a related field (or equivalent experience). Relevant certifications in Oracle Fusion or related technologies are a plus. Compliance with export controls or sanctions regulations may be required. Core Competencies Customer Focus - Builds strong relationships and delivers customer-centric solutions. Global Perspective - Applies a broad, global lens to problem-solving. Manages Complexity - Navigates complex information to solve problems effectively. Manages Conflict - Resolves conflicts constructively and efficiently. Optimizes Work Processes - Continuously improves processes for efficiency and effectiveness. Values Differences - Embraces diverse perspectives and cultures. Technical Competencies Solution Design & Configuration - Designs scalable, secure, and maintainable solutions. Solution Functional Fit Analysis - Evaluates how well components interact to meet business needs. Solution Modeling & Validation Testing - Creates models and tests solutions to ensure they meet requirements. Performance Tuning & Data Modeling - Optimizes application and database performance.

Posted 1 month ago

Apply

40.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Engineering Graduate / Post Graduate preferably in Computer Science or MCA having 2+ yrs of development experience in : Oracle and ADF based applications Knowledge of RDBMS and data modeling concepts Oracle database, knowledge of SQL, and PL/SQL Cient side web development languages (JavaScript, HTML, DHTML, and CSS) Desirable : Rest API Implementation SOA (REST-based micro-services) Collaborative development , (Gitflow, peer reviewing) Maven SQL - Continuous Integration/delivery (Jenkins,Docker) Diversity and Inclusion: An Oracle career can span industries, roles, Countries and cultures, giving you the opportunity to flourish in new roles and innovate, while blending work life in. Oracle has thrived through 40+ years of change by innovating and operating with integrity while delivering for the top companies in almost every industry.In order to nurture the talent that makes this happen, we are committed to an inclusive culture that celebrates and values diverse insights and perspectives, a workforce that inspires thought leadership and innovation. Responsibilities Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications. Java developers need to be successful building cloud-native applications. Leverage deep integrations with familiar tools like Spring, Maven, Kubernetes, and IntelliJ to get started quickly. As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products. Duties and tasks are varied and complex needing independent judgment. Fully competent in own area of expertise. May have project lead role and or supervise lower level personnel. BS or MS degree or equivalent experience relevant to functional area. 4 years of software engineering or related experience. Qualifications Career Level - About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 month ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Chennai, Bengaluru

Work from Office

Immediate hiring for Azure Data Engineers/ Lead - Hexaware Technologies Primary Skill Set - Azure Databricks, Pyspark Required Total Exp : 4 to 12yrs Location : Chennai & Bangalore only Work Mode : 5 Days work from office Shift Timing : 1 pm to 10pm Notice : Immediate & Early joiners only preferred Job Description: Primary: Azure Databricks, ADF, Pyspark/Python Must Have • 6+ Years of IT experience in Datawarehouse and ETL • Hands-on data experience on Cloud Technologies on Azure, ADF, Synapse, Pyspark/Python • Ability to understand Design, Source to target mapping (STTM) and create specifications documents • Flexibility to operate from client office locations • Able to mentor and guide junior resources, as needed Nice to Have • Any relevant certifications • Banking experience on RISK & Regulatory OR Commercial OR Credit Cards/Retail Interested candidates, Kindly share your updated resume to ramyar2@hexaware.com with below required details. Full Name: Contact No: Total Exp: Rel Exp in PLSQL: Current & Joining Location: Notice Period (If serving mention LWD): Current CTC: Expected CTC:

Posted 1 month ago

Apply

7.0 years

0 Lacs

Rajkot, Gujarat, India

On-site

Job Description Oracle Global Services Center (GSC) is a fast-growing cloud consulting team passionate about our customer’s rapid and successful adoption of Oracle Cloud Solutions. Our flexible and innovative “Optimum Shore” approach helps our clients implement, maintain, and integrate their Oracle Cloud Applications and Technology environments while reducing overall total cost of ownership. We assemble an efficient team for each client by blending resources from onshore, near shore, and offshore global delivery centers to match the right expertise, to the right solution, for the right cost. To support our rapid growth, we are seeking versatile consultants that bring a passion for providing excellent client experience, enabling client success by developing innovative solutions. Our cloud solutions are redefining the world of business, empowering governments, and helping society evolve with the pace of change. Join the team of top-class consultants and help our customers achieve more than ever before.. Senior consulting position operating independently with some assistance and mentorship to a project team or customer align with Oracle methodologies and practices. Performs standard duties and tasks with some variation to implement Oracle products and technology to meet customer specifications. Life at Oracle: We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veteran’s status or any other characteristic protected by law. At Oracle, we don’t just value differences—we celebrate them! Committed to crafting a workplace where all kinds of people work together. We believe innovation starts with diversity. https://www.oracle.com/corporate/careers/culture/diversity.html Career Level - IC2 Responsibilities Oracle Global Services Center (GSC) is a fast-growing cloud consulting team passionate about our customer’s rapid and successful adoption of Oracle Cloud Solutions. Our flexible and innovative “Optimum Shore” approach helps our clients implement, maintain, and integrate their Oracle Cloud Applications and Technology environments while reducing overall total cost of ownership. We assemble an efficient team for each client by blending resources from onshore, near shore, and offshore global delivery centers to match the right expertise, to the right solution, for the right cost. To support our rapid growth, we are seeking versatile consultants that bring a passion for providing excellent client experience, enabling client success by developing innovative solutions. Our cloud solutions are redefining the world of business, empowering governments, and helping society evolve with the pace of change. Join the team of top-class consultants and help our customers achieve more than ever before.. Life at Oracle: We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veteran’s status or any other characteristic protected by law. At Oracle, we don’t just value differences—we celebrate them! Committed to crafting a workplace where all kinds of people work together. We believe innovation starts with diversity. https://www.oracle.com/corporate/careers/culture/diversity.html Detailed Description Operates independently to provide quality work products to an engagement. Performs multifaceted and complex tasks that need independent judgment. Applies Oracle methodology, company procedures, and leading practices. Demonstrates expertise to deliver solutions on complex engagements. May act as the functional team lead on projects. Efficiently collaborates with management of customer organizations. Participates in business development activities. Develops and configures detailed solutions for complex projects. Detail Requirements: The candidate is expected to have a sound domain knowledge in HCM covering the hire to retire cycle with 7 to 12 years experience. They must have been a part of at least 3 end to end HCM Cloud implementations along with experience in at least 1 projects as a lead. FUNCTIONAL - The candidate must have knowledge in any of the modules along with Core HR module -Time and Labor Absence Management Payroll Benefits Compensation Recruiting The candidate should have been in client facing roles and interacted with customers in requirement gathering workshops, design, configuration, testing and go-live. Engineering graduates with MBA (HR) will be preferred. TECHNICAL - In-depth understanding of Data Model and Business process functionality and its data flow) in HCM Cloud application and Oracle EBS / PeopleSoft AU (HRMS). Experienced knowledge on Cloud HCM Conversions, integrations (HCM Extracts & BIP), Reporting (OTBI & BIP), Fast Formula & Personalization. Engineering Graduation in any field or MCA Degree or equivalent experience. Proven experience with Fusion technologies including HDL, HCM Extracts, Fast Formulas, BI Publisher Reports & Design Studio. Apart from the above experience, advanced knowledge in OIC, ADF, Java, PaaS, DBCS etc would be an added advantage. Good functional or technical leadership capability with strong planning and follow up skills, mentorship, Work Allocation, Monitoring and status updates to Project Coordinator Should have strong written and verbal communication skills, personal drive, flexibility, teammate, problem solving, influencing and negotiating skills and organizational awareness and sensitivity, engagement delivery, continuous improvement and sharing the knowledge and client management. Assist in the identification, assessment and resolution of complex Technical issues/problems. Interact with client frequently around specific work efforts/deliverable. Candidate should be open for domestic or international travel for short as well as long duration. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio; including Software and Red Hat. Curiosity and a constant quest for knowledge serve as the foundation to success in IBM Consulting. In your role, you'll be encouraged to challenge the norm, investigate ideas outside of your role, and come up with creative solutions resulting in ground breaking impact for a wide network of clients. Our culture of evolution and empathy centers on long-term career growth and development opportunities in an environment that embraces your unique skills and experience. Your Role And Responsibilities Ability to take full ownership and deliver component or functionality. Supporting the team to deliver project features with high quality and providing technical guidance. Responsible to work effectively individually and with team members toward customer satisfaction and success Preferred Education Master's Degree Required Technical And Professional Expertise SQL ADF Azure Data Bricks Preferred Technical And Professional Experience PostgreSQL, MSSQL Eureka Hystrix, zuul/API gateway In-Memory storage Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Gurgaon/Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking an Assistant Scientist for our Data Engineering team. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner. What You’ll Be DOING What will your essential responsibilities include? Collaborate with cross-functional teams to understand data requirements and business needs. Utilize SQL to query and manipulate data from various sources, ensuring data accuracy and integrity. Develop and maintain data pipelines using DBT, Databricks to facilitate data processing and transformation. Assist in deploying and managing data solutions on Azure Cloud, ensuring optimal performance and security. Create and maintain documentation for data processes, data models, and system architecture. Participate in data quality checks and troubleshooting to resolve data-related issues. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practices to make sure the code is not vulnerable. Bring ideas to the tables that help to streamline and rationalize BTL jobs. Leads small team strategic partner/vendor team members. Works with business users and tries to bring closure to the request. You will report to the Division Lead of Data Modeling. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Proficiency in SQL for database querying and management. Excellent programming skills in Python, with experience in data manipulation and analysis. Must have hands-on experience in designing and developing ETL Pipelines. Relevant years of exposure and good proficiency in data warehousing concepts. Proficient in SQL and database Design concepts. Good knowledge of unit testing, documentation - low-level design. Desired Skills And Abilities Knowledge of the DBT/Databricks is a plus. Knowledge of Git is a plus. Knowledge of Azure cloud computing platform with Azure Synapse, and ADLS is a plus. Knowledge of Databricks and ADF, Pyspark is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Knowledge of GitHub and build management. Who WE are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and enables business growth and is critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most inclusive workforce possible, and create a culture where everyone can bring their full selves to work and reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe. Robust support for Flexible Working Arrangements Enhanced family-friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides competitive compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far-reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability. Show more Show less

Posted 1 month ago

Apply

6.0 - 9.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

We use cookies to offer you the best possible website experience. Your cookie preferences will be stored in your browser’s local storage. This includes cookies necessary for the website's operation. Additionally, you can freely decide and change any time whether you accept cookies or choose to opt out of cookies to improve website's performance, as well as cookies used to display content tailored to your interests. Your experience of the site and the services we are able to offer may be impacted if you do not accept all cookies. Press Tab to Move to Skip to Content Link Skip to main content Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook Search by Keyword Search by Location Home Page Home Page Life At YASH Core Values Careers Business Consulting Jobs Digital Jobs ERP IT Infrastructure Jobs Sales & Marketing Jobs Software Development Jobs Solution Architects Jobs Join Our Talent Community Social Media LinkedIn Twitter Instagram Facebook View Profile Employee Login Search by Keyword Search by Location Show More Options Loading... Requisition ID All Skills All Select How Often (in Days) To Receive An Alert: Create Alert Select How Often (in Days) To Receive An Alert: Apply now » Apply Now Start apply with LinkedIn Please wait... Module Lead - Power BI Job Date: Jun 5, 2025 Job Requisition Id: 61183 Location: Hyderabad, TG, IN Indore, IN Pune, MH, IN Indore, MP, IN Pune, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Power BI Professionals in the following areas : Experience 6-9 Years Job Description Power BI Developer Azure ADF Work on Power BI reports - Develop new reports or /fix any data issues in existing reports and support users for any data validation. Support the Data team to understand the functional requirements. Strong experience in SQL & writing complex DAX queries. Understand the existing report requirements & capture new report specifications. Coordinate amongst various groups in understanding Report KPI’s Participating in the data requirement sessions and develop Power BI reports. Provide the solutioning and design the prototype for use case reports. Specialized in different reporting tools. Responsible for report feature assessment and building report matrix. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture Apply now » Apply Now Start apply with LinkedIn Please wait... Find Similar Jobs: Careers Home View All Jobs Top Jobs Quick Links Blogs Events Webinars Media Contact Contact Us Copyright © 2020. YASH Technologies. All Rights Reserved. Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

This role is for one of our clients Industry: Technology, Information and Media Seniority level: Mid-Senior level Min Experience: 8 years Location: mumbai JobType: full-time About The Role We’re seeking a Lead Cloud Data Architect to design and lead the execution of enterprise-grade data platforms in a modern cloud environment. This is a strategic and hands-on role, ideal for a data expert who thrives in building scalable, high-performance ecosystems across Azure cloud services, big data platforms, and enterprise analytics tools. You’ll be instrumental in transforming our data infrastructure by driving architectural decisions, standardizing data governance, and ensuring secure, scalable, and accessible data systems. Your work will directly enable advanced analytics, reporting, and data science across the organization. What You’ll Be Doing Cloud Data Architecture Design modern, scalable data solutions on Microsoft Azure , integrating components such as Azure Data Lake, Azure Synapse, Azure SQL, and Azure Databricks. Build architecture blueprints to support streaming, batch, and real-time data processing. Data Pipelines & Engineering Build and orchestrate robust, fault-tolerant data pipelines using Azure Data Factory (ADF) , Databricks, and custom ETL frameworks. Drive transformation of structured, semi-structured, and unstructured data into optimized formats for downstream use. Big Data & Analytics Integration Utilize Azure Databricks for large-scale data processing, machine learning data prep, and distributed data transformations. Enable seamless data flows from lake to warehouse to visualization. Data Governance & Quality Implement robust data governance protocols including lineage, cataloging, classification, and access management. Ensure security, compliance, and adherence to regulatory data policies (GDPR, HIPAA, etc.). BI & Reporting Enablement Collaborate with analytics and reporting teams to provide highly available, performant, and clean data to tools like Power BI . Standardize KPIs, metrics definitions, and data sources for enterprise-wide reporting consistency. Collaboration & Leadership Engage with product owners, business leaders, and engineering teams to define long-term data strategies. Mentor engineers, review architectures, and set coding and documentation standards. What You Bring 8–10 years of experience in data architecture or engineering , including 5+ years designing solutions on Microsoft Azure . Expertise in Azure services: ADF, Azure Data Lake, Databricks, Synapse, Azure SQL , and Power BI. Proven experience building and optimizing ETL/ELT pipelines , with deep understanding of data transformation frameworks . Proficiency in Python , SQL , or Scala for scripting, data wrangling, and logic workflows. In-depth understanding of data modeling , data warehouse concepts, performance tuning, and storage optimization. Familiarity with DevOps and CI/CD pipelines in the context of data engineering. Strong collaboration, documentation, and communication skills — with the ability to work cross-functionally. Nice to Have Microsoft certifications such as Azure Data Engineer Associate or Solutions Architect . Hands-on experience with Azure Purview , Collibra , or other data governance and catalog tools. Familiarity with Apache Spark , version control systems, containerization (Docker), and orchestration (Kubernetes). Show more Show less

Posted 1 month ago

Apply

1.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Job Title: Campaign Associate Location: Noida Company: Adfluence Hub Industry: Influencer Marketing Employment Type: Full-time About Us: At Adfluence Hub, we pride ourselves on being a leading influencer marketing agency that delivers impactful and authentic campaigns for our clients. We are seeking a dedicated and dynamic Campaign Manager to join our team and contribute to our mission of excellence. Position Overview: We are seeking a highly skilled Campaign Manager to oversee the strategic execution of large-scale micro and nano influencer campaigns. This role requires a detail-oriented and results-driven professional who can manage the full campaign lifecycle, from influencer identification to execution and performance analysis. Key Responsibilities: Influencer Sourcing & Relationship Management: You will be responsible for identifying and onboarding relevant influencers, both micro and macro, ensuring they meet our standards for audience quality and engagement. Building and maintaining a robust network of micro-influencers is crucial for efficient campaign scaling. You'll negotiate competitive pricing, achieve monthly sign-up targets, and cultivate long-term relationships for continued collaboration. Campaign Execution & Coordination: You will develop and execute influencer marketing strategies aligned with client goals, working closely with internal teams to define objectives and timelines. Precision in managing contracts, deliverables, and payments is essential. You’ll ensure brand compliance and oversee all aspects of campaign execution, from content approvals to final rollouts. Analytics & Performance Tracking: Utilizing data-driven insights, you’ll track and analyze campaign performance, focusing on ROI, engagement, and conversions. You’ll leverage analytics tools along with the ADF tech platform to monitor influencer impact and optimize campaigns, delivering post-campaign reports with actionable insights for continuous improvement. Process Optimization & Automation: You will implement streamlined workflows for influencer onboarding and campaign execution, leveraging tools like Google Spreadsheets to automate tracking and reporting. Collaborating with platform and tech teams, you'll enhance influencer recruitment and campaign scalability. Key Performance Indicators (KPIs): Timely Campaign Execution Comprehensive Tracker Maintenance Influencer Satisfaction Levels Campaign Performance Metrics Influencer Onboarding Efficiency Qualifications & Skills: Experience: Minimum 1+ years of experience in influencer marketing, with a focus on micro-influencer campaigns. Experience in the Beauty and Personal Care industry is a plus. Core Competencies: Influencer Relationship Management: Ability to build and maintain strong influencer partnerships. Project Management: Strong organizational and time-management skills, capable of managing multiple campaigns simultaneously. Communication & Negotiation: Excellent verbal and written communication skills, with proven negotiation abilities. Strategic Thinking: Ability to develop and execute data-driven influencer marketing strategies. Data Analysis: the ability to interpret campaign metrics and optimize accordingly. Technical Skills: Proficiency in Google Spreadsheets, analytics tools, basic video editing and email communication platforms. Professional Attributes: Results-driven and highly motivated, with a commitment to achieving campaign objectives. Proactive and adaptable, capable of thriving in a fast-paced environment. Strong attention to detail and a commitment to quality. Ability to work well within a team. Company Culture: At Adfluence Hub, we value creativity, collaboration, and a positive work environment. We believe in fostering growth and development, both professionally and personally, and strive to create an inclusive and supportive workplace for all our team members. How to Apply: If you are passionate about influencer marketing and possess the skills to drive impactful campaigns, we would love to hear from you. Please submit your resume. Join us and be part of a team that values innovation, collaboration, and campaign success! Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within…. Responsibilities Design, implement, manage and optimize data pipelines in Azure Data Factory as per customer business requirements. Design and develop SparkSQL/PySpark codes in DataBricks. Integrate different services of Azure and External Systems to implement data analytics solutions. Design and develop codes in Azure LogicApp, Azure Functions, Azure SQL, Synapse etc. Implement best practices in ADF / DataBricks / other Azure data engineering services / target databases to maximize job performance, ensure code reusability and minimize implementation and maintenance cost. Ingest structured/semi-structures/unstructured data into ADLS/Blob Storage in batch/near real time/real time from different sources systems including RDBMS's, ERP's, File Systems, Storage Services, API's, Event Producers, NoSQL DB's etc. Develop advanced codes using SQL/Python/Scala/Spark/Data Engineering tools/other query languages to process data as per business requirements. Develop data ingestion, integration and transformation frameworks to ensure best data services. Understand client's business requirements to design data models as per the requirements. Design data warehouses, data lakes and other modern analytics systems to provide batch / near real time / real time data analytics capabilities to customer. Mandatory Skill Sets ADF +Python , Azure data engg Preferred Skill Sets ADF +Python , Azure data engg Years Of Experience Required 3-8 Education Qualification Btech/MBA/MCA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Python (Programming Language) Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

This role is for one of our clients Industry: Technology, Information and Media Seniority level: Associate level Min Experience: 3 years Location: Mumbai JobType: full-time About The Role We are looking for an enthusiastic and data-driven Associate Data Visualization Specialist to join our data & insights team. This role is ideal for someone with a strong foundation in data analytics who enjoys crafting clear, interactive, and impactful visual stories that drive smarter decisions across the business. As part of a collaborative team, you will work with large datasets, business users, and technical teams to create intuitive dashboards and reports. You'll support the development of analytical tools and bring clarity to complex data by transforming it into meaningful insights using Power BI , Tableau , and Azure Databricks . What You’ll Be Doing Data Visualization & Dashboarding Create compelling, easy-to-understand dashboards and visualizations using Power BI or Tableau . Translate raw data into clear stories that guide business decisions and highlight opportunities. Ensure consistency, accuracy, and visual best practices in every dashboard. Data Preparation & Transformation Support data transformation tasks, collaborating with data engineers and analysts. Use tools such as Azure Data Factory (ADF) to ingest and process structured and semi-structured data. Clean, shape, and enrich datasets to ensure readiness for visualization. Big Data & Cloud Integration Utilize Azure Databricks and PySpark to analyze large datasets and extract relevant insights. Assist in managing and optimizing data pipelines within the Azure ecosystem . Optimization & QA Monitor the performance and accuracy of dashboards, identifying and resolving issues proactively. Optimize queries and visuals to reduce latency and improve responsiveness. Stakeholder Engagement Work with business teams to understand data needs and define success metrics. Provide training, documentation, and walkthroughs to non-technical stakeholders on dashboard usage. What You Bring 2–3 years of experience in data visualization, business intelligence, or analytics. Strong knowledge of Power BI and/or Tableau , with a portfolio of dashboards or visual projects. Hands-on experience with Python and PySpark for data analysis and manipulation. Familiarity with Azure Databricks , Azure SQL , and Azure Data Factory (ADF) . Good understanding of ETL concepts , KPI design , and storytelling with data. Analytical mindset, attention to detail, and ability to work independently and collaboratively. Nice to Have Strong communication skills to explain data concepts to non-technical users. A curious, self-starter attitude with eagerness to grow in cloud data and analytics technologies. Basic understanding of data governance, metadata, and security practices. Show more Show less

Posted 1 month ago

Apply

5.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About Company : Our Client is a leading Indian multinational IT services and consulting firm. It provides digital transformation, cloud computing, data analytics, enterprise application integration, infrastructure management, and application development services. The company caters to over 700 clients across industries such as banking and financial services, manufacturing, technology, media, retail, and travel & hospitality. Its industry-specific solutions are designed to address complex business challenges by combining domain expertise with deep technical capabilities. With a global workforce of over 80,000 professionals and a presence in more than 50 countries. Job Title: Python Developer Locations: PAN INDIA Experience: 5-10 Years (Relevant) Employment Type: Contract to Hire Work Mode : Work From Office Notice Period : Immediate to 15 Days Job Description: Design, implement, and manage cloud-based applications using Azure services. Develop Infrastructure as Code to automate the provisioning and management of cloud resources. Host websites and python based API on Azure Web Apps. Write and maintain scripts for automation and deployment. Primary Skill in Python and Azure applications management Expertise in Azure Data Factory Familiarity with Snowflake data warehousing Performance optimisation of data workflows Collect aggregate and manage data from various sources including APIs S3 buckets Excel files CSV files Blob storage and SharePoint Flatten and transform JSON data and model it appropriately for downstream processes Data Transformation and Processing Utilise tools and technologies to perform data transformations and ensure data quality Develop and maintain data pipelines and ETL processes to move data from source to target systems Data Flow Development Design and implement data flows in Azure Data Factory ADF to support data transformations Collaborate with other teams to define data transformation requirements and ensure successful data flow execution Scripting and Automation Write and optimise SQL queries and stored procedures for data extraction transformation and loading Develop Python scripts for data manipulation processing and integration tasks Data Warehouse Management Work with Snowflake and other data warehousing tools to design and maintain data models and schemas Ensure data availability integrity and security in the data warehouse environment Performance Optimisation Monitor and optimise the performance of data pipelines and data integration processes Identify and resolve performance bottlenecks in data processing workflows Documentation and Reporting Document data integration processes workflows and best practices Generate reports and provide insights based on data analysis to support business decision making Collaboration and Communication Collaborate with cross functional teams to understand data requirements and provide data related support Communicate effectively with stakeholders to ensure data solutions meet business needs Skill sets required Data integration from multiple sources Primary- Python , Azure applications management, Snowflake, SQL Show more Show less

Posted 1 month ago

Apply

5.0 - 10.0 years

0 Lacs

Greater Kolkata Area

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Creating business intelligence from data requires an understanding of the business, the data, and the technology used to store and analyse that data. Using our Rapid Business Intelligence Solutions, data visualisation and integrated reporting dashboards, we can deliver agile, highly interactive reporting and analytics that help our clients to more effectively run their business and understand what business questions can be answered and how to unlock the answers. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Key Responsibilities: Data Lake and Lakehouse Implementation: Design, implement, and manage Data Lake and Lakehouse architectures. (Must have) Develop and maintain scalable data pipelines and workflows. (Must have) Utilize Azure Data Lake Services (ADLS) for data storage and management. (Must have) Knowledge on Medalion Architecture, Delta Format. (Must have) Data Processing and Transformation: Use PySpark for data processing and transformations. (Must have) Implement Delta Live Tables for real-time data processing and analytics. (Good to have) Ensure data quality and consistency across all stages of the data lifecycle. (Must have) Data Management and Governance: Employ Unity Catalog for data governance and metadata management. (Good to have) Ensure robust data security and compliance with industry standards. (Must have) Data Integration: Extract, transform, and load (ETL) data from multiple sources (Must have) including SAP (Good to have), Dynamics 365 (Good to have), and other systems. Utilize Azure Data Factory (ADF) and Synapse Analytics for data integration and orchestration. (Must have) Performance Optimization of the Jobs. (Must have) Data Storage and Access: Implement and manage Azure Data Lake Storage (ADLS) for large-scale data storage. (Must have) Optimize data storage and retrieval processes for performance and cost-efficiency. (Must have) Collaboration and Communication: Work closely with data scientists, analysts, and other stakeholders to understand data requirements. (Must have) Provide technical guidance and mentorship to junior team members. (Good to have) Continuous Improvement: Stay updated with the latest industry trends and technologies in data engineering and cloud computing. (Good to have) Continuously improve data processes and infrastructure for efficiency and scalability. (Must have) Required Skills And Qualifications Technical Skills: Proficient in PySpark and Python for data processing and analysis. Strong experience with Azure Data Lake Services (ADLS) and Data Lake architecture. Hands-on experience with Databricks for data engineering and analytics. Knowledge of Unity Catalog for data governance. Expertise in Delta Live Tables for real-time data processing. Familiarity with Azure Fabric for data integration and orchestration. Proficient in Azure Data Factory (ADF) and Synapse Analytics for ETL and data warehousing. Experience in pulling data from multiple sources like SAP, Dynamics 365, and others. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Ability to work independently and as part of a team. Attention to detail and commitment to data accuracy and quality. Certifications Required Certification in Azure Data Engineering or relevant Azure certifications. DP203 (Must have) Certification in Databricks. Databricks certified Data Engineer Associate (Must have) Databricks certified Data Engineer Professional (Good Have) Mandatory skill sets: Azure DE, Pyspark, Databricks Preferred skill sets: Azure DE, Pyspark, Databricks Years of experience required: 5-10 Years Educational Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 month ago

Apply

4.0 years

6 - 9 Lacs

Hyderābād

On-site

Analyze, design develop, troubleshoot and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications.As a member of the software engineering division, you will perform high-level design based on provided external specifications. Specify, design and implement minor changes to existing software architecture. Build highly complex enhancements and resolve complex bugs. Build and execute unit tests and unit plans. Review integration and regression test plans created by QA. Communicate with QA and porting engineering as necessary to discuss minor changes to product functionality and to ensure quality and consistency across specific products.Duties and tasks are varied and complex needing independent judgment. Fully competent in own area of expertise. May have project lead role and or supervise lower level personnel. BS or MS degree or equivalent experience relevant to functional area. 4 years of software engineering or related experience. Career Level - IC3 Our Procurement Cloud is the key offering from the Oracle Applications Cloud Suite. Procurement Cloud is a fast growing division within Oracle Cloud Applications and have a variety of customers starting from a leading fast-food joint to world's largest furniture maker. Procurement Cloud Development works on different sophisticated areas starting from a complex search engine to a time critical auctions/bidding process to core business functionalities like bulk order processing, just to name a few. As a member of our team, you will use the latest technologies, including JDeveloper, ADF, Oracle 12c Database, Oracle SQL, BPEL, Oracle Text, BC4J, web-services, and service oriented architectures (SOA). In addition to gaining this technical experience, you will also be exposed to the business side of the industry. Developers are involved in the entire development cycle, so you will have the chance to take part in activities such as working with the product management team to define the product’s functionality and interacting with customers to resolve issues. So are you looking to be technically challenged and gain business experience? Do you want to be part of a team of upbeat, hard-working developers who know how to work and have fun at the same time? Well look no further. Join us and be the newest member of the Fusion Procurement Development! Skills/languages:: 1-8 years of experience in building Java based Applications. Good programming skills, excellent analytical/logical skills. Able to craft a feature from end to end. Can think out of the box, has practical knowledge on the given technologies, can apply logic to tackle a technical problem though might not have the background on the same. Should be persistent in their efforts. Experience in BPEL, Workflow System, ADF, REST Implementation, AI/ML, Scrum processes is a plus. Required: Java, OOPS Concepts, JavaScript/VBCS/JET Optional: JDBC, XML, SQL, PL/SQL, Unix/Linux, REST, ADF, AI/ML, Scrum Analyze, design develop, fix and debug software programs for commercial or end user applications. Writes code, completes programming and performs testing and debugging of applications.

Posted 1 month ago

Apply

10.0 - 19.0 years

8 - 9 Lacs

Thiruvananthapuram

On-site

10 - 19 Years 10 Openings Trivandrum Role description Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning. Experience in data warehouse design and cost improvements. Apply and optimize data models for efficient storage retrieval and processing of large datasets. Communicate and explain design/development aspects to customers. Estimate time and resource requirements for developing/debugging features/components. Participate in RFP responses and solutioning. Mentor team members and guide them in relevant upskilling and certification. Knowledge Examples: Knowledge Examples Knowledge of various ETL services used by cloud providers including Apache PySpark AWS Glue GCP DataProc/Dataflow Azure ADF and ADLF. Proficient in SQL for analytics and windowing functions. Understanding of data schemas and models. Familiarity with domain-related data. Knowledge of data warehouse optimization techniques. Understanding of data security concepts. Awareness of patterns frameworks and automation practices. Additional Comments: Role Proficiency: This role requires proficiency in developing data pipelines including coding and testing for ingesting wrangling transforming and joining data from various sources. The ideal candidate should be adept in ETL tools like Informatica Glue Databricks and DataProc with strong coding skills in Python PySpark and SQL. This position demands independence and proficiency across various data domains. Expertise in data warehousing solutions such as Snowflake BigQuery Lakehouse and Delta Lake is essential including the ability to calculate processing costs and address performance issues. A solid understanding of DevOps and infrastructure needs is also required. Outcomes: Act creatively to develop pipelines/applications by selecting appropriate technical options optimizing application development maintenance and performance through design patterns and reusing proven solutions. Support the Project Manager in day-to-day project execution and account for the developmental activities of others. Interpret requirements create optimal architecture and design solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code using best standards debug and test solutions to ensure best-in-class quality. Tune performance of code and align it with the appropriate infrastructure understanding cost implications of licenses and infrastructure. Create data schemas and models effectively. Develop and manage data storage solutions including relational databases NoSQL databases Delta Lakes and data lakes. Validate results with user representatives integrating the overall solution. Influence and enhance customer satisfaction and employee engagement within project teams. Measures of Outcomes: TeamOne's Adherence to engineering processes and standards TeamOne's Adherence to schedule / timelines TeamOne's Adhere to SLAs where applicable TeamOne's # of defects post delivery TeamOne's # of non-compliance issues TeamOne's Reduction of reoccurrence of known defects TeamOne's Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirementst Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). TeamOne's Average time to detect respond to and resolve pipeline failures or data issues. TeamOne's Number of data security incidents or compliance breaches. Outputs Expected: Code: Develop data processing code with guidance ensuring performance and scalability requirements are met. Define coding standards templates and checklists. Review code for team and peers. Documentation: Create/review templates checklists guidelines and standards for design/process/development. Create/review deliverable documents including design documents architecture documents infra costing business requirements source-target mappings test cases and results. Configure: Define and govern the configuration management plan. Ensure compliance from the team. Test: Review/create unit test cases scenarios and execution. Review test plans and strategies created by the testing team. Provide clarifications to the testing team. Domain Relevance: Advise data engineers on the design and development of features and components leveraging a deeper understanding of business needs. Learn more about the customer domain and identify opportunities to add value. Complete relevant domain certifications. Manage Project: Support the Project Manager with project inputs. Provide inputs on project plans or sprints as needed. Manage the delivery of modules. Manage Defects: Perform defect root cause analysis (RCA) and mitigation. Identify defect trends and implement proactive measures to improve quality. Estimate: Create and provide input for effort and size estimation and plan resources for projects. Manage Knowledge: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release: Execute and monitor the release process. Design: Contribute to the creation of design (HLD LLD SAD)/architecture for applications business components and data models. Interface with Customer: Clarify requirements and provide guidance to the Development Team. Present design options to customers. Conduct product demos. Collaborate closely with customer architects to finalize designs. Manage Team: Set FAST goals and provide feedback. Understand team members' aspirations and provide guidance and opportunities. Ensure team members are upskilled. Engage the team in projects. Proactively identify attrition risks and collaborate with BSE on retention measures. Certifications: Obtain relevant domain and technology certifications. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Skills scala,Python,Pyspark About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 1 month ago

Apply

0 years

0 Lacs

Andhra Pradesh

On-site

To be responsible for data modelling, design, and development of the batch and real-time extraction, load, transform (ELT) processes, and the setup of the data integration framework, ensuring best practices are followed during the integration development. Bachelors degree in CS/IT or related field (minimum) Azure Data Engineer (ADF, ADSL, MS Fabric), Databricks Azure DevOps, Confluence About Virtusa Teamwork, quality of life, professional and personal development: values that Virtusa is proud to embody. When you join us, you join a team of 27,000 people globally that cares about your growth — one that seeks to provide you with exciting projects, opportunities and work with state of the art technologies throughout your career with us. Great minds, great potential: it all comes together at Virtusa. We value collaboration and the team environment of our company, and seek to provide great minds with a dynamic place to nurture new ideas and foster excellence. Virtusa was founded on principles of equal opportunity for all, and so does not discriminate on the basis of race, religion, color, sex, gender identity, sexual orientation, age, non-disqualifying physical or mental disability, national origin, veteran status or any other basis covered by appropriate law. All employment is decided on the basis of qualifications, merit, and business need.

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

Goregaon, Maharashtra, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Manager Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary: A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. Responsibilities: Job Description: Key Responsibilities: Designs, implements and maintains reliable and scalable data infrastructure Writes, deploys and maintains software to build, integrate, manage, maintain, and quality-assure data Designs, develops, and delivers large-scale data ingestion, data processing, and data transformation projects on the Azure cloud Mentors and shares knowledge with the team to provide design reviews, discussions and prototypes Works with customers to deploy, manage, and audit standard processes for cloud products Adheres to and advocates for software & data engineering standard processes (e.g. technical design and review, unit testing, monitoring, alerting, source control, code review & documentation) Deploys secure and well-tested software that meets privacy and compliance requirements; develops, maintains and improves CI / CD pipeline Service reliability and following site-reliability engineering standard processes: on-call rotations for services they maintain, responsible for defining and maintaining SLAs. Designs, builds, deploys and maintains infrastructure as code. Containerizes server deployments. Part of a cross-disciplinary team working closely with other data engineers, software engineers, data scientists, data managers and business partners in a Scrum/Agile setup Job Requirements: Education : Bachelor or higher degree in computer science, Engineering, Information Systems or other quantitative fields Experience : Years of experience: 8 to 12 years relevant experience Deep and hands-on experience designing, planning, productionizing, maintaining and documenting reliable and scalable data infrastructure and data products in complex environments Hands on experience with: Spark for data processing (batch and/or real-time) Configuring Delta Lake on Azure Databricks Languages: SQL, pyspark, python Cloud platforms: Azure Azure Data Factory (must) , Azure Data Lake (must), Azure SQL DB (must), Synapse (must), SQL Pools (must), Databricks (good to have) Designing data solutions in Azure incl. data distributions and partitions, scalability, cost-management, disaster recovery and high availability Azure Devops (or similar tools) for source control & building CI/CD pipelines Experience designing and implementing large-scale distributed systems Customer management and front-ending and ability to lead large organizations through influence Desirable Criteria : Strong customer management- own the delivery for Data track with customer stakeholders Continuous learning and improvement attitude Key Behaviors : Empathetic: Cares about our people, our community and our planet Curious: Seeks to explore and excel Creative: Imagines the extraordinary Inclusive: Brings out the best in each other Mandatory Skill Sets: ‘Must have’ knowledge, skills and experiences Synapse, ADF, spark, SQL, pyspark, spark-SQL Preferred Skill Sets: ‘Good to have’ knowledge, skills and experiences Cosmos DB, Data modeling, Databricks, PowerBI, experience of having built analytics solution with SAP as data source for ingestion pipelines. Depth: Candidate should have in-depth hands-on experience w.r.t end to end solution designing in Azure data lake, ADF pipeline development and debugging, various file formats, Synapse and Databricks with excellent coding skills in PySpark and SQL with logic building capabilities. He/she should have sound knowledge of optimizing workloads. Years Of Experience Required: 8 to 12 years relevant experience Education Qualification: BE, B.Tech, ME, M,Tech, MBA, MCA (60% above) Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering, Bachelor of Technology, Master of Engineering Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Apache Synapse Optional Skills Microsoft Power Business Intelligence (BI) Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Skill: Azure -ADB,ADF,Azure synapse, SQL,pyspark Experience:5-12yrs Location: PAN India Notice period: Immediate -30 days only Show more Show less

Posted 1 month ago

Apply

5.0 - 8.0 years

0 Lacs

Mohali district, India

On-site

Company Description Optimum Data Analytics is a strategic technology partner delivering reliable turnkey AI solutions. Our streamlined approach to development ensures high-quality results and client satisfaction. We bring experience and clarity to organizations, powering every human decision with analytics & AI. Our team consists of statisticians, computer science engineers, data scientists, and product managers. With expertise, flexibility, and cultural alignment, we understand the business, analytics, and data management imperatives of your organization. Our goal is to change how AI/ML is approached in the service sector and deliver outcomes that matter. We provide best-in-class services that increase profit for businesses and deliver improved value for customers, helping businesses grow, transform, and achieve their objectives. Job Details Position: Support Engineer – AI & Data Experience: 5-8 Years Work Mode: Onsite Location: Pune or Mohali Job Overview We are seeking a motivated and talented Support Engineer to join our AI & Data Team. This job offers a unique opportunity to gain hands-on experience with the latest tool technologies, quality documentation preparation, and Software Development Lifecycle responsibilities. If you are passionate about technology and eager to apply your academic knowledge in a real-world setting, this role is perfect for you. Key Responsibilities Collaborate with the AI & Data Team to support various projects. Utilize MS Office tools for documentation and project management tasks. Assist in the development, testing, and deployment and support of BI solutions. Part of ITIL process management. Prepare and maintain high-quality documentation for various processes and projects. Stay updated with the latest industry trends and technologies to contribute innovative ideas. Essential Requirements Experience in SQL and Azure Data Factory (ADF) and Data modeling is a must. Experience in Logic Apps and Azure Integrations is nice to have. Good communications skills. Need to connect with Stakeholders directly. Strong critical thinking and problem-solving skills. Certification in any industry-relevant skills is an advantage. Preferred Skills And Qualifications Strong understanding of software development and testing principles. Familiarity with Data warehousing concepts and technologies. Excellent written and verbal communication skills. Ability to work both independently and as part of a team. Attention to detail and strong organizational skills. What We Offer Hands-on experience with the latest digital tools and technologies. Exposure to real-world projects and industry best practices. Opportunities to prepare and contribute to quality documentation. Experience in SDET responsibilities, enhancing your software testing and development skills. Mentorship from experienced professionals in the field. Skills: management,development,ai,ms office,data modeling,azure,testing,data,software development lifecycle,documentation,itil process management,azure data factory,itil,sql,data warehousing,logic apps,azure integrations Show more Show less

Posted 1 month ago

Apply

5.0 - 10.0 years

3 - 6 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Sr Data Engg- Detailed JD *(Roles and Responsibilities) Education: Bachelors degree in Computer Science or Engineering • Candidate should have 5+ years of experience as Data Engineering, or any related role to Data solutions. • Hands-on experience solutioning and implementing analytical capabilities using the Azure Data Analytics platform including, Azure Data Factory, Azure Logic Apps, Azure Functions, Azure Storage, Azure SQL Data Warehouse/Synapse, Azure Data Lake. • Candidate should be capable to support in all the phases of Analytical Development from identification of key business questions, through Data Collection and ETL. • Good experience in Developing Data solutions in Lakehouse platforms like Dremio is an added benefit. • Strong knowledge of Data Modelling and Data Design is a plus • Microsoft Data Certification is a plus. Mandatory skills* Azure Data Factory Desired skills* Azure Data Factory, Data Modeling Domain* Financial Services Work Location - Any location WFO/WFH/Hybrid WFO - Hybrid Is there any working in shifts from standard Daylight (to avoid confusions post onboarding) YES/ NO No shifts Location- PAN India Yrs of Exp-5Yrs

Posted 1 month ago

Apply

5.0 - 8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Company Description Optimum Data Analytics is a strategic technology partner delivering reliable turnkey AI solutions. Our streamlined approach to development ensures high-quality results and client satisfaction. We bring experience and clarity to organizations, powering every human decision with analytics & AI. Our team consists of statisticians, computer science engineers, data scientists, and product managers. With expertise, flexibility, and cultural alignment, we understand the business, analytics, and data management imperatives of your organization. Our goal is to change how AI/ML is approached in the service sector and deliver outcomes that matter. We provide best-in-class services that increase profit for businesses and deliver improved value for customers, helping businesses grow, transform, and achieve their objectives. Job Details Position: Support Engineer – AI & Data Experience: 5-8 Years Work Mode: Onsite Location: Pune or Mohali Job Overview We are seeking a motivated and talented Support Engineer to join our AI & Data Team. This job offers a unique opportunity to gain hands-on experience with the latest tool technologies, quality documentation preparation, and Software Development Lifecycle responsibilities. If you are passionate about technology and eager to apply your academic knowledge in a real-world setting, this role is perfect for you. Key Responsibilities Collaborate with the AI & Data Team to support various projects. Utilize MS Office tools for documentation and project management tasks. Assist in the development, testing, and deployment and support of BI solutions. Part of ITIL process management. Prepare and maintain high-quality documentation for various processes and projects. Stay updated with the latest industry trends and technologies to contribute innovative ideas. Essential Requirements Experience in SQL and Azure Data Factory (ADF) and Data modeling is a must. Experience in Logic Apps and Azure Integrations is nice to have. Good communications skills. Need to connect with Stakeholders directly. Strong critical thinking and problem-solving skills. Certification in any industry-relevant skills is an advantage. Preferred Skills And Qualifications Strong understanding of software development and testing principles. Familiarity with Data warehousing concepts and technologies. Excellent written and verbal communication skills. Ability to work both independently and as part of a team. Attention to detail and strong organizational skills. What We Offer Hands-on experience with the latest digital tools and technologies. Exposure to real-world projects and industry best practices. Opportunities to prepare and contribute to quality documentation. Experience in SDET responsibilities, enhancing your software testing and development skills. Mentorship from experienced professionals in the field. Skills: management,development,ai,ms office,data modeling,azure,testing,data,software development lifecycle,documentation,itil process management,azure data factory,itil,sql,data warehousing,logic apps,azure integrations Show more Show less

Posted 1 month ago

Apply

6.0 years

0 Lacs

Udaipur, Rajasthan, India

On-site

Job Description: We are looking for a highly skilled and experienced Data Engineer with 4–6 years of hands-on experience in designing and implementing robust, scalable data pipelines and infrastructure. The ideal candidate will be proficient in SQL and Python and have a strong understanding of modern data engineering practices. You will play a key role in building and optimizing data systems, enabling data accessibility and analytics across the organization, and collaborating closely with cross-functional teams including Data Science, Product, and Engineering. Key Responsibilities: · Design, develop, and maintain scalable ETL/ELT data pipelines using SQL and Python · Collaborate with data analysts, data scientists, and product teams to understand data needs · Optimize queries and data models for performance and reliability · Integrate data from various sources, including APIs, internal databases, and third-party systems · Monitor and troubleshoot data pipelines to ensure data quality and integrity · Document processes, data flows, and system architecture · Participate in code reviews and contribute to a culture of continuous improvement Required Skills: · 4–6 years of experience in data engineering, data architecture, or backend development with a focus on data · Strong command of SQL for data transformation and performance tuning · Experience with Python (e.g., pandas, Spark, ADF) · Solid understanding of ETL/ELT processes and data pipeline orchestration · Proficiency with RDBMS (e.g., PostgreSQL, MySQL, SQL Server) · Experience with data warehousing solutions (e.g., Snowflake, Redshift, BigQuery) · Familiarity with version control (Git), CI/CD workflows, and containerized environments (Docker, Kubernetes) · Basic Programming Skills · Excellent problem-solving skills and a passion for clean, efficient data systems Preferred Skills: · Experience with cloud platforms (AWS, Azure, GCP) and services like S3, Glue, Dataflow, etc. · Exposure to enterprise solutions (e.g., Databricks, Synapse) · Knowledge of big data technologies (e.g., Spark, Kafka, Hadoop) · Background in real-time data streaming and event-driven architectures · Understanding of data governance, security, and compliance best practices · Prior experience working in agile development environment Educational Qualifications: Bachelor's degree in Computer Science, Information Technology, or a related field. Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies