Jobs
Interviews

1444 Adf Jobs - Page 35

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Profile: Sr. DW BI Developer Location: Sector 64, Noida (Work from Office) Position Overview: Working with the Finance Systems Manager, the role will ensure that ERP system is available and fit for purpose. The ERP Systems Developer will be developing the ERP system, providing comprehensive day-to-day support, training and develop the current ERP System for the future. Key Responsibilities: As a Sr. DW BI Developer, the candidate will participate in the design / development / customization and maintenance of software applications. As a DW BI Developer, the person should analyse the different applications/Products, design and implement DW using best practices. Rich data governance experience, data security, data quality, provenance / lineage. The candidate will also be maintaining a close working relationship with the other application stakeholders. Experience of developing secured and high-performance web application(s) Knowledge of software development life-cycle methodologies e.g. Iterative, Waterfall, Agile, etc. Designing and architecting future releases of the platform. Participating in troubleshooting application issues. Jointly working with other teams and partners handling different aspects of the platform creation. Tracking advancements in software development technologies and applying them judiciously in the solution roadmap. Ensuring all quality controls and processes are adhered to. Planning the major and minor releases of the solution. Ensuring robust configuration management. Working closely with the Engineering Manager on different aspects of product lifecycle management. Demonstrate the ability to independently work in a fast-paced environment requiring multitasking and efficient time management. Required Skills and Qualifications: End to end Lifecyle of Data warehousing, DataLakes and reporting Experience with Maintaining/Managing Data warehouses. Responsible for the design and development of a large, scaled-out, real-time, high performing Data Lake / Data Warehouse systems (including Big data and Cloud). Strong SQL and analytical skills. Experience in Power BI, Tableau, Qlikview, Qliksense etc. Experience in Microsoft Azure Services. Experience in developing and supporting ADF pipelines. Experience in Azure SQL Server/ Databricks / Azure Analysis Services Experience in developing tabular model. Experience in working with APIs. Minimum 2 years of experience in a similar role Experience with data warehousing, data modelling. Strong experience in SQL 2-6 years of total experience in building DW/BI systems Experience with ETL and working with large-scale datasets. Proficiency in writing and debugging complex SQLs. Prior experience working with global clients. Hands on experience with Kafka, Flink, Spark, SnowFlake, Airflow, nifi, Oozie, Pig, Hive,Impala Sqoop. Storage like HDFS , Object Storage (S3 etc), RDBMS, MPP and Nosql DB. Experience with distributed data management, data sfailover,luding databases (Relational, NoSQL, Big data, data analysis, data processing, data transformation, high availability, and scalability) Experience in end-to-end project implementation in Cloud (Azure / AWS / GCP) as a DW BI Developer Rich data governance experience, data security, data quality, provenance / lineagHive, Impalaerstanding of industry trends and products in dataops , continuous intelligence , Augmented analytics , and AI/ML. Prior experience of working in cloud like Azure, AWS and GCP Prior experience of working with Global Clients To know our Privacy Policy, please click on the link below or copy paste the URL on your browser: https://gedu.global/wp-content/uploads/2023/09/GEDU-Privacy-Policy-22092023-V2.0-1.pdf Show more Show less

Posted 1 month ago

Apply

0.0 - 5.0 years

0 Lacs

Ahmedabad, Gujarat

On-site

Ahmedabad,Gujarat Full Time Job Overview: We are looking for a skilled and experienced Data Engineer to join our team. The ideal candidate will have a strong background in Azure Data Factory, Databricks, Pyspark, Python , Azure SQL and other Azure cloud services, and will be responsible for building and managing scalable data pipelines, data lakes, and data warehouses . Experience with Azure Synapse Analytics, Microsoft Fabric or PowerBI will be considered a strong advantage. Key Responsibilities: Design, develop, and manage robust and scalable ETL/ELT pipelines using Azure Data Factory and Databricks Work with PySpark and Python to transform and process large datasets Build and maintain data lakes and data warehouses on Azure Cloud Collaborate with data architects, analysts, and stakeholders to gather and translate requirements into technical solutions Ensure data quality, consistency, and integrity across systems Optimize performance and cost of data pipelines and cloud infrastructure Implement best practices for security, governance, and monitoring of data pipelines Maintain and document data workflows and architecture Required Skills & Qualifications: 3–5 years of experience in Data Engineering Strong hands-on experience with: Azure Data Factory (ADF) Azure Databricks Azure SQL PySpark and Python Azure Storage (Blob, Data Lake Gen2) Hands-on experience with data warehouse/Lakehouse/data lake architecture Familiarity with Delta Lake, MLflow, and Unity Catalog is a plus Good understanding of SQL and performance tuning Knowledge of CI/CD in Azure for data pipelines Excellent problem-solving skills and ability to work independently Preferred Skills: Experience with Azure Synapse Analytics Familiarity with Microsoft Fabric Working knowledge of Power BI for data visualization and dashboarding Exposure to DevOps and infrastructure as code (IaC) in Azure Understanding of data governance and security best practices Databricks certification (e.g., Databricks Certified Data Engineer Associate/Professional)

Posted 1 month ago

Apply

8.0 years

0 Lacs

Itanagar, Arunachal Pradesh, India

On-site

Job Description We are seeking a skilled and motivated Data Engineer with 8+ years of experience to join our team. The ideal candidate should be experienced in data engineering on Snowflake, Azure -ADF, Microsoft MDS, SQL ,Data Pipelines with a focus on developing, maintaining the Data Analytics solutions. You will collaborate with cross-functional teams to deliver high-quality data solutions that meet business requirements. Required Skills And Experience Bachelor or Master degree in computer science, Data Science, Engineering, or a related field. 8+ years of experience in data engineering or related fields. Strong proficiency in SQL, Snowflake, Stored procedure, Views . Hands-on experience with Snowflake SQL, ADF (Azure Data Factory), Microsoft MDS(Master Data Service). Knowledge of data warehousing concepts. Experience with cloud platforms (Azure). Understanding of data modeling and data warehousing principles. Strong problem-solving and analytical skills, with attention to detail. Excellent communication and collaboration skills. Bonus Skills Exposure to CI/CD practices using Microsoft Azure DevOps . Basic knowledge or understanding of PBI. Key Responsibilities Design, develop, and maintain scalable and efficient data pipelines using Azure Data Factory (ADF). Build and optimize data models and data warehousing solutions within Snowflake. Develop and maintain data integration processes, ensuring data quality and integrity. Utilize strong SQL skills to query, transform, and analyze data within Snowflake. Develop and manage stored procedures and views in Snowflake. Implement and manage master data using Microsoft Master Data Services (MDS). Collaborate with data analysts and business stakeholders to understand data requirements and deliver effective data solutions. Ensure the performance, reliability, and security of data pipelines and data warehousing systems. Troubleshoot and resolve data-related issues in a timely manner. Stay up-to-date with the latest advancements in data engineering technologies, particularly within the Snowflake and Azure ecosystems. Contribute to the documentation of data pipelines, data models, and ETL processes (ref:hirist.tech) Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Key Responsibilities Azure Synapse Development: Design, develop, and optimize data solutions within Azure Synapse Analytics, leveraging its capabilities for data warehousing, data lakes, and big data processing. Data Pipeline Development (ADF): Build, manage, and monitor scalable and efficient data pipelines using Azure Data Factory (ADF) for data ingestion, transformation, and orchestration. Data Warehousing & Modelling: Apply expertise in data warehousing principles and various data modelling techniques to design and implement robust data structures. Snowflake & Stored Procedures: Work extensively with Snowflake, including data loading, transformation, and optimizing queries. Develop and maintain complex Stored Procedures in various database environments. ETL/ELT Processes: Implement and enhance ETL (Extract, Transform, Load) and ELT (Extract, Load, Transform) processes to move data efficiently between disparate systems. Data Quality & Monitoring: Implement and ensure adherence to data quality frameworks. Utilize data monitoring tools to ensure data integrity and reliability. Job Scheduling: Configure and manage job scheduling for automated data workflows and pipeline execution. Data Format Handling: Work proficiently with various data formats including JSON, XML, CSV, and Parquet. Agile Collaboration: Participate actively in an Agile development environment, using tools like JIRA for task management and collaboration. Communication: Clearly communicate technical concepts and solutions to team members and stakeholders, maintaining formal and professional Skills : Azure Synapse: Good experience in Azure Synapse Analytics. Azure Data Factory (ADF): Good experience in Azure Data Factory. Snowflake: Good experience with Snowflake. Stored Procedures: Strong experience with Stored Procedures. Data Engineering Fundamentals: Experience with ETL/ELT processes, data warehousing, and data modelling. Data Quality & Operations: Experience with data quality frameworks, monitoring tools, and job scheduling. Data Formats: Knowledge of data formats like JSON, XML, CSV, and Parquet. Agile: Experience with Agile methodology and tools like JIRA. Language: Fluent in English (Strong written, verbal, and presentation skills). Communication: Good communication and formal skills. (ref:hirist.tech) Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Azure Data Engineer Job Description Role : Azure Data Engineer Experience : Minimum 3-5 years Location : Spaze ITech Park, Sector-49, Gurugram Working Days : Monday to Friday (9 :00 Am - 6 :00 Pm) Joining : < 15 days About Us Panamoure is UK based group with offshore office in Gurgaon, India. We are known to be the ultimate Business and Technology Change partner for our clients including PE groups and ambitious mid-market businesses. Panamoure is a fast paced and dynamic management consultancy delivering Business and Technology change services to the UKs fastest growing companies. Our ability to deliver exceptional quality to our clients has seen us grow rapidly over the last 36 months and we have ambitious plans to scale substantially further moving forward. As part of this growth we are looking to expand both our UK and India team with bright, ambitious and talented individuals that want to learn and grow with the business. Primary Skills The Azure Data Engineer will be responsible for developing, maintaining, and optimizing data pipelines and SQL databases using Azure Data Factory (ADF), Microsoft Fabrics and other Azure services. The role requires expertise in SQL Server, ETL/ELT processes, and data modeling to support business intelligence and operational applications. The ideal candidate will collaborate with cross-functional teams to deliver reliable, scalable, and high-performing data solutions. Key Responsibilities Design, develop, and manage SQL databases, tables, stored procedures, and T-SQL queries. Develop and maintain Azure Data Factory (ADF) pipelines to automate data ingestion, transformation, and integration. Build and optimize ETL/ELT processes to transfer data between Azure Data Lake, SQL Server, and other systems. Design and implement Microsoft Fabric Lake houses for structured and unstructured data storage. Build scalable ETL/ELT pipelines to move and transform data across Azure Data Lake, SQL Server, and external data sources. Develop and implement data modeling strategies using star schema, snowflake schema, and dimensional models to support analytics use cases. Integrate Azure Data Lake Storage (ADLS) with Microsoft Fabric for scalable, secure, and cost-effective data storage. Monitor, troubleshoot, and optimize data pipelines using Azure Monitor, Log Analytics, and Fabric Monitoring capabilities. Ensure data integrity, consistency, and security following data governance frameworks such as Azure Purview. Collaborate with DevOps teams to implement CI/CD pipelines for automated data pipeline deployment. Utilize Azure Monitor, Log Analytics, and Application Insights for pipeline monitoring and performance optimization. Stay updated on Azure Data Services and Microsoft Fabric innovations, recommending enhancements for performance and scalability. Desired Candidate Profile 4+ years of experience in data engineering with strong expertise in SQL development. Proficiency in SQL Server, T-SQL, and query optimization techniques. Hands-on experience with Azure Data Factory (ADF), Azure Synapse Analytics, and Azure SQL Database. Solid understanding of ETL/ELT processes, data integration patterns, and data transformation. Practical experience with Microsoft Fabric components : Fabric Dataflows for self-service data preparation. Fabric Lake houses for unified data storage. Fabric Synapse Real-Time Analytics for streaming data insights. Fabric Direct Lake mode with Power BI for optimized performance. Strong understanding of Azure Data Lake Storage (ADLS) for efficient data management. Proficiency in Python or Scala for data transformation tasks. Experience with Azure DevOps, Git, and CI/CD pipeline automation. Knowledge of data governance practices, including data lineage, sensitivity labels, and RBAC. Experience with Infrastructure-as-Code (IaC) using Terraform or ARM templates. Understanding of data security protocols like data encryption and network security groups (NSGs). Familiarity with streaming services like Azure Event Hub or Kafka is a plus. Excellent problem-solving, communication, and team collaboration skills. Azure Data Engineer Associate (DP-203) and Microsoft Fabric Analytics certifications are desirable. What We Offer Opportunity to work with modern data architectures and Microsoft Fabric innovations. Competitive salary and benefits package, tailored to experience and qualifications. Opportunities for professional growth and development in a supportive and collaborative environment. A culture that values diversity, creativity, and a commitment to excellence. Benefits And Perks Provident Fund Health Insurance Flexible Timing Providing office Lunch We look forward to adding a skilled Azure Data Engineer to our team! (ref:hirist.tech) Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title : Senior.NET Developer Experience : 8+ Years Notice Period : Immediate Joiners Preferred Location : Trivandrum / Bangalore / Chennai Skill Set : C#.Net technologies, SQL, Azure Cloud Service , .Net Core 6.8\8.0 , Micro services , SOA , api integrations , CI\CD , Azure , Javascript , React js , Angular , APIM Job Overview We are looking for a highly experienced and hands-on Senior.NET Developer with over 8 years of experience in software development, particularly with.NET Core, Azure Cloud Services, and Azure DevOps. This is a client-facing role for a US-based client and requires excellent communication skills, strong technical expertise, and the ability to work in a fast-paced, agile environment. Key Responsibilities Design, develop, enhance, and maintain application features using.NET Core 6/8+, C#, REST APIs, T-SQL, and AngularJS/ReactJS. Provide application support and handle API integrations with third-party services. Participate in full software development lifecycle (SDLC) activities, including planning, coding, testing, deployment, and maintenance. Collaborate with cross-functional teams to define, design, and deliver new features. Write clean, scalable, and efficient code following best practices and architectural standards. Lead and mentor junior developers by reviewing code and guiding them through technical challenges. Troubleshoot and debug applications and ensure software meets all quality and security standards. Create and maintain technical documentation. Ensure active participation in daily stand-ups and meetings aligned with US EST time zone. Primary Skills & Requirements 8+ years of hands-on development experience in C#, .NET technologies, SQL. Minimum 2 years of experience in Azure Cloud Services. Proficiency in .NET Core 6/8+, Entity Framework/Core, Micro services architecture, Azure -DevOps, and Service-Oriented Architecture (SOA). Strong understanding of REST APIs, Unit Testing (xUnit, MSTest), and CI/CD pipelines (YAML or Classic). Expertise in MS SQL Server - writing SQL queries, stored procedures, views, functions, and performance tuning. Experience with Application Support and third-party API Integrations. Familiarity with software development methodologies like Agile/Scrum using tools like JIRA. Strong debugging, problem-solving, and analytical skills. Excellent written and verbal communication skills. Azure Skills Azure Messaging Services : Service Bus, Event Grid, Event Hub Azure Storage : Blobs, Tables, Queues Azure Functions / Durable Functions Azure Data Factory (ADF), Logic Apps CI/CD with Azure DevOps Monitoring : Application Insights, Azure Monitoring Security : Azure Key Vault SQL Azure Secondary/Desirable Skills Frontend development with JavaScript, ReactJS, Angular, jQuery Azure API Management (APIM) Containerization & Orchestration: Azure Kubernetes Service (AKS), Docker (ref:hirist.tech) Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

Position Summary Netoyed is seeking a highly skilled Data Integration and Power BI Developer. This Developer will be responsible for quickly connecting four external APIs, building Azure Data Factory pipelines for data ingestion and transformation, and delivering interactive dashboards and reports in Power BI. This role is essential to meeting project milestones for Juniper Landscapings Power BI Integration initiative. Key Responsibilities API Integration and Data Ingestion Develop and configure secure connections to the following APIs : Paycom Acumatica Aspire Procuify Handle data extraction (structured and unstructured) and secure processing. Data Pipeline Development Build scalable ETL pipelines using Azure Data Factory (ADF). Implement automation for continuous data refresh and ingestion. Ensure proper data validation, error handling, and pipeline optimization. Dashboard And Report Creation Design and build operational, analytical, and executive dashboards in Power BI. Implement dynamic visualizations, KPIs, drill-throughs, and real-time updates. Apply role-based security and user access controls within Power BI. Project Collaboration Work closely with Netoyed project leads to ensure deliverables align with client expectations. Provide regular updates on progress, blockers, and completion milestones. Deliver technical documentation for API integrations, pipelines, and dashboards. Required Skills And Experience 3+ years professional experience with Azure Data Factory and data ingestion projects. 3+ years building dashboards and reports in Power BI. Proven ability to connect to REST/SOAP APIs, process JSON/XML responses securely. Strong skills in Azure SQL Database, Azure Data Lake Storage, and data processing. Proficiency with DAX, Power Query (M language), and data modeling best practices. Excellent troubleshooting, problem-solving, and documentation skills. Ability to work independently and meet aggressive short-term deadlines. Preferred Qualifications Experience with Azure DevOps for CI/CD of data pipelines and reporting solutions. Familiarity with Microsoft Fabric or Lakehouse architecture is a plus. Knowledge of data integration from HR, financial, or operational systems (e.g., Paycom, Acumatica). Microsoft certifications such as Azure Data Engineer Associate or Power BI Data Analyst are a plus. Job Details Start Date : Immediate to within 2 weeks. Hours : Estimated 3040 hours per week (project workload driven). Work Mode : Netoyed office, US EST hours. Tools Provided : Access to required Azure resources, API documentation, and Power BI workspace environments. (ref:hirist.tech) Show more Show less

Posted 1 month ago

Apply

2.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

About Gartner IT Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About The Role Data warehousing engineer with technical expertise, capable of collaborating with the team to create a Data Platform Strategy and implement the solution. What You’ll Do Participate in design and implementation of the Data Warehousing Solution Participate in the end-to-end delivery of solutions from gathering requirements, to implementation, testing, and continuous improvement post roll out using Agile Scrum methodologies. What You’ll Need 2-4 years of experience in software programming and/or data warehousing, in an Agile Scrum environment. Must Have Strong experience in SQL, ADF and Synapse/Databricks. ETL process design including techniques for addressing slowly changing dimensions, differential fact-journaling (i.e., storage optimization for fact data), semi-additive measures and related concerns, and rolldown distributions. SQL query optimization Who You Are Bachelor’s degree in computer science or information systems, or equivalent experience in the field of software development Effective time management skills and ability to meet deadlines. Delivering project work on-time within budget with high quality. Excellent communications skills interacting with technical and business audience’s. Excellent organization, multitasking, and prioritization skills. Must possess a willingness and aptitude to embrace new technologies/ideas and master concepts rapidly. Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:99949 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser. Show more Show less

Posted 1 month ago

Apply

5.0 - 10.0 years

4 - 9 Lacs

Hyderabad

Work from Office

Skills: ADF,ETL,Rest API Microsoft Power Platform / Snowflake Certification is a Plus Power Bi / Talend and Integration Power Apps SAP S4 Hana Abap,Odata Rest and Soap Job LOcation : Hyderabad

Posted 1 month ago

Apply

3.0 - 8.0 years

9 - 16 Lacs

Pune

Work from Office

We are looking for a skilled Azure Data Engineer to design, develop, optimize data pipelines for following 1, SQL+ETL+AZURE+Python+Pyspark+Databricks 2, SQL+ADF+ Azure 3, SQL+Python+Pyspark - Strong proficiency in SQL for data manipulation querying Required Candidate profile - Python and PySpark for data engineering tasks. - Exp with Databricks for big data processing analytics. - Knowledge of data modeling, warehousing, governance. - CI/CD pipelines for data deployment. Perks and benefits Perks and Benefits

Posted 1 month ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Senior Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) Years of experience required Minimum 4Years of Oracle fusion experience Education Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Business Administration, Bachelor of Technology Degrees/Field Of Study Preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Integration Cloud (OIC) Optional Skills Accepting Feedback, Active Listening, Analytical Thinking, Business Transformation, Communication, Creativity, Design Automation, Embracing Change, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Learning Agility, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Self-Awareness, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date Show more Show less

Posted 1 month ago

Apply

3.0 - 9.0 years

0 Lacs

Andhra Pradesh, India

On-site

At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. As a Guidewire developer at PwC, you will specialise in developing and customising applications using the Guidewire platform. Guidewire is a software suite that provides insurance companies with tools for policy administration, claims management, and billing. You will be responsible for designing, coding, and testing software solutions that meet the specific needs of insurance organisations. Driven by curiosity, you are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt to working with a variety of clients and team members, each presenting varying challenges and scope. Every experience is an opportunity to learn and grow. You are expected to take ownership and consistently deliver quality work that drives value for our clients and success as a team. As you navigate through the Firm, you build a brand for yourself, opening doors to more opportunities. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Total Experience – 3 To 9 Years Education Qualification: BTech/BE/MTech/MS/MCA Preferred Skill Set/Roles and Responsibility - Hands on Experience in Azure Data Bricks and ADF Guidewire. Works with business in identifying detailed analytical and operational reporting/extracts requirements. Experience in Python is a must have. Able to create Microsoft SQL / ETL / SSIS complex queries. Participates in Sprint development, test, and integration activities. Creates detailed source to target mappings. Creates and validates data dictionaries Writes and validates data translation and migration scripts. Communicating with business to gather business requirements. Performs GAP analysis between existing (legacy) and new (GW) data related solutions. Working with Informatica ETL devs. Show more Show less

Posted 1 month ago

Apply

0 years

6 - 8 Lacs

Hyderābād

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose – the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant- Databricks Developer ! In this role, the Databricks Developer is responsible for solving the real world cutting edge problem to meet both functional and non-functional requirements. Responsibilities Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have experience in Data Engineering domain . Qualifications we seek in you! Minimum qualifications Bachelor’s Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience. Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Work with architect and lead engineers for solutions to meet functional and non-functional requirements. Demonstrated knowledge of relevant industry trends and standards. Demonstrate strong analytical and technical problem-solving skills. Must have excellent coding skills either Python or Scala, preferably Python. Must have experience in Data Engineering domain . Must have implemented at least 2 project end-to-end in Databricks. Must have at least experience on databricks which consists of various components as below Delta lake dbConnect db API 2.0 Databricks workflows orchestration Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. Must have good understanding to create complex data pipeline Must have good knowledge of Data structure & algorithms. Must be strong in SQL and sprak-sql . Must have strong performance optimization skills to improve efficiency and reduce cost . Must have worked on both Batch and streaming data pipeline . Must have extensive knowledge of Spark and Hive data processing framework. Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB /DynamoDB, ASB/SQS, Cloud databases. Must be strong in writing unit test case and integration test Must have strong communication skills and have worked on the team of size 5 plus Must have great attitude towards learning new skills and upskilling the existing skills. Preferred Qualifications Good to have Unity catalog and basic governance knowledge. Good to have Databricks SQL Endpoint understanding. Good To have CI/CD experience to build the pipeline for Databricks jobs. Good to have if worked on migration project to build Unified data platform. Good to have knowledge of DBT. Good to have knowledge of docker and Kubernetes. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. For more information, visit www.genpact.com . Follow us on Twitter, Facebook, LinkedIn, and YouTube. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Lead Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jun 9, 2025, 9:15:16 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 1 month ago

Apply

0 years

0 Lacs

New Delhi, Delhi, India

Remote

We are seeking a proactive and business-oriented Data Functional Consultant with strong experience in Azure Data Factory and Azure Databricks . This role bridges the gap between business stakeholders and technical teams—translating business needs into scalable data solutions, ensuring effective data management, and enabling insights-driven decision-making. The ideal candidate is not a pure developer or data engineer but someone who understands business processes, data flows, and stakeholder priorities , and can help drive value from data platforms using cloud-native Azure services. What You’ll Do: Collaborate closely with business stakeholders to gather, understand, and document functional data requirements. Translate business needs into high-level data design, data workflows, and process improvements. Work with data engineering teams to define and validate ETL/ELT logic and data pipeline workflows using Azure Data Factory and Databricks. Facilitate functional workshops and stakeholder meetings to align on data needs and business KPIs. Act as a bridge between business teams and data engineers to ensure accurate implementation and delivery of data solutions. Conduct data validation, UAT, and support users in adopting data platforms and self-service analytics. Maintain functional documentation, data dictionaries, and mapping specifications. Assist in defining data governance, data quality, and master data management practices from a business perspective. Monitor data pipeline health and help triage issues from a functional/business impact standpoint. What You’ll Bring: Proven exposure to Azure Data Factory (ADF) for orchestrating data workflows. Practical experience with Azure Databricks for data processing (functional understanding, not necessarily coding). Strong understanding of data warehousing, data modeling, and business KPIs. Experience working in agile or hybrid project environments. Excellent communication and stakeholder management skills. Ability to translate complex technical details into business-friendly language. Familiarity with tools like Power BI, Excel, or other reporting solutions is a plus. Background in Banking, Finance industries is a bonus. What We Offer: At Delphi, we are dedicated to creating an environment where you can thrive, both professionally and personally. Our competitive compensation package, performance-based incentives, and health benefits are designed to ensure you're well-supported. We believe in your continuous growth and offer company-sponsored certifications, training programs , and skill-building opportunities to help you succeed. We foster a culture of inclusivity and support, with remote work options and a fully supported work-from-home setup to ensure your comfort and productivity. Our positive and inclusive culture includes team activities, wellness and mental health programs to ensure you feel supported. Show more Show less

Posted 1 month ago

Apply

3.0 years

10 Lacs

Gurgaon

Remote

Data Engineer – Azure This is a hands on data platform engineering role that places significant emphasis on consultative data engineering engagements with a wide range of customer stakeholders; Business Owners, Business Analytics, Data Engineering teams, Application Development, End Users and Management teams. You Will: Design and build resilient and efficient data pipelines for batch and real-time streaming Collaborate with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools. Provide technical product expertise, advise on deployment architectures, and handle in-depth technical questions around data infrastructure, PaaS services, design patterns and implementation approaches. Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to lead the identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations. Execute projects with an Agile mindset. Build software frameworks to solve data problems at scale. Technical Requirements: 3+ years of data engineering experience leading implementations of large-scale lakehouses on Databricks, Snowflake, or Synapse. Prior experience using DBT and PowerBI will be a plus. Extensive experience with Azure data services (Databricks, Synapse, ADF) and related azure infrastructure services like firewall, storage, key vault etc. is required. Strong programming / scripting experience using SQL and python and Spark. Knowledge of software configuration management environments and tools such as JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket. Experience with Agile development methods in data-oriented projects Other Requirements: Highly motivated self-starter and team player and demonstrated success in prior roles. Track record of success working through technical challenges within enterprise organizations Ability to prioritize deals, training, and initiatives through highly effective time management Excellent problem solving, analytical, presentation, and whiteboarding skills Track record of success dealing with ambiguity (internal and external) and working collaboratively with other departments and organizations to solve challenging problems Strong knowledge of technology and industry trends that affect data analytics decisions for enterprise organizations Certifications on Azure Data Engineering and related technologies. "Remote postings are limited to candidates residing within the country specified in the posting location" About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.

Posted 1 month ago

Apply

7.0 years

10 Lacs

Gurgaon

Remote

Senior Data Engineer – Azure This is a hands on data platform engineering role that places significant emphasis on consultative data engineering engagements with a wide range of customer stakeholders; Business Owners, Business Analytics, Data Engineering teams, Application Development, End Users and Management teams. You Will: Design and build resilient and efficient data pipelines for batch and real-time streaming Architect and design data infrastructure on cloud using Infrastructure-as-Code tools. Collaborate with product managers, software engineers, data analysts, and data scientists to build scalable and data-driven platforms and tools. Provide technical product expertise, advise on deployment architectures, and handle in-depth technical questions around data infrastructure, PaaS services, design patterns and implementation approaches. Collaborate with enterprise architects, data architects, ETL developers & engineers, data scientists, and information designers to lead the identification and definition of required data structures, formats, pipelines, metadata, and workload orchestration capabilities Address aspects such as data privacy & security, data ingestion & processing, data storage & compute, analytical & operational consumption, data modeling, data virtualization, self-service data preparation & analytics, AI enablement, and API integrations. Lead a team of engineers to deliver impactful results at scale. Execute projects with an Agile mindset. Build software frameworks to solve data problems at scale. Technical Requirements: 7+ years of data engineering experience leading implementations of large-scale lakehouses on Databricks, Snowflake, or Synapse. Prior experience using DBT and PowerBI will be a plus. 3+ years' experience architecting solutions for developing data pipelines from structured, unstructured sources for batch and realtime workloads. Extensive experience with Azure data services (Databricks, Synapse, ADF) and related azure infrastructure services like firewall, storage, key vault etc. is required. Strong programming / scripting experience using SQL and python and Spark. Strong Data Modeling, Data lakehouse concepts. Knowledge of software configuration management environments and tools such as JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket. Experience with Agile development methods in data-oriented projects Other Requirements: Highly motivated self-starter and team player and demonstrated success in prior roles. Track record of success working through technical challenges within enterprise organizations Ability to prioritize deals, training, and initiatives through highly effective time management Excellent problem solving, analytical, presentation, and whiteboarding skills Track record of success dealing with ambiguity (internal and external) and working collaboratively with other departments and organizations to solve challenging problems Strong knowledge of technology and industry trends that affect data analytics decisions for enterprise organizations Certifications on Azure Data Engineering and related technologies. "Remote postings are limited to candidates residing within the country specified in the posting location" About Rackspace Technology We are the multicloud solutions experts. We combine our expertise with the world’s leading technologies — across applications, data and security — to deliver end-to-end solutions. We have a proven record of advising customers based on their business challenges, designing solutions that scale, building and managing those solutions, and optimizing returns into the future. Named a best place to work, year after year according to Fortune, Forbes and Glassdoor, we attract and develop world-class talent. Join us on our mission to embrace technology, empower customers and deliver the future. More on Rackspace Technology Though we’re all different, Rackers thrive through our connection to a central goal: to be a valued member of a winning team on an inspiring mission. We bring our whole selves to work every day. And we embrace the notion that unique perspectives fuel innovation and enable us to best serve our customers and communities around the globe. We welcome you to apply today and want you to know that we are committed to offering equal employment opportunity without regard to age, color, disability, gender reassignment or identity or expression, genetic information, marital or civil partner status, pregnancy or maternity status, military or veteran status, nationality, ethnic or national origin, race, religion or belief, sexual orientation, or any legally protected characteristic. If you have a disability or special need that requires accommodation, please let us know.

Posted 1 month ago

Apply

0 years

5 - 7 Lacs

Chennai

On-site

Flex is the diversified manufacturing partner of choice that helps market-leading brands design, build and deliver innovative products that improve the world. A career at Flex offers the opportunity to make a difference and invest in your growth in a respectful, inclusive, and collaborative environment. If you are excited about a role but don't meet every bullet point, we encourage you to apply and join us to create the extraordinary. Job Description The “Business Intelligence Developer” will be Chennai based (with very occasional travel to sites), Responsible for maintaining and improving the organisation’s business intelligence systems to ensure that they function reliably and in accordance with user needs. What a typical day looks like: Support the Senior Business Intelligence Developer to maintain and improve the Power BI suite of reports for the business Develop, test, review and help deploy automated reports and dashboards using Power BI and other reporting tools. Understand business requirements to set functional specifications for reporting applications. Exhibit an understanding of database concepts such relational database architecture and multidimensional database design Design data models that transform raw data into insightful knowledge by understanding business requirements in the context of BI. Develop technical specifications from business needs, and accurately scope the work to help set realistic deadlines for work completion. Make charts and data documentation that includes descriptions of the techniques, parameters, models, and relationships. Developing Power BI desktop to create dashboards, KPI scorecards, and visual reports. Examine, comprehend, and study business needs as they relate to business intelligence. Design and map data models to transform raw data into insightful information. Create dynamic and eye-catching dashboards and reports using Power BI. Make necessary tactical and technological adjustments to enhance current business intelligence systems Integrate data, alter data, and connect to data sources for business intelligence. ‍ The experience we’re looking to add to our team: Knowledge of SSRS and TSQL, Power Query, MDX, PowerBI, and DAX and systems on the MS SQL Server BI Stack. Good communication skills are necessary to effectively work with stakeholders, end users and all levels at the organisation who request reports. Ability to run a Power BI project End to end through all stages from Requirements gathering to Report Deployment. Exceptional analytical thinking skills for converting data into illuminating reports and reports. Some knowledge of data warehousing, data gateway, and data preparation projects Good knowledge of Power BI,and desirable a knowledge of SSAS, SSRS, and SSIS components of the Microsoft Business Intelligence Stack Articulating, representing, and analysing solutions with the team while documenting, creating, and modelling them Strong understanding of DAX, Intermediate Knowledge of SQL, and M-Query and a basic understanding of Python. A basic understanding of ADF or Fabric Pipelines. Comprehensive understanding of data modelling, administration, and visualisation Capacity to perform in an atmosphere where agility and continual development are prioritised Awareness of BI technologies (e.g., Microsoft Power BI, Oracle BI) Expertise of SQL queries, SSRS, and SQL Server Integration Services (SSIS) Troubleshooting and problem-solving skills. Demonstrates basic functional, technical and people and/or process management skills as well as customer (external and internal) relationship skills. Demonstrates skills in functional/ technical area. Use of the following tools may be required: Office Skills: telephones, data entry, and office software to include word processing, spreadsheets, and presentation package and database systems. What you’ll receive for the great work you provide: Health Insurance Paid Time Off #BB04 Job Category IT Flex pays for all costs associated with the application, interview or offer process, a candidate will not be asked for any payment related to these costs. Flex does not accept unsolicited resumes from headhunters, recruitment agencies or fee based recruitment services. Flex is an Equal Opportunity Employer and employment selection decisions are based on merit, qualifications, and abilities. Flex does not discriminate in employment opportunities or practices based on: age, race, religion, color, sex, national origin, marital status, sexual orientation, gender identity, veteran status, disability, pregnancy status or any other status protected by law. Flex provides reasonable accommodation so that qualified applicants with a disability may participate in the selection process. Please advise us of any accommodations you request to express interest in a position by e-mailing: accessibility@flex.com . Please state your request for assistance in your message. Only reasonable accommodation requests related to applying for a specific position within Flex will be reviewed at the e-mail address. Flex will contact you if it is determined that your background is a match to the required skills required for this position. Thank you for considering a career with Flex.

Posted 1 month ago

Apply

5.0 years

0 Lacs

Calcutta

On-site

Skill required: Tech for Operations - Microsoft Azure Cloud Services Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation/12th/PUC/HSC Years of Experience: 5 to 8 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us at www.accenture.com. In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle: Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience. The Senior Azure Data factory (ADF) Support Engineer Il will be a critical member of our Enterprise Applications Team, responsible for designing, supporting & maintaining robust data solutions. The ideal candidate is proficient in ADF, SQL and has extensive experience in troubleshooting Azure Data factory environments, conducting code reviews, and bug fixing. This role requires a strategic thinker who can collaborate with cross-functional teams to drive our data strategy and ensure the optimal performance of our data systems. What are we looking for? • Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. • Proven experience (5+ years) as a Azure Data Factory Support Engineer Il Expertise in ADF with a deep understanding of its data-related libraries. • Strong experience in Azure cloud services, including troubleshooting and optimizing cloud-based environments. • Proficient in SQL and experience with SQL database design. • Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Experience with ADF pipelines. • Excellent problem-solving and troubleshooting skills. • Experience in code review and debugging in a collaborative project setting. • Excellent verbal and written communication skills. • Ability to work in a fast-paced, team-oriented environment. • Strong understanding of the business and a passion for the mission of Service Supply Chain • Hands on with Jira, Devops ticketing, ServiceNow is good to have Roles and Responsibilities: • Innovate. Collaborate. Build. Create. Solve ADF & associated systems Ensure systems meet business requirements and industry practices. • Integrate new data management technologies and software engineering tools into existing structures. • Recommend ways to improve data reliability, efficiency, and quality. • Use large data sets to address business issues. • Use data to discover tasks that can be automated. • Fix bugs to ensure robust and sustainable codebase. • Collaborate closely with the relevant teams to diagnose and resolve issues in data processing systems, ensuring minimal downtime and optimal performance. • Analyze and comprehend existing ADF data pipelines, systems, and processes to identify and troubleshoot issues effectively. • Develop, test, and implement code changes to fix bugs and improve the efficiency and reliability of data pipelines. • Review and validate change requests from stakeholders, ensuring they align with system capabilities and business objectives. • Implement robust monitoring solutions to proactively detect and address issues in ADF data pipelines and related infrastructure. • Coordinate with data architects and other team members to ensure that changes are in line with the overall architecture and data strategy. • Document all changes, bug fixes, and updates meticulously, maintaining clear and comprehensive records for future reference and compliance. • Provide technical guidance and support to other team members, promoting a culture of continuous learning and improvement. • Stay updated with the latest technologies and practices in ADF to continuously improve the support and maintenance of data systems. • Flexible Work Hours to include US Time Zones • Flexible working hours however this position may require you to work a rotational On-Call schedule, evenings, weekends, and holiday shifts when need arises Participate in the Demand Management and Change Management processes. • Work in partnership with internal business, external 3rd party technical teams and functional teams as a technology partner in communicating and coordinating delivery of technology services from Technology For Operations (TfO) Any Graduation,12th/PUC/HSC

Posted 1 month ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Role Description Data Engineer Responsibilities/Tasks Work with Technical Leads and Architects to analyse solutions. Translate complex business requirements into tangible data requirements through collaborative work with both business and technical subject matter experts. Develop / modify data models with an eye towards high performance, scalability, flexibility and usability. Ensure data models are in alignment with the overall architecture standards. Create source to target mapping documentation. Serves as data flow and enrichment “owner” with deep expertise in data dynamics, capable of recognizing and elevating improvement opportunities early in the process Work with product owners to understand business reporting requirements and deliver appropriate insights on regular basis Responsible for system configuration to deliver reports, data visualizations, and other solution components Skills Required More than 5 years of software development experience Proficient in Azure services - Azure Data Factory, Synapse, Data Lake, Data Bricks Nice to have experience in C#.Net. Experience querying, analysing, or managing data required. Experience within the healthcare insurance industry Experience in data cleansing, data engineering, data enrichment, data warehousing/ Business Intelligence preferred. Strong analytical, problem solving and planning skills. Strong organizational and presentation skills. Excellent interpersonal and communication skills. Ability to multi-task in a fast-paced environment. Flexibility to adapt readily to changing business needs in a fast-paced environment. Team player who is delivery-oriented and takes responsibility for the team’s success. Enthusiastic, can-do attitude with the drive to continually learn and improve. Knowledge of Agile, SCRUM and/or Agile methodologies. Skills Data Engineering,Azure, ADB, ADF Show more Show less

Posted 1 month ago

Apply

5.0 - 10.0 years

9 - 18 Lacs

Mumbai, Mumbai Suburban, Navi Mumbai

Work from Office

Job description Hiring for Oracle Fusion Technical - Mumbai (WFO) Experience: 5+ Years Work location: Mumbai_WFO Notice Period: Immediate to 15 Days Max About Clover InfoTech: With 30 years of IT excellence, Clover Infotech is a leading global IT services and consulting company. Our 5000+ experts specialized in Oracle, Microsoft, and Open Source technologies, delivering solutions in application and technology modernization, cloud enablement, data management, automation, and assurance services. We help enterprises on their transformation journey by implementing business-critical applications and supporting technology infrastructure through a proven managed services model. Our SLA-based delivery ensures operational efficiency, cost-effectiveness, and enhanced information security. We proudly partner with companies ranging from Fortune 500 companies to emerging enterprises and new-age startups. We offer technology-powered solutions that accelerate growth and drive success. Job Description: Oracle Fusion Technical Responsible for providing technical solutioning & implementing the same Should have worked extensively on Fusion Reporting tools like BIP, OTBI and FRS. Should have expertise in Oracle BPM and workflows. Strong knowledge in FBDI, ADF, FBL and webservices SOAP & REST. Ready to Join immediately or within 30 days. Ready to work from Clover Office or client locations. Total relevant experience in Oracle ERP (EBS+Fusion) more than 5 years. Minimum Fusion experience out of total experience is 3 years. Preference to candidates with Certification in Oracle Fusion. Have done minimum 2 Fusion implementation projects. Should be ready to work in different time zone. Should have managed a Team of Technical as well as co-ordination with Functional teams. Should be ready to Travel as & when required & has a valid passport. Should be able to train and Mentor team in Oracle Fusion. Thank you for considering this opportunity with Clover InfoTech. We look forward to hearing from you! Best Regards, Talent Acquisition Team https://www.cloverinfotech.com Office Locations India: Mumbai | Navi Mumbai | Pune | Gurugram | Bengaluru | Chennai International: UAE | USA | Canada | Singapore

Posted 1 month ago

Apply

29.0 years

0 Lacs

Vishakhapatnam, Andhra Pradesh, India

On-site

Company Description Miracle Software Systems is a global IT services company delivering true value to businesses for the past 29 years. We optimize and transform businesses into high-performance platforms, enabling digitization and business growth. With over 2600 employees worldwide, Miracle serves 42 of today’s Fortune 100 companies, with 1000+ satisfied customers and 1400+ successful projects. We provide services across Cloud, Application Development, Data and Analytics, among others, and are known for our Always-Available, Innovation-First approach, making us a trusted partner in digital journeys. We have alliances with leading IT firms such as SAP, IBM, AWS, RedHat, Microsoft, and UiPath. Role Description This is an on-site full-time role for a Sr. Azure Data Engineer located in Vishakhapatnam. The Sr. Azure Data Engineer with 6+ years will be responsible for highly skilled Data Engineering with extensive experience in Microsoft Azure, particularly with ADF and Fabric pipeline development and a strong understanding of the Medallion Architecture (Bronze, Silver, Gold layers). The ideal candidate will be responsible for designing and optimizing end-to-end data pipelines across Lake houses and Warehouses in Microsoft Fabric, and will work closely with business and engineering teams to define scalable, governed data models. Responsibilities: Develop and manage complex data pipelines using Azure Data Factory (ADF) and Microsoft Fabric. Implement and maintain Medallion Architecture layers (Bronze, Silver, Gold). Design governed, scalable data models tailored to business requirements. Develop and optimize PySpark-based data processing for large-scale data transformations. Integrate with reporting tools such as Power BI for seamless data visualization. Ensure robust data governance, security, and performance in large-scale Fabric deployments. Required Skills: Strong expertise in Azure Data Factory (ADF) and Microsoft Fabric Hands-on experience with OneLake, Lakehouse Explorer, and Power BI integration Solid understanding of data governance, security, and performance tuning SAP knowledge is required Proficiency in PySpark is mandatory Interested can share your updated resume to skoditala@miraclesoft.com // can able to reach me at 08919401201. Show more Show less

Posted 1 month ago

Apply

10.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Role Description Role Proficiency: Act creatively to develop applications by selecting appropriate technical options optimizing application development maintenance and performance by employing design patterns and reusing proven solutions. Account for others' developmental activities; assisting Project Manager in day to day project execution. Outcomes Interpret the application feature and component designs to develop the same in accordance with specifications. Code debug test document and communicate product component and feature development stages. Validate results with user representatives integrating and commissions the overall solution. Select and create appropriate technical options for development such as reusing improving or reconfiguration of existing components while creating own solutions for new contexts Optimises efficiency cost and quality. Influence and improve customer satisfaction Influence and improve employee engagement within the project teams Set FAST goals for self/team; provide feedback to FAST goals of team members Measures Of Outcomes Adherence to engineering process and standards (coding standards) Adherence to project schedule / timelines Number of technical issues uncovered during the execution of the project Number of defects in the code Number of defects post delivery Number of non compliance issues Percent of voluntary attrition On time completion of mandatory compliance trainings Code Outputs Expected: Code as per the design Define coding standards templates and checklists Review code – for team and peers Documentation Create/review templates checklists guidelines standards for design/process/development Create/review deliverable documents. Design documentation Requirements test cases and results Configure Define and govern configuration management plan Ensure compliance from the team Test Review/Create unit test cases scenarios and execution Review test plan created by testing team Provide clarifications to the testing team Domain Relevance Advise software developers on design and development of features and components with deeper understanding of the business problem being addressed for the client Learn more about the customer domain and identify opportunities to provide value addition to customers Complete relevant domain certifications Manage Project Support Project Manager with inputs for the projects Manage delivery of modules Manage complex user stories Manage Defects Perform defect RCA and mitigation Identify defect trends and take proactive measures to improve quality Estimate Create and provide input for effort and size estimation and plan resources for projects Manage Knowledge Consume and contribute to project related documents share point libraries and client universities Review the reusable documents created by the team Release Execute and monitor release process Design Contribute to creation of design (HLD LLD SAD)/architecture for applications features business components and data models Interface With Customer Clarify requirements and provide guidance to Development Team Present design options to customers Conduct product demos Work closely with customer architects for finalizing design Manage Team Set FAST goals and provide feedback Understand aspirations of the team members and provide guidance opportunities etc Ensure team members are upskilled Ensure team is engaged in project Proactively identify attrition risks and work with BSE on retention measures Certifications Obtain relevant domain and technology certifications Skill Examples Explain and communicate the design / development to the customer Perform and evaluate test results against product specifications Break down complex problems into logical components Develop user interfaces business software components Use data models Estimate time and effort resources required for developing / debugging features / components Perform and evaluate test in the customer or target environments Make quick decisions on technical/project related challenges Manage a team mentor and handle people related issues in team Have the ability to maintain high motivation levels and positive dynamics within the team. Interface with other teams designers and other parallel practices Set goals for self and team. Provide feedback for team members Create and articulate impactful technical presentations Follow high level of business etiquette in emails and other business communication Drive conference calls with customers and answer customer questions Proactively ask for and offer help Ability to work under pressure determine dependencies risks facilitate planning handling multiple tasks. Build confidence with customers by meeting the deliverables timely with a quality product. Estimate time and effort of resources required for developing / debugging features / components Knowledge Examples Appropriate software programs / modules Functional & technical designing Programming languages – proficient in multiple skill clusters DBMS Operating Systems and software platforms Software Development Life Cycle Agile – Scrum or Kanban Methods Integrated development environment (IDE) Rapid application development (RAD) Modelling technology and languages Interface definition languages (IDL) Broad knowledge of customer domain and deep knowledge of sub domain where problem is solved Additional Comments Key Responsibilities: Design, build, and manage end-to-end ETL/ELT workflows using Azure Data Factory (ADF) to support supply chain data movement and transformation. Integrate data from multiple sources such as ERP systems, logistics platforms, warehouses, APIs, and third-party providers into Azure Data Lake or Synapse Analytics. Ensure high-performance, scalable, and secure data pipelines aligned with business and compliance requirements. Collaborate with business analysts, data architects, and supply chain SMEs to understand data needs and implement effective solutions. Write and optimize complex SQL queries, stored procedures, and data transformation logic. Monitor, troubleshoot, and optimize ADF pipelines for latency, throughput, and reliability. Support data validation, quality assurance, and governance processes. Document data flows, transformation logic, and technical processes. Work in Agile/Scrum delivery model to support iterative development and rapid delivery. ________________________________________ Required Skills & Experience: 9–10 years of experience in Data Engineering and ETL development, with at least 3–5 years in Azure Data Factory. Strong knowledge of Azure Data Lake, Azure SQL DB, Azure Synapse, Blob Storage, and Data Flows in ADF. Proficiency in SQL, T-SQL, and performance tuning of queries. Experience working with structured, semi-structured (JSON, XML), and unstructured data. Exposure to Supply Chain data sources like ERP (e.g., SAP, Oracle), TMS, WMS, or inventory/order management systems. Experience with Git, Azure DevOps, or other version control and CI/CD tools. Basic understanding of DataBricks, Python, or Spark is a plus. Familiarity with data quality, metadata management, and lineage tools. Bachelor's Degree in Computer Science, Engineering, or a related field. ________________________________________ Preferred Qualifications: Experience in Supply Chain Analytics or Operations. Knowledge of forecasting, inventory planning, procurement, logistics, or demand planning data flows. Certification in Microsoft Azure Data Engineer Associate is preferred. ________________________________________ Soft Skills: Strong problem-solving and analytical skills. Ability to communicate effectively with business and technical stakeholders. Experience working in Agile / Scrum teams. Proactive, self-motivated, and detail-oriented. Skills Azure Data Factory,Azure Data Lake,Blob Storage Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Skill required: Tech for Operations - Microsoft Azure Cloud Services Designation: App Automation Eng Senior Analyst Qualifications: Any Graduation/12th/PUC/HSC Years of Experience: 5 to 8 years About Accenture Accenture is a global professional services company with leading capabilities in digital, cloud and security.Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world’s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities.Visit us at www.accenture.com What would you do? Accenture is a global professional services company with leading capabilities in digital, cloud and security. Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song— all powered by the world s largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. We embrace the power of change to create value and shared success for our clients, people, shareholders, partners and communities. Visit us at www.accenture.com. In our Service Supply Chain offering, we leverage a combination of proprietary technology and client systems to develop, execute, and deliver BPaaS (business process as a service) or Managed Service solutions across the service lifecycle: Plan, Deliver, and Recover. In this role, you will partner with business development and act as a Business Subject Matter Expert (SME) to help build resilient solutions that will enhance our clients supply chains and customer experience. The Senior Azure Data factory (ADF) Support Engineer Il will be a critical member of our Enterprise Applications Team, responsible for designing, supporting & maintaining robust data solutions. The ideal candidate is proficient in ADF, SQL and has extensive experience in troubleshooting Azure Data factory environments, conducting code reviews, and bug fixing. This role requires a strategic thinker who can collaborate with cross-functional teams to drive our data strategy and ensure the optimal performance of our data systems. What are we looking for? Bachelor s or Master s degree in Computer Science, Information Technology, or a related field. Proven experience (5+ years) as a Azure Data Factory Support Engineer Il Expertise in ADF with a deep understanding of its data-related libraries. Strong experience in Azure cloud services, including troubleshooting and optimizing cloud-based environments. Proficient in SQL and experience with SQL database design. Strong analytical skills with the ability to collect, organize, analyze, and disseminate significant amounts of information with attention to detail and accuracy. Experience with ADF pipelines. Excellent problem-solving and troubleshooting skills. Experience in code review and debugging in a collaborative project setting. Excellent verbal and written communication skills. Ability to work in a fast-paced, team-oriented environment. Strong understanding of the business and a passion for the mission of Service Supply Chain Hands on with Jira, Devops ticketing, ServiceNow is good to have Roles and Responsibilities: Innovate. Collaborate. Build. Create. Solve ADF & associated systems Ensure systems meet business requirements and industry practices. Integrate new data management technologies and software engineering tools into existing structures. Recommend ways to improve data reliability, efficiency, and quality. Use large data sets to address business issues. Use data to discover tasks that can be automated. Fix bugs to ensure robust and sustainable codebase. Collaborate closely with the relevant teams to diagnose and resolve issues in data processing systems, ensuring minimal downtime and optimal performance. Analyze and comprehend existing ADF data pipelines, systems, and processes to identify and troubleshoot issues effectively. Develop, test, and implement code changes to fix bugs and improve the efficiency and reliability of data pipelines. Review and validate change requests from stakeholders, ensuring they align with system capabilities and business objectives. Implement robust monitoring solutions to proactively detect and address issues in ADF data pipelines and related infrastructure. Coordinate with data architects and other team members to ensure that changes are in line with the overall architecture and data strategy. Document all changes, bug fixes, and updates meticulously, maintaining clear and comprehensive records for future reference and compliance. Provide technical guidance and support to other team members, promoting a culture of continuous learning and improvement. Stay updated with the latest technologies and practices in ADF to continuously improve the support and maintenance of data systems. Flexible Work Hours to include US Time Zones Flexible working hours however this position may require you to work a rotational On-Call schedule, evenings, weekends, and holiday shifts when need arises Participate in the Demand Management and Change Management processes. Work in partnership with internal business, external 3rd party technical teams and functional teams as a technology partner in communicating and coordinating delivery of technology services from Technology For Operations (TfO) Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Flex is the diversified manufacturing partner of choice that helps market-leading brands design, build and deliver innovative products that improve the world. We believe in the power of diversity and inclusion and cultivate a workplace culture of belonging that views uniqueness as a competitive edge and builds a community that enables our people to push the limits of innovation to make great products that create value and improve people's lives. A career at Flex offers the opportunity to make a difference and invest in your growth in a respectful, inclusive, and collaborative environment. If you are excited about a role but don't meet every bullet point, we encourage you to apply and join us to create the extraordinary. The “ Business Intelligence Developer ” will be R emote based (with very occasional travel to sites), Responsible for maintaining and improving the organisation’s business intelligence systems to ensure that they function reliably and in accordance with user needs. What a typical day looks like: Support the Senior Business Intelligence Developer to maintain and improve the Power BI suite of reports for the business Develop, test, review and help deploy automated reports and dashboards using Power BI and other reporting tools. Understand business requirements to set functional specifications for reporting applications. Exhibit an understanding of database concepts such relational database architecture and multidimensional database design Design data models that transform raw data into insightful knowledge by understanding business requirements in the context of BI. Develop technical specifications from business needs, and accurately scope the work to help set realistic deadlines for work completion. Make charts and data documentation that includes descriptions of the techniques, parameters, models, and relationships. Developing Power BI desktop to create dashboards, KPI scorecards, and visual reports. Examine, comprehend, and study business needs as they relate to business intelligence. Design and map data models to transform raw data into insightful information. Create dynamic and eye-catching dashboards and reports using Power BI. Make necessary tactical and technological adjustments to enhance current business intelligence systems Integrate data, alter data, and connect to data sources for business intelligence. ‍ The experience we’re looking to add to our team: Knowledge of SSRS and TSQL, Power Query, MDX, PowerBI, and DAX and systems on the MS SQL Server BI Stack. Good communication skills are necessary to effectively work with stakeholders, end users and all levels at the organisation who request reports. Ability to run a Power BI project End to end through all stages from Requirements gathering to Report Deployment. Exceptional analytical thinking skills for converting data into illuminating reports and reports. Some knowledge of data warehousing, data gateway, and data preparation projects Good knowledge of Power BI,and desirable a knowledge of SSAS, SSRS, and SSIS components of the Microsoft Business Intelligence Stack Articulating, representing, and analysing solutions with the team while documenting, creating, and modelling them Strong understanding of DAX, Intermediate Knowledge of SQL, and M-Query and a basic understanding of Python. A basic understanding of ADF or Fabric Pipelines. Comprehensive understanding of data modelling, administration, and visualisation Capacity to perform in an atmosphere where agility and continual development are prioritised Awareness of BI technologies (e.g., Microsoft Power BI, Oracle BI) Expertise of SQL queries, SSRS, and SQL Server Integration Services (SSIS) Troubleshooting and problem-solving skills. Demonstrates basic functional, technical and people and/or process management skills as well as customer (external and internal) relationship skills. Demonstrates skills in functional/ technical area. Use of the following tools may be required: Office Skills: telephones, data entry, and office software to include word processing, spreadsheets, and presentation package and database systems. What you’ll receive for the great work you provide: Health Insurance Paid Time Off #BB04 Site Flex is an Equal Opportunity Employer and employment selection decisions are based on merit, qualifications, and abilities. We celebrate diversity and do not discriminate based on: age, race, religion, color, sex, national origin, marital status, sexual orientation, gender identity, veteran status, disability, pregnancy status, or any other status protected by law. We're happy to provide reasonable accommodations to those with a disability for assistance in the application process. Please email accessibility@flex.com and we'll discuss your specific situation and next steps (NOTE: this email does not accept or consider resumes or applications. This is only for disability assistance. To be considered for a position at Flex, you must complete the application process first). Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

Senior Data Functional Consultant - Fully Remote - 6 Month Contract Role: Senior Data Functional Consultant Client Location: Dubai Work Location: Fully Remote Duration: 6 Months extendable Monthly Rate: $2000 USD Our Dubai based client are seeking a proactive and business-oriented Data Functional Consultant with strong experience in Azure Data Factory and Azure Databricks. This role bridges the gap between business stakeholders and technical teams—translating business needs into scalable data solutions, ensuring effective data management, and enabling insights-driven decision-making. The ideal candidate is not a pure developer or data engineer but someone who understands business processes, data flows, and stakeholder priorities, and can help drive value from data platforms using cloud-native Azure services. Experience Required: • Proven exposure to Azure Data Factory (ADF) for orchestrating data workflows. • Practical experience with Azure Databricks for data processing (functional understanding, not necessarily coding). • Strong understanding of data warehousing, data modeling, and business KPIs. • Experience working in agile or hybrid project environments. • Excellent communication and stakeholder management skills. • Ability to translate complex technical details into business-friendly language. • Familiarity with tools like Power BI, Excel, or other reporting solutions is a plus. • Background in Banking, Finance industries is a bonus. Requirements: • Collaborate closely with business stakeholders to gather, understand, and document functional data requirements Translate business needs into high-level data design, data workflows, and process improvements. • Work with data engineering teams to define and validate ETL/ELT logic and data pipeline workflows using Azure Data Factory and Databricks. • Facilitate functional workshops and stakeholder meetings to align on data needs and business KPIs. • Act as a bridge between business teams and data engineers to ensure accurate implementation and delivery of data solutions. • Conduct data validation, UAT, and support users in adopting data platforms and self-service analytics. • Maintain functional documentation, data dictionaries, and mapping specifications. • Assist in defining data governance, data quality, and master data management practices from a business perspective. • Monitor data pipeline health and help triage issues from a functional/business impact standpoint. Show more Show less

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies