Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
2.0 - 6.0 years
4 - 8 Lacs
Noida, Pune, Bengaluru
Work from Office
Type: Permanent, Work from Office Location: Chennai Budget: Competitive, as per industry standards Looking for: Immediate Joiners Responsibilities: Design and develop customizations, extensions, and reports in Oracle Fusion applications. Collaborate with functional consultants to understand business requirements and provide technical solutions. Develop and implement integrations using Oracle Integration Cloud (OIC), BI Publisher, and other tools. Debug and resolve issues in Oracle Fusion modules. Maintain technical documentation for solutions provided. Ensure compliance with best practices in Fusion application development. Skills Required: Hands-on experience in Oracle Fusion technical development. Strong skills in Oracle Integration Cloud (OIC), BI Publisher, and ADF. Good understanding of Oracle Fusion modules (Finance, SCM, HCM, etc.). Strong problem-solving and communication skills. Location: Remote-Delhi / NCR,Bangalore/Bengaluru,Hyderabad/Secunderabad,Chennai,Pune,Kolkata,Ahmedabad,Mumbai
Posted 5 hours ago
4.0 years
0 Lacs
India
Remote
Welcome to Veradigm! Our Mission is to be the most trusted provider of innovative solutions that empower all stakeholders across the healthcare continuum to deliver world-class outcomes. Our Vision is a Connected Community of Health that spans continents and borders. With the largest community of clients in healthcare, Veradigm is able to deliver an integrated platform of clinical, financial, connectivity and information solutions to facilitate enhanced collaboration and exchange of critical patient information. Veradigm Life Veradigm is here to transform health, insightfully. Veradigm delivers a unique combination of point-of-care clinical and financial solutions, a commitment to open interoperability, a large and diverse healthcare provider footprint, along with industry proven expert insights. We are dedicated to simplifying the complicated healthcare system with next-generation technology and solutions, transforming healthcare from the point-of-patient care to everyday life. For more information, please explore Veradigm.com. Job Description For Sr Software Engineer Job Title: Sr Software Engineer Job Responsibilities What will your job look like: The primary purpose of this role is to perform Specification, Design, Coding, Testing, Documentation in the areas of Development and Maintenance. Responsible for creating low-level designs for complex software modules and subsystems. Provide technical guidance to the team, ensuring the successful implementation of advanced software solutions. The ideal candidate will excel at translating business requirements into detailed and comprehensive functional requirements, thereby significantly contributing to the success of our projects. An Ideal Candidate Will Have 4+ years of experience as a software engineer. SQL database experience (Redshift, PostgreSQL, MySQL, Snowflake or similar). Key areas include understanding database design principles, writing efficient queries, and utilizing advanced features. Specific items include database design, data manipulation (CRUD operations), querying data (SELECT statements with various clauses like WHERE, GROUP BY, ORDER BY, JOINs), data modeling, and understanding database concepts like primary and foreign keys. Excellent programming skills in ADF (Azure data factory pipelines ) includes data movements , data transformation ,authentication and control activities Excellent programming skills in Python, Java, C#, C++, or similar language At least 1 year working as a software developer on large distributed systems and client server architectures. 3+ years Python development using frameworks like Flask, Django, Jinja, SQLAlchemy Experience building and deploying applications using Amazon Web Services or similar cloud infrastructure. Software development in life sciences industry preferred. Validated software development in a regulated environment preferred. Development/testing of ETL Experience with Apache HTTP, NGINX, Tomcat, or Jetty. Experience with standard build tools and version control systems (e.g., Git, Jenkins). Broad understanding of internet protocols and network programming. Benefits Veradigm believes in empowering our associates with the tools and flexibility to bring the best version of themselves to work. Through our generous benefits package with an emphasis on work/life balance, we give our employees the opportunity to allow their careers to flourish. Quarterly Company-Wide Recharge Days Flexible Work Environment (Remote/Hybrid Options) Peer-based incentive “Cheer” awards “All in to Win” bonus Program Tuition Reimbursement Program To know more about the benefits and culture at Veradigm, please visit the links mentioned below: - https://veradigm.com/about-veradigm/careers/benefits/ https://veradigm.com/about-veradigm/careers/culture/ Veradigm is proud to be an equal opportunity workplace dedicated to pursuing and hiring a diverse and inclusive workforce. Thank you for reviewing this opportunity! Does this look like a great match for your skill set? If so, please scroll down and tell us more about yourself!
Posted 7 hours ago
4.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Oracle Management Level Associate Job Description & Summary At PwC, our people in business application consulting specialise in consulting services for a variety of business applications, helping clients optimise operational efficiency. These individuals analyse client needs, implement software solutions, and provide training and support for seamless integration and utilisation of business applications, enabling clients to achieve their strategic objectives. Those in Oracle technology at PwC will focus on utilising and managing Oracle suite of software and technologies for various purposes within an organisation. You will be responsible for tasks such as installation, configuration, administration, development, and support of Oracle products and solutions. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Job Description & Summary Managing business performance in today’s complex and rapidly changing business environment is crucial for any organization’s short-term and long-term success. However ensuring streamlined E2E Oracle fusion Technical to seamlessly adapt to the changing business environment is crucial from a process and compliance perspective. As part of the Technology Consulting -Business Applications - Oracle Practice team, we leverage opportunities around digital disruption, new age operating model and best in class practices to deliver technology enabled transformation to our clients *Responsibilities: Extensive experience in Oracle ERP/Fusion SaaS/PaaS project implementations as a technical developer . Completed at least 2 full Oracle Cloud (Fusion) Implementation Extensive Knowledge on database structure for ERP/Oracle Cloud (Fusion) Extensively worked on BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) * Mandatory skill sets BI Publisher reports, FBDI/OTBI Cloud and Oracle Integration (OIC) *Preferred skill sets database structure for ERP/Oracle Cloud (Fusion) *Years of experience required Minimum 4Years of Oracle fusion experience *Education Qualification BE/BTech MBA Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Oracle Fusion Middleware (OFM) Optional Skills Accepting Feedback, Active Listening, Business Transformation, Communication, Design Automation, Emotional Regulation, Empathy, Inclusion, Intellectual Curiosity, Optimism, Oracle Application Development Framework (ADF), Oracle Business Intelligence (BI) Publisher, Oracle Cloud Infrastructure, Oracle Data Integration, Process Improvement, Process Optimization, Strategic Technology Planning, Teamwork, Well Being Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 7 hours ago
5.0 - 8.0 years
13 - 16 Lacs
Gāndhīnagar
On-site
Company Name : PIB Techco India Pvt Ltd Location: Gandhinagar, Gujarat Job title: Sr. Devops Engineer Requirements: Must have: We are seeking a highly skilled DevOps Engineer with 5–8 years of professional hand on experience, particularly in managing Azure DevOps CI/CD pipelines and automating deployments across cloud-based data solutions. The ideal candidate should be capable of handling end-to-end deployment processes for Azure Devops projects involving Azure Data Factory (ADF), Databricks, SQL, Python, Azure Data Lake Storage (ADLS) and Power BI repositories Key Responsibilities: - Design, implement, and manage automated deployment pipelines for ADF, Databricks notebooks, SQL scripts, Python-based data processing and Power BI projects. - Manage build and release pipelines for various environments including Dev, UAT, and Production. - Enable environment consistency across Dev, UAT, and Production with automated application deployments using Azure CI/CD Pipelines, PowerShell, and CLI scripts. - Proficient in Python, Bash, or PowerShell - Collaborate with dataops and data engineering teams to enable smooth integration and deployment across Dev, UAT, and production environments. - Monitor pipeline health and performance, troubleshoot deployment failures, and ensure version control and rollback mechanisms are in place. - Support end-to-end project delivery including requirement gathering, pipeline design, development, testing automation, deployment, and post-deployment support. - Implement robust branching strategies, Git workflows, and automated testing frameworks. - Maintain version control practices using Azure DevOps Repos. - Monitor, log, and troubleshoot deployment issues using Azure Monitor, Log Analytics, or Cloud-native tools Nice to have: - Familiarity with Azure Data Factory (ADF), Databricks, SQL, Python, Azure Data Lake Storage (ADLS) and Power BI repositories,Docker, Kubernetes, or managed services like AKS/EKS. - Experience working with Agile methodologies, Test-Driven Development (TDD), and implementing CI/CD pipelines using tools like Azure DevOps pipeline or AWS CodePipeline. - Exposure to data modelling tools like Erwin or ER/Studio to support DevOps in metadata and schema management. - Exposure to leading reporting and visualization tools such as Power BI, particularly in automating report deployment and integration workflows. - Experience with API integrations and supporting infrastructure-as-code for connecting various systems and services Job Types: Full-time, Permanent Pay: ₹1,300,000.00 - ₹1,600,000.00 per year Benefits: Health insurance Provident Fund Schedule: Day shift Monday to Friday Work Location: In person Application Deadline: 01/07/2025 Expected Start Date: 01/07/2025
Posted 8 hours ago
0 years
7 - 9 Lacs
Calcutta
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: We are seeking a Data Engineer to design, develop, and maintain data ingestion processes to a data platform built using Microsoft Technologies, ensuring data quality and integrity. The role involves collaborating with data architects and business analysts to implement solutions using tools like ADF, Azure Databricks, and requires strong SQL skills. Responsibilities: Key responsibilities include developing, testing, and optimizing ETL workflows and maintaining documentation. ETL development experience in Microsoft data track are required. Work with business team to translate the business requirement to technical requirements. Demonstrated expertise in Agile methodologies, including Scrum, Kanban, or SAFe. Mandatory skill sets: · Strong proficiency in Azure Databricks, including Spark and Delta Lake. · Experience with Azure Data Factory, Azure Data Lake Storage, and Azure SQL Database. · Proficiency in data integration and ETL processes and T-SQL. · Experienced working in Python for data engineering · Experienced working in Postgres Database · Experienced working in graph database · Experienced in architecture design and data modelling Good To Have Skill Sets: · Unity Catalog / Purview · Familiarity with Fabric/Snowflake service offerings · Visualization tool – PowerBI Preferred skill sets: Hands on knowledge of python, Pyspark and strong SQL knowledge. ETL and data warehousing is must. Relevant certifications (Any one) (e.g., Databricks Data Engineer Associate Microsoft Certified: Azure Data Engineer Associate Azure Solution Architect) are mandatory Years of experience required: 5+yrs Education qualification: Bachelor's degree in Computer Science, IT, or a related field. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Acceptance Test Driven Development (ATDD), Acceptance Test Driven Development (ATDD), Accepting Feedback, Active Listening, Analytical Thinking, Android, API Management, Appian (Platform), Application Development, Application Frameworks, Application Lifecycle Management, Application Software, Business Process Improvement, Business Process Management (BPM), Business Requirements Analysis, C#.NET, C++ Programming Language, Client Management, Code Review, Coding Standards, Communication, Computer Engineering, Computer Science, Continuous Integration/Continuous Delivery (CI/CD), Creativity {+ 46 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 8 hours ago
3.0 years
7 - 9 Lacs
Calcutta
On-site
Line of Service Advisory Industry/Sector FS X-Sector Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Senior Associate Exp : 3 - 6 Years Location: Kolkata Technical Skills: · Strong expertise in Azure Databricks, Azure Data Factory (ADF), PySpark, SQL Server, and Python. · Solid understanding of Azure Functions and their application in data processing workflows. · Understanding of DevOps practices and CI/CD pipelines for data solutions. · Experience with other ETL tools such as Informatica Intelligent Cloud Services (IICS) is a plus. · Strong problem-solving skills and ability to work independently and collaboratively in a fast-paced environment. · Excellent communication skills to effectively convey technical concepts to non-technical stakeholders. Key Responsibilities: · Develop, maintain, and optimize scalable data pipelines using Azure Databricks, Azure Data Factory (ADF), and PySpark. · Collaborate with data architects and business stakeholders to translate requirements into technical solutions. · Implement and manage data integration processes using SQL Server and Python. · Design and deploy Azure Functions to support data processing workflows. · Monitor and troubleshoot data pipeline performance and reliability issues. · Ensure data quality, security, and compliance with industry standards and best practices. · Document technical specifications and maintain clear and concise project documentation. Mandatory skill sets: Azure Databricks, Azure Data Factory (ADF), and PySpark. Preferred skill sets: Azure Databricks, Azure Data Factory (ADF), and PySpark. Years of experience required: 3-6 Years Education qualification: B.E.(B.Tech)/M.E/M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis, Intellectual Curiosity, Java (Programming Language), Market Development {+ 11 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 8 hours ago
3.0 years
7 - 9 Lacs
Calcutta
On-site
Line of Service Advisory Industry/Sector FS X-Sector Specialism Operations Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary – Senior Associate – Azure Data Engineer Responsibilities: Role : Senior Associate Exp : 3 - 6 Years Location: Kolkata Technical Skills: · Strong expertise in Azure Databricks, Azure Data Factory (ADF), PySpark, SQL Server, and Python. · Solid understanding of Azure Functions and their application in data processing workflows. · Understanding of DevOps practices and CI/CD pipelines for data solutions. · Experience with other ETL tools such as Informatica Intelligent Cloud Services (IICS) is a plus. · Strong problem-solving skills and ability to work independently and collaboratively in a fast-paced environment. · Excellent communication skills to effectively convey technical concepts to non-technical stakeholders. Key Responsibilities: · Develop, maintain, and optimize scalable data pipelines using Azure Databricks, Azure Data Factory (ADF), and PySpark. · Collaborate with data architects and business stakeholders to translate requirements into technical solutions. · Implement and manage data integration processes using SQL Server and Python. · Design and deploy Azure Functions to support data processing workflows. · Monitor and troubleshoot data pipeline performance and reliability issues. · Ensure data quality, security, and compliance with industry standards and best practices. · Document technical specifications and maintain clear and concise project documentation. Mandatory skill sets: Azure Databricks, Azure Data Factory (ADF), and PySpark. Preferred skill sets: Azure Databricks, Azure Data Factory (ADF), and PySpark. Years of experience required: 3-6 Years Education qualification: B.E.(B.Tech)/M.E/M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Microsoft Azure, PySpark Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 8 hours ago
5.0 - 10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Role: Azure Data Engineer Experience: 5-10 years Notice Period: Immediate to 15 days Location: Hyderabad We are seeking a highly skilled Data Engineer to join our dynamic team. Job Description Mandate Skills: Databricks, BI and ADP Proficient in Azure Data Platform (Storage, ADF, Databricks, Devops). Strong SQL skills, Data model design (MSSQL, DatabricksSQL) Experience with Azure SQL Database, Azure Cosmos DB, and Azure Blob Storage. Expertise in designing and implementing ETL processes using SSIS, Python, or PowerShell. Fabric/Power BI (full lifecycle of models/reports design, test, deployment, performance optimization/monitoring) Familiarity with data modeling principles and techniques. Excellent understanding of data security and compliance regulations. Proficiency in Azure DevOps for continuous integration and deployment. Ability to work in a fast-paced, collaborative environment. Regards, ValueLabs
Posted 8 hours ago
6.0 - 10.0 years
0 - 3 Lacs
Pune, Chennai, Bengaluru
Hybrid
Experience with cloud database platforms, e.g. Azure SQL Database, Snowflake etc, or on-prem database platforms like MS SQL server, and have good SQL skills Experience working with Azure Data Factory (ADF). Experience with data modelling. Experience of working with a variety of stakeholders including product owners, delivery managers and architects. Experience with performance tuning and relational database design particularly in the development of business intelligence solutions. Experience with data migration strategies from on premise to cloud. Experience with Data Warehouse concepts. Experience of communicating and documenting technical design proposals. Experience with DataOps, automating the promotion and release of data engineering artefacts, automating testing and pipeline optimisation. Experience with data masking policies, GDPR, auditing access and securing sensitive data sets.
Posted 8 hours ago
5.0 years
0 Lacs
Mumbai Metropolitan Region
On-site
The BI Data Engineer is a key role within the Enterprise Data team. We are looking for expert Azure data engineer with deep Data engineering, ADF Integration and database development experience. This is a unique opportunity to be involved in delivering leading-edge business analytics using the latest and greatest cutting-edge BI tools, such as cloud-based databases, self-service analytics and leading visualisation tools enabling the company’s aim to become a fully digital organisation. Job Description: Key Responsibilities: Build Enterprise data engineering and Integration solutions using the latest Azure platform, Azure Data Factory, Azure SQL Database, Azure Synapse and Azure Fabric Development of enterprise ETL and integration routines using ADF Evaluate emerging Data enginnering technologies, standards and capabilities Partner with business stakeholder, product managers, and data scientists to understand business objectives and translate them into technical solutions. Work with DevOps, engineering, and operations teams to implement CI/CD pipelines and ensure smooth deployment of data engiinering solutions Required Skills And Experience Technical Expertise : Expertise in the Azure platform including Azure Data Factory, Azure SQL Database, Azure Synapse and Azure Fabric Exposure to Data bricks and lakehouse arcchitect & technologies Extensive knowledge of data modeling, ETL processes and data warehouse design principles. Experienc in machine learning and AI services in Azure. Professional Experience : 5+ years of experience in database development using SQL 5+ Years integration and data engineering experience 5+ years experience using Azure SQL DB, ADF and Azure Synapse 2+ Years experience using Power BI Comprehensive understanding of data modelling Relevant certifications in data engineering, machine learning, AI. Key Competencies: Expertise in data engineering and database development. Familiarity with the Microsoft Fabric technologies including One Lake, Lakehouse and Data Factory Strong understanding of data governance, compliance, and security frameworks. Proven ability to drive innovation in data strategy and cloud solutions. A deep understanding of business intelligence workflows and the ability to align technical solutions Strong database design skills, including an understanding of both normalised form and dimensional form databases. In-depth knowledge and experience of data-warehousing strategies and techniques e.g., Kimball Data warehousing Experience in Cloud based data integration tools like Azure Data Factory Experience in Azure Dev Ops or JIRA is a plus Experience working with finance data is highly desirable Familiarity with agile development techniques and objectives Location: DGS India - Mumbai - Thane Ashar IT Park Brand: Dentsu Time Type: Full time Contract Type: Permanent
Posted 9 hours ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Oracle ADF Good to have skills : Python (Programming Language), Node.js, React.js Minimum 5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with team members to understand project specifications, developing application features, and ensuring that the applications are aligned with business needs. You will also engage in testing and troubleshooting to enhance application performance and user experience, while continuously seeking opportunities for improvement and innovation in application development processes. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Mentor junior team members to enhance their skills and knowledge. - Analyze user requirements and translate them into technical specifications. Professional & Technical Skills: - Must To Have Skills: Proficiency in Oracle ADF. - Good To Have Skills: Experience with Python (Programming Language), Node.js, React.js. - Strong understanding of application development methodologies. - Experience with database management and SQL. - Familiarity with web services and RESTful APIs. Additional Information: - The candidate should have minimum 5 years of experience in Oracle ADF. - This position is based in Pune. - A 15 years full time education is required.
Posted 10 hours ago
0 years
0 Lacs
India
On-site
About NTT DATA NTT DATA is a $30+ billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. We invest over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure, and connectivity. We are also one of the leading providers of digital and AI infrastructure in the world. NTT DATA is part of NTT Group and headquartered in Tokyo. Visit us at nttdata.com Job Description for Data Engineer Experience with the following key deployment automations must be there: 5+ yeas of experience in Azure Data Factory (ADF) Azure SQL PySpark Delta Lake –(Databrics) Databricks deployment pipelines Automation of ADF deployments Automation of database deployments (Azure SQL, Delta Lake) Automation of Databricks deployments Deployment of Python, Django, and React-based microservices to Azure services such as Function Apps, Container Apps, Web Apps, and Azure Kubernetes Service (AKS) Job Mode:- Hybrid (2 days in a week) Job Location:- Any NTT Data office Note:- Preferred only candidates who can join us within 30 days time frame. #NTTData #LI-NorthAmerica
Posted 13 hours ago
6.0 - 10.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About Company : Our Client is a leading Indian multinational IT services and consulting firm. It provides digital transformation, cloud computing, data analytics, enterprise application integration, infrastructure management, and application development services. The company caters to over 700 clients across industries such as banking and financial services, manufacturing, technology, media, retail, and travel & hospitality. Its industry-specific solutions are designed to address complex business challenges by combining domain expertise with deep technical capabilities. With a global workforce of over 80,000 professionals and a presence in more than 50 countries. Job Title: Python Developer Locations: PAN INDIA Experience: 6-10 Years (Relevant) Employment Type: Contract to Hire Work Mode : Work From Office Notice Period : Immediate to 15 Days Job Description: Design, implement, and manage cloud-based applications using Azure services. Develop Infrastructure as Code to automate the provisioning and management of cloud resources. Host websites and python based API on Azure Web Apps. Write and maintain scripts for automation and deployment. Primary Skill in Python and Azure applications management Expertise in Azure Data Factory Familiarity with Snowflake data warehousing Performance optimization of data workflows Collect aggregate and manage data from various sources including APIs S3 buckets Excel files CSV files Blob storage and SharePoint Flatten and transform JSON data and model it appropriately for downstream processes Data Transformation and Processing Utilize tools and technologies to perform data transformations and ensure data quality Develop and maintain data pipelines and ETL processes to move data from source to target systems Data Flow Development Design and implement data flows in Azure Data Factory ADF to support data transformations Collaborate with other teams to define data transformation requirements and ensure successful data flow execution Scripting and Automation Write and optimize SQL queries and stored procedures for data extraction transformation and loading Develop Python scripts for data manipulation processing and integration tasks Data Warehouse Management Work with Snowflake and other data warehousing tools to design and maintain data models and schemas Ensure data availability integrity and security in the data warehouse environment Performance Optimization Monitor and optimize the performance of data pipelines and data integration processes Identify and resolve performance bottlenecks in data processing workflows Documentation and Reporting Document data integration processes workflows and best practices Generate reports and provide insights based on data analysis to support business decision making Collaboration and Communication Collaborate with cross functional teams to understand data requirements and provide data related support Communicate effectively with stakeholders to ensure data solutions meet business needs Skill sets required Data integration from multiple sources Primary- Python , Azure applications management,Snowflake, SQL
Posted 15 hours ago
3.0 years
0 Lacs
India
On-site
This is a hands-on Data Platform Engineering role with a strong focus on consultative data engagements across business and technical teams. Responsibilities Design and implement resilient data pipelines for batch and real-time processing Work closely with product managers, engineers, analysts, and data scientists to deliver scalable data platforms Provide guidance on architecture, infrastructure, and implementation best practices Collaborate with architects and developers to define data structures, pipelines, and orchestration strategies Ensure data privacy, processing, modeling, analytics, AI integration, and API connectivity Embrace Agile principles for project execution Develop frameworks to solve data challenges at scale Technical Skills Required 3+ years in data engineering with experience in lakehouse implementations (Databricks, Snowflake, or Synapse) Hands-on with Azure data stack (Databricks, Synapse, ADF) and supporting services (Key Vault, Storage, Firewall) Proficient in SQL, Python, and Spark Familiar with tools like JIRA, Git, Jenkins, TFS, Shell, PowerShell, Bitbucket Experience in Agile environments and familiarity with DBT and PowerBI is a plus
Posted 23 hours ago
10.0 years
0 Lacs
Greater Kolkata Area
On-site
Responsibilities : About Lexmark: Founded in 1991 and headquartered in Lexington, Kentucky, Lexmark is recognized as a global leader in print hardware, service, software solutions and security by many of the technology industry’s leading market analyst firms. Lexmark creates cloud-enabled imaging and IoT solutions that help customers in more than 170 countries worldwide quickly realize business outcomes. Lexmark’s digital transformation objectives accelerate business transformation, turning information into insights, data into decisions, and analytics into action. Lexmark India, located in Kolkata, is one of the research and development centers of Lexmark International Inc. The India team works on cutting edge technologies & domains like cloud, AI/ML, Data science, IoT, Cyber security on creating innovative solutions for our customers and helping them minimize the cost and IT burden in providing a secure, reliable, and productive print and imaging environment. At our core, we are a technology company – deeply committed to building our own R&D capabilities, leveraging emerging technologies and partnerships to bring together a library of intellectual property that can add value to our customer's business. Caring for our communities and creating growth opportunities by investing in talent are woven into our culture. It’s how we care, grow, and win together. Job Description/Responsibilities: We are looking for a highly skilled and strategic Data Architect with deep expertise in the Azure Data ecosystem . This role requires a strong command over Azure Databricks , Azure Data Lake , Azure Data Factory , data warehouse design , SQL optimization , and AI/ML integration . The Data Architect will design and oversee robust, scalable, and secure data architectures to support advanced analytics and machine learning workloads. Qualification: BE/ME/MCA with 10+ Years in IT Experience. Must Have Skills/Skill Requirement: Define and drive the overall Azure-based data architecture strategy aligned with enterprise goals. Architect and implement scalable data pipelines, data lakes, and data warehouses using Azure Data Lake, ADF, and Azure SQL/Synapse. Provide technical leadership on Azure Databricks (Spark, Delta Lake, Notebooks, MLflow etc.) for large-scale data processing and advanced analytics use cases. Integrate AI/ML models into data pipelines and support end-to-end ML lifecycle (training, deployment, monitoring). Collaborate with cross-functional teams including data scientists, DevOps engineers, and business analysts. Evaluate and recommend tools, platforms, and design patterns for data and ML infrastructure. Mentor data engineers and junior architects on best practices and architectural standards. Strong experience with data modeling, ETL/ELT frameworks, and data warehousing concepts. Proficient in SQL, Python, PySpark. Solid understanding of AI/ML workflows and tools. Exposure on Azure DevOps. Excellent communication and stakeholder management skills. How to Apply ? Are you an innovator? Here is your chance to make your mark with a global technology leader. Apply now! Global Privacy Notice Lexmark is committed to appropriately protecting and managing any personal information you share with us. Click here to view Lexmark's Privacy Notice.
Posted 1 day ago
3.0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector FS X-Sector Specialism Data, Analytics & AI Management Level Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Senior Associate Exp : 3 - 6 Years Location: Kolkata Technical Skills: · Strong expertise in Azure Databricks, Azure Data Factory (ADF), PySpark, SQL Server, and Python. · Solid understanding of Azure Functions and their application in data processing workflows. · Understanding of DevOps practices and CI/CD pipelines for data solutions. · Experience with other ETL tools such as Informatica Intelligent Cloud Services (IICS) is a plus. · Strong problem-solving skills and ability to work independently and collaboratively in a fast-paced environment. · Excellent communication skills to effectively convey technical concepts to non-technical stakeholders. Key Responsibilities: · Develop, maintain, and optimize scalable data pipelines using Azure Databricks, Azure Data Factory (ADF), and PySpark. · Collaborate with data architects and business stakeholders to translate requirements into technical solutions. · Implement and manage data integration processes using SQL Server and Python. · Design and deploy Azure Functions to support data processing workflows. · Monitor and troubleshoot data pipeline performance and reliability issues. · Ensure data quality, security, and compliance with industry standards and best practices. · Document technical specifications and maintain clear and concise project documentation. Mandatory skill sets: Azure Databricks, Azure Data Factory (ADF), and PySpark. Preferred skill sets: Azure Databricks, Azure Data Factory (ADF), and PySpark. Years of experience required: 3-6 Years Education qualification: B.E.(B.Tech)/M.E/M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering, Master of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Microsoft Azure Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis, Intellectual Curiosity, Java (Programming Language), Market Development {+ 11 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship? No Government Clearance Required? No Job Posting End Date
Posted 1 day ago
0 years
0 Lacs
Greater Kolkata Area
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Microsoft Management Level Senior Associate Job Description & Summary At PwC, our people in software and product innovation focus on developing cutting-edge software solutions and driving product innovation to meet the evolving needs of clients. These individuals combine technical experience with creative thinking to deliver innovative software products and solutions. Those in software engineering at PwC will focus on developing innovative software solutions to drive digital transformation and enhance business performance. In this field, you will use your knowledge to design, code, and test cutting-edge applications that revolutionise industries and deliver exceptional user experiences. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. Job Description & Summary: We are seeking a Data Engineer to design, develop, and maintain data ingestion processes to a data platform built using Microsoft Technologies, ensuring data quality and integrity. The role involves collaborating with data architects and business analysts to implement solutions using tools like ADF, Azure Databricks, and requires strong SQL skills. Responsibilities: Key responsibilities include developing, testing, and optimizing ETL workflows and maintaining documentation. ETL development experience in Microsoft data track are required. Work with business team to translate the business requirement to technical requirements. Demonstrated expertise in Agile methodologies, including Scrum, Kanban, or SAFe. Mandatory skill sets: · Strong proficiency in Azure Databricks, including Spark and Delta Lake. · Experience with Azure Data Factory, Azure Data Lake Storage, and Azure SQL Database. · Proficiency in data integration and ETL processes and T-SQL. · Experienced working in Python for data engineering · Experienced working in Postgres Database · Experienced working in graph database · Experienced in architecture design and data modelling Good To Have Skill Sets: · Unity Catalog / Purview · Familiarity with Fabric/Snowflake service offerings · Visualization tool – PowerBI Preferred skill sets: Hands on knowledge of python, Pyspark and strong SQL knowledge. ETL and data warehousing is must. Relevant certifications (Any one) (e.g., Databricks Data Engineer Associate Microsoft Certified: Azure Data Engineer Associate Azure Solution Architect) are mandatory Years of experience required: 5+yrs Education qualification: Bachelor's degree in Computer Science, IT, or a related field. Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills Data Engineering Optional Skills Acceptance Test Driven Development (ATDD), Acceptance Test Driven Development (ATDD), Accepting Feedback, Active Listening, Analytical Thinking, Android, API Management, Appian (Platform), Application Development, Application Frameworks, Application Lifecycle Management, Application Software, Business Process Improvement, Business Process Management (BPM), Business Requirements Analysis, C#.NET, C++ Programming Language, Client Management, Code Review, Coding Standards, Communication, Computer Engineering, Computer Science, Continuous Integration/Continuous Delivery (CI/CD), Creativity {+ 46 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 1 day ago
5.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
Job Family Data Science & Analysis (India) Travel Required None Clearance Required None What You Will Do Design, develop, and maintain robust, scalable, and efficient data pipelines and ETL/ELT processes. Lead and execute data engineering projects from inception to completion, ensuring timely delivery and high quality. Build and optimize data architectures for operational and analytical purposes. Collaborate with cross-functional teams to gather and define data requirements. Implement data quality, data governance, and data security practices. Manage and optimize cloud-based data platforms ( Azure\AWS). Develop and maintain Python/PySpark libraries for data ingestion, Processing and integration with both internal and external data sources. Design and optimize scalable data pipelines using Azure data factory and Spark(Databricks) Work with stakeholders, including the Executive, Product, Data and Design teams to assist with data-related technical issues and support their data infrastructure needs. Develop frameworks for data ingestion, transformation, and validation. Mentor junior data engineers and guide best practices in data engineering. Evaluate and integrate new technologies and tools to improve data infrastructure. Ensure compliance with data privacy regulations (HIPAA, etc.). Monitor performance and troubleshoot issues across the data ecosystem. Automated deployment of data pipelines using GIT hub actions \ Azure devops What You Will Need Bachelors or master’s degree in computer science, Information Systems, Statistics, Math, Engineering, or related discipline. Minimum 5 + years of solid hands-on experience in data engineering and cloud services. Extensive working experience with advanced SQL and deep understanding of SQL. Good Experience in Azure data factory (ADF), Databricks , Python and PySpark. Good experience in modern data storage concepts data lake, lake house. Experience in other cloud services (AWS) and data processing technologies will be added advantage. Ability to enhance , develop and resolve defects in ETL process using cloud services. Experience handling large volumes (multiple terabytes) of incoming data from clients and 3rd party sources in various formats such as text, csv, EDI X12 files and access database. Experience with software development methodologies (Agile, Waterfall) and version control tools Highly motivated, strong problem solver, self-starter, and fast learner with demonstrated analytic and quantitative skills. Good communication skill. What Would Be Nice To Have AWS ETL Platform – Glue , S3 One or more programming languages such as Java, .Net Experience in US health care domain and insurance claim processing. What We Offer Guidehouse offers a comprehensive, total rewards package that includes competitive compensation and a flexible benefits package that reflects our commitment to creating a diverse and supportive workplace. About Guidehouse Guidehouse is an Equal Opportunity Employer–Protected Veterans, Individuals with Disabilities or any other basis protected by law, ordinance, or regulation. Guidehouse will consider for employment qualified applicants with criminal histories in a manner consistent with the requirements of applicable law or ordinance including the Fair Chance Ordinance of Los Angeles and San Francisco. If you have visited our website for information about employment opportunities, or to apply for a position, and you require an accommodation, please contact Guidehouse Recruiting at 1-571-633-1711 or via email at RecruitingAccommodation@guidehouse.com. All information you provide will be kept confidential and will be used only to the extent required to provide needed reasonable accommodation. All communication regarding recruitment for a Guidehouse position will be sent from Guidehouse email domains including @guidehouse.com or guidehouse@myworkday.com. Correspondence received by an applicant from any other domain should be considered unauthorized and will not be honored by Guidehouse. Note that Guidehouse will never charge a fee or require a money transfer at any stage of the recruitment process and does not collect fees from educational institutions for participation in a recruitment event. Never provide your banking information to a third party purporting to need that information to proceed in the hiring process. If any person or organization demands money related to a job opportunity with Guidehouse, please report the matter to Guidehouse’s Ethics Hotline. If you want to check the validity of correspondence you have received, please contact recruiting@guidehouse.com. Guidehouse is not responsible for losses incurred (monetary or otherwise) from an applicant’s dealings with unauthorized third parties. Guidehouse does not accept unsolicited resumes through or from search firms or staffing agencies. All unsolicited resumes will be considered the property of Guidehouse and Guidehouse will not be obligated to pay a placement fee.
Posted 1 day ago
3.0 years
0 Lacs
India
On-site
Extensive technical experience Complex SQL queries in SQL Server. • 3 Years exp in IICS (Informatica Intelligent Cloud Services), Snowflake, Strong SQL Skills • Experience working in Azure technologies like Data Verse, ADF, ADLS, Power Platform, Synapse etc. • Should have strong experience in all phases of the project life cycle from Requirements gathering to implementation • Should have expertise in extracting and loading data from source/target systems like Snowflake, Oracle SQL, flat files, XML sources, etc. • Experienced in using transformations like Expression, Router, Filter, Lookup, Hierarchy Builder, Hierarchy Parser, Business Service, Update Strategy, Union, Joiner and Aggregator • Strong technical experience building data integration processes by constructing mappings, mapping tasks, task flows, schedules, and parameter files • Experienced in developing Data Integration mappings using REST APIs and configuring swagger files and rest connections • Experienced with performance optimization, error handling, debugging, and monitoring • Experienced in writing complex SQL queries and knowledge of SQL Analytical functions
Posted 1 day ago
3.0 - 6.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role Description Hiring Locations: Chennai, Trivandrum, Kochi Experience Range: 3 to 6 years Role Description The L1 Data Ops Analyst / Data Pipeline Developer is responsible for developing, testing, and maintaining robust data pipelines and monitoring operational dashboards to ensure smooth data flow. This role demands proficiency in data engineering tools, SQL, and cloud platforms, with the ability to work independently and in 24x7 shift environments. The candidate should be capable of analyzing data, troubleshooting issues using SOPs, and collaborating effectively across support levels. Key Responsibilities Development & Engineering: Design, code, test, and implement scalable and efficient data pipelines. Develop features in accordance with requirements and low-level design. Write optimized, clean code using Python, PySpark, SQL, and ETL tools. Conduct unit testing and validate data integrity. Maintain comprehensive documentation of work. Monitoring & Support Monitor dashboards, pipelines, and databases across assigned shifts. Identify, escalate, and resolve anomalies using defined SOPs. Collaborate with L2/L3 teams to ensure timely issue resolution. Analyze trends and anomalies using SQL and Excel. Process Adherence & Contribution Follow configuration and release management processes. Participate in estimation, knowledge sharing, and defect management. Adhere to SLA and compliance standards. Contribute to internal documentation and knowledge bases. Mandatory Skills Strong command of SQL for data querying and analysis. Proficiency in Python or PySpark for data manipulation. Experience in ETL tools (any of the following): Informatica, Talend, Apache Airflow, AWS Glue, Azure ADF, GCP DataProc/DataFlow. Experience working with cloud platforms (AWS, Azure, or GCP). Hands-on experience with data validation and performance tuning. Working knowledge of data schemas and data modeling. Good To Have Skills Certification in Azure, AWS, or GCP (foundational or associate level). Familiarity with monitoring tools and dashboard platforms. Understanding of data warehouse concepts. Exposure to BigQuery, ADLS, or similar services. Soft Skills Excellent written and verbal communication in English. Strong attention to detail and analytical skills. Ability to work in a 24x7 shift model, including night shifts. Ability to follow SOPs precisely and escalate issues appropriately. Self-motivated with minimal supervision. Team player with good interpersonal skills. Outcomes Expected Timely and error-free code delivery. Consistent adherence to engineering processes and release cycles. Documented and trackable issue handling with minimal escalations. Certification and training compliance. High availability and uptime of monitored pipelines and dashboards. Skills Sql,Data Analysis,Ms Excel,Dashboards
Posted 1 day ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 day ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Role : Data Engineer Full time / Contract Experience : 2 to 6Yrs Mode of Work : WFO Only Location : Chennai Job description: Key Skills: SQL, ETL Tools, ADF, ADB, SSIS, Reporting Tools Key Requirements: The day-to-day development activities will need knowledge of the below concepts. Expert-level knowledge in RDBMS (SQL Server) with clear understanding of SQL query writing, object creation and management and performance and optimisation of DB/DWH operations. Good understanding of Transactional and Dimensional Data Modelling, Star Schema, Facts/Dimensions, Relationships Good understanding of ETL concepts and exposure to tools such as Azure Data Factory, Azure Databricks, Airflow. In-depth expertise in Azure Data Factory and Databricks, including building scalable data pipelines, orchestrating complex workflows, implementing dynamic and parameterized pipelines, and optimizing Spark-based data transformations for large-scale integrations. Hands-on experience with Databricks Unity Catalog for centralized data governance, fine-grained access control, auditing, and managing data assets securely across multiple workspaces. Should have worked on at least 1 development lifecycle of one of the below: End-to-end ETL project (Involving above mentioned ETL tools) Ability to write and review test cases, test code and validate code. Good understanding of SDLC practices like source control, version management, usage of Azure Devops and CI/CD practices. Project context: Should have the skill to fully understand the context and use-case of a project and have a personal vision for it – Play the role of interfacing with customer directly on a daily basis. Should be able to converse with functional users and convert requirements into tangible processes/models and documentation in available templates. Should be able to provide consultative options to customer on best way to execute projects Should have a good understanding of project dynamics – scoping, setting estimates, setting timelines, working around timelines in case of exceptions, etc. Preferred skills: Knowledge of Python is a bonus Knowledge of SSIS is a bonus Knowledge of Azure Devops, Source Control/Repos is good to have
Posted 1 day ago
3.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Azure Data Engineer - Senior As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for candidates with strong technology and data understanding in big data engineering space, having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop & deploy big data pipelines in a cloud environment using Azure Cloud services ETL design, development and migration of existing on-prem ETL routines to Cloud Service Interact with senior leaders, understand their business goals, contribute to the delivery of the workstreams Design and optimize model codes for faster execution Skills And Attributes For Success Overall 3+ years of IT experience with 2+ Relevant experience in Azure Data Factory (ADF) and good hands-on with Exposure to latest ADF Version Hands-on experience on Azure functions & Azure synapse (Formerly SQL Data Warehouse) Should have project experience in Azure Data Lake / Blob (Storage purpose) Should have basic understanding on Batch Account configuration, various control options Sound knowledge in Data Bricks & Logic Apps Should be able to coordinate independently with business stake holders and understand the business requirements, implement the requirements using ADF To qualify for the role, you must have Be a computer science graduate or equivalent with 3-7 years of industry experience Have working experience in an Agile base delivery methodology (Preferable) Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Ideally, you’ll also have Client management skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 day ago
4.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.
Posted 1 day ago
4.0 - 8.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Greetings, TCS is conducting in-person interview drive in Kolkata on 28-jun-25 Job Role: Data Architect Experience : 4 -8 Years Location : Kolkata JOB DESCRIPTION: Languages – Java, Python, Scala AWS – S3, EMR, Glue, Redshift, Athena, Lamda Azure – Blob, ADLS, ADF, Synapse, PowerBI Google Cloud – Bigquery, DataProc, Looker Snowflake Databricks CDH - Hive, Spark, HDFS, Kafka CDH etc. ETL – Informatica, DBT, Mattilion,
Posted 1 day ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The job market for ADF (Application Development Framework) professionals in India is witnessing significant growth, with numerous opportunities available for job seekers in this field. ADF is a popular framework used for building enterprise applications, and companies across various industries are actively looking for skilled professionals to join their teams.
Here are 5 major cities in India where there is a high demand for ADF professionals: - Bangalore - Hyderabad - Pune - Chennai - Mumbai
The estimated salary range for ADF professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
In the ADF job market in India, a typical career path may include roles such as Junior Developer, Senior Developer, Technical Lead, and Architect. As professionals gain more experience and expertise in ADF, they can progress to higher-level positions with greater responsibilities.
In addition to ADF expertise, professionals in this field are often expected to have knowledge of related technologies such as Java, Oracle Database, SQL, JavaScript, and web development frameworks like Angular or React.
Here are 25 interview questions for ADF roles, categorized by difficulty level: - Basic: - What is ADF and its key features? - What is the difference between ADF Faces and ADF Task Flows? - Medium: - Explain the lifecycle of an ADF application. - How do you handle exceptions in ADF applications? - Advanced: - Discuss the advantages of using ADF Business Components. - How would you optimize performance in an ADF application?
As you explore job opportunities in the ADF market in India, make sure to enhance your skills, prepare thoroughly for interviews, and showcase your expertise confidently. With the right preparation and mindset, you can excel in your ADF career and secure rewarding opportunities in the industry. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20183 Jobs | Dublin
Wipro
10025 Jobs | Bengaluru
EY
8024 Jobs | London
Accenture in India
6531 Jobs | Dublin 2
Amazon
6260 Jobs | Seattle,WA
Uplers
6244 Jobs | Ahmedabad
Oracle
5916 Jobs | Redwood City
IBM
5765 Jobs | Armonk
Capgemini
3771 Jobs | Paris,France
Tata Consultancy Services
3728 Jobs | Thane