Jobs
Interviews

1487 Adf Jobs - Page 15

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

5 - 9 Lacs

Bengaluru

On-site

Req ID: 330864 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Senior Dev Ops Engineer to join our team in Bangalore, Karnātaka (IN-KA), India (IN). "Job Duties: -DEVOps Exp in Establishing and managing CI/CD pipelines to automate the build, test, and deployment processes. Exp in Provision and manage infrastructure resources in the cloud using tools like Terraform. Exp in Azure Databricks, Azure DevOps tools, Terraform / Azure Resource Manager , Containerization and Orchestration with Docker and Kubernetes. Version control Exp - Git or Azure Repos Scripting automation - Azure CLI/Powershell Must have: Proficiency in Cloud Technologies Azure, Azure Databricks, ADF, CI/CD pipelines, terraform, Hashicorp Vault, Github, Git Preferred: Containerization and Orchestration with Docker and Kubernetes,IAM, RBAC, OAuth, Change Managment, SSL certificates Knowledge of security best practices and compliance frameworks like GDPR or HIPAA. Minimum Skills Required: -DEVOps Exp in Establishing and managing CI/CD pipelines to automate the build, test, and deployment processes. Exp in Provision and manage infrastructure resources in the cloud using tools like Terraform. Exp in Azure Databricks, Azure DevOps tools, Terraform / Azure Resource Manager , Containerization and Orchestration with Docker and Kubernetes. Version control Exp - Git or Azure Repos Scripting automation - Azure CLI/Powershell Must have: Proficiency in Cloud Technologies Azure, Azure Databricks, ADF, CI/CD pipelines, terraform, Hashicorp Vault, Github, Git Preferred: Containerization and Orchestration with Docker and Kubernetes,IAM, RBAC, OAuth, Change Managment, SSL certificates Knowledge of security best practices and compliance frameworks like GDPR or HIPAA." About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies. Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us at us.nttdata.com NTT DATA endeavors to make https://us.nttdata.com accessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Designation : Data Architect. Location : Pune. Experience : 10-15 years. Job Description Role & Responsibilities : The architect should have experience in architecting large scale analytics solutions using native services such as Azure Synapse, Data Lake, Data Factory, HDInsight, Databricks, Azure Cognitive Services, Azure ML, Azure Event Hub. Assist with creation of a robust, sustainable architecture that supports requirements and provides for expansion with secured access. Experience in building/running large data environment for BFSI clients. Work with customers, end users, technical architects, and application designers to define the data requirements and data structure for BI/Analytic solutions. Designs conceptual and logical models for the data lake, data warehouse, data mart, and semantic layer (data structure, storage, and integration). Lead the database analysis, design, and build effort. Communicates physical database designs to lead data architect/database administrator. Evolves data models to meet new and changing business requirements. Work with business analysts to identify and understand requirements and source data systems. Skills Required Big Data Technologies : Expert in big data technologies on Azure/GCP. ETL Platforms : Experience with ETL platforms like ADF, Glue, Ab Initio, Informatica, Talend, Airflow. Data Visualization : Experience in data visualization tools like Tableau, Power BI, etc. Data Engineering & Management : Experience in a data engineering, metadata management, database modeling and development role. Streaming Data Handling : Strong experience in handling streaming data with Kafka. Data API Understanding : Understanding of Data APIs, Web services. Data Security : Experience in Data security and Data Archiving/Backup, Encryption and define the standard processes for same. DataOps/MLOps : Experience in setting up DataOps and MLOps. Integration : Work with other architects to ensure that all components work together to meet objectives and performance goals as defined in the requirements. Data Science Coordination : Coordinate with the Data Science Teams to identify future data needs and requirements and creating pipelines for them. Soft Skills Soft skills such as communication, leading the team, taking ownership and accountability to successful engagement. Participate in quality management reviews. Managing customer expectation and business user interactions. Deliver key research (MVP, POC) with an efficient turn-around time to help make strong product decisions. Demonstrate key understanding and expertise on modern technologies, architecture, and design. Mentor the team to deliver modular, scalable, and high-performance code. Innovation : Be a change agent on key innovation and research to keep the product, team at the cutting edge of technical and product innovation. (ref:hirist.tech)

Posted 2 weeks ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Primary skills: Technology->AWS->Devops Technology->Cloud Integration->Azure Data Factory (ADF),Technology->Cloud Platform->AWS Database, Technology->Cloud Platform->Azure Devops->Azure Pipelines, Technology->DevOps->Continuous integration - Mainframe A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to actively aid the consulting team in different phases of the project including problem definition, effort estimation, diagnosis, solution generation and design and deployment You will explore the alternatives to the recommended solutions based on research that includes literature surveys, information available in public domains, vendor evaluation information, etc. and build POCs You will create requirement specifications from the business needs, define the to-be-processes and detailed functional designs based on requirements. You will support configuring solution requirements on the products; understand if any issues, diagnose the root-cause of such issues, seek clarifications, and then identify and shortlist solution alternatives You will also contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Ability to work with clients to identify business challenges and contribute to client deliverables by refining, analyzing, and structuring relevant data Awareness of latest technologies and trends Logical thinking and problem solving skills along with an ability to collaborate Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge

Posted 2 weeks ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description Customer Success Services (CSS) Are you passionate about problem solving? If you are enthusiastic to learn cutting edge technologies, you have interest in innovation and you are customer-centric- we want you with us! Oracle is a technology leader that’s changing how the world does business – and our Customer Success Services (CSS) team supports over 6,000 companies around the world. We’re looking for an experienced and self-motivated Sr. / Sr. Principal Support Engineer - EBS Apps Developer. Join the team of highly skilled technical experts who build and maintain our clients’ technical landscapes through tailored support services. The EBS Oracle Applications developer is an experienced technical professional, who has an understanding of business solutions, industry best practices, multiple business processes and technology designs within the Oracle Applications supporting products and technologies. The candidate should have experience in implementation or support of large to medium Oracle Applications implementation projects. He or She should be able to operate independently to provide quality work products, and perform varied and complex duties and tasks that need independent judgment. Your Opportunity We are looking for flexible and open-minded experts, able to work with different technologies, and address complex architectures, on premises, cloud, or Hybrid environments. We look for engineers who can quickly learn and who are willing to work with new and innovative products and solutions, and who are capable to interact and collaborate with people in different teams globally to provide always the best-tailored solution to Oracle customers. CSS offers a professional context where engineers can develop themselves constantly and where they can always be in touch with the most innovative technologies both in on-prem and in cloud environments. SKILLS: Strong technical knowledge in Oracle applications, SQL and PL-SQL is a must. Strong knowledge in OAF, XML, Oracle Forms and Reports, AME, WF, APEX is a must. Java, ADF, JET and PaaS skills. Oracle relevant technical Certification. Good understanding of functional parts of the developed code (Preferably in Oracle Financials and HRMS). Strong analytical and problem solving skills Technical troubleshooting experience. Our Ideal Candidate In addition to the technical capabilities, our ideal candidate is a person who: The job involves working with customers in different time zones and resource should be flexible to work in shifts including night shifts Resource should be able to independently work on CEMLI objects - Design, develop and test Technically good in development and experience on EBS Financial Modules Resource should be able to investigate, analyze, design and develop solution for enhancements/developments related to CEMLI’s Resource should be able to identify the impact of patches and determine functional and technical steps required to minimize the disruption to business Report progress/status/risk/issues on development at regular basis. Resource should be able to manage the complete development pipeline and manage the scope, time and cost and delivery of all the CEMLIs Resource should be able to lead the support team in Incident and Problem Management and come up with innovative solutions in short span of time. Resource should be able to understand customer requirements/user stories and implement practical solutions. Resource should have hands on knowledge and expertise on Oracle EBS R12 and Fusion/SaaS modules Resource should have Good knowledge of business processes and application setups and the impacts of one setups to another. REQUIREMENTS: Minimum 10 years of relevant experience. Excellent problem-solving skills and troubleshooting skills. Ability to work effectively in a team, collaborating with stakeholders to solve business needs. Strong communication and teamwork skills. Self driven and result oriented Collaborate with product owners, QA teams, and stakeholders to understand requirements, work on user stories/backlog items, and ensure high-quality delivery. Ability to keep track of schedules and ensure on-time delivery of assigned tasks, optimizing pace and meeting deadlines. Participate in standup meetings and provide progress updates regularly. Experience in understanding customer requirement. Good knowledge of business processes and application setups. Good Technical expertise on EBS/integrations architecture Fluent English (other additional languages will be also valued) Availability to travel and work onsite at customers by not less than 50% Availability to work 24x7 (on-call) RESPONSIBILITIES: Work on developing technical solutions to meet business requirements gathered and documented by functional consultant Identify and resolve key issues related to code change requirements and bug fixes Support Oracle ERP products and services from the technical aspect in line with the contractual agreement Works with support to resolve Customers SRs. Conduct knowledge transfer sessions both within the Oracle team and to end users. Work closely with the functional team and delivery leaders to provide development work estimates and drive excellence in technical work. To develop and manage the technical relationship with a designated account(s) in order to maximize the value of CSS to the customer, To develop and maintain trusted relationships with the other Oracle contacts within designated account(s) and relevant third parties, To act as the technical primary point of contact for Oracle Support To safeguard customer satisfaction, and renewal, through quality delivery and added value. Engage directly in architectural tasks and collaborate with colleagues to implement best practices specific to the projects. Detect and address performance challenges, security issues, and other technical concerns proactively. Analyze, troubleshoot and solve whenever feasible, the issues the customer may face using Oracle products. Identify required/recommended actions on Customer systems as main output of service delivery, based on own knowledge and experience; Escalate at the right time customer issues to Technical Account Manager where relevant; Ensure adherence to internal methodology, tools and quality standards; Actively participate on Services development. Actively collaborate with other engineers in the team or in other teams, to share knowledge, experiences and others, which can benefit CSS Business results. Career Level - IC4 About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Kolkata, West Bengal, India

Remote

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a highly skilled and motivated Senior DataOps Engineer with strong expertise in the Azure data ecosystem. You will play a crucial role in managing and optimizing data workflows across Azure platforms such as Azure Data Factory, Data Lake, Databricks, and Synapse. Your primary focus will be on building, maintaining, and monitoring data pipelines, ensuring high data quality, and supporting critical data operations. You'll also support visualization, automation, and CI/CD processes to streamline data delivery and reporting. Your Key Responsibilities Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage. ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources. Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices. Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity. Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues. Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data. DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation. Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow. Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency. Skills And Attributes For Success Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks Solid understanding of ETL/ELT design and implementation principles Strong SQL and PySpark skills for data transformation and validation Exposure to Python for automation and scripting Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred) Experience in working with Power BI or Tableau for data visualization and reporting support Strong problem-solving skills, attention to detail, and commitment to data quality Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing. To qualify for the role, you must have 4–6 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Azure Databricks: Experience in data transformation and processing using notebooks and Spark. Azure Data Lake: Experience working with hierarchical data storage in Data Lake. Azure Synapse: Familiarity with distributed data querying and data warehousing. Azure Data factory: Hands-on experience in orchestrating and monitoring data pipelines. ETL Process Understanding: Knowledge of data extraction, transformation, and loading workflows, including data cleansing, mapping, and integration techniques. Good to have Power BI or Tableau for reporting support Monitoring/logging using Azure Monitor or Log Analytics Azure DevOps and Git for CI/CD and version control Python and/or PySpark for scripting and data handling Informatica Cloud Data Integration (CDI) or similar ETL tools Shell scripting or command-line data SQL (across distributed and relational databases) What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 weeks ago

Apply

0 years

6 - 8 Lacs

Bengaluru

On-site

 Manage and Maintain the RBI ADF/reporting system to ensure the timely and accurate submission of regulatory returns as and when required.  Act as Money Laundering Reporting Officer and perform all duties and responsibilities to ensure adherence with RBI rules and regulatory bodies.  Liaise with the RBI / FIU and other regulatory bodies as required to ensure compliance with RBI rules and regulations and other requirements of a legal nature.  Provide managers of the other teams with appropriate and up-to-date information or data immediately as requested.  Authorize and release payment order filtered by OFAC Filtering System.  Make a return of Bank Audit posted by Audit Company.  Work closely with the Chief Executive Officer in overseeing compliance procedures and advise on risk management.  Assist the Chief Executive Officer with the development of the entity-wide budget for compliance efforts including identifying resource gaps and direct resources appropriately whether within the department or in other areas of the Bank.  Create process and manuals according to KEB Hana Bank policy and periodically to be reviewed.  Manage daily & monthly Audit.  Manage audit set up by H.O.  Act in the capacity of Internal Auditor ensuring that regular audits are performed of all departments of the branch.  Train all staffs for internal control & AML and report to H.O.  Establish and execute yearly Compliance Plan and report results to H.O.  Monitor internal control process and submit Monthly Compliance Report to H.O.  Preview and assess new / renewal of contracts, proposals of launching new banking products / services and submissions of bank’s internal data to external parties.  Manage and Maintain intimate relationship with regulators for cooperations. Job Type: Full-time Pay: ₹650,000.00 - ₹800,000.00 per year Schedule: Day shift Work Location: In person

Posted 2 weeks ago

Apply

10.0 years

26 - 30 Lacs

Chennai

On-site

We are looking for Associate Division Manager for one of our Major Client .This role includes designing and building AI/ML products at scale to improve customer Understanding & Sentiment analysis, recommend customer requirements, recommend optimal inputs, Improve efficiency of Process. This role will collaborate with product owners and business owners Key Responsibilities: - Leading a team of junior and experienced data scientists Lead and participate in end-to-end ML projects deployments that require feasibility analysis, design, development, validation, and application of state-of-the art data science solutions. Push the state of the art in terms of the application of data mining, visualization, predictive modelling, statistics, trend analysis, and other data analysis techniques to solve complex business problems including lead classification, recommender systems, product life-cycle modelling, Design Optimization problems, Product cost & weigh optimization problems.Functional Responsibilities :- Leverage and enhance applications utilizing NLP, LLM, OCR, image based models and Deep Learning Neural networks for use cases including text mining, speech and object recognition Identify future development needs, advance new emerging ML and AI technology, and set the strategy for the data science team Cultivate a product-centric, results-driven data science organization Write production ready code and deploy real time ML models; expose ML outputs through APIs Partner with data/ML engineers and vendor partners for input data pipes development and ML models automation Provide leadership to establish world-class ML lifecycle management processes.Qualification :- MTech / BE / BTech / MSc in CS Exp:- Over 10 years of Applied Machine learning experience in the fields of Machine Learning, Statistical Modelling, Predictive Modelling, Text Mining, Natural Language Processing (NLP), LLM, OCR, Image based models, Deep learning Expert Python Programmer: SQL, C#, extremely proficient with the SciPy stack (e.g. numpy, pandas, sci-kit learn, matplotlib) Proficiency in work with open source deep learning platforms like TensorFlow, Keras, Pytorch Knowledge of the Big Data Ecosystem: (Apache Spark, Hadoop, Hive, EMR, MapReduce) Proficient in Cloud Technologies and Service (Azure Databricks, ADF, Databricks MLflow).Functional Competencies :- A demonstrated ability to mentor junior data scientists and proven experience in collaborative work environments with external customers Proficient in communicating technical findings to non-technical stakeholders Holding routine peer code review of ML work done by the team Experience in leading and / or collaborating with small to midsized teams Experienced in building scalable / highly available distribute systems in production Experienced in ML lifecycle mgmt. and ML Ops tools & frameworks.Job type:- FTE Location:- Chennai Job Type: Contractual / Temporary Pay: ₹2,633,123.63 - ₹3,063,602.96 per year Schedule: Monday to Friday Education: Bachelor's (Preferred) Work Location: In person

Posted 2 weeks ago

Apply

0 years

8 - 9 Lacs

Chennai

On-site

Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. Fiche de poste : About UPS UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Supply Chain Symphony™ The UPS Supply Chain Symphony™ platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making. About the role We are seeking Data Developer to join our data engineering team responsible for building and maintaining complex data solutions using Azure Data Factory (ADF), Azure Databricks , and Cosmos DB . The role involves designing and developing scalable data pipelines, implementing data transformations, and ensuring high data quality and performance. Work closely with data architects, testers, and analysts to deliver robust data solutions that support strategic business initiatives. The ideal candidate should possess deep expertise in big data technologies, data integration, and cloud-native data engineering solutions on Microsoft Azure. Primary Skills Data Engineering: Azure Data Factory (ADF), Azure Databricks. Cloud Platform: Microsoft Azure (Data Lake Storage, Cosmos DB). Data Modeling: NoSQL data modeling, Data warehousing concepts. Performance Optimization: Data pipeline performance tuning and cost optimization. Programming Languages: Python, SQL, PySpark Secondary Skills DevOps and CI/CD: Azure DevOps, CI/CD pipeline design and automation. Security and Compliance: Implementing data security and governance standards. Agile Methodologies: Experience in Agile/Scrum environments. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications in Azure and Data Engineering, such as: Microsoft Certified: Azure Data Engineer Associate Microsoft Certified: Azure Solutions Architect Expert Databricks Certified Data Engineer Associate or Professional Type de contrat: en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

15 - 25 Lacs

Gurugram, Bengaluru

Hybrid

Warm Greetings from SP Staffing!! Role: Azure Data Engineer Experience Required : 5 to 8 yrs Work Location : Bangalore/Gurgaon Required Skills, Azure Databricks, ADF, Pyspark/SQL Interested candidates can send resumes to nandhini.spstaffing@gmail.com

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Kanayannur, Kerala, India

Remote

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a highly skilled and motivated Senior DataOps Engineer with strong expertise in the Azure data ecosystem. You will play a crucial role in managing and optimizing data workflows across Azure platforms such as Azure Data Factory, Data Lake, Databricks, and Synapse. Your primary focus will be on building, maintaining, and monitoring data pipelines, ensuring high data quality, and supporting critical data operations. You'll also support visualization, automation, and CI/CD processes to streamline data delivery and reporting. Your Key Responsibilities Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage. ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources. Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices. Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity. Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues. Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data. DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation. Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow. Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency. Skills And Attributes For Success Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks Solid understanding of ETL/ELT design and implementation principles Strong SQL and PySpark skills for data transformation and validation Exposure to Python for automation and scripting Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred) Experience in working with Power BI or Tableau for data visualization and reporting support Strong problem-solving skills, attention to detail, and commitment to data quality Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing. To qualify for the role, you must have 4–6 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Azure Databricks: Experience in data transformation and processing using notebooks and Spark. Azure Data Lake: Experience working with hierarchical data storage in Data Lake. Azure Synapse: Familiarity with distributed data querying and data warehousing. Azure Data factory: Hands-on experience in orchestrating and monitoring data pipelines. ETL Process Understanding: Knowledge of data extraction, transformation, and loading workflows, including data cleansing, mapping, and integration techniques. Good to have Power BI or Tableau for reporting support Monitoring/logging using Azure Monitor or Log Analytics Azure DevOps and Git for CI/CD and version control Python and/or PySpark for scripting and data handling Informatica Cloud Data Integration (CDI) or similar ETL tools Shell scripting or command-line data SQL (across distributed and relational databases) What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Trivandrum, Kerala, India

Remote

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a highly skilled and motivated Senior DataOps Engineer with strong expertise in the Azure data ecosystem. You will play a crucial role in managing and optimizing data workflows across Azure platforms such as Azure Data Factory, Data Lake, Databricks, and Synapse. Your primary focus will be on building, maintaining, and monitoring data pipelines, ensuring high data quality, and supporting critical data operations. You'll also support visualization, automation, and CI/CD processes to streamline data delivery and reporting. Your Key Responsibilities Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage. ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources. Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices. Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity. Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues. Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data. DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation. Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow. Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency. Skills And Attributes For Success Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks Solid understanding of ETL/ELT design and implementation principles Strong SQL and PySpark skills for data transformation and validation Exposure to Python for automation and scripting Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred) Experience in working with Power BI or Tableau for data visualization and reporting support Strong problem-solving skills, attention to detail, and commitment to data quality Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing. To qualify for the role, you must have 4–6 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Azure Databricks: Experience in data transformation and processing using notebooks and Spark. Azure Data Lake: Experience working with hierarchical data storage in Data Lake. Azure Synapse: Familiarity with distributed data querying and data warehousing. Azure Data factory: Hands-on experience in orchestrating and monitoring data pipelines. ETL Process Understanding: Knowledge of data extraction, transformation, and loading workflows, including data cleansing, mapping, and integration techniques. Good to have Power BI or Tableau for reporting support Monitoring/logging using Azure Monitor or Log Analytics Azure DevOps and Git for CI/CD and version control Python and/or PySpark for scripting and data handling Informatica Cloud Data Integration (CDI) or similar ETL tools Shell scripting or command-line data SQL (across distributed and relational databases) What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

Remote

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a highly skilled and motivated Senior DataOps Engineer with strong expertise in the Azure data ecosystem. You will play a crucial role in managing and optimizing data workflows across Azure platforms such as Azure Data Factory, Data Lake, Databricks, and Synapse. Your primary focus will be on building, maintaining, and monitoring data pipelines, ensuring high data quality, and supporting critical data operations. You'll also support visualization, automation, and CI/CD processes to streamline data delivery and reporting. Your Key Responsibilities Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage. ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources. Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices. Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity. Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues. Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data. DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation. Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow. Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency. Skills And Attributes For Success Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks Solid understanding of ETL/ELT design and implementation principles Strong SQL and PySpark skills for data transformation and validation Exposure to Python for automation and scripting Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred) Experience in working with Power BI or Tableau for data visualization and reporting support Strong problem-solving skills, attention to detail, and commitment to data quality Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing. To qualify for the role, you must have 4–6 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Azure Databricks: Experience in data transformation and processing using notebooks and Spark. Azure Data Lake: Experience working with hierarchical data storage in Data Lake. Azure Synapse: Familiarity with distributed data querying and data warehousing. Azure Data factory: Hands-on experience in orchestrating and monitoring data pipelines. ETL Process Understanding: Knowledge of data extraction, transformation, and loading workflows, including data cleansing, mapping, and integration techniques. Good to have Power BI or Tableau for reporting support Monitoring/logging using Azure Monitor or Log Analytics Azure DevOps and Git for CI/CD and version control Python and/or PySpark for scripting and data handling Informatica Cloud Data Integration (CDI) or similar ETL tools Shell scripting or command-line data SQL (across distributed and relational databases) What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Noida, Uttar Pradesh, India

Remote

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity We are seeking a highly skilled and motivated Senior DataOps Engineer with strong expertise in the Azure data ecosystem. You will play a crucial role in managing and optimizing data workflows across Azure platforms such as Azure Data Factory, Data Lake, Databricks, and Synapse. Your primary focus will be on building, maintaining, and monitoring data pipelines, ensuring high data quality, and supporting critical data operations. You'll also support visualization, automation, and CI/CD processes to streamline data delivery and reporting. Your Key Responsibilities Data Pipeline Management: Build, monitor, and optimize data pipelines using Azure Data Factory (ADF), Databricks, and Azure Synapse for efficient data ingestion, transformation, and storage. ETL Operations: Design and maintain robust ETL processes for batch and real-time data processing across cloud and on-premise sources. Data Lake Management: Organize and manage structured and unstructured data in Azure Data Lake, ensuring performance and security best practices. Data Quality & Validation: Perform data profiling, validation, and transformation using SQL, PySpark, and Python to ensure data integrity. Monitoring & Troubleshooting: Use logging and monitoring tools to troubleshoot failures in pipelines and address data latency or quality issues. Reporting & Visualization: Work with Power BI or Tableau teams to support dashboard development, ensuring the availability of clean and reliable data. DevOps & CI/CD: Support data deployment pipelines using Azure DevOps, Git, and CI/CD practices for version control and automation. Tool Integration: Collaborate with cross-functional teams to integrate Informatica CDI or similar ETL tools with Azure components for seamless data flow. Collaboration & Documentation: Partner with data analysts, engineers, and business stakeholders, while maintaining SOPs and technical documentation for operational efficiency. Skills And Attributes For Success Strong hands-on experience in Azure Data Factory, Azure Data Lake, Azure Synapse, and Databricks Solid understanding of ETL/ELT design and implementation principles Strong SQL and PySpark skills for data transformation and validation Exposure to Python for automation and scripting Familiarity with DevOps concepts, CI/CD workflows, and source control systems (Azure DevOps preferred) Experience in working with Power BI or Tableau for data visualization and reporting support Strong problem-solving skills, attention to detail, and commitment to data quality Excellent communication and documentation skills to interface with technical and business teamsStrong knowledge of asset management business operations, especially in data domains like securities, holdings, benchmarks, and pricing. To qualify for the role, you must have 4–6 years of experience in DataOps or Data Engineering roles Proven expertise in managing and troubleshooting data workflows within the Azure ecosystem Experience working with Informatica CDI or similar data integration tools Scripting and automation experience in Python/PySpark Ability to support data pipelines in a rotational on-call or production support environment Comfortable working in a remote/hybrid and cross-functional team setup Technologies and Tools Must haves Azure Databricks: Experience in data transformation and processing using notebooks and Spark. Azure Data Lake: Experience working with hierarchical data storage in Data Lake. Azure Synapse: Familiarity with distributed data querying and data warehousing. Azure Data factory: Hands-on experience in orchestrating and monitoring data pipelines. ETL Process Understanding: Knowledge of data extraction, transformation, and loading workflows, including data cleansing, mapping, and integration techniques. Good to have Power BI or Tableau for reporting support Monitoring/logging using Azure Monitor or Log Analytics Azure DevOps and Git for CI/CD and version control Python and/or PySpark for scripting and data handling Informatica Cloud Data Integration (CDI) or similar ETL tools Shell scripting or command-line data SQL (across distributed and relational databases) What We Look For Enthusiastic learners with a passion for data op’s and practices. Problem solvers with a proactive approach to troubleshooting and optimization. Team players who can collaborate effectively in a remote or hybrid work environment. Detail-oriented professionals with strong documentation skills. What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 2 weeks ago

Apply

6.0 years

0 Lacs

Greater Kolkata Area

On-site

Job Title : Lead ADF - Microsoft BI & Data Warehouse Lead Skills : Minimum 6+Years into ADF in ETL , Microsoft BI, DW, T-SQL / SQL, procedure writing, Azure/SQL Job Description : (Azure Cloud - API Cloud Integration is Mandate) with 3+Years Overview We are looking for an experienced Microsoft BI & Data Warehouse Lead to design, develop, and maintain robust data warehouse and ETL solutions using the Microsoft technology stack. The ideal candidate will have extensive expertise in SQL Server development, Azure Data Factory (ADF), and other Microsoft-based data integration tools such as Azure Functions, and other related tools. As a lead, he will play a crucial role in driving data strategy, ensuring system performance, and delivering scalable data solutions that meet organizational needs. Responsibilities Data Warehouse Development : Design and implement scalable and efficient data warehouse solutions. Develop complex SQL Server-based solutions, including T-SQL queries, stored procedures, and performance tuning. Optimize SQL Server databases, develop T-SQL scripts, and improve query performance. ETL Development And Maintenance Build and optimize ETL workflows using Azure Data Factory (ADF) for data integration from multiple sources. Ensure high-performance data pipelines for large-scale data processing. Integrate and automate data processes using Azure Functions to extend ETL capabilities. Cloud Integration Implement cloud-native solutions leveraging Azure SQL Database, Azure Functions, and Synapse Analytics. Support hybrid data integration scenarios combining on-premises and Azure services. Data Governance And Quality Establish and maintain robust data quality frameworks and governance standards. Ensure consistency, accuracy, and security of data across all platforms. Leadership And Collaboration Lead a team of BI and data professionals, providing mentorship and technical direction. Partner with stakeholders to understand business requirements and deliver data-driven solutions. Define project goals, timelines, and resources for successful execution. Should be flexible to support multiple IT platforms Managing Day to day activities - Jira request, SQL execution, access request, resolving alerts and updating Tickets Requirements Education : Bachelors degree in computer science, Information Technology, or related field. B.Tech / B.E Skills (Mandatory) (Note : list of skills needed to perform the role) : Strong experience with SQL Server development (T-SQL, indexing, optimization). In-depth knowledge of BI-DW concepts and data modelling. Extensive experience with Azure Data Factory (ADF) for ETL tools Advanced knowledge of Azure SQL Database and cloud technologies. Excellent problem-solving, analytical, and leadership skills. Strong communication and teamwork abilities. Ability to translate business requirements into technical solutions Experience with cloud-based data platforms and migrations. Familiarity with DevOps for CI/CD in data integration pipelines. Skills (Good to Have) (Note : list of skills that will add value to the role) : Experience with additional cloud platforms (e.g., Azure, AWS, Google Cloud). Advanced skills in data visualization tools (e.g., Power BI, SSRS, Tableau). Proficiency in Python or other scripting languages. Experience with Databricks is a plus. Certifications Relevant certifications in SQL, BI-DW, or cloud platforms are highly desirable. Microsoft Certified : Azure Data Fundamentals /Any Cloud Certifications (preferred). Skills (Mandatory) (Note : list of skills needed to perform the Lead role) : Client Facing - US Based Clients - Need Strong Communication skill (ref:hirist.tech)

Posted 2 weeks ago

Apply

0 years

0 Lacs

Greater Kolkata Area

On-site

Job Title : Sr Lead ADF - Cloud API Integration Skills : Minimum 8 yrs into ADF in ETL, Microsoft BI, DW, T-SQL / SQL, procedure writing, Azure/SQL Job Description Overview : We are looking for an experienced Microsoft BI & Data Warehouse Lead to design, develop, and maintain robust data warehouse and ETL solutions using the Microsoft technology stack. The ideal candidate will have extensive expertise in SQL Server development, Azure Data Factory (ADF), and other Microsoft-based data integration tools such as Azure Functions, and other related tools. As a lead, he will play a crucial role in driving data strategy, ensuring system performance, and delivering scalable data solutions that meet organizational needs. Responsibilities Data Warehouse Development : Design and implement scalable and efficient data warehouse solutions. Develop complex SQL Server-based solutions, including T-SQL queries, stored procedures, and performance tuning. Optimize SQL Server databases, develop T-SQL scripts, and improve query performance. ETL Development And Maintenance Build and optimize ETL workflows using Azure Data Factory (ADF) for data integration from multiple sources. Ensure high-performance data pipelines for large-scale data processing. Integrate and automate data processes using Azure Functions to extend ETL capabilities. Cloud Integration Implement cloud-native solutions leveraging Azure SQL Database, Azure Functions, and Synapse Analytics. Support hybrid data integration scenarios combining on-premises and Azure services. Data Governance And Quality Establish and maintain robust data quality frameworks and governance standards. Ensure consistency, accuracy, and security of data across all platforms. Leadership And Collaboration Lead a team of BI and data professionals, providing mentorship and technical direction. Partner with stakeholders to understand business requirements and deliver data-driven solutions. Define project goals, timelines, and resources for successful execution. Should be flexible to support multiple IT platforms Managing Day to day activities - Jira request, SQL execution, access request, resolving alerts and updating Tickets Requirements Education : Bachelor's degree in computer science, Information Technology, or related field. B.Tech / B.E Skills (Mandatory) (Note : list of skills needed to perform the role) : Strong experience with SQL Server development (T-SQL, indexing, optimization). In-depth knowledge of BI-DW concepts and data modelling. Extensive experience with Azure Data Factory (ADF) for ETL tools Advanced knowledge of Azure SQL Database and cloud technologies. Excellent problem-solving, analytical, and leadership skills. Strong communication and teamwork abilities. Ability to translate business requirements into technical solutions Experience with cloud-based data platforms and migrations. Familiarity with DevOps for CI/CD in data integration pipelines. Skills (Good to Have) (Note : list of skills that will add value to the role) : Experience with additional cloud platforms (e.g., Azure, AWS, Google Cloud). Advanced skills in data visualization tools (e.g., Power BI, SSRS, Tableau). Proficiency in Python or other scripting languages. Experience with Databricks is a plus. Certifications Relevant certifications in SQL, BI-DW, or cloud platforms are highly desirable. Microsoft Certified : Azure Data Fundamentals /Any Cloud Certifications (preferred). Notice Period : Less then 15 Days Time / Or Immediate Joiners (Only) (ref:hirist.tech)

Posted 2 weeks ago

Apply

7.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Role Description Job Title: Senior Data Engineer Experience: 7+ Years Employment Type: Full-time Job Summary We are seeking a skilled Senior Data Engineer with strong experience in building scalable data pipelines and transforming complex data across enterprise platforms. The ideal candidate should have hands-on expertise in Databricks, PySpark, SQL , and ETL/ELT tools such as Informatica, AWS Glue, or DataProc . Experience with cloud data warehouses like Snowflake, BigQuery, or Delta Lake , and a strong understanding of data security, compliance, and DevOps is essential. Domain knowledge in banking, financial services, or cybersecurity is highly desirable. Key Responsibilities Design, build, and optimize secure data pipelines for large-scale data processing. Develop ETL/ELT jobs and implement Data Quality (DQ) rules within Databricks and Aurora platforms. Collaborate with Data Architects, DQ Analysts, and Cyber SMEs in Agile POD teams. Manage data modeling, performance tuning, and infrastructure cost optimization. Support data governance, DQ controls (e.g., BCBS 239, DUSE, DMOVE), and compliance reporting. Document architecture, test strategies, and ensure code quality and scalability. Required Skills Strong proficiency in Databricks, PySpark, SQL Experience with ETL tools (e.g., Glue, DataProc, ADF, Informatica) Cloud experience with AWS, Azure, or GCP Hands-on with data modeling, DQ implementation, and performance tuning Understanding of data security, encryption, and risk controls Excellent communication and stakeholder collaboration skills Preferred Qualifications Bachelor’s degree in Computer Science, Engineering, or a related field Experience in banking, financial services, or cybersecurity domains Familiarity with DUSE/DMOVE frameworks and cybersecurity metrics reporting Certification in cloud or data engineering tools is a plus Skills Databricks,Pyspark,Sql,Etl

Posted 2 weeks ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Greetings from TCS !!! TCS is Hiring for Azure Data Engineer Job Role: Azure Data Engineer Experience Range: 8+ Job Location: Noida / Chennai Interview Mode: Virtual (MS Teams) Responsibilities of / Expectations from the job: Developing, managing and optimizing robust and reliable data pipelines using azure native capabilities. Implement ADF workflows that perform data ingestion, data integration/ETL, statistical model executions etc. Creating architecture for data solutions with high performance characteristics Bringing data and analytics products to production Implement CI/CD pipelines for data solutions Build dashboards for data stewards and business reporting Design & build RDBMS data models Added Advantage: Python Azure data engineer certification TCS Eligibility Criteria: *BE/B.tech/MCA/M.Sc./MS with minimum 3 years of relevant IT-experience post Qualification. *Only Full-Time courses would be considered. Referrals are always welcome!!! Kindly don't apply if already attended interview within 1 month. Thanks & Regards, Jerin L Varghese

Posted 2 weeks ago

Apply

4.0 - 9.0 years

13 - 23 Lacs

Pune, Chennai, Bengaluru

Hybrid

Skill - ADF, Snowflake, SQL Interested candidates please share resume on juisagars@hexaware.com with below details - Total exp Relevant exp Current company Current CTC Expected CTC Notice period/LWD

Posted 2 weeks ago

Apply

12.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Sr. Manager – Azure Data Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (Big Data Architects) with strong technology and data understanding having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop standardized practices for delivering new products and capabilities using Big Data & cloud technologies, including data acquisition, transformation, analysis, Modelling, Governance & Data management skills Interact with senior client technology leaders, understand their business goals, create, propose solution, estimate effort, build architectures, develop and deliver technology solutions Define and develop client specific best practices around data management within a cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using ADB, ADF, PySpark, Python, Snypase Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Have managed team and have experience in end to end delivery Have experience of building technical capability and teams to deliver Skills And Attributes For Success Strong understanding & familiarity with all Cloud Ecosystem components Strong understanding of underlying Cloud Architectural concepts and distributed computing paradigms Experience in the development of large scale data processing. Hands-on programming experience in ADB, ADF, Synapse, Python, PySpark, SQL Hands-on expertise in cloud services like AWS, and/or Microsoft Azure eco system Solid understanding of ETL methodologies in a multi-tiered stack with Data Modelling & Data Governance Experience with BI, and data analytics databases Experience in converting business problems/challenges to technical solutions considering security, performance, scalability etc. Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Strong stakeholder, client, team, process & delivery management skills To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Minimum 12 years hand-on experience in one or more of the above areas. Minimum 14 years industry experience Ideally, you’ll also have Project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

14.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description Our Analytics and Insights Managed Services team brings a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights and optimizing processes for efficiency and client satisfaction. The role requires a deep understanding of IT services, operational excellence, and client-centric solutions. Job Requirements And Preferences: Minimum Degree Required: Bachelor’s degree in information technology, Data Science, Computer Science, Statistics, or a related field (Master’s degree preferred) Minimum Years of Experience: 14 year(s) with at least 3 years in a managerial or leadership role. Proven experience in managing data analytics services for external clients, preferably within a managed services or consulting environment Technical Skills: Experience and knowhow of working with a combination/subset tools and technologies listed below Proficiency in data analytics tools (e.g., Power BI, Tableau, QlikView), Data Integration tools (ETL, Informatica, Talend, Snowflake etc.) and programming languages (e.g., Python, R, SAS, SQL). Strong understanding of Data & Analytics services cloud platforms (e.g., AWS, Azure, GCP) like AWS Glue, EMR, ADF, Redshift, Synapse, BigQuery etc and big data technologies (e.g., Hadoop, Spark). Familiarity with traditional Data warehousing tools like Teradata, Netezza etc Familiarity with machine learning, AI, and automation in data analytics. Certification in data-related disciplines preferred Leadership: Demonstrated ability to lead teams, manage complex projects, and deliver results. Communication: Excellent verbal and written communication skills, with the ability to present complex information to non-technical stakeholders Roles & Responsibilities: Demonstrates intimate abilities and/or a proven record of success as a team leader, emphasizing the following: Client Relationship Management: Serve as the focal point for level client interactions, maintaining strong relationships. Manage client escalations and ensure timely resolution of issues. Face of the team for strategic client discussions, Governance and regular cadence with Client Service Delivery Management: Responsibly Lead end-to-end delivery of managed data analytics services to clients, ensuring projects meet business requirements, timelines, and quality standards Deliver Minor Enhancements and Bug Fixes aligned to client’s service delivery model Good Experience setting up Incident Management, Problem Management processes for the engagement Collaborate with cross-functional teams, including data engineers, data scientists, and business analysts, to deliver end-to-end solutions Monitor, manage & report service-level agreements (SLAs) and key performance indicators (KPIs). Solid financial acumen with experience in budget management. Problem-solving and decision-making skills, with the ability to think strategically Operational Excellence & practice growth: Implement and oversee standardized processes, workflows, and best practices to ensure efficient operations. Utilize tools and systems for service monitoring, reporting, and automation to improve service delivery. Drive innovation and automation in data integration, processing, analysis, and reporting workflows Keep up to date with industry trends, emerging technologies, and regulatory requirements impacting managed services. Risk and Compliance: Ensure data security, privacy, and compliance with relevant standards and regulations Ensure all managed services are delivered in compliance with relevant regulatory requirements and industry standards. Proactively identify and mitigate operational risks that could affect service delivery. Team Leadership & Development: Lead and mentor a team of service managers and technical professionals to ensure high performance and continuous development. Foster a culture of collaboration, accountability, and excellence within the team. Ensure the team is trained on the latest industry best practices, tools, and methodologies. Capacity Management, experience with practice development, strong understanding of agile practices, cloud platforms, and infrastructure management Pre-Sales Experience: Collaborate with sales teams to identify opportunities for growth and expansion of services. Experience in solutioning of responses and operating model including Estimation frameworks, content contribution, solution architecture in responding to RFPs Job Description Our Analytics and Insights Managed Services team brings a unique combination of industry expertise, technology, data management and managed services experience to create sustained outcomes for our clients and improve business performance. We empower companies to transform their approach to analytics and insights and optimizing processes for efficiency and client satisfaction. The role requires a deep understanding of IT services, operational excellence, and client-centric solutions. Job Requirements And Preferences: Minimum Degree Required: Bachelor’s degree in information technology, Data Science, Computer Science, Statistics, or a related field (Master’s degree preferred) Minimum Years of Experience: 14 year(s) with at least 3 years in a managerial or leadership role. Proven experience in managing data analytics services for external clients, preferably within a managed services or consulting environment Technical Skills: Experience and knowhow of working with a combination/subset tools and technologies listed below Proficiency in data analytics tools (e.g., Power BI, Tableau, QlikView), Data Integration tools (ETL, Informatica, Talend, Snowflake etc.) and programming languages (e.g., Python, R, SAS, SQL). Strong understanding of Data & Analytics services cloud platforms (e.g., AWS, Azure, GCP) like AWS Glue, EMR, ADF, Redshift, Synapse, BigQuery etc and big data technologies (e.g., Hadoop, Spark). Familiarity with traditional Datawarehousing tools like Teradata, Netezza etc Familiarity with machine learning, AI, and automation in data analytics. Certification in data-related disciplines preferred Leadership: Demonstrated ability to lead teams, manage complex projects, and deliver results. Communication: Excellent verbal and written communication skills, with the ability to present complex information to non-technical stakeholders Roles & Responsibilities: Demonstrates intimate abilities and/or a proven record of success as a team leader, emphasizing the following: Client Relationship Management: Serve as the focal point for level client interactions, maintaining strong relationships. Manage client escalations and ensure timely resolution of issues. Face of the team for strategic client discussions, Governance and regular cadence with Client Service Delivery Management: Responsibly Lead end-to-end delivery of managed data analytics services to clients, ensuring projects meet business requirements, timelines, and quality standards Deliver Minor Enhancements and Bug Fixes aligned to client’s service delivery model Good Experience setting up Incident Management, Problem Management processes for the engagement Collaborate with cross-functional teams, including data engineers, data scientists, and business analysts, to deliver end-to-end solutions Monitor, manage & report service-level agreements (SLAs) and key performance indicators (KPIs). Solid financial acumen with experience in budget management. Problem-solving and decision-making skills, with the ability to think strategically Operational Excellence & practice growth: Implement and oversee standardized processes, workflows, and best practices to ensure efficient operations. Utilize tools and systems for service monitoring, reporting, and automation to improve service delivery. Drive innovation and automation in data integration, processing, analysis, and reporting workflows Keep up to date with industry trends, emerging technologies, and regulatory requirements impacting managed services. Risk and Compliance: Ensure data security, privacy, and compliance with relevant standards and regulations Ensure all managed services are delivered in compliance with relevant regulatory requirements and industry standards. Proactively identify and mitigate operational risks that could affect service delivery. Team Leadership & Development: Lead and mentor a team of service managers and technical professionals to ensure high performance and continuous development. Foster a culture of collaboration, accountability, and excellence within the team. Ensure the team is trained on the latest industry best practices, tools, and methodologies. Capacity Management, experience with practice development, strong understanding of agile practices, cloud platforms, and infrastructure management Pre-Sales Experience: Collaborate with sales teams to identify opportunities for growth and expansion of services. Experience in solutioning of responses and operating model including Estimation frameworks, content contribution, solution architecture in responding to RFPs

Posted 3 weeks ago

Apply

8.0 - 12.0 years

25 - 40 Lacs

Hyderabad

Remote

Job Description Job Title: SQL Database Architect (SQL Server, Azure SQL Server, SSIS, SSRS, Data Migration, Azure Data Factory, Power BI) Job Summary: We are seeking a highly skilled SQL Database Architect with expertise in SQL Server, Azure SQL Server, SSIS, SSRS, Data Migration, and Power BI to design, develop, and maintain scalable database solutions . The ideal candidate will have experience in database architecture, data integration, ETL processes, cloud-based solutions, and business intelligence reporting . Excellent communication and documentation skills are essential for collaborating with cross-functional teams and maintaining structured database records. Key Responsibilities: Database Design & Architecture: Develop highly available, scalable, and secure database solutions using Azure SQL Server . ETL & Data Integration: Design and implement SSIS packages for data movement, transformations, and automation . Data Migration: Oversee database migration projects , including on-premises to cloud transitions and cloud to cloud transitions, data conversion, and validation processes. Azure Data Factory (ADF): Build and manage data pipelines, integrating various data sources and orchestrating ETL workflows in cloud environments Reporting & Business Intelligence: Develop SSRS reports and leverage Power BI for creating interactive dashboards and data visualizations . Performance Optimization: Analyze and optimize query performance, indexing strategies, and database configurations . Cloud Integration: Architect Azure-based database solutions , including Azure SQL Database, Managed Instances, and Synapse Analytics . Security & Compliance: Ensure data security, encryption, and compliance with industry standards. Backup & Disaster Recovery: Design and implement backup strategies, high availability, and disaster recovery solutions . Automation & Monitoring: Utilize Azure Monitor, SQL Profiler, and other tools to automate and monitor database performance. Collaboration & Communication: Work closely with developers, BI teams, DevOps, and business stakeholders , explaining complex database concepts in a clear and concise manner . Documentation & Best Practices: Maintain comprehensive database documentation , including design specifications, technical workflows, and troubleshooting guides . Required Skills & Qualifications: Expertise in SQL Server & Azure SQL Database. Experience with SSIS for ETL processes, data transformations, and automation. Proficiency in SSRS for creating, deploying, and managing reports. Strong expertise in Data Migration, including cloud and on-premises database transitions. Power BI skills for developing dashboards, reports, and data visualizations. Database modeling, indexing, and query optimization expertise. Knowledge of cloud-based architecture, including Azure SQL Managed Instance. Proficiency in T-SQL, stored procedures, Triggers, and database scripting. Understanding of security best practices, including role-based access control (RBAC). Excellent communication skills to explain database solutions to technical and non-technical stakeholders. Strong documentation skills to create and maintain database design specs, process documents, and reports. Preferred Qualifications: Knowledge of CI/CD pipelines for database deployments. Familiarity with Power BI and other data visualization tools. Experience with Azure Data Factory and Synapse Analytics for advanced data engineering workflows Qualifications Bachelor's or Master's degree in Business, Computer Science, Engineering, or a related field. Additional information Bachelor's or Master's degree in Business, Computer Science, Engineering, or a related field.

Posted 3 weeks ago

Apply

1.0 - 3.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At PwC, our people in managed services focus on a variety of outsourced solutions and support clients across numerous functions. These individuals help organisations streamline their operations, reduce costs, and improve efficiency by managing key processes and functions on their behalf. They are skilled in project management, technology, and process optimization to deliver high-quality services to clients. Those in managed service management and strategy at PwC will focus on transitioning and running services, along with managing delivery teams, programmes, commercials, performance and delivery risk. Your work will involve the process of continuous improvement and optimising of the managed services process, tools and services. You are a reliable, contributing member of a team. In our fast-paced environment, you are expected to adapt, take ownership and consistently deliver quality work that drives value for our clients and success as a team. Skills Examples of the skills, knowledge, and experiences you need to lead and deliver value at this level include but are not limited to: Apply a learning mindset and take ownership for your own development. Appreciate diverse perspectives, needs, and feelings of others. Adopt habits to sustain high performance and develop your potential. Actively listen, ask questions to check understanding, and clearly express ideas. Seek, reflect, act on, and give feedback. Gather information from a range of sources to analyse facts and discern patterns. Commit to understanding how the business works and building commercial awareness. Learn and apply professional and technical standards (e.g. refer to specific PwC tax and audit guidance), uphold the Firm's code of conduct and independence requirements. Role: Specialist Tower: Data Analytics & Insights Managed Service Experience: 1 - 3 years Key Skills: Data Engineering Educational Qualification: Bachelor's degree in computer science/IT or relevant field Work Location: Bangalore, India Job Description As a Specialist, you'll work as part of a team of problem solvers, helping to solve complex business issues from strategy to execution by using Data, Analytics & Insights Skills. PwC Professional skills and responsibilities for this management level include but are not limited to: Use feedback and reflection to develop self-awareness, personal strengths, and address development areas. Flexible to work in stretch opportunities/assignments. Demonstrate critical thinking and the ability to bring order to unstructured problems. Ticket Quality and deliverables review, Status Reporting for the project. Adherence to SLAs, experience in incident management, change management and problem management. Review your work and that of others for quality, accuracy, and relevance. Know how and when to use tools available for a given situation and can explain the reasons for this choice. Seek and embrace opportunities which give exposure to different situations, environments, and perspectives. Use straightforward communication, in a structured way, when influencing and connecting with others. Able to read situations and modify behavior to build quality relationships. Uphold the firm's code of ethics and business conduct. Demonstrate leadership capabilities by working with clients directly and leading the engagement. Work in a team environment that includes client interactions, workstream management, and cross-team collaboration. Good Team player. Take up cross competency work and contribute to COE activities. Escalation/Risk management. Position Requirements Required Skills: Primary Skill: ETL/ELT, SQL, Informatica, Python Secondary Skill: Azure/AWS/GCP, Talend, DataStage, etc. Data Engineer Should have minimum 1 years of Operate/Managed Services/Production Support Experience Should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines to ingest, store, collect, standardize, and integrate data that for downstream consumption like Business Intelligence systems, Analytics modelling, Data scientists etc. Designing and implementing data pipelines to extract, transform, and load (ETL) data from various sources into data storage systems, such as data warehouses or data lakes. Should have experience in building efficient, ETL/ELT processes using industry leading tools like Informatica, Talend, SSIS, SSRS, AWS, Azure, ADF, GCP, Snowflake, Spark, SQL, Python etc. Should have Hands-on experience with Data analytics tools like Informatica, Collibra, Hadoop, Spark, Snowflake etc. Monitoring and troubleshooting data pipelines and resolving issues related to data processing, transformation, or storage. Implementing and maintaining data security and privacy measures, including access controls and encryption, to protect sensitive data. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases. Scaling and optimizing schema and performance tuning SQL and ETL pipelines in data lake and data warehouse environments. Should have Experience of ITIL processes like Incident management, Problem Management, Knowledge management, Release management, Data DevOps etc. Should have Strong communication, problem solving, quantitative and analytical abilities. Nice To Have Certifications in Cloud Technology is an added advantage. Experience in Visualization tools like Power BI, Tableau, Qlik, etc. Managed Services- Data, Analytics & Insights At PwC we relentlessly focus on working with our clients to bring the power of technology and humans together and create simple, yet powerful solutions. We imagine a day when our clients can simply focus on their business knowing that they have a trusted partner for their IT needs. Every day we are motivated and passionate about making our clients’ better. Within our Managed Services platform, PwC delivers integrated services and solutions that are grounded in deep industry experience and powered by the talent that you would expect from the PwC brand. The PwC Managed Services platform delivers scalable solutions that add greater value to our client’s enterprise through technology and human-enabled experiences. Our team of highly skilled and trained global professionals, combined with the use of the latest advancements in technology and process, allows us to provide effective and efficient outcomes. With PwC’s Managed Services our clients are able to focus on accelerating their priorities, including optimizing operations and accelerating outcomes. PwC brings a consultative first approach to operations, leveraging our deep industry insights combined with world class talent and assets to enable transformational journeys that drive sustained client outcomes. Our clients need flexible access to world class business and technology capabilities that keep pace with today’s dynamic business environment. Within our global, Managed Services platform, we provide Data, Analytics & Insights Managed Service where we focus more so on the evolution of our clients’ Data, Analytics, Insights and cloud portfolio. Our focus is to empower our clients to navigate and capture the value of their application portfolio while cost-effectively operating and protecting their solutions. We do this so that our clients can focus on what matters most to your business: accelerating growth that is dynamic, efficient and cost-effective. As a member of our Data, Analytics & Insights Managed Service team, we are looking for candidates who thrive working in a high-paced work environment capable of working on a mix of critical Application Evolution Service offerings and engagement including help desk support, enhancement and optimization work, as well as strategic roadmap and advisory level work. It will also be key to lend experience and effort in helping win and support customer engagements from not only a technical perspective, but also a relationship perspective.

Posted 3 weeks ago

Apply

12.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY GDS – Data and Analytics (D&A) – Sr. Manager – Azure Data Architect As part of our EY-GDS D&A (Data and Analytics) team, we help our clients solve complex business challenges with the help of data and technology. We dive deep into data to extract the greatest value and discover opportunities in key business and functions like Banking, Insurance, Manufacturing, Healthcare, Retail, Manufacturing and Auto, Supply Chain, and Finance. The opportunity We’re looking for Managers (Big Data Architects) with strong technology and data understanding having proven delivery capability. This is a fantastic opportunity to be part of a leading firm as well as a part of a growing Data and Analytics team. Your Key Responsibilities Develop standardized practices for delivering new products and capabilities using Big Data & cloud technologies, including data acquisition, transformation, analysis, Modelling, Governance & Data management skills Interact with senior client technology leaders, understand their business goals, create, propose solution, estimate effort, build architectures, develop and deliver technology solutions Define and develop client specific best practices around data management within a cloud environment Recommend design alternatives for data ingestion, processing and provisioning layers Design and develop data ingestion programs to process large data sets in Batch mode using ADB, ADF, PySpark, Python, Snypase Develop data ingestion programs to ingest real-time data from LIVE sources using Apache Kafka, Spark Streaming and related technologies Have managed team and have experience in end to end delivery Have experience of building technical capability and teams to deliver Skills And Attributes For Success Strong understanding & familiarity with all Cloud Ecosystem components Strong understanding of underlying Cloud Architectural concepts and distributed computing paradigms Experience in the development of large scale data processing. Hands-on programming experience in ADB, ADF, Synapse, Python, PySpark, SQL Hands-on expertise in cloud services like AWS, and/or Microsoft Azure eco system Solid understanding of ETL methodologies in a multi-tiered stack with Data Modelling & Data Governance Experience with BI, and data analytics databases Experience in converting business problems/challenges to technical solutions considering security, performance, scalability etc. Experience in Enterprise grade solution implementations. Experience in performance bench marking enterprise applications Strong stakeholder, client, team, process & delivery management skills To qualify for the role, you must have Flexible and proactive/self-motivated working style with strong personal ownership of problem resolution. Excellent communicator (written and verbal formal and informal). Ability to multi-task under pressure and work independently with minimal supervision. Strong verbal and written communication skills. Must be a team player and enjoy working in a cooperative and collaborative team environment. Adaptable to new technologies and standards. Participate in all aspects of Big Data solution delivery life cycle including analysis, design, development, testing, production deployment, and support. Minimum 12 years hand-on experience in one or more of the above areas. Minimum 14 years industry experience Ideally, you’ll also have Project management skills Client management skills Solutioning skills What We Look For People with technical experience and enthusiasm to learn new things in this fast-moving environment What Working At EY Offers At EY, we’re dedicated to helping our clients, from start–ups to Fortune 500 companies — and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees and you will be able to control your own development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: Support, coaching and feedback from some of the most engaging colleagues around Opportunities to develop new skills and progress your career The freedom and flexibility to handle your role in a way that’s right for you EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Avant de postuler à un emploi, sélectionnez votre langue de préférence parmi les options disponibles en haut à droite de cette page. Découvrez votre prochaine opportunité au sein d'une organisation qui compte parmi les 500 plus importantes entreprises mondiales. Envisagez des opportunités innovantes, découvrez notre culture enrichissante et travaillez avec des équipes talentueuses qui vous poussent à vous développer chaque jour. Nous savons ce qu’il faut faire pour diriger UPS vers l'avenir : des personnes passionnées dotées d’une combinaison unique de compétences. Si vous avez les qualités, de la motivation, de l'autonomie ou le leadership pour diriger des équipes, il existe des postes adaptés à vos aspirations et à vos compétences d'aujourd'hui et de demain. About UPS Fiche de poste : UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Supply Chain Symphony™ The UPS Supply Chain Symphony™ platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making. About The Role We are seeking Data Developer to join our data engineering team responsible for building and maintaining complex data solutions using Azure Data Factory (ADF), Azure Databricks , and Cosmos DB . The role involves designing and developing scalable data pipelines, implementing data transformations, and ensuring high data quality and performance. Work closely with data architects, testers, and analysts to deliver robust data solutions that support strategic business initiatives. The ideal candidate should possess deep expertise in big data technologies, data integration, and cloud-native data engineering solutions on Microsoft Azure. Primary Skills Data Engineering: Azure Data Factory (ADF), Azure Databricks. Cloud Platform: Microsoft Azure (Data Lake Storage, Cosmos DB). Data Modeling: NoSQL data modeling, Data warehousing concepts. Performance Optimization: Data pipeline performance tuning and cost optimization. Programming Languages: Python, SQL, PySpark Secondary Skills DevOps and CI/CD: Azure DevOps, CI/CD pipeline design and automation. Security and Compliance: Implementing data security and governance standards. Agile Methodologies: Experience in Agile/Scrum environments. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications in Azure and Data Engineering, such as: Microsoft Certified: Azure Data Engineer Associate Microsoft Certified: Azure Solutions Architect Expert Databricks Certified Data Engineer Associate or Professional Type De Contrat en CDI Chez UPS, égalité des chances, traitement équitable et environnement de travail inclusif sont des valeurs clefs auxquelles nous sommes attachés.

Posted 3 weeks ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Before you apply to a job, select your language preference from the options available at the top right of this page. Explore your next opportunity at a Fortune Global 500 organization. Envision innovative possibilities, experience our rewarding culture, and work with talented teams that help you become better every day. We know what it takes to lead UPS into tomorrow—people with a unique combination of skill + passion. If you have the qualities and drive to lead yourself or teams, there are roles ready to cultivate your skills and take you to the next level. Job Description About UPS UPS is a global leader in logistics, offering a broad range of solutions that include transportation, distribution, supply chain management, and e-commerce. Founded in 1907, UPS operates in over 220 countries and territories, delivering packages and providing specialized services worldwide. Our mission is to enable commerce by connecting people, places, and businesses, with a strong focus on sustainability and innovation. About UPS Supply Chain Symphony™ The UPS Supply Chain Symphony™ platform is a cloud-based solution that seamlessly integrates key supply chain components, including shipping, warehousing, and inventory management, into a unified platform. This solution empowers businesses by offering enhanced visibility, advanced analytics, and customizable dashboards to streamline global supply chain operations and decision-making. About The Role We are seeking Data Developer to join our data engineering team responsible for building and maintaining complex data solutions using Azure Data Factory (ADF), Azure Databricks , and Cosmos DB . The role involves designing and developing scalable data pipelines, implementing data transformations, and ensuring high data quality and performance. Work closely with data architects, testers, and analysts to deliver robust data solutions that support strategic business initiatives. The ideal candidate should possess deep expertise in big data technologies, data integration, and cloud-native data engineering solutions on Microsoft Azure. Primary Skills Data Engineering: Azure Data Factory (ADF), Azure Databricks. Cloud Platform: Microsoft Azure (Data Lake Storage, Cosmos DB). Data Modeling: NoSQL data modeling, Data warehousing concepts. Performance Optimization: Data pipeline performance tuning and cost optimization. Programming Languages: Python, SQL, PySpark Secondary Skills DevOps and CI/CD: Azure DevOps, CI/CD pipeline design and automation. Security and Compliance: Implementing data security and governance standards. Agile Methodologies: Experience in Agile/Scrum environments. Soft Skills Strong problem-solving abilities and attention to detail. Excellent communication skills, both verbal and written. Effective time management and organizational capabilities. Ability to work independently and within a collaborative team environment. Strong interpersonal skills to engage with cross-functional teams. Educational Qualifications Bachelor's degree in Computer Science, Engineering, Information Technology, or a related field. Relevant certifications in Azure and Data Engineering, such as: Microsoft Certified: Azure Data Engineer Associate Microsoft Certified: Azure Solutions Architect Expert Databricks Certified Data Engineer Associate or Professional Employee Type Permanent UPS is committed to providing a workplace free of discrimination, harassment, and retaliation.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies