Jobs
Interviews

1301 Adf Jobs - Page 10

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 8.0 years

0 Lacs

Bihar

On-site

Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. ͏ Wipro Limited is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com Mandatory Qualifications: Oracle ADF: Proven expertise in developing applications using ADF, ADF Faces, and ADF Task Flows. Java/J2EE: Strong foundation in Java programming and J2EE technologies for building enterprise-level applications. Web Technologies: Proficiency in JavaScript, HTML5, and CSS for front-end development. Web Services: Experience with building and consuming web services. MVC Architecture: Understanding and experience with implementing Model-View-Controller (MVC) patterns using ADF. JDeveloper: Proficiency in using JDeveloper for development and debugging. Application Server: Experience with WebLogic application server. Experience with systems support. Degree in Information Technology. Wipro is an egalitarian company that offers employment opportunities to all people, running a selection process that does not consider race, gender, nationality, ancestry, disability, sexual orientation or any other status protected by applicable law. Job Description ͏ ͏ ͏ Mandatory Skills: Oracle Public Sector Revenue Management. Experience: 5-8 Years. Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 week ago

Apply

9.0 - 14.0 years

15 - 30 Lacs

Hyderabad

Work from Office

HI ALL, Please find below JD Position : Oracle Fusion Technical Location Hyderabad Experience: 8 to 15Years Position: Fulltime/C2H Education: (BTech\MTech\MCA\BSC\BCA) Job Description: Technical Consultant Should have worked on FBDI templates Worked on either SCM or CRM Oracle SOA/BPEL/OIC is a must Should have knowledge on Data integrator Working experience on OTBI/BI publisher Working experience on Data loading or Data migration or Conversions. Knowledge on Business Units/Legal entities/Chart of accounts

Posted 1 week ago

Apply

0 years

20 - 25 Lacs

Pune, Maharashtra, India

On-site

We are a fast-growing data-analytics consultancy dedicated to the Life Sciences / Pharmaceutical commercial analytics space. Our teams build cloud-native data platforms that power sales, marketing, and patient-centric insights for leading global pharma brands—delivering compliant, high-impact solutions at enterprise scale. Role & Responsibilities Architect, build, and tune Snowflake data warehouses and ELT pipelines (SQL, Streams, Tasks, UDFs, Stored Procedures) to meet complex commercial-analytics workloads. Integrate diverse pharma data sources (Veeva, Salesforce, IQVIA, Symphony, RWD, patient-services feeds) via Fivetran, ADF, or Python-based frameworks, ensuring end-to-end data quality. Establish robust data models (star, snowflake, Data Vault) optimized for sales reporting, market-share analytics, and AI/ML use-cases. Drive governance & compliance (HIPAA, GDPR, GxP) through fine-grained access controls, masking, lineage, and metadata management. Lead code reviews, mentor engineers, and resolve performance bottlenecks while right-sizing compute for cost efficiency. Partner with business stakeholders to translate commercial objectives into scalable data solutions and actionable insights. Skills & Qualifications Must-Have 7+ yrs data-engineering / warehousing experience, incl. 4+ yrs hands-on Snowflake design & development. Expert‐level SQL plus strong data-modeling (Dimensional, Data Vault) and ETL/ELT optimisation skills. Proficiency in Python (or similar) for automation, API integrations, and orchestration. Proven governance/security acumen within regulated industries (HIPAA, GDPR, PII). Bachelor’s in Computer Science, Engineering, Information Systems (Master’s preferred). Strong client-facing communication and problem-solving ability in fast-paced, agile environments. Preferred Direct experience with pharma commercial datasets (sales, CRM, claims, MDM, adherence KPIs). Cloud-platform depth (AWS, Azure, or GCP) and familiarity with tools such as Matillion/DBT/Airflow, Git. Snowflake certifications (SnowPro Core / Advanced) plus Tableau, Power BI, or Qlik connectivity know-how. Skills: Snowflake,AWS,analytics,sales,sql,data,etl/elt optimization,python,data warehousing,azure,data modeling,data governance,cloud

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Looking for a dynamic, energetic, and self-driven Azure Data Engineer to design, implement, and maintain scalable data solutions. The role focuses on integrating data from multiple sources across all business units within India for Saint Gobain to enable seamless analytics and decision-making. Key Responsibilities : Designing, building, and managing data pipelines using Azure Data Factory and Azure Databricks Creating, managing tables in Azure Databricks' Unity catalog schemas and writing complex SQL queries for ETL and unit testing Monitoring and troubleshooting pipeline failures and fixing issues. Key Technical Skills: Azure Databricks (preferably with knowledge in Unity catalog, Foreign Catalog, SQL warehouses and PySpark), ADLS Gen 2, ADF. Good to have Microsoft DP-203 and/or DP-900 certification (not mandatory). Key Soft Skills: Strong team player with excellent interpersonal skills, Business process depth in Manufacturing set-up is preferred. Experience: 3 -5 years of relevant experience in Azure data engineering or similar roles. Women candidate are encouraged to apply.

Posted 1 week ago

Apply

0.0 - 8.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Job Profile – Lead Data Engineer Does working with data on a day to day basis excite you? Are you interested in building robust data architecture to identify data patterns and optimise data consumption for our customers, who will forecast and predict what actions to undertake based on data? If this is what excites you, then you’ll love working in our intelligent automation team. Schneider AI Hub is leading the AI transformation of Schneider Electric by building AI-powered solutions. We are looking for a savvy Data Engineer to join our growing team of AI and machine learning experts. You will be responsible for expanding and optimizing our data and data pipeline architecture, as well as optimizing data flow and collection for cross functional teams. The ideal candidate is an experienced data pipeline builder and data wrangler who enjoys optimizing data systems and building them from the ground up. The Data Engineer will support our software engineers, data analysts and data scientists on data initiatives and will ensure optimal data delivery architecture is consistent throughout ongoing projects. They must be self-directed and comfortable supporting the data needs of multiple teams, systems and products. Responsibilities Create and maintain optimal data pipeline architecture; assemble large, complex data sets that meet functional / non-functional requirements. Design the right schema to support the functional requirement and consumption patter. Design and build production data pipelines from ingestion to consumption. Create necessary preprocessing and postprocessing for various forms of data for training/ retraining and inference ingestions as required. Create data visualization and business intelligence tools for stakeholders and data scientists for necessary business/ solution insights. Identify, design, and implement internal process improvements: automating manual data processes, optimizing data delivery, etc. Ensure our data is separated and secure across national boundaries through multiple data centers Requirements and Skills You should have a bachelors or master’s degree in computer science, Information Technology or other quantitative fields You should have at least 8 years working as a data engineer in supporting large data transformation initiatives related to machine learning, with experience in building and optimizing pipelines and data sets Strong analytic skills related to working with unstructured datasets. Experience with Azure cloud services, ADF, ADLS, HDInsight, Data Bricks, App Insights etc Experience in handling ETL’s using Spark. Experience with object-oriented/object function scripting languages: Python, Pyspark, etc. Experience with big data tools: Hadoop, Spark, Kafka, etc. Experience with data pipeline and workflow management tools: Azkaban, Luigi, Airflow, etc. You should be a good team player and committed for the success of team and overall project. About Us Schneider Electric™ creates connected technologies that reshape industries, transform cities and enrich lives. Our 144,000 employees thrive in more than 100 countries. From the simplest of switches to complex operational systems, our technology, software and services improve the way our customers manage and automate their operations. Great people make Schneider Electric a great company. We seek out and reward people for putting the customer first, being disruptive to the status quo, embracing different perspectives, continuously learning, and acting like owners. We want our employees to reflect the diversity of the communities in which we operate. We welcome people as they are, creating an inclusive culture where all forms of diversity are seen as a real value for the company. We’re looking for people with a passion for success — on the job and beyond. Primary Location : IN-Karnataka-Bangalore Schedule : Full-time Unposting Date : Ongoing

Posted 1 week ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

On-site

About The Role We are seeking a highly skilled and experienced Senior Power BI Engineer to join our Data & Analytics team. You will play a pivotal role in designing, developing, and optimizing advanced Power BI reports and dashboards that empower business users with actionable insights. The ideal candidate will have deep expertise in Power BI, data modelling, and visualization, combined with a strong understanding of modern cloud data platforms, especially within the Azure ecosystem. Key Responsibilities Design, develop, and maintain scalable and high-performing Power BI reports and dashboards based on the Global Data Warehouse and Lakehouse environment. Collaborate closely with data engineers, data scientists, and business stakeholders to translate business requirements into technical BI solutions. Optimize Power BI datasets leveraging Azure Synapse Analytics and Databricks-processed data to ensure efficient query performance. Develop and maintain robust data models, DAX calculations, and custom visualizations in Power BI to deliver actionable insights. Implement best practices for report design, data security (Row-Level Security), and governance to ensure compliance and data integrity. Troubleshoot, debug, and resolve performance and data quality issues in Power BI reports and datasets. Mentor and provide technical guidance to junior BI developers and analysts. Stay up to date with the latest Power BI features and Azure Synapse ecosystem enhancements to continuously improve BI solutions. Support end-user training and documentation to promote self-service BI adoption. Required Qualifications Bachelor's or master's degree in computer science, Information Systems, Data Science, or a related field. 5+ years of experience in Power BI report development and data visualization. Strong proficiency in Power BI Desktop, Power Query (M), DAX, and Power BI Service. Hands-on experience with Azure Synapse Analytics, Azure Data Factory (ADF), and Databricks. Deep understanding of data warehousing concepts, dimensional modelling, and ETL/ELT processes. Experience optimizing performance of Power BI reports connected to large-scale data warehouses and lakehouses. Knowledge of security implementations within Power BI, including Row-Level Security (RLS) and workspace permissions. Strong SQL skills for data querying and debugging. Excellent problem-solving skills and ability to work effectively with cross-functional teams. Strong communication skills to engage with business users and technical teams. Preferred Qualifications Microsoft Power BI Certification (e.g., DA-100 / PL-300). Experience with other Azure data services (Azure Data Lake Storage, Azure Synapse Pipelines). Familiarity with Python or Spark for data processing in Databricks. Exposure to Agile development methodologies and CI/CD pipelines for BI. (ref:hirist.tech)

Posted 1 week ago

Apply

1.0 - 5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Job Title - Data Science Analyst S&C GN Management Level : Analyst Location: Bangalore / Gurugram / Mumbai / Hyderabad / Chennai Must have skills: Gen AI / ML, SQL, Python, Azure / AWS, ML Ops Good to have skills: Experience in data science projects focused on Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Job Summary: We are seeking a highly skilled and motivated Data Science Analyst to work on innovative projects and drive impactful solutions in domains such as Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product . This role requires hands-on technical expertise , and client delivery management skills to execute cutting-edge projects in Generative AI, data science, and cloud-based analytics. Key Responsibilities 1.Data Science and Engineering Perform advanced analytics using Python, SQL, Pyspark using machine learning frameworks. Develop predictive models, recommendation systems, and optimization solutions tailored to business needs. Manage and preprocess large, complex datasets, ensuring efficient pipelines and advanced feature engineering across structured and unstructured data. Build MLOps pipelines for model training / retraining, monitoring, and scalability Dashboard and Reporting Develop dashboards, reports, and insights to track the impact of deployed models on business outcomes in PowerBI/Tableau. Present results and recommendations to stakeholders, leveraging data storytelling to drive decision-making. Cloud Platform Expertise Design and implement end-to-end data science workflows on cloud platforms (e.g., AWS, Azure, GCP) for business-critical projects. Leverage cloud-native tools and services (e.g., Databricks, ADF, Lambda, Glue, AzureML) for training, deploying, and monitoring machine learning models at scale. Generative AI Expertise Lead the development of Generative AI based application and solutions leveraging frameworks like LangChain, LlamaIndex Drive model evaluation strategies using advanced metrics (e.g., BLEU, ROUGE, FID) and iteratively optimize performance for production-grade applications. Architect deployment solutions, including API development and seamless integration with existing systems. Required Qualifications Experience: 1-5 years in data science Education: Bachelor's / master’s degree in computer science, statistics, applied mathematics, or a related field. Industry Knowledge: Preferred experience in Utilities, Oil & Gas, Mining, Manufacturing, Chemicals and Forest Product Technical Skills Programming: Proficiency in Python, SQL and Pyspark. GenAI Expertise: Hands-on experience in building GenAI based applications and solutions Experience in deploying GenAI application in production. Cloud Platforms: Experience with Azure / AWS / GCP Visualization Tools: PowerBI / Tableau Preferred Skills Strong analytical and problem-solving skills with a results-oriented mindset. Good communication and stakeholder management capabilities. Very good in generating business insights and presenting to stakeholders. About Our Company | Accenture Experience: 1-5 years in data science Educational Qualification: Bachelor's / master’s degree in computer science, statistics, applied mathematics, or a related field.

Posted 1 week ago

Apply

4.0 years

0 Lacs

Gurugram, Haryana, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, we’re all in to shape your future with confidence. We’ll help you succeed in a globally connected powerhouse of diverse teams and take your career wherever you want it to go. Join EY and help to build a better working world. EY_ Consulting_Microsoft_Fabric- Manager As part of our EY-DnA team, you will be responsible for designing, developing, and maintaining distributed systems using Microsoft Fabric, including One Lake, Azure Data Factory (ADF), Azure Synapse, Notebooks, Data Warehouse, and Lakehouse. You will play a crucial role in architecting and implementing enterprise data platforms and data management practices, ensuring the delivery of high-quality solutions that meet business requirements. You will collaborate with system architects, business analysts, and stakeholders to understand their requirements and convert them into technical designs. Your role will involve designing, building, testing, deploying, and maintaining robust integration architectures, services, and workflows. To qualify for the role, you should: Design, develop, and implement ETL pipelines using Azure Data Factory to extract, transform, and load data from various sources into target systems. Architect and implement Azure Synapse, Data Warehouse, and Lakehouse solutions, ensuring scalability, performance, and reliability. Utilize Notebooks and Spark for data analysis, processing, and visualization to derive actionable insights from large datasets. Define and implement enterprise data platform architecture, including the creation of gold, silver, and bronze datasets for downstream use. Hands-on development experience in cloud-based big data technologies, including Azure, Power Platform, Microsoft Fabric/Power BI, leveraging languages such as SQL, PySpark, DAX, Python, and Power Query. Designing and developing BI reports and dashboards by understanding the business requirements, designing the data model, and developing visualizations that provide actionable insights. Collaborate effectively with key stakeholders and other developers to understand business requirements, provide technical expertise, and deliver solutions that meet project objectives. Mentor other developers in the team, sharing knowledge, best practices, and emerging technologies to foster continuous learning and growth. Stay updated on industry trends and advancements in Microsoft Fabric and related technologies, incorporating new tools and techniques to enhance development processes and outcomes. Skills and attributes for success: 7-11 years of experience in developing data solutions using the Microsoft Azure cloud platform. Strong experience with Azure Data Factory and ETL Pipelines Strong experience with Azure Synapse, Data Warehouse and Lakehouse implementations Strong experience with Notebooks and Spark Background in architecting and implementing enterprise data platforms and data management practise including gold, silver bronze datasets for downstream use. Hands on experience in cloud-based big data technologies including Azure, Power Platform, Microsoft Fabric/Power BI; using languages such as SQL, Pyspark, DAX, Python, Power Query. Creating Business Intelligence (BI) reports and crafting complex Data Analysis Expressions (DAX) for metrics. Ideally, you’ll also have: Exceptional communication skills and the ability to articulate ideas clearly and concisely. Capability to work independently as well as lead a team effectively. EY | Building a better working world EY is building a better working world by creating new value for clients, people, society and the planet, while building trust in capital markets. Enabled by data, AI and advanced technology, EY teams help clients shape the future with confidence and develop answers for the most pressing issues of today and tomorrow. EY teams work across a full spectrum of services in assurance, consulting, tax, strategy and transactions. Fueled by sector insights, a globally connected, multi-disciplinary network and diverse ecosystem partners, EY teams can provide services in more than 150 countries and territories.

Posted 1 week ago

Apply

4.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

4.0 years

0 Lacs

Kanayannur, Kerala, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

4.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. The opportunity You will help our clients navigate the complex world of modern data science and analytics. We'll look to you to provide advice to our clients on how best to design, implement, stabilise, and optimise internal controls utilising cutting edge and scalable AI & Big Data technologies. Your Key Responsibilities As a Senior AI/ML Engineer you Build and leverage cutting edge Gen AI & Big Data platforms to deliver insights to build a comprehensive Risk & Controls monitoring mechanisms. Creating executive reporting, proposals, and marketing material, ensuring the highest quality deliverables. Work within the team and leverage your knowledge and expertise to solve some of the most intriguing technical challenges in design and development of next generation AI & Data led Risk technology platform. Skills and Summary of Accountabilities: Develop robust AI/ML models to drive valuable insights and enhance the decision-making process across multiple business lines. Strong technical knowledge of Big Data Platforms, AI technologies around Large Language Models, Vector Databases, comprehensive DevOps services and full stack application development Lead the design and development of bespoke machine learning algorithms to achieve business objectives. Use predictive modeling to increase and optimize customer experiences, revenue generation, ad targeting, and other business outcomes. Good understanding of Machine learning/NLP/LLM/GenAI/deep learning, and generative AI techniques. Apply advanced machine learning techniques (like SVM, Decision Trees, Neural Networks, Random Forests, Gradient Boosting etc.) to complex problems. Ensure adherence to ethical AI guidelines and data governance policies. Utilize expertise in Prompt Engineering to design and implement innovative solutions for workflow orchestration, leveraging tools such as Python, Airflow, and Terraform. Collaborate closely with cross-functional teams, including data scientists, analysts, and other engineers, to identify opportunities for AI and analytics-driven improvements in the bank's operations and services. Stay up to date with the latest advancements in AI, analytics, and Prompt Engineering, ensuring remains at the forefront of innovation and maintains a competitive edge in the industry. Maintain and improve existing AI/ML architectures and ensure they continue to deliver significant results. Intellectual strength and flexibility to resolve complex problems and rationalise these into a workable solution which can then be delivered. Develop current and relevant client propositions, delivering timely high-quality output against stated project objectives. Willingness to learn, apply new and cutting-edge AI technologies to deliver insight driven business solutions. To qualify for the role you must have 4+ years of working experience in Large Scale AI/ML models and data science. Deep understanding of statistics, AI/ML algorithms, and predictive modeling. Proficiency in AI/ML programming languages like Python, R, SQL. Proficiency with a deep learning framework such as TensorFlow, PyTorch or Keras. Expertise in machine learning algorithms and data mining techniques (like SVM, Decision Trees, and Deep Learning Algorithms). Implement monitoring and logging tools to ensure AI model performance and reliability. Strong programming skills in Python including libraries for machine learning such as Scikit-learn, Pandas, NumPy, etc. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Automate tasks through Python scripting, databases, and other advanced technologies like databricks, synapse, ML, AI, ADF etc Should have good understanding of Git, JIRA, Change / Release management, build/deploy, CI/CD Azure Devops & Share Point. Strong understanding of Large Enterprise Applications like SAP, Oracle ERP & Microsoft Dynamics etc. Ideally, you'll also have Bachelor's Degree or above in mathematics, information systems, statistics, computer science, Data Science, or related disciplines. Relevant certifications are considered a plus Self-driven and creative problem-solver who enjoys the fast-paced world of software development and can perform well in a team. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 1 week ago

Apply

8.0 - 10.0 years

5 - 50 Lacs

Pune, Maharashtra, India

On-site

We are looking for a Senior Oracle EBS Technical Consultant to join our team. In this role, you will be responsible for providing technical expertise in Oracle E-Business Suite (EBS) and related technologies. You will work closely with clients to understand their business requirements and provide solutions to meet their needs. Responsibilities Collaborating with clients to gather and document technical requirements. Designing, developing, and implementing customizations and extensions to Oracle EBS. Providing technical support and troubleshooting for Oracle EBS. Conducting code reviews and ensuring best practices are followed. Leading technical workshops and training sessions for clients and internal team members. Keeping up to date with the latest technologies and industry trends in Oracle EBS. Experience in the preparation of Technical Design documents Can work independently and progress the build of a CEMLI/RICE object from a technical design document Technical Skills Hands on experience on Data Conversion/Migrations, Inbound / Outbound interfaces, Reports, Forms and Customizations. Experience in Implementation and RICE Customizations of Oracle Applications 11i/R12 Expertise in SQL, PL/SQL and Performance tuning. Expertise in Oracle Forms (Development and Personalization), BI Publisher Reports, Oracle Workflows. OAF experience will be preferable Sound knowledge in using Oracle APIs for interfaces to Oracle Financials and AOL/Sys-Admin components. Good knowledge on functional flows FIN (GL, Fixed Assets, Cash management, AP/AR) and SCM (procurement, Inventory, Order Management). Solid understanding of Oracle EBS database/table structures and the integration and impacts between modules Ability to design and document solutions for complex problem Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Minimum of 8-10 years of experience in Oracle EBS technical development. Strong technical skills in Oracle EBS R12 and related technologies such as Oracle Forms, Oracle Reports, PL/SQL, Oracle Workflow, OAF, ADF, and XML Publisher. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Ability to work independently and in a team environment. Capable of working in a fast paced, dynamic, team-oriented environment. If you are a motivated individual with a passion for Oracle EBS technology and a desire to work in a dynamic, fast-paced environment, we encourage you to apply for this position. We offer competitive salary, comprehensive benefits, and opportunities for career growth and advancement. Skills:- Oracle EBS and Technical support

Posted 1 week ago

Apply

3.0 - 8.0 years

3 - 40 Lacs

Pune, Maharashtra, India

On-site

Responsibilities We are seeking an experienced Oracle EBS Technical Consultant to join our team. In this role, you will be responsible for providing technical expertise in Oracle E-Business Suite (EBS) and related technologies. You will work closely with clients to understand their business requirements and provide solutions to meet their needs. Your primary responsibilities will include: Designing, developing, and implementing customizations and extensions to Oracle EBS. Providing technical support and troubleshooting for Oracle EBS. Ensuring best practices are followed. Keeping up-to-date with the latest technologies and industry trends in Oracle EBS. Experience in the preparation of Technical Design documents Can work independently and progress the build of a CEMLI/RICE object from a technical design document Technical Skills Hands on experience on Data Conversion/Migrations, Inbound / Outbound interfaces, Reports, Forms and Customizations. Experience in Implementation and RICE Customizations of Oracle Applications 11i/R12 Expertise in SQL, PL/SQL and Performance tuning. Expertise in Oracle Forms (Development and Personalization), BI Publisher Reports, Oracle Workflows. OAF (Oracle Application Framework) experience will be preferable Sound knowledge in using Oracle APIs for interfaces to Oracle Financials and AOL/Sys-Admin components. Good knowledge on functional flows FIN (GL, Fixed Assets, Cash management, AP/AR) and SCM (procurement, Inventory, Order Management). Solid understanding of Oracle EBS database/table structures and the integration and impacts between modules Ability to design and document solutions for complex problem Qualifications Bachelor's or Master's degree in Computer Science, Information Technology, or a related field. Minimum of 3-8 years of experience in Oracle EBS technical development. Strong technical skills in Oracle EBS R12 and related technologies such as Oracle Forms, Oracle Reports, PL/SQL, Oracle Workflow, OAF, ADF, and XML Publisher. Strong analytical and problem-solving skills. Excellent communication and interpersonal skills. Ability to work independently and in a team environment. If you are a motivated individual with a passion for Oracle EBS technology and a desire to work in a dynamic, fast-paced environment, we encourage you to apply for this position. We offer competitive salary, comprehensive benefits, and opportunities for career growth and advancement. Skills:- Oracle EBS and Technical support

Posted 1 week ago

Apply

5.0 - 10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Azure Data Engineer Experience Range: 5 to 10 years Location : Pune Skills: Azure Data Engineering, C#, Fabrics, ADF Synapse Python, SQL, Synapse/ADF Should have good hands on experience on building ETL flow using any ETL tool such as ADF etc. Should be able to understand the requirement and design the data flow diagram of ETL process end to end. Should have good hands-on experience writing complex SQL queries and advance concepts of SQL such has indexes,partition,filegroup,transaction etc. Should be able to understand the business requirement and develop end to end data pipelines using required tools/technologies. Excellent troubleshooting and good communication skills with good attention to details. Should have knowledge on designing optimized data processing based on volume of data. Able to create documentation that clearly explains the purpose of the data flow and its intended use. Able to make regular modifications to existing production code for error correction and adding new features. Experience using Visual Studio,SQL server management studio. Strong understanding of data warehousing concepts such as dimension,fact,schema ,data loading process, dimensional modeling and data mining. Flexible to learn and adopt tools/technologies used in project.

Posted 1 week ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Data Engineer- ETL Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking a Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner. What You’ll Be Doing What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to make sure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Py spark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to make sure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to the Application Manager. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE Are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and a diverse workforce enable business growth and are critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most diverse workforce possible, and create an inclusive culture where everyone can bring their full selves to work and can reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe Robust support for Flexible Working Arrangements Enhanced family friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides dynamic compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability.

Posted 1 week ago

Apply

0 years

0 Lacs

Gurugram, Haryana, India

On-site

Data Engineer- ETL Bangalore, India AXA XL recognizes data and information as critical business assets, both in terms of managing risk and enabling new business opportunities. This data should not only be high quality, but also actionable - enabling AXA XL’s executive leadership team to maximize benefits and facilitate sustained industrious advantage. Our Chief Data Office also known as our Innovation, Data Intelligence & Analytics team (IDA) is focused on driving innovation through optimizing how we leverage data to drive strategy and create a new business model - disrupting the insurance market. As we develop an enterprise-wide data and digital strategy that moves us toward greater focus on the use of data and data-driven insights, we are seeking a Data Engineer. The role will support the team’s efforts towards creating, enhancing, and stabilizing the Enterprise data lake through the development of the data pipelines. This role requires a person who is a team player and can work well with team members from other disciplines to deliver data in an efficient and strategic manner. What You’ll Be Doing What will your essential responsibilities include? Act as a data engineering expert and partner to Global Technology and data consumers in controlling complexity and cost of the data platform, whilst enabling performance, governance, and maintainability of the estate. Understand current and future data consumption patterns, architecture (granular level), partner with Architects to make sure optimal design of data layers. Apply best practices in Data architecture. For example, balance between materialization and virtualization, optimal level of de-normalization, caching and partitioning strategies, choice of storage and querying technology, performance tuning. Leading and hands-on execution of research into new technologies. Formulating frameworks for assessment of new technology vs business benefit, implications for data consumers. Act as a best practice expert, blueprint creator of ways of working such as testing, logging, CI/CD, observability, release, enabling rapid growth in data inventory and utilization of Data Science Platform. Design prototypes and work in a fast-paced iterative solution delivery model. Design, Develop and maintain ETL pipelines using Py spark in Azure Databricks using delta tables. Use Harness for deployment pipeline. Monitor Performance of ETL Jobs, resolve any issue that arose and improve the performance metrics as needed. Diagnose system performance issue related to data processing and implement solution to address them. Collaborate with other teams to make sure successful integration of data pipelines into larger system architecture requirement. Maintain integrity and quality across all pipelines and environments. Understand and follow secure coding practice to make sure code is not vulnerable. You will report to the Application Manager. What You Will BRING We’re looking for someone who has these abilities and skills: Required Skills And Abilities Effective Communication skills. Bachelor’s degree in computer science, Mathematics, Statistics, Finance, related technical field, or equivalent work experience. Relevant years of extensive work experience in various data engineering & modeling techniques (relational, data warehouse, semi-structured, etc.), application development, advanced data querying skills. Relevant years of programming experience using Databricks. Relevant years of experience using Microsoft Azure suite of products (ADF, synapse and ADLS). Solid knowledge on network and firewall concepts. Solid experience writing, optimizing and analyzing SQL. Relevant years of experience with Python. Ability to break complex data requirements and architect solutions into achievable targets. Robust familiarity with Software Development Life Cycle (SDLC) processes and workflow, especially Agile. Experience using Harness. Technical lead responsible for both individual and team deliveries. Desired Skills And Abilities Worked in big data migration projects. Worked on performance tuning both at database and big data platforms. Ability to interpret complex data requirements and architect solutions. Distinctive problem-solving and analytical skills combined with robust business acumen. Excellent basics on parquet files and delta files. Effective Knowledge of Azure cloud computing platform. Familiarity with Reporting software - Power BI is a plus. Familiarity with DBT is a plus. Passion for data and experience working within a data-driven organization. You care about what you do, and what we do. Who WE Are AXA XL, the P&C and specialty risk division of AXA, is known for solving complex risks. For mid-sized companies, multinationals and even some inspirational individuals we don’t just provide re/insurance, we reinvent it. How? By combining a comprehensive and efficient capital platform, data-driven insights, leading technology, and the best talent in an agile and inclusive workspace, empowered to deliver top client service across all our lines of business − property, casualty, professional, financial lines and specialty. With an innovative and flexible approach to risk solutions, we partner with those who move the world forward. Learn more at axaxl.com What we OFFER Inclusion AXA XL is committed to equal employment opportunity and will consider applicants regardless of gender, sexual orientation, age, ethnicity and origins, marital status, religion, disability, or any other protected characteristic. At AXA XL, we know that an inclusive culture and a diverse workforce enable business growth and are critical to our success. That’s why we have made a strategic commitment to attract, develop, advance and retain the most diverse workforce possible, and create an inclusive culture where everyone can bring their full selves to work and can reach their highest potential. It’s about helping one another — and our business — to move forward and succeed. Five Business Resource Groups focused on gender, LGBTQ+, ethnicity and origins, disability and inclusion with 20 Chapters around the globe Robust support for Flexible Working Arrangements Enhanced family friendly leave benefits Named to the Diversity Best Practices Index Signatory to the UK Women in Finance Charter Learn more at axaxl.com/about-us/inclusion-and-diversity. AXA XL is an Equal Opportunity Employer. Total Rewards AXA XL’s Reward program is designed to take care of what matters most to you, covering the full picture of your health, wellbeing, lifestyle and financial security. It provides dynamic compensation and personalized, inclusive benefits that evolve as you do. We’re committed to rewarding your contribution for the long term, so you can be your best self today and look forward to the future with confidence. Sustainability At AXA XL, Sustainability is integral to our business strategy. In an ever-changing world, AXA XL protects what matters most for our clients and communities. We know that sustainability is at the root of a more resilient future. Our 2023-26 Sustainability strategy, called “Roots of resilience”, focuses on protecting natural ecosystems, addressing climate change, and embedding sustainable practices across our operations. Our Pillars Valuing nature: How we impact nature affects how nature impacts us. Resilient ecosystems - the foundation of a sustainable planet and society - are essential to our future. We’re committed to protecting and restoring nature - from mangrove forests to the bees in our backyard - by increasing biodiversity awareness and inspiring clients and colleagues to put nature at the heart of their plans. Addressing climate change: The effects of a changing climate are far reaching and significant. Unpredictable weather, increasing temperatures, and rising sea levels cause both social inequalities and environmental disruption. We're building a net zero strategy, developing insurance products and services, and mobilizing to advance thought leadership and investment in societal-led solutions. Integrating ESG: All companies have a role to play in building a more resilient future. Incorporating ESG considerations into our internal processes and practices builds resilience from the roots of our business. We’re training our colleagues, engaging our external partners, and evolving our sustainability governance and reporting. AXA Hearts in Action: We have established volunteering and charitable giving programs to help colleagues support causes that matter most to them, known as AXA XL’s “Hearts in Action” programs. These include our Matching Gifts program, Volunteering Leave, and our annual volunteering day - the Global Day of Giving. For more information, please see axaxl.com/sustainability.

Posted 1 week ago

Apply

7.0 - 12.0 years

25 - 40 Lacs

Dubai, Hyderabad

Work from Office

TechECS Hiring: Oracle EBS HRMS Technical Consultant Dubai! Location: Dubai, UAE (On-site) Experience: 7+ years Oracle EBS (11i & R12) technical experience, with strong HRMS (Core HR, Payroll, OTL) expertise. Key Responsibilities Design, develop, and implement Oracle EBS R12 solutions for HRMS (Core HR, Payroll, OTL). Create & maintain RICEW components (Reports, Interfaces, Conversions, Extensions, Workflows). Customize Oracle Forms, Reports, BI Publisher/XML, and OAF/ADF pages. Build and personalize Oracle Workflow & Advanced Workflow (AME). Develop robust PL/SQL: packages, procedures, triggers; tune performance. Implement Web ADI, XML, shell scripting, APIs, and interfaces for data integration. Lead technical assessments, debugging, system migrations (legacy R12), and production support. Mentor junior developers and document technical specifications and user guides. Collaborate with functional EHCM teams and business stakeholders to translate requirements. Optionally, handle WebServices/SOA integrations. Must-Have Technical Skills Oracle EBS 11i & R12: Core HR, Payroll, OTL Oracle PL/SQL (advanced queries, procedures, triggers) Oracle Forms & Reports, BI Publisher/XML Oracle Workflow / Advanced Workflow (AME) RICEW development Oracle Application Framework (OAF) & ADF Web ADI, XML, API/Interface development UNIX / Shell scripting Performance tuning and production debugging Why Join Tech ECS in Dubai? Work on high-impact Oracle EBS HRMS projects in a vibrant, international environment Competitive UAE-based compensation and benefits package Opportunity to lead and innovate in system customizations, migrations, and support If Interested, share your updated resume to mounika.paladugula@techecs.com Regards, Mounika Paladugula - TAG || Tech ECS Do checkout the Job opportunity in Linkedin - https://www.linkedin.com/posts/activity-7346065109640716288-x2JU?utm_source=share&utm_medium=member_desktop&rcm=ACoAABr1upkBsyV2seyDBpl4tvLVvGKgRzqTL44

Posted 1 week ago

Apply

15.0 - 17.0 years

0 Lacs

Mulshi, Maharashtra, India

On-site

Area(s) of responsibility Experience: 15 to 17 Years Cloud Architect with experience in Azure and Snowflake along with experience in RFP and proposal writing Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on AWS and Azure Cloud Experience in Azure Databricks and ADF, Azure Synapse and PySpark and Snowflake Services Participate in pre-sales activities, including RFP and proposal writing Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes for Reporting, AI and Machine Learning Understanding of data modelling and data architecture concepts Participate in Proposal and Capability presentation To be able to clearly articulate pros and cons of various technologies and platforms Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage Snowflake and Azure cloud platforms. Define and implement cloud governance and best practices. Identify and implement automation opportunities to increase operational efficiency. Conduct knowledge sharing and training sessions to educate clients and internal teams on cloud technologies.

Posted 1 week ago

Apply

15.0 - 17.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Area(s) of responsibility Experience: 15 to 17 Years Cloud Architect with experience in Azure and Snowflake along with experience in RFP and proposal writing Responsible for designing and implementing secure, scalable, and highly available cloud-based solutions and estimation on AWS and Azure Cloud Experience in Azure Databricks and ADF, Azure Synapse and PySpark and Snowflake Services Participate in pre-sales activities, including RFP and proposal writing Experience with integration of different data sources with Data Warehouse and Data Lake is required Experience in creating Data warehouse, data lakes for Reporting, AI and Machine Learning Understanding of data modelling and data architecture concepts Participate in Proposal and Capability presentation To be able to clearly articulate pros and cons of various technologies and platforms Collaborate with clients to understand their business requirements and translate them into technical solutions that leverage Snowflake and Azure cloud platforms. Define and implement cloud governance and best practices. Identify and implement automation opportunities to increase operational efficiency. Conduct knowledge sharing and training sessions to educate clients and internal teams on cloud technologies.

Posted 1 week ago

Apply

3.0 years

0 Lacs

Kochi, Kerala, India

Remote

We are looking for a passionate and experienced Data Science/Engineering Trainer to join our training team. This role is ideal for professionals who love sharing knowledge, mentoring aspiring data engineers, and contributing to the growth of future tech talent. Job Description Deliver in-person classroom/online training on Data Science , Data Engineering , and related tools and technologies. Design course content, hands-on labs, case studies, and capstone projects aligned with industry standards. Teach tools and concepts such as: Python, SQL, Pandas, NumPy, Scikit-learn, TensorFlow, Keras, Power BI/Tableau Big Data (Spark, Hadoop) and Data Pipelines Cloud Platforms : AWS (Glue, S3, Redshift), Azure (ADF, Synapse), or GCP (BigQuery, Cloud Composer) ETL tools , Data Warehousing , MLOps , and DevOps Conduct assessments, code/project reviews, and support students with guidance and doubt resolution. Mentor students on interview preparation, career path, and hands-on project development. Continuously update training materials based on technology trends and feedback. Required Skill: 3+ years of experience in Data Science or Cloud Data Engineering roles(training/development). Strong coding skills in Python and SQL. Hands-on knowledge of at least one cloud platform ( AWS , Azure , or GCP ). Solid understanding of ML lifecycle, data lake/warehouse concepts, and ETL/ELT. Excellent presentation and communication skills. Nice to have : Certifications in AWS, GCP, or Azure (Data Engineering/AI/ML). What we Offer : Competitive salary and performance-based incentives. Work with senior architects and domain experts. Opportunity to contribute to real projects and mentor students for live industry challenges. Collaborative and growth-focused work environment. Flexibility to work from home.

Posted 1 week ago

Apply

0 years

6 - 8 Lacs

Hyderābād

On-site

Ready to shape the future of work? At Genpact, we don’t just adapt to change—we drive it. AI and digital innovation are redefining industries, and we’re leading the charge. Genpact’s AI Gigafactory , our industry-first accelerator, is an example of how we’re scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to agentic AI , our breakthrough solutions tackle companies’ most complex challenges. If you thrive in a fast-moving, tech-driven environment, love solving real-world problems, and want to be part of a team that’s shaping the future, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions – we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at genpact.com and on LinkedIn , X , YouTube , and Facebook . Inviting applications for the role of Senior Principal Consultant - Databricks Architect! In this role, the Databricks Architect is responsible for providing technical direction and lead a group of one or more developer to address a goal. Responsibilities Architect and design solutions to meet functional and non-functional requirements. Create and review architecture and solution design artifacts. Evangelize re-use through the implementation of shared assets. Enforce adherence to architectural standards/principles, global product-specific guidelines, usability design standards, etc. Proactively guide engineering methodologies, standards, and leading practices. Guidance of engineering staff and reviews of as-built configurations during the construction phase. Provide insight and direction on roles and responsibilities required for solution operations. Identify, communicate and mitigate Risks, Assumptions, Issues, and Decisions throughout the full lifecycle. Considers the art of the possible, compares various architectural options based on feasibility and impact, and proposes actionable plans. Demonstrate strong analytical and technical problem-solving skills. Ability to analyze and operate at various levels of abstraction. Ability to balance what is strategically right with what is practically realistic . Growing the Data Engineering business by helping customers identify opportunities to deliver improved business outcomes, designing and driving the implementation of those solutions. Growing & retaining the Data Engineering team with appropriate skills and experience to deliver high quality services to our customers. Supporting and developing our people, including learning & development, certification & career development plans Providing technical governance and oversight for solution design and implementation Should have technical foresight to understand new technology and advancement. Leading team in the definition of best practices & repeatable methodologies in Cloud Data Engineering, including Data Storage, ETL, Data Integration & Migration, Data Warehousing and Data Governance Should have Technical Experience in Azure, AWS & GCP Cloud Data Engineering services and solutions. Contributing to Sales & Pre-sales activities including proposals, pursuits, demonstrations, and proof of concept initiatives Evangelizing the Data Engineering service offerings to both internal and external stakeholders Development of Whitepapers, blogs, webinars and other though leadership material Development of Go-to-Market and Service Offering definitions for Data Engineering Working with Learning & Development teams to establish appropriate learning & certification paths for their domain. Expand the business within existing accounts and help clients, by building and sustaining strategic executive relationships, doubling up as their trusted business technology advisor. Position differentiated and custom solutions to clients, based on the market trends, specific needs of the clients and the supporting business cases. Build new Data capabilities, solutions, assets, accelerators, and team competencies. Manage multiple opportunities through the entire business cycle simultaneously, working with cross-functional teams as necessary. Qualifications we seek in you! Minimum qualifications Excellent technical architecture skills, enabling the creation of future-proof, complex global solutions. Excellent interpersonal communication and organizational skills are required to operate as a leading member of global, distributed teams that deliver quality services and solutions. Ability to rapidly gain knowledge of the organizational structure of the firm to facilitate work with groups outside of the immediate technical team. Knowledge and experience in IT methodologies and life cycles that will be used. Familiar with solution implementation/management, service/operations management, etc. Leadership skills can inspire others and persuade. Maintains close awareness of new and emerging technologies and their potential application for service offerings and products. Bachelor’s Degree or equivalency (CS, CE, CIS, IS, MIS, or engineering discipline) or equivalent work experience Experience in a solution architecture role using service and hosting solutions such as private/public cloud IaaS, PaaS, and SaaS platforms. Experience in architecting and designing technical solutions for cloud-centric solutions based on industry standards using IaaS, PaaS, and SaaS capabilities. Must have strong hands-on experience on various cloud services like ADF/Lambda, ADLS/S3, Security, Monitoring, Governance Must have experience to design platform on Databricks. Hands-on Experience to design and build Databricks based solution on any cloud platform. Hands-on experience to design and build solution powered by DBT models and integrate with databricks . Must be very good designing End-to-End solution on cloud platform. Must have good knowledge of Data Engineering concept and related services of cloud. Must have good experience in Python and Spark. Must have good experience in setting up development best practices. Intermediate level knowledge is required for Data Modelling. Good to have knowledge of docker and Kubernetes. Experience with claims-based authentication (SAML/OAuth/OIDC), MFA, RBAC , SSO etc. Knowledge of cloud security controls including tenant isolation, encryption at rest, encryption in transit, key management, vulnerability assessments, application firewalls, SIEM, etc. Experience building and supporting mission-critical technology components with DR capabilities. Experience with multi-tier system and service design and development for large enterprises Extensive, real-world experience designing technology components for enterprise solutions and defining solution architectures and reference architectures with a focus on cloud technologies. Exposure to infrastructure and application security technologies and approaches Familiarity with requirements gathering techniques. Preferred qualifications Must have designed the E2E architecture of unified data platform covering all the aspect of data lifecycle starting from Data Ingestion, Transformation, Serve and consumption. Must have excellent coding skills either Python or Scala, preferably Python. Must have experience in Data Engineering domain with total Must have designed and implemented at least 2-3 project end-to-end in Databricks. Must have experience on databricks which consists of various components as below o Delta lake o dbConnect o db API 2.0 o SQL Endpoint – Photon engine o Unity Catalog o Databricks workflows orchestration o Security management o Platform governance o Data Security Must have knowledge of new features available in Databricks and its implications along with various possible use-case. Must have followed various architectural principles to design best suited per problem. Must be well versed with Databricks Lakehouse concept and its implementation in enterprise environments. Must have strong understanding of Data warehousing and various governance and security standards around Databricks. Must have knowledge of cluster optimization and its integration with various cloud services. Must have good understanding to create complex data pipeline. Must be strong in SQL and sprak-sql . Must have strong performance optimization skills to improve efficiency and reduce cost. Must have worked on designing both Batch and streaming data pipeline. Must have extensive knowledge of Spark and Hive data processing framework. Must have worked on any cloud (Azure, AWS, GCP) and most common services like ADLS/S3, ADF/Lambda, CosmosDB /DynamoDB, ASB/SQS, Cloud databases. Must be strong in writing unit test case and integration test. Must have strong communication skills and have worked with cross platform team. Must have great attitude towards learning new skills and upskilling the existing skills. Responsible to set best practices around Databricks CI/CD. Must understand composable architecture to take fullest advantage of Databricks capabilities. Good to have Rest API knowledge. Good to have understanding around cost distribution. Good to have if worked on migration project to build Unified data platform. Good to have knowledge of DBT. Experience around DevSecOps including docker and Kubernetes. Software development full lifecycle methodologies, patterns, frameworks, libraries, and tools Knowledge of programming and scripting languages such as JavaScript, PowerShell, Bash, SQL, Java , Python, etc. Experience with data ingestion technologies such as Azure Data Factory, SSIS, Pentaho, Alteryx Experience with visualization tools such as Tableau, Power BI Experience with machine learning tools such as mlFlow , Databricks AI/ML, Azure ML, AWS sagemaker , etc. Experience in distilling complex technical challenges to actionable decisions for stakeholders and guiding project teams by building consensus and mediating compromises when necessary. Experience coordinating the intersection of complex system dependencies and interactions Experience in solution delivery using common methodologies especially SAFe Agile but also Waterfall, Iterative, etc. Demonstrated knowledge of relevant industry trends and standards Why join Genpact? Be a transformation leader – Work at the cutting edge of AI, automation, and digital innovation Make an impact – Drive change for global enterprises and solve business challenges that matter Accelerate your career – Get hands-on experience, mentorship, and continuous learning opportunities Work with the best – Join 140,000+ bold thinkers and problem-solvers who push boundaries every day Thrive in a values-driven culture – Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the tech shapers and growth makers at Genpact and take your career in the only direction that matters: Up. Let’s build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training. Job Senior Principal Consultant Primary Location India-Hyderabad Schedule Full-time Education Level Bachelor's / Graduation / Equivalent Job Posting Jul 1, 2025, 6:40:20 AM Unposting Date Ongoing Master Skills List Digital Job Category Full Time

Posted 1 week ago

Apply

6.0 years

7 - 10 Lacs

Hyderābād

On-site

Job Requirement: Minimum 6 years or more of experience as .NET full stack developer. Developer should have experience in .NET Core application development and debugging. Developer should have experience in ReactJs development like SPA and Microfrontend. Developer should have experience in SQL Server development with debugging experience. Developer should have experience in Agile and complete SDLC life cycle. Developer should have hands on experience in Azure service like AKS, ADF and APIM Developer should have DevOps hands on experience like build and deployment pipeline creations. Developer should API development experience like REST or GraphQL Job Requirement: Minimum 6 years or more of experience as .NET full stack developer. Developer should have experience in .NET Core application development and debugging. Developer should have experience in ReactJs development like SPA and Microfrontend. Developer should have experience in SQL Server development with debugging experience. Developer should have experience in Agile and complete SDLC life cycle. Developer should have hands on experience in Azure service like AKS, ADF and APIM Developer should have DevOps hands on experience like build and deployment pipeline creations. Developer should API development experience like REST or GraphQL Careers with Optum. Here's the idea. We built an entire organization around one giant objective; make the health system work better for everyone. So when it comes to how we use the world's large accumulation of health-related information, or guide health and lifestyle choices or manage pharmacy benefits for millions, our first goal is to leap beyond the status quo and uncover new ways to serve. Optum, part of the UnitedHealth Group family of businesses, brings together some of the greatest minds and most advanced ideas on where health care has to go in order to reach its fullest potential. For you, that means working on high performance teams against sophisticated challenges that matter. Optum, incredible ideas in one incredible company and a singular opportunity to do your life's best work. SM UnitedHealth Group is an Equal Employment Opportunity employer under applicable law and qualified applicants will receive consideration for employment without regard to race, national origin, religion, age, color, sex, sexual orientation, gender identity, disability, or protected veteran status, or any other characteristic protected by local, state, or federal laws, rules, or regulations. UnitedHealth Group is a drug-free workplace. Candidates are required to pass a drug test before beginning employment.

Posted 1 week ago

Apply

0 years

2 - 9 Lacs

Gurgaon

On-site

About Gartner IT: Join a world-class team of skilled engineers who build creative digital solutions to support our colleagues and clients. We make a broad organizational impact by delivering cutting-edge technology solutions that power Gartner. Gartner IT values its culture of nonstop innovation, an outcome-driven approach to success, and the notion that great ideas can come from anyone on the team. About the role: Gartner is looking for a well-rounded and driven leader to become a part of its Conferences Technology & Insight Analytics team, which is tasked with creating the reporting and analytics to support its Conference reporting operations. What you will do: Provide technical leadership and guidance to software development teams, guaranteeing alignment with project objectives and adherence to industry best practices. Leading and mentoring a team of software engineers, delegating responsibilities, offering support, and promoting a collaborative environment. Collaborate with business stakeholders, design, build advanced analytic solutions for Gartner Conference Technology Business. Execution of our data strategy through design and development of Data platforms to deliver Reporting, BI and Advanced Analytics solutions. Design and development of key analytics capabilities using MS SQL Server, Azure SQL Managed Instance, T-SQL & ADF on Azure Platform. Consistently improving and optimizing T-SQL performance across the entire analytics platform. Create, build, and implement comprehensive data integration solutions utilizing Azure Data Factory. Analysing and solving complex business problems, breaking down the work into actionable tasks. Develop, maintain, and document data dictionary and data flow diagrams Responsible for building and enhancing the regression test suite to monitor nightly ETL jobs and identify data issues. Work alongside project managers, cross teams to support fast paced Agile/Scrum environment. Build Optimized solutions and designs to handle Big Data. Follow coding standards, build appropriate unit tests, integration tests, deployment scripts and review project artifacts created by peers. Contribute to overall growth by suggesting improvements to the existing software architecture or introducing new technologies. What you will need Strong IT professional with high-end knowledge on Designing and Development of E2E BI & Analytics projects in a global enterprise environment. The candidate should have strong qualitative and quantitative problem-solving abilities and is expected to yield ownership and accountability. Must have: Strong experience with SQL, including diagnosing and resolving load failures, constructing hierarchical queries, and efficiently analysing existing SQL code to identify and resolve issues, using Microsoft Azure SQL Database, SQL Server, and Azure SQL Managed Instance. Ability to create and modifying various database objects such as stored procedures, views, tables, triggers, indexes using Microsoft Azure SQL Database, SQL Server, Azure SQL Managed Instance. Deep understanding in writing Advance SQL code (Analytic functions). Strong technical experience with Database performance and tuning, troubleshooting and query optimization. Strong technical experience with Azure Data Factory on Azure Platform. Create and manage complex ETL pipelines to extract, transform, and load data from various sources using Azure Data Factory. Monitor and troubleshoot data pipeline issues to ensure data integrity and availability. Enhance data workflows to improve performance, scalability, and cost-effectiveness. Establish best practices for data governance and security within data pipelines. Experience in Cloud Platforms, Azure technologies like Azure Analysis Services, Azure Blob Storage, Azure Data Lake, Azure Delta Lake etc Experience with data modelling, database design, and data warehousing concepts and Data Lake. Ensure thorough documentation of data processes, configurations, and operational procedures. Good to Have: Experience working with dataset ingestion, data model creation, reports, dashboards using Power BI. Experience with Python and Azure Function for data processing. Experience in other reporting tools like SSRS, Tableau, Power BI etc. Demonstrated Ability to use GIT, Jenkins and other change management tools. Good knowledge of database performance and tuning, troubleshooting and query optimization. Who you are : Graduate/Post-graduate in BE/Btech, ME/MTech or MCA is preferred. IT Professional with 7-10 yrs of experience in Data analytics, Cloud technologies and ETL development. Excellent communication and prioritization skills. Able to work independently or within a team proactively in a fast-paced AGILE-SCRUM environment. Strong desire to improve upon their skills in software development, frameworks, and technologies. Don’t meet every single requirement? We encourage you to apply anyway. You might just be the right candidate for this, or other roles. #LI-NS4 Who are we? At Gartner, Inc. (NYSE:IT), we guide the leaders who shape the world. Our mission relies on expert analysis and bold ideas to deliver actionable, objective insight, helping enterprise leaders and their teams succeed with their mission-critical priorities. Since our founding in 1979, we’ve grown to more than 21,000 associates globally who support ~14,000 client enterprises in ~90 countries and territories. We do important, interesting and substantive work that matters. That’s why we hire associates with the intellectual curiosity, energy and drive to want to make a difference. The bar is unapologetically high. So is the impact you can have here. What makes Gartner a great place to work? Our sustained success creates limitless opportunities for you to grow professionally and flourish personally. We have a vast, virtually untapped market potential ahead of us, providing you with an exciting trajectory long into the future. How far you go is driven by your passion and performance. We hire remarkable people who collaborate and win as a team. Together, our singular, unifying goal is to deliver results for our clients. Our teams are inclusive and composed of individuals from different geographies, cultures, religions, ethnicities, races, genders, sexual orientations, abilities and generations. We invest in great leaders who bring out the best in you and the company, enabling us to multiply our impact and results. This is why, year after year, we are recognized worldwide as a great place to work . What do we offer? Gartner offers world-class benefits, highly competitive compensation and disproportionate rewards for top performers. In our hybrid work environment, we provide the flexibility and support for you to thrive — working virtually when it's productive to do so and getting together with colleagues in a vibrant community that is purposeful, engaging and inspiring. Ready to grow your career with Gartner? Join us. The policy of Gartner is to provide equal employment opportunities to all applicants and employees without regard to race, color, creed, religion, sex, sexual orientation, gender identity, marital status, citizenship status, age, national origin, ancestry, disability, veteran status, or any other legally protected status and to seek to advance the principles of equal employment opportunity. Gartner is committed to being an Equal Opportunity Employer and offers opportunities to all job seekers, including job seekers with disabilities. If you are a qualified individual with a disability or a disabled veteran, you may request a reasonable accommodation if you are unable or limited in your ability to use or access the Company’s career webpage as a result of your disability. You may request reasonable accommodations by calling Human Resources at +1 (203) 964-0096 or by sending an email to ApplicantAccommodations@gartner.com . Job Requisition ID:101327 By submitting your information and application, you confirm that you have read and agree to the country or regional recruitment notice linked below applicable to your place of residence. Gartner Applicant Privacy Link: https://jobs.gartner.com/applicant-privacy-policy For efficient navigation through the application, please only use the back button within the application, not the back arrow within your browser.

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies