Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a PySpark Data Engineer, you will play a crucial role in developing robust data processing and transformation solutions within our data platform. Your responsibilities will include designing, implementing, and maintaining PySpark-based applications to handle complex data processing tasks, ensuring data quality, and integrating with diverse data sources. To excel in this role, you should possess strong PySpark development skills, experience with big data technologies, and the ability to thrive in a fast-paced, data-driven environment. Your primary responsibilities will involve designing, developing, and testing PySpark-based applications to process, transform, and analyze large-scale datasets from various sources such as relational databases, NoSQL databases, batch files, and real-time data streams. You will need to implement efficient data transformation and aggregation techniques using PySpark and relevant big data frameworks, as well as develop robust error handling and exception management mechanisms to maintain data integrity and system resilience within Spark jobs. Additionally, optimizing PySpark jobs for performance through techniques like partitioning, caching, and tuning of Spark configurations will be essential. Collaboration will be key in this role, as you will work closely with data analysts, data scientists, and data architects to understand data processing requirements and deliver high-quality data solutions. By analyzing and interpreting data structures, formats, and relationships, you will implement effective data transformations using PySpark and work with distributed datasets in Spark to ensure optimal performance for large-scale data processing and analytics. In terms of data integration and ETL processes, you will design and implement ETL (Extract, Transform, Load) processes to ingest and integrate data from various sources, ensuring consistency, accuracy, and performance. Integration of PySpark applications with data sources such as SQL databases, NoSQL databases, data lakes, and streaming platforms will also be a part of your responsibilities. To excel in this role, you should possess a Bachelor's degree in Computer Science, Information Technology, or a related field, along with 5+ years of hands-on experience in big data development, preferably with exposure to data-intensive applications. A strong understanding of data processing principles, techniques, and best practices in a big data environment is essential, as well as proficiency in PySpark, Apache Spark, and related big data technologies for data processing, analysis, and integration. Experience with ETL development and data pipeline orchestration tools such as Apache Airflow and Luigi will be advantageous. Strong analytical and problem-solving skills, along with excellent communication and collaboration abilities, will also be critical for success in this role.,
Posted 16 hours ago
10.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
We are looking for a highly motivated and experienced Data and Analytics Senior Architect to lead our Master Data Management (MDM) and Data Analytics team. As the Data and Analytics Architect Lead, you will be responsible for defining and implementing the overall data architecture strategy to ensure alignment with business goals and support data-driven decision-making. Your role will involve designing scalable, secure, and efficient data systems, including databases, data lakes, and data warehouses. You will evaluate and recommend tools and technologies for data integration, processing, storage, and analytics while staying updated on industry trends. You will lead a high-performing team, fostering a collaborative and innovative culture, and ensuring data integrity, consistency, and availability across the organization. You will manage the existing MDM solution and data platform based on Microsoft Data Lake Gen 2, Snowflake as the DWH, and Power BI managing data from core applications. Additionally, you will drive further development to handle additional data and capabilities to support our AI journey. The ideal candidate will possess strong leadership skills, a deep understanding of data management and technology principles, and the ability to collaborate effectively across different departments and functions. **Principle Duties and Responsibilities:** **Team Leadership:** - Lead, mentor, and develop a high-performing team of data analysts and MDM specialists. - Foster a collaborative and innovative team culture that encourages continuous improvement and efficiency. - Provide technical leadership and guidance to the development teams and oversee the implementation of IT solutions. **Architect:** - Define the overall data architecture strategy, aligning it with business goals and ensuring it supports data-driven decision-making. - Identify, evaluate, and establish shared enabling technical capabilities for the division in collaboration with IT to ensure consistency, quality, and business value. - Design and oversee the implementation of data systems, including databases, data lakes, and data warehouses, ensuring they are scalable, secure, efficient, and cost-effective. - Evaluate and recommend tools and technologies for data integration, processing, storage, and analytics, staying updated on industry trends. **Strategic Planning:** - Develop and implement the MDM and analytics strategy aligned with the overall team and organizational goals. - Work with the Enterprise architect to align on the overall strategy and application landscape to ensure MDM and data analytics fit into the ecosystem. - Identify opportunities to enhance data quality, governance, and analytics capabilities. **Project Management:** - Oversee project planning, execution, and delivery to ensure timely and successful completion of initiatives. - Monitor project progress and cost, identify risks, and implement mitigation strategies. **Stakeholder Engagement:** - Collaborate with cross-functional teams to understand data needs and deliver solutions that support business objectives. - Serve as a key point of contact for data-related inquiries and support requests. - Develop business cases and proposals for IT investments and present them to senior management and stakeholders. **Data/Information Governance:** - Establish and enforce data/information governance policies and standards to ensure compliance and data integrity. - Champion best practices in data management and analytics across the organization. **Reporting and Analysis:** - Utilize data analytics to derive insights and support decision-making processes. - Document and present findings and recommendations to senior management. **Knowledge, Skills and Abilities Required:** - Bachelor's degree in computer science, Data Science, Information Management, or a related field; master's degree preferred. - 10+ years of experience in data management, analytics, or a related field, with at least 2 years in a leadership role. - Strong knowledge of master data management concepts, data governance, data technology, and analytics tools. - Proficiency in data modeling, ETL processes, database management, big data technologies, and data integration techniques. - Excellent project management skills with a proven track record of delivering complex projects on time and within budget. - Strong analytical, problem-solving, and decision-making abilities. - Exceptional communication and interpersonal skills. - Team player, result-oriented, structured, with attention to detail and a strong work ethic. **Special Competencies required:** - Proven leader with excellent structural skills, good at documenting and presenting. - Strong executional skills to make things happen, not just generate ideas. - Experience in working with analytics tools and data ingestion platforms. - Experience in working with MDM solutions and preferably TIBCO EBX. - Experience in working with Jira/Confluence. **Additional Information:** - Office, remote, or hybrid working. - Ability to function within variable time zones. - International travel may be required.,
Posted 20 hours ago
10.0 - 14.0 years
0 Lacs
kolkata, west bengal
On-site
You are a highly skilled and strategic Data Architect with deep expertise in the Azure Data ecosystem. Your role will involve defining and driving the overall Azure-based data architecture strategy aligned with enterprise goals. You will architect and implement scalable data pipelines, data lakes, and data warehouses using Azure Data Lake, ADF, and Azure SQL/Synapse. Providing technical leadership on Azure Databricks for large-scale data processing and advanced analytics use cases is a crucial aspect of your responsibilities. Integrating AI/ML models into data pipelines and supporting the end-to-end ML lifecycle including training, deployment, and monitoring will be part of your day-to-day tasks. Collaboration with cross-functional teams such as data scientists, DevOps engineers, and business analysts is essential. You will evaluate and recommend tools, platforms, and design patterns for data and ML infrastructure while mentoring data engineers and junior architects on best practices and architectural standards. Your role will require a strong background in data modeling, ETL/ELT frameworks, and data warehousing concepts. Proficiency in SQL, Python, PySpark, and a solid understanding of AI/ML workflows and tools are necessary. Exposure to Azure DevOps and excellent communication and stakeholder management skills are also key requirements. As a Data Architect at Lexmark, you will play a vital role in designing and overseeing robust, scalable, and secure data architectures to support advanced analytics and machine learning workloads. If you are an innovator looking to make your mark with a global technology leader, apply now to join our team in Kolkata, India.,
Posted 3 days ago
14.0 - 18.0 years
0 Lacs
pune, maharashtra
On-site
You are hiring for the position of AVP - Databricks with a minimum of 14 years of experience. The role is based in Bangalore/Hyderabad/NCR/Kolkata/Mumbai/Pune. As an AVP - Databricks, your responsibilities will include leading and managing Databricks-based project delivery to ensure solutions are designed, developed, and implemented according to client requirements and industry standards. You will act as the subject matter expert on Databricks, providing guidance on architecture, implementation, and optimization to teams. Collaboration with architects and engineers to design optimal solutions for data processing, analytics, and machine learning workloads is also a key aspect of the role. You will serve as the primary point of contact for clients to ensure alignment between business requirements and technical delivery. The qualifications we seek in you include a Bachelor's degree in Computer Science, Engineering, or a related field (Masters or MBA preferred). You should have relevant years of experience in IT services with a specific focus on Databricks and cloud-based data engineering. Preferred qualifications/skills for this role include proven experience in leading end-to-end delivery, solution and architecture of data engineering or analytics solutions on Databricks. Strong experience in cloud technologies such as AWS, Azure, GCP, data pipelines, and big data tools is desirable. Hands-on experience with Databricks, Spark, Delta Lake, MLflow, and related technologies is a plus. Expertise in data engineering concepts including ETL, data lakes, data warehousing, and distributed computing will be beneficial for this role.,
Posted 3 days ago
5.0 - 9.0 years
0 Lacs
thane, maharashtra
On-site
As the BI / BW Lead at DMart, you will lead and manage a dedicated SAP BW team to ensure the timely delivery of reports, dashboards, and analytics solutions. Your role will involve managing the team effectively, overseeing all SAP BW operational support tasks and development projects with a focus on high quality and efficiency. You will be responsible for maintaining the stability and performance of the SAP BW environment, managing daily support activities, and ensuring seamless data flow and reporting across the organization. Acting as the bridge between business stakeholders and your technical team, you will play a crucial role in enhancing DMart's data ecosystem. You should possess a Bachelor's or Master's degree in Computer Science, Information Systems, Engineering, or a related field. While SAP BW certifications are preferred, they are not mandatory. Key Responsibilities: - Lead and manage the SAP BW & BOBJ team, ensuring efficient workload distribution and timely task completion. - Oversee the daily operational support of the SAP BW & BOBJ environment to maintain stability and performance. - Provide direction and guidance to the team for issue resolution, data loads, and reporting accuracy. - Serve as the primary point of contact for business users and internal teams regarding SAP BW support and enhancements. - Ensure the team follows best practices in monitoring, error handling, and performance optimization. - Drive continuous improvement of support processes, tools, and methodologies. - Proactively identify risks and bottlenecks in data flows and take corrective actions. - Ensure timely delivery of data extracts, reports, and dashboards for critical business decisions. - Provide leadership in system upgrades, patching, and data model improvements. - Facilitate knowledge sharing and skill development within the SAP BW team. - Maintain high standards of data integrity and security in the BW environment. Professional Skills: - Strong functional and technical understanding of SAP BW / BW on HANA & BOBJ. - At least 5 years of working experience with SAP Analytics. - Solid knowledge of ETL processes and data extraction. - Experience with Data lakes such as Snowflake, Big Query, Data bricks, and Dashboard tools like Power BI, Tableau is advantageous. - Experience in Retail, CPG, or SCM is a plus. - Experience in managing SAP BW support activities and coordinating issue resolution. - Strong stakeholder management skills with the ability to translate business needs into technical actions. - Excellent problem-solving and decision-making abilities under pressure.,
Posted 3 days ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
At PwC, our team in managed services specializes in providing outsourced solutions and supporting clients across various functions. We help organizations enhance their operations, reduce costs, and boost efficiency by managing key processes and functions on their behalf. Our expertise lies in project management, technology, and process optimization, allowing us to deliver high-quality services to our clients. In managed service management and strategy at PwC, the focus is on transitioning and running services, managing delivery teams, programs, commercials, performance, and delivery risk. Your role will involve continuous improvement and optimization of managed services processes, tools, and services. As a Managed Services - Data Engineer Senior Associate at PwC, you will be part of a team of problem solvers dedicated to addressing complex business issues from strategy to execution using Data, Analytics & Insights Skills. Your responsibilities will include using feedback and reflection to enhance self-awareness and personal strengths, acting as a subject matter expert in your chosen domain, mentoring junior resources, and conducting knowledge sharing sessions. You will be required to demonstrate critical thinking, ensure quality of deliverables, adhere to SLAs, and participate in incident, change, and problem management. Additionally, you will be expected to review your work and that of others for quality, accuracy, and relevance, as well as demonstrate leadership capabilities by working directly with clients and leading engagements. The primary skills required for this role include ETL/ELT, SQL, SSIS, SSMS, Informatica, and Python, with secondary skills in Azure/AWS/GCP, Power BI, Advanced Excel, and Excel Macro. As a Data Ingestion Senior Associate, you should have extensive experience in developing scalable, repeatable, and secure data structures and pipelines, designing and implementing ETL processes, monitoring and troubleshooting data pipelines, implementing data security measures, and creating visually impactful dashboards for data reporting. You should also have expertise in writing and analyzing complex SQL queries, be proficient in Excel, and possess strong communication, problem-solving, quantitative, and analytical abilities. In our Managed Services platform, we focus on leveraging technology and human expertise to deliver simple yet powerful solutions to our clients. Our team of skilled professionals, combined with advanced technology and processes, enables us to provide effective outcomes and add greater value to our clients" enterprises. We aim to empower our clients to focus on their business priorities by providing flexible access to world-class business and technology capabilities that align with today's dynamic business environment. If you are a candidate who thrives in a high-paced work environment, capable of handling critical Application Evolution Service offerings, engagement support, and strategic advisory work, then we are looking for you to join our team in the Data, Analytics & Insights Managed Service at PwC. Your role will involve working on a mix of help desk support, enhancement and optimization projects, as well as strategic roadmap initiatives, while also contributing to customer engagements from both a technical and relationship perspective.,
Posted 4 days ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. As part of our Analytics and Insights Consumption team, you'll analyze data to drive useful insights for clients to address core business issues or to drive strategic outcomes. You'll use visualization, statistical and analytics models, AI/ML techniques, Modelops and other techniques to develop these insights. Candidates with 8+ years of hands-on experience are invited to join our team as we embark on a journey to drive innovation and change through data-driven solutions. Responsibilities: - Lead and manage a team of software engineers in developing, implementing, and maintaining advanced software solutions for GenAI projects. - Engage with senior leadership and cross-functional teams to gather business requirements, identify opportunities for technological enhancements, and ensure alignment with organizational goals. - Design and implement sophisticated event-driven architectures to support real-time data processing and analysis. - Oversee the use of containerization technologies such as Kubernetes to promote efficient deployment and scalability of software applications. - Supervise the development and management of extensive data lakes, ensuring effective storage and handling of large volumes of structured and unstructured data. - Champion the use of Python as the primary programming language, setting high standards for software development within the team. - Facilitate close collaboration between software engineers, data scientists, data engineers, and DevOps teams to ensure seamless integration and deployment of GenAI models. - Maintain a cutting-edge knowledge base in GenAI technologies to drive innovation and enhance software engineering processes continually. - Translate complex business needs into robust technical solutions, contributing to strategic decision-making processes. - Establish and document software engineering processes, methodologies, and best practices, promoting a culture of excellence. - Ensure continuous professional development of the team by maintaining and acquiring new solution architecture certificates and adhering to industry best practices.,
Posted 5 days ago
14.0 - 18.0 years
0 Lacs
karnataka
On-site
As the AVP Databricks Squad Delivery Lead, you will play a crucial role in overseeing project delivery, team leadership, architecture reviews, and client engagement. Your primary responsibility will be to optimize Databricks implementations across cloud platforms such as AWS, Azure, and GCP, while leading cross-functional teams. You will lead and manage the end-to-end delivery of Databricks-based solutions. Your expertise as a subject matter expert (SME) in Databricks architecture, implementation, and optimization will be essential. Collaborating with architects and engineers, you will design scalable data pipelines and analytics platforms. Additionally, you will oversee Databricks workspace setup, performance tuning, and cost optimization. Acting as the primary point of contact for client stakeholders, you will ensure effective communication and alignment between business goals and technical solutions. Driving innovation within the team, you will implement best practices, tools, and technologies to enhance project delivery. The ideal candidate should possess a Bachelor's degree in Computer Science, Engineering, or equivalent (Masters or MBA preferred). Hands-on experience in delivering data engineering/analytics projects using Databricks and managing cloud-based data pipelines on AWS, Azure, or GCP is a must. Strong leadership skills and excellent client-facing communication are essential for this role. Preferred skills include proficiency with Spark, Delta Lake, MLflow, and distributed computing. Expertise in data engineering concepts such as ETL, data lakes, and data warehousing is highly desirable. Certifications in Databricks or cloud platforms (AWS/Azure/GCP) and Agile/Scrum or PMP certification are considered advantageous.,
Posted 5 days ago
7.0 - 11.0 years
0 Lacs
haryana
On-site
As an Assistant Manager - MIS Reporting at Axis Max Life Insurance, you will play a crucial role in driving the business intelligence team towards a data-driven culture and leading the transformation towards automation and real-time insights. You will be responsible for ensuring the accurate and timely delivery of reports and dashboards while coaching and mentoring a team of professionals to enhance their skills and capabilities. Your key responsibilities will include handling distribution reporting requirements, supporting CXO reports and dashboards, driving data democratization, collaborating to design data products, and partnering with the data team to build necessary data infrastructure. You will lead a team of 10+ professionals, including partners, and work closely with distribution leaders to understand key metrics and information needs to develop business intelligence products that cater to those needs. Additionally, you will define the vision and roadmap for the business intelligence team, championing a data culture within Max Life and accelerating the journey towards becoming a data-driven organization. To excel in this role, you should possess a Master's degree in a quantitative field, along with at least 7-8 years of relevant experience in working with business reporting teams in the financial services sector. Proficiency in tools like Python and PowerBI is essential, as well as demonstrated experience in working with senior leadership, standardizing and automating business reporting, and technical proficiency in BI tech stack. Strong interpersonal skills, excellent verbal and written communication abilities, and a deep understanding of data architecture, data warehousing, and data lakes are also required for this position. If you are passionate about leading change, driving efficiency, and rationalizing information overload, we are looking for you to join our team at Axis Max Life Insurance.,
Posted 5 days ago
14.0 - 18.0 years
0 Lacs
karnataka
On-site
You are hiring for the role of AVP - Databricks with a requirement of minimum 14+ years of experience. The job location can be in Bangalore, Hyderabad, NCR, Kolkata, Mumbai, or Pune. As an AVP - Databricks, your responsibilities will include leading and managing Databricks-based project delivery to ensure that all solutions meet client requirements, best practices, and industry standards. You will serve as a subject matter expert (SME) on Databricks, providing guidance to teams on architecture, implementation, and optimization. Collaboration with architects and engineers to design optimal solutions for data processing, analytics, and machine learning workloads will also be part of your role. Additionally, you will act as the primary point of contact for clients, ensuring alignment between business requirements and technical delivery. We are looking for a candidate with a Bachelor's degree in Computer Science, Engineering, or a related field (Masters or MBA preferred) with relevant years of experience in IT services, specifically in Databricks and cloud-based data engineering. Proven experience in leading end-to-end delivery and solution architecting of data engineering or analytics solutions on Databricks is a plus. Strong expertise in cloud technologies such as AWS, Azure, GCP, data pipelines, and big data tools is desired. Hands-on experience with Databricks, Spark, Delta Lake, MLflow, and related technologies is a requirement. An in-depth understanding of data engineering concepts including ETL, data lakes, data warehousing, and distributed computing will be beneficial for this role.,
Posted 5 days ago
5.0 - 9.0 years
0 Lacs
thane, maharashtra
On-site
As the BI/BW Lead at DMart, you will be responsible for leading and managing a dedicated SAP BW team to ensure timely delivery of Reports, Dashboards, and analytics solutions. Your role will focus on managing the team effectively, ensuring that all SAP BW operational support tasks and Development projects are completed with high quality and efficiency. You will also be responsible for the stability and performance of the SAP BW environment, overseeing daily support activities, and ensuring seamless data flow and reporting across the organization. Acting as the bridge between business stakeholders and your technical team, you will play a pivotal role in maintaining and enhancing DMart's data ecosystem. Your educational qualifications should include a Bachelors/Masters Degree in Computer Science, Information Systems, Engineering, or a related field. While SAP BW certifications are preferred, they are not mandatory. Key Responsibilities: - Lead and manage the SAP BW & BOBJ team to ensure efficient workload distribution and timely completion of tasks. - Oversee the daily operational support of the SAP BW & BOBJ environment, ensuring stability and performance. - Provide direction and guidance to the team for issue resolution, data loads, and reporting accuracy. - Act as the primary point of contact for business users and internal teams regarding SAP BW support and enhancements. - Ensure the team follows best practices in monitoring, error handling, and performance optimization. - Drive the continuous improvement of support processes, tools, and methodologies. - Proactively identify potential risks and bottlenecks in data flows and take corrective actions. - Ensure timely delivery of data extracts, reports, and dashboards for business-critical decisions. - Provide leadership in system upgrades, patching, and data model improvements. - Facilitate knowledge sharing and skill development within the SAP BW team. - Maintain high standards of data integrity and security in the BW environment. Professional Skills: - Strong functional and technical understanding of SAP BW/BW on HANA & BOBJ. - At least 5 years of working experience on SAP Analytics. - Solid understanding of ETL processes and data extraction. - Experience working on Data lakes like Snowflake, Big Query, Data bricks, and Dashboard tools like Power BI, Tableau would be an added advantage. - Experience working in Retail, CPG, or SCM would be an added advantage. - Experience in managing SAP BW support activities and coordinating issue resolution. - Strong stakeholder management skills with the ability to translate business needs into technical actions. - Excellent problem-solving and decision-making abilities under pressure.,
Posted 5 days ago
7.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
As a Business Analyst with 7-14 years of experience, you will be responsible for various tasks including Business Requirement Documents (BRD) and Functional Requirement Documents (FRD) creation, Stakeholder Management, User Acceptance Testing (UAT), understanding Datawarehouse Concepts, SQL queries, and subqueries, as well as utilizing Data Visualization tools such as Power BI or MicroStrategy. It is essential that you have a deep understanding of the Investment Domain, specifically in areas like Capital markets, Asset management, and Wealth management. Your primary responsibilities will involve working closely with stakeholders to gather requirements, analyzing data, and testing systems to ensure they meet business needs. Additionally, you should have a strong background in investment management or financial services, with experience in areas like Asset management, Investment operations, and Insurance. Your familiarity with concepts like Critical Data Elements (CDEs), data traps, and reconciliation workflows will be beneficial in this role. Technical expertise in BI and analytics tools like Power BI, Tableau, and MicroStrategy is required, along with proficiency in SQL. You should also possess excellent communication skills, analytical thinking capabilities, and the ability to engage effectively with stakeholders. Experience in working within Agile/Scrum environments with cross-functional teams is highly valued. In terms of technical skills, you should demonstrate proven abilities in analytical problem-solving, with a deep knowledge of investment data platforms such as Golden Source, NeoXam, RIMES, and JPM Fusion. Expertise in cloud data technologies like Snowflake, Databricks, and AWS/GCP/Azure data services is essential. Understanding data governance frameworks, metadata management, and data lineage is crucial, along with compliance standards in the investment management industry. Hands-on experience with Investment Book of Records (IBORs) like Blackrock Alladin, CRD, Eagle STAR (ABOR), Eagle Pace, and Eagle DataMart is preferred. Familiarity with investment data platforms including Golden Source, FINBOURNE, NeoXam, RIMES, and JPM Fusion, as well as cloud data platforms like Snowflake and Databricks, will be advantageous. Your background in data governance, metadata management, and data lineage frameworks will be essential in ensuring data accuracy and compliance within the organization.,
Posted 5 days ago
2.0 - 15.0 years
0 Lacs
noida, uttar pradesh
On-site
You are a highly skilled and experienced professional tasked with leading and supporting data warehousing and data center architecture initiatives. Your expertise in Data Warehousing, Data Lakes, Data Integration, and Data Governance, along with hands-on experience in ETL tools and cloud platforms such as AWS, Azure, GCP, and Snowflake, will be crucial for this role. You are expected to have a strong presales experience, technical leadership capabilities, and the ability to manage complex enterprise deals across various geographies. Your main responsibilities will include architecting and designing scalable Data Warehousing and Data Lake solutions, leading presales engagements, creating and presenting proposals and solution designs to clients, collaborating with cross-functional teams, estimating efforts and resources for customer requirements, driving Managed Services opportunities and enterprise deal closures, engaging with clients globally, ensuring alignment of solutions with business goals and technical requirements, and maintaining high standards of documentation and presentation for client-facing materials. To excel in this role, you must possess a Bachelors or Masters degree in Computer Science, Information Technology, or a related field. Certifications in AWS, Azure, GCP, or Snowflake are advantageous. You should have experience working in consulting or system integrator environments, a strong understanding of Data Warehousing, Data Lakes, Data Integration, and Data Governance, hands-on experience with ETL tools, exposure to cloud environments, a minimum of 2 years of presales experience, experience in enterprise-level deals and Managed Services, the ability to handle multi-geo engagements, excellent presentation and communication skills, and a solid grasp of effort estimation techniques for customer requirements.,
Posted 5 days ago
5.0 - 15.0 years
0 Lacs
noida, uttar pradesh
On-site
HCLTech is seeking a Data and AI Principal / Senior Manager (Generative AI) for their Noida location. As a global technology company with a workforce of over 218,000 employees in 59 countries, HCLTech specializes in digital, engineering, cloud, and AI solutions. The company collaborates with clients across various industries such as Financial Services, Manufacturing, Life Sciences, Healthcare, Technology, Telecom, Retail, and Public Services, offering innovative technology services and products. With consolidated revenues of $13.7 billion as of the 12 months ending September 2024, HCLTech aims to drive progress and transformation for its clients globally. Key Responsibilities: In this role, you will be responsible for providing hands-on technical leadership and oversight, including leading the design of AI and GenAI solutions, machine learning pipelines, and data architectures. You will actively contribute to coding, solution design, and troubleshooting critical components, collaborating with Account Teams, Client Partners, and Domain SMEs to ensure technical solutions align with business needs. Additionally, you will mentor and guide engineers across various functions to foster a collaborative and high-performance team environment. As part of the role, you will design and implement system and API architectures, integrating microservices, RESTful APIs, cloud-based services, and machine learning models seamlessly into GenAI and data platforms. You will lead the integration of AI, GenAI, and Agentic applications, NLP models, and large language models into scalable production systems. You will also architect ETL pipelines, data lakes, and data warehouses using tools like Apache Spark, Airflow, and Google BigQuery, and drive deployment using cloud platforms such as AWS, Azure, and GCP. Furthermore, you will lead the design and deployment of machine learning models using frameworks like PyTorch, TensorFlow, and scikit-learn, ensuring accurate and reliable outputs. You will develop prompt engineering techniques for GenAI models and implement best practices for ML model performance monitoring and continuous training. The role also involves expertise in CI/CD pipelines, Infrastructure-as-Code, cloud management, stakeholder communication, agile development, performance optimization, and scalability strategies. Required Qualifications: - 15+ years of hands-on technical experience in software engineering, with at least 5+ years in a leadership role managing cross-functional teams in AI, GenAI, machine learning, data engineering, and cloud infrastructure. - Proficiency in Python and experience with Flask, Django, or FastAPI for API development. - Extensive experience in building and deploying ML models using TensorFlow, PyTorch, scikit-learn, and spaCy, and integrating them into AI frameworks. - Familiarity with ETL pipelines, data lakes, data warehouses, and data processing tools like Apache Spark, Airflow, and Kafka. - Strong expertise in CI/CD pipelines, containerization, Infrastructure-as-Code, and API security for high-traffic systems. If you are interested in this position, please share your profile with the required details including Overall Experience, Skills, Current and Preferred Location, Current and Expected CTC, and Notice Period to paridhnya_dhawankar@hcltech.com.,
Posted 6 days ago
3.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
We are seeking a highly skilled and experienced Snowflake Architect to take charge of designing, developing, and deploying enterprise-grade cloud data solutions. The ideal candidate should possess a solid background in data architecture, cloud data platforms, and Snowflake implementation, along with practical experience in end-to-end data pipeline and data warehouse design. In this role, you will be responsible for leading the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. You will also be tasked with defining data modeling standards, best practices, and governance frameworks. Collaborating with stakeholders to comprehend data requirements and translating them into robust architectural solutions will be a key part of your responsibilities. Furthermore, you will be required to design and optimize ETL/ELT pipelines utilizing tools like Snowpipe, Azure Data Factory, Informatica, or DBT. Implementing data security, privacy, and role-based access controls within Snowflake is also essential. Providing guidance to development teams on performance tuning, query optimization, and cost management within Snowflake will be part of your duties. Additionally, ensuring high availability, fault tolerance, and compliance across data platforms will be crucial. Mentoring developers and junior architects on Snowflake capabilities is an important aspect of this role. Qualifications and Experience: - 8+ years of overall experience in data engineering, BI, or data architecture, with a minimum of 3+ years of hands-on Snowflake experience. - Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization. - Strong proficiency in SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP). - Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion. - Good understanding of data lakes, data mesh, and modern data stack principles. - Experience with CI/CD for data pipelines, DevOps, and data quality frameworks. - Solid knowledge of data governance, metadata management, and cataloging. Desired Skills: - Snowflake certification (e.g., SnowPro Core/Advanced Architect). - Familiarity with Apache Airflow, Kafka, or event-driven data ingestion. - Knowledge of data visualization tools such as Power BI, Tableau, or Looker. - Experience in healthcare, BFSI, or retail domain projects. Please note that this job description is sourced from hirist.tech.,
Posted 6 days ago
3.0 - 7.0 years
0 Lacs
maharashtra
On-site
As a Software Engineer in the Direct Platform Quality team at Morningstar's Enterprise Data Platform (EDP), you will play a crucial role in developing and maintaining data quality solutions to enhance Morningstar's client experience. You will collaborate with a quality engineering team to automate the creation of client scorecards and conduct data-specific audit and benchmarking activities. By partnering with key stakeholders like Product Managers and Senior Engineers, you will contribute to the development and execution of data quality control suites. Your responsibilities will include developing and deploying quality solutions using best practices of Software Engineering, building applications and services for Data Quality Benchmarking and Data Consistency Solutions, and adding new features as per the Direct Platform Quality initiatives" product roadmap. You will also be required to participate in periodic calls during US or European hours and adhere to coding standards and guidelines. To excel in this role, you should have a minimum of 3 years of hands-on experience in software engineering with a focus on building and deploying applications for data analytics. Proficiency in Python, Object Oriented Programming, SQL, and AWS Cloud is essential, with AWS certification being a plus. Additionally, expertise in big data open-source technologies, Analytics & ML/AI, public cloud services, and cloud-native architectures is required. Experience in working on Data Analytics and Data Quality projects for AMCs, Banks, Hedge Funds, and designing complex data pipelines in a Cloud Environment will be advantageous. An advanced degree in engineering, computer science, or a related field is preferred, along with experience in the Financial Domain. Familiarity with Agile software engineering practices and mutual fund, fixed income, and equity data is beneficial. At Morningstar, we believe in continuous learning and expect you to stay abreast of software engineering, cloud and data science, and financial research trends. Your contributions to the technology strategy will lead to the development of superior products, streamlined processes, effective communication, and faster delivery times. As our products have a global reach, a global mindset is essential for success in this role. Morningstar is committed to providing an equal opportunity work environment. Our hybrid work model allows for remote work with regular in-person collaboration, fostering a culture of flexibility and connectivity among global colleagues. Join us at Morningstar to be part of a dynamic team that values innovation, collaboration, and personal growth.,
Posted 6 days ago
5.0 - 15.0 years
0 Lacs
noida, uttar pradesh
On-site
HCLTech is looking for a Data and AI Principal / Senior Manager (Generative AI) to join their team in Noida. As a global technology company with a strong presence in 59 countries and over 218,000 employees, HCLTech is a leader in digital, engineering, cloud, and AI services. They collaborate with clients in various industries such as Financial Services, Manufacturing, Life Sciences, Healthcare, Technology, Telecom, Media, Retail, and Public Services. With consolidated revenues of $13.7 billion, HCLTech aims to provide industry-leading capabilities to drive progress for their clients. In this role, you will be responsible for providing hands-on technical leadership and oversight. This includes leading the design of AI, GenAI solutions, machine learning pipelines, and data architectures to ensure performance, scalability, and resilience. You will actively contribute to coding, code reviews, and solution design, while working closely with Account Teams, Client Partners, and Domain SMEs to align technical solutions with business needs. Mentoring and guiding engineers across various functions will be an essential aspect of this role, fostering a collaborative and high-performance team environment. Your role will also involve designing and implementing system and API architectures, integrating AI, GenAI, and Agentic applications into production systems, and architecting ETL pipelines, data lakes, and data warehouses using industry-leading tools. You will drive the deployment and scaling of solutions using cloud platforms like AWS, Azure, and GCP, while leading the integration of machine learning models into end-to-end production workflows. Additionally, you will be responsible for leading CI/CD pipeline efforts, infrastructure automation, and ensuring robust integration with cloud platforms. Stakeholder communication, promoting Agile methodologies, and optimizing performance and scalability of applications will be key responsibilities. The ideal candidate will have at least 15 years of hands-on technical experience in software engineering, with a focus on AI, GenAI, machine learning, data engineering, and cloud infrastructure. If you meet the qualifications and are passionate about driving innovation in AI and data technologies, we invite you to share your profile with us. Kindly email your details to paridhnya_dhawankar@hcltech.com including your overall experience, skills, current and preferred location, current and expected CTC, and notice period. We look forward to hearing from you and exploring the opportunity to work together at HCLTech.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
haryana
On-site
As SM- MIS Reporting at Axis Max Life Insurance in the BPMA department, you will play a crucial role in leading the reporting function for all distribution functions. Your responsibilities will include defining the vision and roadmap for the business intelligence team, championing a data culture within Max Life, and driving the transformation towards automation and real-time insights. You will lead a team of 10+ professionals, including partners, and coach and mentor them to continuously enhance their skills and capabilities. Your key responsibilities will involve handling distribution reporting requirements across functions and job families to support strategic priorities and performance management. You will ensure the timely and accurate delivery of reports and dashboards, identify opportunities to automate reporting processes, and collaborate with the data team to design and build data products for the distribution teams. Additionally, you will work towards driving a data democratization culture and developing the data infrastructure necessary for efficient analysis and reporting. To qualify for this role, you should possess a Master's degree in a quantitative field, along with at least 7-8 years of relevant experience in working with business reporting teams. Experience in the financial services sector, proficiency in Python and PowerBI, and familiarity with BI tech stack tools like SQL Server reporting services and SAP BO are preferred. You should also have a strong understanding of data architecture, data warehousing, and data lakes, as well as excellent interpersonal, verbal, and written communication skills. Join us at Axis Max Life Insurance to be part of a dynamic team that is focused on leveraging data-driven insights to enhance business performance and drive strategic decision-making.,
Posted 1 week ago
3.0 - 7.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Engineer specializing in Snowflake architecture, you will be responsible for designing and implementing scalable data warehouse architectures, including schema modeling and data partitioning. Your role will involve leading or supporting data migration projects to Snowflake from on-premise or legacy cloud platforms. You will be developing ETL/ELT pipelines and integrating data using various tools such as DBT, Fivetran, Informatica, and Airflow. It will be essential to define and implement best practices for data modeling, query optimization, and storage efficiency within Snowflake. Collaboration with cross-functional teams, including data engineers, analysts, BI developers, and stakeholders, will be crucial to align architectural solutions effectively. Ensuring data governance, compliance, and security by implementing RBAC, masking policies, and access control within Snowflake will also be part of your responsibilities. Working closely with DevOps teams to enable CI/CD pipelines, monitoring, and infrastructure as code for Snowflake environments is essential. Your role will involve optimizing resource utilization, monitoring workloads, and managing the cost-effectiveness of the platform. Staying updated with Snowflake features, cloud vendor offerings, and best practices will be necessary to drive continuous improvement in data architecture. Qualifications & Skills: - Bachelor's or Master's degree in Computer Science, Information Systems, or a related field. - 5+ years of experience in data engineering, data warehousing, or analytics architecture. - 3+ years of hands-on experience in Snowflake architecture, development, and administration. - Strong knowledge of cloud platforms such as AWS, Azure, or GCP. - Solid understanding of SQL, data modeling, and data transformation principles. - Experience with ETL/ELT tools, orchestration frameworks, and data integration. - Familiarity with data privacy regulations (GDPR, HIPAA, etc.) and compliance. Additional Qualifications: - Snowflake certification (SnowPro Core / Advanced). - Experience in building data lakes, data mesh architectures, or streaming data platforms. - Familiarity with tools like Power BI, Tableau, or Looker for downstream analytics. - Experience with Agile delivery models and CI/CD workflows. This role offers an exciting opportunity to work on cutting-edge data architecture projects and collaborate with diverse teams to drive impactful business outcomes.,
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. As part of our Analytics and Insights Consumption team, you'll analyze data to drive useful insights for clients to address core business issues or to drive strategic outcomes. You'll use visualization, statistical and analytics models, AI/ML techniques, Modelops and other techniques to develop these insights. Candidates with 8+ years of hands-on experience are preferred for this role. Lead and manage a team of software engineers in developing, implementing, and maintaining advanced software solutions for GenAI projects. Engage with senior leadership and cross-functional teams to gather business requirements, identify opportunities for technological enhancements, and ensure alignment with organizational goals. Design and implement sophisticated event-driven architectures to support real-time data processing and analysis. Oversee the use of containerization technologies such as Kubernetes to promote efficient deployment and scalability of software applications. Supervise the development and management of extensive data lakes, ensuring effective storage and handling of large volumes of structured and unstructured data. Champion the use of Python as the primary programming language, setting high standards for software development within the team. Facilitate close collaboration between software engineers, data scientists, data engineers, and DevOps teams to ensure seamless integration and deployment of GenAI models. Maintain a cutting-edge knowledge base in GenAI technologies to drive innovation and enhance software engineering processes continually. Translate complex business needs into robust technical solutions, contributing to strategic decision-making processes. Establish and document software engineering processes, methodologies, and best practices, promoting a culture of excellence. Ensure continuous professional development of the team by maintaining and acquiring new solution architecture certificates and adhering to industry best practices.,
Posted 1 week ago
3.0 - 8.0 years
0 Lacs
chennai, tamil nadu
On-site
We are seeking a highly skilled and experienced Snowflake Architect to take charge of designing, developing, and deploying enterprise-grade cloud data solutions. As the ideal candidate, you should possess a robust background in data architecture, cloud data platforms, and Snowflake implementation. Hands-on experience in end-to-end data pipeline and data warehouse design is essential for this role. Your responsibilities will include leading the architecture, design, and implementation of scalable Snowflake-based data warehousing solutions. You will be tasked with defining data modeling standards, best practices, and governance frameworks. Designing and optimizing ETL/ELT pipelines using tools such as Snowpipe, Azure Data Factory, Informatica, or DBT will be a key aspect of your role. Collaboration with stakeholders to understand data requirements and translating them into robust architectural solutions will also be expected. Additionally, you will be responsible for implementing data security, privacy, and role-based access controls within Snowflake. Guiding development teams on performance tuning, query optimization, and cost management in Snowflake is crucial. Ensuring high availability, fault tolerance, and compliance across data platforms will also fall under your purview. Mentoring developers and junior architects on Snowflake capabilities is another important aspect of this role. In terms of Skills & Experience, we are looking for candidates with at least 8+ years of overall experience in data engineering, BI, or data architecture, and a minimum of 3+ years of hands-on Snowflake experience. Expertise in Snowflake architecture, data sharing, virtual warehouses, clustering, and performance optimization is highly desirable. Strong proficiency in SQL, Python, and cloud data services (e.g., AWS, Azure, or GCP) is required. Hands-on experience with ETL/ELT tools like ADF, Informatica, Talend, DBT, or Matillion is also necessary. A good understanding of data lakes, data mesh, and modern data stack principles is preferred. Experience with CI/CD for data pipelines, DevOps, and data quality frameworks is a plus. Solid knowledge of data governance, metadata management, and cataloging is beneficial. Preferred qualifications include holding a Snowflake certification (e.g., SnowPro Core/Advanced Architect), familiarity with Apache Airflow, Kafka, or event-driven data ingestion, knowledge of data visualization tools such as Power BI, Tableau, or Looker, and experience in healthcare, BFSI, or retail domain projects. If you meet these requirements and are ready to take on a challenging and rewarding role as a Snowflake Architect, we encourage you to apply.,
Posted 1 week ago
12.0 - 16.0 years
0 Lacs
haryana
On-site
As the Senior Solution Architect reporting to the Director Consulting, you play a crucial role in creating innovative solutions for both new and existing clients. Your main focus will be on utilizing data to drive the architecture and strategy of Digital Experience Platforms (DXP). You will be pivotal in shaping the technology and software architecture, particularly emphasizing how data-driven insights influence the design and execution of client projects. Your expertise will steer the development of solutions deeply rooted in CMS, CDP, CRM, loyalty, and analytics-intensive platforms, integrating ML and AI capabilities. The core of our approach revolves around harnessing data to craft composable, insightful, and efficient DXP solutions. You are a client-facing professional, often engaging directly with potential customers to shape technically and commercially viable solutions. As a mentor, you lead by example and encourage others to expand their horizons. A self-starter in your field, you are keen on enhancing your skill set in the ever-evolving Omnichannel solutions landscape. Collaborative by nature, you thrive on working with various cross-functional teams on a daily basis. An effective communicator, you possess the ability to capture attention, strategize, and troubleshoot on the fly with both internal and external team members. Your mastery of written language allows you to deliver compelling technical proposals to both new and existing clients. In your role, you will be responsible for discussing technical solutions with current and potential clients, as well as internal teams, to introduce innovative ideas for creating functional and appealing digital environments. You will contribute to our clients" digital transformation strategies based on industry best practices and act as a subject matter expert in business development activities. Furthermore, you will collaborate closely with product vendor partners, client partners, strategists, and delivery engagement leaders to tailor solutions to meet clients" explicit and implicit needs. Your tasks will involve architecting technical solutions that meet clients" requirements, selecting and evaluating technology frameworks, and addressing complex business problems with comprehensive assessments. You will also be responsible for articulating the transition from the current to future state, breaking down intricate business and technical strategies into manageable requirements for teams to execute. Your role will also involve conceptualizing and sharing knowledge and thought leadership within the organization, researching and presenting new technology trends, participating in project requirement discovery, scoping, and providing estimations for phased program delivery. The ideal candidate for the Solutions Architect position will have 12+ years of experience in designing, developing, and supporting large-scale web applications, along with expertise in cloud-native capabilities of AWS, GCP, or Azure. They should also possess hands-on experience in data architectures, storage solutions, data processing workflows, CDPs, data lakes, analytics solutions, and customer-facing applications. Additionally, experience in client-facing technology consulting, E-Commerce platforms, and knowledge of digital marketing trends and best practices are desired. A bachelor's degree in a relevant field is required. The statements within this job description outline the essential functions of this role, the necessary level of knowledge and skills, and the extent of responsibility. It should not be viewed as an exhaustive list of job requirements. Individuals may be assigned other duties as needed to cover absences, balance organizational workload, or work in different functional areas. Material is a global company known for partnering with top brands worldwide and launching innovative products. We value inclusion, collaboration, and expertise in our work, with a commitment to understanding human behavior and applying a scientific approach. We offer a learning-focused community dedicated to creating impactful experiences and making a difference in people's lives. Additionally, we provide professional development, a hybrid work mode, health and family insurance, ample leave allowances, and wellness programs, ensuring a supportive and enriching work environment.,
Posted 1 week ago
10.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
We are looking for a highly motivated and experienced Data and Analytics Senior Architect to lead our Master Data Management (MDM) and Data Analytics team. As the Architect Lead, you will be responsible for defining and implementing the overall data architecture strategy to ensure alignment with business goals and support data-driven decision-making. Your role will involve designing scalable, secure, and efficient data systems, including databases, data lakes, and data warehouses. You will evaluate and recommend tools and technologies for data integration, processing, storage, and analytics while staying updated on industry trends. Additionally, you will lead a high-performing team, foster a collaborative and innovative culture, and ensure data integrity, consistency, and availability across the organization. Our existing MDM solution is based on Microsoft Data Lake gen 2, Snowflake as the DWH, and Power BI managing data from most of our core applications. You will be managing the existing solution and driving further development to handle additional data and capabilities, as well as supporting our AI journey. The ideal candidate will possess strong leadership skills, a deep understanding of data management and technology principles, and the ability to collaborate effectively across different departments and functions. **Principle Duties and Responsibilities:** **Team Leadership:** - Lead, mentor, and develop a high-performing team of data analysts and MDM specialists. - Foster a collaborative and innovative team culture that encourages continuous improvement and efficiency. - Provide technical leadership and guidance to the development teams and oversee the implementation of IT solutions. **Architect:** - Define the overall data architecture strategy, aligning it with business goals and ensuring it supports data-driven decision-making. - Identify, evaluate, and establish shared enabling technical capabilities for the division in collaboration with IT to ensure consistency, quality, and business value. - Design and oversee the implementation of data systems, including databases, data lakes, and data warehouses, ensuring they are scalable, secure, efficient, and cost-effective. - Evaluate and recommend tools and technologies for data integration, processing, storage, and analytics, staying updated on industry trends. **Strategic Planning:** - Take part in developing and implementing the MDM and analytics strategy aligned with the overall team and organizational goals. - Collaborate with the Enterprise architect to align on the overall strategy and application landscape securing that MDM and data analytics fit into the overall ecosystem. - Identify opportunities to enhance data quality, governance, and analytics capabilities. **Project Management:** - Oversee project planning, execution, and delivery to ensure timely and successful completion of initiatives and support. - Monitor project progress and cost, identify risks, and implement mitigation strategies. **Stakeholder Engagement:** - Collaborate with cross-functional teams to understand data needs and deliver solutions that support business objectives. - Serve as a key point of contact for data-related inquiries and support requests. - Actively develop business cases and proposals for IT investments and present them to senior management, executives, and stakeholders. **Data/Information Governance:** - Establish and enforce data/information governance policies and standards to ensure compliance and data integrity. - Champion best practices in data management and analytics across the organization. **Reporting and Analysis:** - Utilize data analytics to derive insights and support decision-making processes. - Document and present findings and recommendations to senior management and stakeholders. **Knowledge, Skills and Abilities Required:** - Bachelor's degree in computer science, Data Science, Information Management, or a related field; master's degree preferred. - 10+ years of experience in data management, analytics, or a related field, with at least 2 years in a leadership role. - Management advisory skills, such as strategic thinking, problem-solving, business acumen, stakeholder management, and change management. - Strong knowledge of master data management concepts, data governance, data technology, data modeling, ETL processes, database management, big data technologies, and data integration techniques. - Excellent project management skills with a proven track record of delivering complex projects on time and within budget. - Strong analytical, problem-solving, and decision-making abilities. - Exceptional communication and interpersonal skills, with the ability to engage and influence stakeholders at all levels. - Team player, result-oriented, structured, attention to detail, drive for accuracy, and strong work ethic. **Special Competencies required:** - Proven leader with excellent structural skills, good at documenting as well as presenting. - Strong executional skills to make things happen, not generate ideas alone but also getting things done of value for the entire organization. - Proven experience in working with analytics tools as well as data ingestion and platforms like Power BI, Azure Data Lake, Snowflake, etc. - Experience in working in any MDM solution and preferably TIBCO EBX. - Experience in working with Jira/Confluence. **Additional Information:** - Office, remote, or hybrid working. - Ability to function within variable time zones. - International travel may be required. Join us at the ASSA ABLOY Group, where our innovations make spaces physical and virtual safer, more secure, and easier to access. As an employer, we value results and empower our people to build their career around their aspirations and our ambitions. We foster diverse, inclusive teams and welcome different perspectives and experiences.,
Posted 1 week ago
9.0 - 13.0 years
0 Lacs
karnataka
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. As a Lead Data Engineer at EY, you will play a crucial role in leading large scale solution architecture design and optimization to provide streamlined insights to partners throughout the business. You will lead a team of Mid- and Senior data engineers to collaborate with visualization on data quality and troubleshooting needs. Your key responsibilities will include implementing data processes for the data warehouse and internal systems, leading a team of Junior and Senior Data Engineers in executing data processes, managing data architecture, designing ETL processes, cleaning, aggregating, and organizing data from various sources, and transferring it to data warehouses. You will be responsible for leading the development, testing, and maintenance of data pipelines and platforms to enable data quality utilization within business dashboards and tools. Additionally, you will support team members and direct reports in refining and validating data sets, create, maintain, and support the data platform and infrastructure, and collaborate with various teams to understand data requirements and design solutions that enable advanced analytics, machine learning, and predictive modeling. To qualify for this role, you must have a Bachelor's degree in Engineering, Computer Science, Data Science, or related field, along with 9+ years of experience in software development, data engineering, ETL, and analytics reporting development. You should possess expertise in building and maintaining data and system integrations using dimensional data modeling and optimized ETL pipelines, as well as experience with modern data architecture and frameworks like data mesh, data fabric, and data product design. Other essential skillsets include proficiency in data engineering programming languages such as Python, distributed data technologies like Pyspark, cloud platforms and tools like Kubernetes and AWS services, relational SQL databases, DevOps, continuous integration, and more. You should have a deep understanding of database architecture and administration, excellent written and verbal communication skills, strong organizational skills, problem-solving abilities, and the capacity to work in a fast-paced environment while adapting to changing business priorities. Desired skillsets for this role include a Master's degree in Engineering, Computer Science, Data Science, or related field, as well as experience in a global working environment. Travel requirements may include access to transportation to attend meetings and the ability to travel regionally and globally. Join EY in building a better working world, where diverse teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate across various sectors.,
Posted 1 week ago
12.0 - 16.0 years
0 Lacs
haryana
On-site
You will play a crucial role as a Senior Solution Architect, reporting directly to the Director of Consulting, within an innovative team dedicated to crafting cutting-edge solutions for both new and existing clients. Your primary focus will involve harnessing the power of data to drive the architecture and strategy of Digital Experience Platforms (DXP). Your expertise will be instrumental in shaping the technology and software architecture, with a particular emphasis on utilizing data-driven insights to inform the design and execution of client initiatives. Your responsibilities will revolve around developing solutions deeply rooted in CMS, CDP, CRM, loyalty, and analytics-intensive platforms, integrating Machine Learning (ML) and Artificial Intelligence (AI) capabilities. The core of our approach revolves around leveraging data to create adaptable, insightful, and impactful DXP solutions. You will be client-facing, actively engaging with potential customers to shape technically and commercially viable solutions. As a mentor, you will lead by example, challenging your peers to expand their horizons. A self-starter in your field, you remain eager to enhance your skill set within the ever-evolving realm of Omnichannel solutions. Collaboration is key, as you thrive on working with diverse cross-functional teams on a daily basis. An effective communicator, you possess the ability to command attention, strategize, and troubleshoot on the spot when working with both internal and external team members. Your proficiency in written communication allows you to deliver compelling technical proposals to both prospective and existing clients. Your day-to-day responsibilities will include: - Engaging in technical solution strategy discussions with current and potential clients, as well as internal teams, to introduce innovative ideas for creating functional and appealing digital environments. - Collaborating closely with product vendor partners, strategists, and delivery engagement leaders to tailor solutions to meet clients" explicit and underlying needs. - Designing the technical architecture of various solutions to meet clients" requirements, selecting and evaluating technology frameworks, and solving intricate business problems through thorough analysis. - Articulating the transition from the current state to the future state while considering the business's future needs, security policies, and requirements. - Sharing knowledge and thought leadership within the organization, presenting recommendations on technical direction and team professional development, and staying abreast of new technology trends. - Participating in technical project requirement discovery, scoping, and providing phased program delivery recommendations, along with contributing to project delivery estimations based on your suggestions. To excel in the role of Solutions Architect, you should possess the following competencies: - 12+ years of experience in designing, developing, and supporting large-scale web applications. - Proficiency in developing modern applications using cloud-native capabilities of AWS, GCP, or Azure. - Extensive experience in designing and implementing efficient data architectures, storage solutions, and data processing workflows, with a focus on stream processing, event queuing capabilities, and advanced CI/CD pipelines. - Hands-on experience with CDPs, data lakes, and analytics solutions, along with customer-facing applications and content management systems. - Experience in client-facing technology consulting roles, understanding business needs and translating them into solutions for customer acquisition, engagement, and retention. - Knowledge of current digital marketing trends and best practices, along with the ability to define conceptual technology solutions and articulate the value of technology to drive creative marketing platforms. In addition to engaging, high-impact work, Material offers a vibrant company culture and a range of benefits, including professional development opportunities, a hybrid work model, health and family insurance coverage, ample leave entitlements, wellness programs, and more.,
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France