Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 8.0 years
9 - 14 Lacs
Chennai
Work from Office
This position provides input, support, and performs full systems life cycle management activities (e-g-, analyses, technical requirements, design, coding, testing, implementation of systems and applications software, etc-)- He/She participates in component and data architecture design, technology planning, and testing for Applications Development (AD) initiatives to meet business requirements- This position provides input to applications development project plans and integrations- He/She collaborates with teams and supports emerging technologies to ensure effective communication and achievement of objectives- This position provides knowledge and support for applications development, integration, and maintenance- He/She provides input to department and project teams on decisions supporting projects- Responsibilities : Performs systems analysis and design- Designs and develops moderate to highly complex applications- Develops application documentation- Produces integration builds- Performs maintenance and support- Supports emerging technologies and products- Qualifications : Minimum 6-8 years of experience with Java Spring Boot, Restful Web Service Client/Server Development Proficient with SQL, PL/SQL, and Oracle database- Proficient with AMQ Nice to have OpenShift, Azure DevOps Server, Jenkins CI/CD Pipeline Knowledge of Code Quality Inspection Tools, Dependency Management Systems and Software Vulnerability Detection and Remediation Familiarity with Agile Development and Sprint Ceremonies Must be detail oriented- Excellent verbal and written communication skills Bachelors Degree or International equivalent in Computer Science, Information Systems, Mathematics, Statistics, or related field - Preferred
Posted 3 weeks ago
5.0 - 10.0 years
25 - 30 Lacs
Kolkata, Mumbai, New Delhi
Work from Office
Coursera was launched in 2012 by Andrew Ng and Daphne Koller, with a mission to provide universal access to world-class learning. It is now one of the largest online learning platforms in the world, with 175 million registered learners as of March 31, 2025. Coursera partners with over 350 leading universities and industry leaders to offer a broad catalog of content and credentials, including courses, Specializations, Professional Certificates, and degrees. Coursera s platform innovations enable instructors to deliver scalable, personalized, and verified learning experiences to their learners. Institutions worldwide rely on Coursera to upskill and reskill their employees, citizens, and students in high-demand fields such as GenAI, data science, technology, and business. Coursera is a Delaware public benefit corporation and a B Corp. Join us in our mission to create a world where anyone, anywhere can transform their life through access to education. Were seeking talented individuals who share our passion and drive to revolutionize the way the world learns. At Coursera, we are committed to building a globally diverse team and are thrilled to extend employment opportunities to individuals in any country where we have a legal entity. We require candidates to possess eligible working rights and have a compatible timezone overlap with their team to facilitate seamless collaboration. Coursera has a commitment to enabling flexibility and workspace choices for employees. Our interviews and onboarding are entirely virtual, providing a smooth and efficient experience for our candidates. As an employee, we enable you to select your main way of working, whether its from home, one of our offices or hubs, or a co-working space near you. Job Overview: Does architecting high quality and scalable data pipelines powering business critical applications excite youHow about working with cutting edge technologies alongside some of the brightest and most collaborative individuals in the industryJoin us, in our mission to bring the best learning to every corner of the world! We re looking for a passionate and talented individual with a keen eye for data to join the Data Engineering team at Coursera! Data Engineering plays a crucial role in building a robust and reliable data infrastructure that enables data-driven decision-making, as well as various data analytics and machine learning initiatives within Coursera. In addition, Data Engineering today owns many external facing data products that drive revenue and boost partner and learner satisfaction. You firmly believe in Courseras potential to make a significant impact on the world, and align with our core values: Learners first: Champion the needs, potential, and progress of learners everywhere. Play for team Coursera: Excel as an individual and win as a team. Put Coursera s mission and results before personal goals. Maximize impact: Increase leverage by focusing on things that produce bigger results with less effort. Learn, change, and grow: Move fast, take risks, innovate, and learn quickly. Invite and offer feedback with respect, courage, and candor. Love without limits: Celebrate the diversity and dignity of every one of our employees, learners, customers, and partners. Your responsibilities Architect scalable data models and construct high quality ETL pipelines that act as the backbone of our core data lake, with cutting edge technologies such as Airflow, DBT, Databricks, Redshift, Spark. Your work will lay the foundation for our data-driven culture. Design, build, and launch self-serve analytics products. Your creations will empower our internal and external customers, providing them with rich insights to make informed decisions. Be a technical leader for the team. Your guidance in technical and architectural designs for major team initiatives will inspire others. Help shape the future of Data Engineering at Coursera and foster a culture of continuous learning and growth. Partner with data scientists, business stakeholders, and product engineers to define, curate, and govern high-fidelity data. Develop new tools and frameworks in collaboration with other engineers. Your innovative solutions will enable our customers to understand and access data more efficiently, while adhering to high standards of governance and compliance. Work cross-functionally with product managers, engineers, and business teams to enable major product and feature launches. Your skills 5+ years experience in data engineering with expertise in data architecture and pipelines Strong programming skills in Python Proficient with relational databases, data modeling, and SQL Experience with big data technologies (eg: Hive, Spark, Presto) Familiarity with batch and streaming architectures preferred Hands-on experience with some of: AWS, Databricks, Delta Lake, Airflow, DBT, Redshift, Datahub, Elementary Knowledgeable on data governance and compliance best practices Ability to communicate technical concepts clearly and concisely Independence and passion for innovation and learning new technologies If this opportunity interest you, you might like these courses on Coursera - Big Data Specialization Data Warehousing for Business Intelligence IBM Data Engineering Professional Certificate #LI-SP2 Coursera is an Equal Employment Opportunity Employer and considers all qualified applicants without regard to race, color, religion, sex, sexual orientation, gender identity, age, marital status, national origin, protected veteran status, disability, or any other legally protected class. . For California Candidates, please review our CCPA Applicant Notice here. For our Global Candidates, please review our GDPR Recruitment Notice here. #LI-Remote
Posted 3 weeks ago
5.0 - 10.0 years
25 - 30 Lacs
Hyderabad
Work from Office
At Intercontinental Exchange (ICE), we engineer technology, exchanges and clearing houses that connect companies around the world to global capital and derivative markets. With a leading-edge approach to developing technology platforms, we have built market infrastructure in all major trading centers, offering customers the ability to manage risk and make informed decisions globally. By leveraging our core strengths in technology, we continue to identify new ways to serve our customers and transform global markets. AIP Suites (Data Modernization to Snowflake) builds an analytics-ready data architecture where data from source systems such as PDM (Product Data Management) and RDO is ingested into Snowflake for centralized storage and modeling. These models support ICE BI, which consumes Snowflake data for analytics and dashboarding. This design ensures clean separation between raw ingestion, transformation, analytics, and service-based consumption, supporting scalable and future-proof data-driven operations. ICE Mortgage Technology is seeking a Data Engineer who will be responsible for design and optimize SQL queries, develop stored procedures, and participate in the migration and modernization of legacy applications to support IMT (ICE Mortgage Technology) Products. The candidate should have a strong background in SQL and Stored Procedures Responsibilities Provides Snowflake-based data warehouse design and development for projects involving new data integration, migration, and enhancement of existing pipelines. Designs and develops data transformation logic using SQL, Snowflake stored procedures, and Python-based scripts for ETL/ELT workloads. Builds and maintains robust data pipelines to support reporting, analytics, and application data needs. Creates and maintains Snowflake objects like tables, views, streams, tasks, file formats, and external stages. Participates in project meetings with data engineers, analysts, business users, and product owners to understand and implement technical requirements. Writes technical design documentation based on business requirements and data architecture principles. Develops and/or reviews unit testing protocols for SQL scripts, procedures, and data pipelines using automation frameworks. Completes documentation and procedures for pipeline deployment, operational handover, and monitoring. May mentor or guide junior developers and data engineers. Stays current with Snowflake features, best practices, and industry trends in cloud data platforms. Performs additional related duties as assigned. Knowledge and Experience Bachelor s Degree or the equivalent combination of education, training, or work experience. 5+ years of professional experience in data engineering or database development. Strong Hands-on experience: Writing complex SQL queries and stored procedures Database stored procedures, functions, views, and schema design Using Streams, Tasks, Time Travel, and Cloning Proficiency in database performance tuning and performance optimization clustering, warehouse sizing, caching, etc. Experience configuring external stages to integrate with cloud storage (AWS S3, Azure Blob, etc.). Experience writing Python/Shell scripts for data processing (where needed). Knowledge on Snowflake and Tidal is an added advantage Proficiency in using Git and working within Agile/Scrum SDLC environments. Familiarity working in a Software Development Life Cycle (SDLC) leveraging Agile principles. Excellent analytical, decision-making, and problem-solving skills. Ability to multitask in a fast-paced environment with a focus on timeliness, documentation, and communication with peers and business users. Strong verbal and written communication skills to engage both technical and non-technical audiences at various organizational levels.
Posted 3 weeks ago
7.0 - 9.0 years
8 - 14 Lacs
Visakhapatnam
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 3 weeks ago
7.0 - 9.0 years
8 - 14 Lacs
Surat
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 3 weeks ago
7.0 - 9.0 years
8 - 14 Lacs
Varanasi
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 3 weeks ago
9.0 - 14.0 years
15 - 30 Lacs
Gurugram
Remote
Job description Data Modeler AI/ML Enablement Remote | Contract/Freelancer | Duration: 1 to 2 Months Start: Immediate | Experience: 8+ Years Were looking for experienced Data Modelers with a strong background in one or more industries: Telecom, Banking/Finance, Media or Government Only. Key Responsibilities: Design conceptual/logical/physical data models Collaborate with AI/ML teams to structure data for model training Build ontologies, taxonomies, and data schemas Ensure compliance with industry-specific data regulations Must-Have Skills & Experience: 7+ years of hands-on experience in data modeling conceptual, logical, and physical models. Proficiency in data modeling tools like Erwin, ER/Studio, or PowerDesigner. Strong understanding of data domains like customer, transaction, network, media, or case data. Familiarity with AI/ML pipelines understanding how structured data supports model training. Knowledge of data governance, quality, and compliance standards (e.g., GDPR, PCI-DSS). Ability to work independently and deliver models quickly in a short-term contract environment.
Posted 3 weeks ago
14.0 - 24.0 years
35 - 50 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Key Responsibilities: Platform Architecture Design : Lead the design and architecture of the digital platform , ensuring that the data infrastructure is scalable, secure, and reliable. Focus on utilizing AWS services (e.g., S3 , Redshift , Glue , Lambda , Kinesis ) and Databricks to build a robust, cloud-based data architecture. Data Integration & ETL Pipelines : Architect and implement ETL/ELT pipelines to integrate data from multiple sources (e.g., transactional databases, third-party services, APIs) into the platform, using AWS Glue , Databricks , and other tools for efficient data processing. Cloud Strategy & Deployment : Implement cloud-native solutions, leveraging AWS tools and Databricks for data storage, real-time processing, machine learning, and analytics. Design the platform to be cost-efficient, highly available, and easily scalable. Data Modelling : Develop and maintain data models for the platform that support business intelligence, reporting, and analytics. Ensure the data model design aligns with business requirements and the overall architecture of the platform. Machine Learning & Analytics Enablement : Work with data scientists and analysts to ensure that the architecture supports advanced analytics and machine learning workflows, enabling faster time to insights and model deployment. Data Security & Governance : Implement data governance frameworks to ensure data privacy, compliance, and security in the digital platform. Use AWS security tools and best practices to safeguard sensitive data and manage access control. Platform Performance & Optimization : Monitor and optimize platform performance, including the efficiency of data processing, data retrieval, and analytics workloads. Ensure low-latency and high-throughput data pipelines. Collaboration & Stakeholder Management : Collaborate closely with stakeholders across data engineering, data science, and business teams to align the platform architecture with business needs and evolving technological requirements. Skills & Qualifications: Required: Bachelors / Master’s degree in computer science , Engineering or a related field. 10+ years of experience in data architecture, data engineering, or a related field, with a strong background in designing scalable, cloud-based data platforms. Extensive experience with AWS services such as S3 , Redshift , Glue , Lambda , Kinesis , and RDS , with a deep understanding of cloud architecture patterns. Strong proficiency in Databricks , including experience with Apache Spark , Delta Lake , and MLflow for building data pipelines, managing large datasets, and supporting machine learning workflows. Expertise in data modelling techniques, including designing star/snowflake schemas, dimensional models, and ensuring data consistency and integrity across the platform. Experience with ETL/ELT processes , integrating data from a variety of sources, and optimizing data flows for performance. Proficiency in programming languages such as Python and SQL for data manipulation, automation, and data pipeline development. Strong knowledge of data governance and security practices, including data privacy regulations (GDPR, CCPA) and tools like AWS IAM , AWS KMS , and AWS CloudTrail . Experience with CI/CD pipelines and automation tools for deployment, testing, and monitoring of data architecture and pipelines. Preferred: Experience with real-time streaming data solutions such as Apache Kafka or AWS Kinesis within the Databricks environment. Experience with data lake management, particularly using AWS Lake Formation and Databricks Delta Lake for large-scale, efficient data storage and management. Soft Skills: Strong communication skills, with the ability to explain complex technical concepts to business leaders and stakeholders. Excellent problem-solving skills with the ability to architect complex, scalable data solutions. Leadership abilities with a proven track record of mentoring and guiding data teams. Collaborative mindset, capable of working effectively with cross-functional teams, including engineering, data science, and business stakeholders. Attention to detail, with a focus on building high-quality, reliable, and scalable data solutions.
Posted 3 weeks ago
13.0 - 20.0 years
20 - 25 Lacs
Mumbai
Work from Office
Tata STRIVE Tata STRIVE is an initiative of the TCIT, aimed at actively bridging the gap between vocational education and industry needs. Tata STRIVE runs various programmes to skill the youth from underprivileged backgrounds enabling gainful livelihood for each aspirant differentiated by its innovations in technology, pedagogy and methodology. Role: Lead Strategy & Architecture Objective The Lead Strategy Architecture serves as the channel that enables the organisations vision to be implemented year on year by prioritizing on the immediate annual objectives and ensuring the framework for the same is laid down. Major Deliverables Define the IT Strategy & road map aligned to Tata STRIVE business plans Conducting independent research and analysis to identify what can be added to the Tata STRIVE Preparing an annual Roadmap by prioritizing on specific areas of change and performance. Focusing on building internal team capabilities and provide process inputs Designing a Strategy Document that captures the annual roadmap with support and justification for the proposed action plan Periodically review roadmap and ensure alignment of objectives at various levels Introduce new technology concepts for enriching stakeholder experience Collaborate with internal teams for creating STRIVE business plan Reporting To: Head Technology & Innovation Location: Mumbai Essential Attributes Manage enterprise architecture (Business, Information & Technology architecture) for small & medium organization Technical knowledge across multiple technologies, tools & frameworks Understand of Skill development ecosystem Knowledge about economics Social and communication skills Team leadership and training Project management. People Skills Strategic Thinking Desired Attributes • Innovation Creativity Diplomacy and Patience Listening Mentoring Qualification: Bachelor degree in engineering. Any management qualification [ MBA], Enterprise architecture certification [Togaf ] are add-on. Desired Experience(years) 15 + years No. of direct reports: 2 – 10
Posted 3 weeks ago
7.0 - 9.0 years
8 - 14 Lacs
Ludhiana
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 3 weeks ago
7.0 - 9.0 years
8 - 14 Lacs
Lucknow
Work from Office
Job Summary: We are looking for a seasoned Tech Anchor with deep expertise in Big Data technologies and Python to lead technical design, development, and mentoring across data-driven projects. This role demands a strong grasp of scalable data architecture, problem-solving capabilities, and hands-on experience with distributed systems and modern data frameworks. Key Responsibilities: Provide technical leadership across Big Data and Python-based projects Architect, design, and implement scalable data pipelines and processing systems Guide teams on best practices in data modeling, ETL/ELT development, and performance optimization Collaborate with data scientists, analysts, and stakeholders to ensure effective data solutions Conduct code reviews and mentor junior engineers to improve code quality and skills Evaluate and implement new tools and frameworks to enhance data capabilities Troubleshoot complex data-related issues and support production deployments Ensure compliance with data security and governance standards
Posted 3 weeks ago
8.0 - 12.0 years
0 Lacs
kochi, kerala
On-site
At EY, you'll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we're counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY's Consulting Services is a unique, industry-focused business unit that provides a broad range of integrated services leveraging deep industry experience with strong functional and technical capabilities and product knowledge. EY's financial services practice offers integrated Consulting services to financial institutions and other capital markets participants, including commercial banks, retail banks, investment banks, broker-dealers & asset management firms, and insurance firms from leading Fortune 500 Companies. Within EY's Consulting Practice, the Data and Analytics team solves big, complex issues and capitalizes on opportunities to deliver better working outcomes that help expand and safeguard businesses, now and in the future. This way, we help create a compelling business case for embedding the right analytical practice at the heart of clients" decision-making. We're looking for a candidate with 10-12 years of expertise in data science, data analysis, and visualization skills. Act as a Technical Lead to a larger team in EY GDS DnA team to work on various Data and Analytics projects. Your key responsibilities include: - Understanding of insurance domain knowledge (PnC or life or both) - Being responsible for the development of the conceptual, logical, and physical data models, the implementation of RDBMS, operational data store (ODS), data marts, and data lakes on target platforms (SQL/NoSQL) - Overseeing and governing the expansion of existing data architecture and the optimization of data query performance via best practices - Working independently and collaboratively - Implementing business and IT data requirements through new data strategies and designs across all data platforms (relational, dimensional, and NoSQL) and data tools (reporting, visualization, analytics, and machine learning) - Working with business and application/solution teams to implement data strategies, build data flows, and develop conceptual/logical/physical data models - Defining and governing data modeling and design standards, tools, best practices, and related development for enterprise data models - Identifying the architecture, infrastructure, and interfaces to data sources, tools supporting automated data loads, security concerns, analytic models, and data visualization - Hands-on modeling, design, configuration, installation, performance tuning, and sandbox POC - Working proactively and independently to address project requirements and articulate issues/challenges to reduce project delivery risks. Skills and attributes for success include: - Strong communication, presentation, and team-building skills - Experience in executing and managing research and analysis of companies and markets - BE/BTech/MCA/MBA with 8 - 12 years of industry experience with machine learning, visualization, data science, and related offerings - At least around 4-8 years of experience in BI and Analytics - Ability to do end-to-end data solutions from analysis, mapping, profiling, ETL architecture, and data modeling - Knowledge and experience of at least 1 Insurance domain engagement life or Property n Causality - Good experience using CA Erwin or other similar modeling tools - Strong knowledge of relational and dimensional data modeling concepts - Experience in data management analysis - Experience with unstructured data is an added advantage - Ability to effectively visualize and communicate analysis results - Experience with big data and cloud preferred - Experience, interest, and adaptability to working in an Agile delivery environment. Ideally, you'll also have: - Good exposure to any ETL tools - Good to have knowledge about P&C insurance - Must have led a team size of at least 4 members - Experience in Insurance and Banking domain - Prior client-facing skills, self-motivated, and collaborative. What we look for: A team of people with commercial acumen, technical experience, and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of a market-leading, multi-disciplinary team of 1400+ professionals, in the only integrated global transaction business worldwide Opportunities to work with EY Consulting practices globally with leading businesses across a range of industries. At EY, we're dedicated to helping our clients, from startups to Fortune 500 companies - and the work we do with them is as varied as they are. You get to work with inspiring and meaningful projects. Our focus is education and coaching alongside practical experience to ensure your personal development. We value our employees, and you will be able to control your development with an individual progression plan. You will quickly grow into a responsible role with challenging and stimulating assignments. Moreover, you will be part of an interdisciplinary environment that emphasizes high quality and knowledge exchange. Plus, we offer: - Support, coaching, and feedback from engaging colleagues - Opportunities to develop new skills and progress your career - The freedom and flexibility to handle your role in a way that's right for you. EY exists to build a better working world, helping to create long-term value for clients, people, and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform, and operate. Working across assurance, consulting, law, strategy, tax, and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Cloud Architect with expertise in Azure and Snowflake, you will be responsible for designing and implementing secure, scalable, and highly available cloud-based solutions on AWS and Azure Cloud. Your role will involve utilizing your experience in Azure Databricks, ADF, Azure Synapse, PySpark, and Snowflake Services. Additionally, you will participate in pre-sales activities, including RFP and proposal writing. Your experience with integrating various data sources with Data Warehouse and Data Lake will be crucial for this role. You will also be expected to create Data warehouses and data lakes for Reporting, AI, and Machine Learning purposes, while having a solid understanding of data modelling and data architecture concepts. Collaboration with clients to comprehend their business requirements and translating them into technical solutions that leverage Snowflake and Azure cloud platforms will be a key aspect of your responsibilities. Furthermore, you will be required to clearly articulate the advantages and disadvantages of different technologies and platforms, as well as participate in Proposal and Capability presentations. Defining and implementing cloud governance and best practices, identifying and implementing automation opportunities for increased operational efficiency, and conducting knowledge sharing and training sessions to educate clients and internal teams on cloud technologies are additional duties associated with this role. Your expertise will play a vital role in ensuring the success of cloud projects and the satisfaction of clients.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
The role involves and covers the following support from a Business Analyst (BA) perspective. As a Business Analyst, you will be responsible for understanding data and data architecture, metrics, and their significance in the design and delivery of new dashboards. You should be able to identify and engage data owners. Your role will require you to translate business requirements into technical requirements and conduct process mapping, with experience in Risk Transformation. Additionally, you should possess knowledge of requirement gathering and BRD/FRD documentation. Relevant experience of controls is necessary to ensure the quality and completeness of the solution and its data. You should also have the ability to challenge current practices and enhance dashboard designs through innovative design and technology utilization. Collaboration with cross-line-of-business working groups to design dashboards and approve solutions will be part of your responsibilities. Demonstrable experience in designing dashboards to meet the needs of diverse user groups is essential, along with a solid understanding of risk management and knowledge of the latest dashboard technologies and trends. Other Desirable Experience: - Programme delivery on strategic programmes - Interaction with executive-level management - TOM development - Regulatory requirements mapping and gap analysis - Governance, Reporting, and MI/dashboarding delivery - Knowledge of Counterparty Credit Risk, Exposure calculation methodologies (simulation, aggregation, limit monitoring), and experience in implementing Modelled and Non-Modelled calculation algorithms - Previous experience in capturing and analyzing the daily movement of EAD numbers for Financing Products, calculating counterparty credit risk - Experience in validated counterparty exposure on a daily, monthly, and quarterly basis using various metrics including Exposure metrics (PFE, EPE, EEPE, EAD, etc.) and VAR computation using both Internal Model (IMM) and Standardized approaches like CEM - Hands-on Experience of Exposure Calculation (EAD/PFE) at Portfolio level for both Modeled (IMM) and Non-Modeled (CEM/SACCR, Credit VAR, CEF) transactions - Working knowledge of calculating and reporting default risk for traded products - Understanding of adjustments at the counterparty level where traded product exposure (derivatives, debt, and equity financing) was found to be erroneous and material to mitigate impact on risk monitoring, CVA, and RWA - Some exposure to credit risk reporting platforms and risk engine Skills and Qualifications: - CFA/FRM certification is a plus - Strong analytical skills and statistical background,
Posted 3 weeks ago
10.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
As the Senior Director - Enterprise Head of Architecture at AstraZeneca, you will be responsible for overseeing the architecture across the Enterprise Capabilities & Solutions (ECS) landscape. Your role will involve collaborating with other Heads of Architecture to align ECS architectures with AstraZeneca's business and IT strategies. You will play a key role in defining AZ standards, patterns, and roadmaps, ensuring that ECS architectures adhere to them. Additionally, you will partner with key business teams, IT Leadership, and other Heads of Architecture to shape AstraZeneca's Digital and Architectural landscape in line with business strategies and visions. Your main accountabilities will include providing Enterprise Design thinking and support to ECS, defining architecture strategies and standards for the ECS Segment, leading the development and execution of Data & Analytics specific architecture capability and strategy, managing a team of Architects across various technology domains, and driving the development of global enterprise standards for the central Architecture & Technology Strategy function. You will also be responsible for analyzing business data and AI priorities & strategies, refining architecture strategies and principles, and ensuring continuous alignment of capabilities with current business priorities and objectives. Furthermore, as a Senior Director, you will act as an authority to the architecture community and the business regarding strategic architecture decisions for ECS. You will lead the development of Architecture Roadmaps and Blueprints, contribute to multi-functional decision-making bodies, and champion ECS EA initiatives across IT and business. Your role will involve engaging with external architecture authorities to stay updated with the latest trends and technologies in ECS, acting as a strategic architectural advisor for senior IT and Business management, and ensuring that ECS Architectures align with Global strategies and solutions. To be successful in this role, you should have a Bachelor's degree or equivalent experience in Computer Science or Data Management related field, extensive experience and knowledge of ECS solutions, the ability to influence and deliver strategic vision, direction, and planning, a blend of data architecture, analysis, and engineering skills, experience in known industry architectural patterns, understanding of cloud-based containerization strategies for hybrid cloud environments, and knowledge of appropriate data structure and technology based on business use case. Desirable skills/experience include a post-graduate degree in MIS or Data Management, extensive experience in a data architect role, experience in Agile data definition scrums, knowledge of Cloud Economics and Forecasting, and experience using metadata cataloguing tools. Join AstraZeneca in our unique and daring world where we redefine our ability to develop life-changing medicines through a combination of brand new science and leading digital technology platforms. Be part of our digital and data-led enterprise journey where you can innovate, take ownership, explore new solutions, experiment with innovative technology, and address challenges in a modern technology environment. If you are ready to make a difference, apply now!,
Posted 3 weeks ago
18.0 - 22.0 years
0 Lacs
haryana
On-site
You are a highly motivated and experienced Enterprise Architect with a strong focus on Product Engineering, Product Development, and Cloud Native Product Architecture. You will play a critical role in shaping the technical vision and architecture of our product portfolio, ensuring alignment with business strategy and long-term scalability. Collaborating with engineering leads, product managers, engineers, and other stakeholders, you will define and evolve the product architecture roadmap, driving innovation and delivering exceptional customer value. Your major responsibilities will include defining and championing the overall product architecture vision, strategy, and roadmap, considering scalability, performance, security, maintainability, and cost-effectiveness. Leading the design and evolution of key product architectures, providing guidance and direction to development teams on architectural best practices and patterns will be a key aspect of your role. You should have expertise in well-architected patterns across logical and deployment views of the architecture and the ability to apply them appropriately to shape the solution. Strong depth in cloud engineering and full-stack cloud solutions is required, including front-end, back-end, APIs, microservices, workflows, and automation. Researching, evaluating, and recommending appropriate technologies, platforms, and frameworks to support product development and innovation is another important responsibility. You will collaborate effectively with product managers, engineers, business stakeholders, and other architects to ensure alignment on architectural decisions. Establishing and enforcing architectural principles, standards, and guidelines across the product development organization is crucial. You should have a good understanding of AI-driven workflows and automation, as well as data architecture, insights, reporting, and solution architecture. Your role will also involve identifying and addressing technical debt within the product architecture, developing strategies for its mitigation, and mentoring and coaching development teams on architectural best practices. Staying abreast of industry trends, emerging technologies, and the competitive landscape is essential to identify opportunities for innovation and improvement. Creating and maintaining architectural documentation, including high-level designs, architectural diagrams, and API specifications, will be part of your responsibilities. You should have deep understanding of enterprise architecture principles, frameworks, and their application in large-scale product development, as well as proven ability to define and evolve product architecture roadmaps aligning with business strategy and long-term scalability goals. Extensive experience in building and managing scalable cloud platforms, cloud engineering, cloud-native application development, API management and integration, DevOps, data platforms, AI-enabled automation, and modern full-stack development skills will be required for this role. Additionally, you should be able to evaluate and select appropriate technologies, manage technical debt, lead and mentor engineering teams, and propose pragmatic solutions. With a minimum of 18+ years of experience and credible exposure to Cloud Engineering, Cloud Native Apps and Platforms, and Enterprise Architecture, you should also have a minimum of 10+ years of architecture experience with 1 major Cloud Platform, preferably Azure Cloud. An architecture certification with 1 major Cloud Platform, especially Azure Cloud, is highly desirable. Experience within the Creative Production and Creative Technology domain, and a high-level understanding of creative processes, is also highly desirable. Location: DGS India - Pune - Kharadi EON Free Zone Brand: Dentsu Creative Time Type: Full time Contract Type: Permanent,
Posted 3 weeks ago
8.0 - 12.0 years
0 Lacs
maharashtra
On-site
As an experienced Data Architect with a focus on advanced analytics and Generative AI solutions, your role will involve architecting and delivering cutting-edge analytics and visualization solutions utilizing Databricks, Generative AI frameworks, and modern BI tools. You will be responsible for designing and implementing Generative AI solutions, integrating frameworks like Microsoft Copilot, and developing reference architectures for leveraging Databricks Agent capabilities. In this position, you will lead pre-sales engagements, conduct technical discovery sessions, and provide solution demos. You will collaborate with various stakeholders to align analytics solutions with business objectives and promote best practices for AI/BI Genie and Generative AI-driven visualization platforms. Additionally, you will guide the deployment of modern data architectures that integrate AI-driven decision support with popular BI tools such as Power BI, Tableau, or ThoughtSpot. Your role will also involve serving as a trusted advisor to clients, helping them transform their analytics and visualization strategies with Generative AI innovation. You will mentor and lead teams of consultants, ensuring high-quality solution delivery, reusable assets, and continuous skill development. Staying current on Databricks platform evolution, GenAI frameworks, and next-generation BI trends will be crucial for proactively advising clients on emerging innovations. To be successful in this role, you should have at least 8 years of experience in data analytics, data engineering, or BI architecture roles, with a minimum of 3 years of experience delivering advanced analytics and Generative AI solutions. Hands-on expertise with the Databricks platform, familiarity with Generative AI frameworks, and strong skills in visualization platforms are essential. Pre-sales experience, consulting skills, and knowledge of data governance and responsible AI principles are also required. Preferred qualifications include Databricks certifications, certifications in major cloud platforms, experience with GenAI prompt engineering, exposure to knowledge graphs and semantic search frameworks, industry experience in financial services, healthcare, or manufacturing, and familiarity with MLOps and end-to-end AI/ML pipelines. Your primary skills should include Data Architecture, with additional expertise in Power BI, AI/ML Architecture, Analytics Architecture, and BI & Visualization Development. Joining Infogain, a human-centered digital platform and software engineering company, will provide you with opportunities to work on cutting-edge projects for Fortune 500 companies and digital natives across various industries, utilizing technologies such as cloud, microservices, automation, IoT, and artificial intelligence.,
Posted 3 weeks ago
14.0 - 18.0 years
0 Lacs
maharashtra
On-site
You will be leading the architectural design for a migration project, utilizing Azure services, SQL, Databricks, and PySpark to develop scalable, efficient, and reliable solutions. Your responsibilities will include designing and implementing advanced data transformation and processing tasks using Databricks, PySpark, and ADF. You should have a strong understanding of data integration, ETL, and data warehousing concepts. It will be essential to design, deploy, and manage Databricks clusters for data processing, ensuring performance and cost efficiency. Troubleshooting cluster performance issues when necessary is also part of your role. You will mentor and guide developers on using PySpark for data transformation and analysis, sharing best practices and reusable code patterns. Having experience in end-to-end architecture for SAS to PySpark migration will be beneficial. Documenting architectural designs, migration plans, and best practices to ensure alignment and reusability within the team and across the organization is a key aspect of this position. You should be experienced in delivering end-to-end solutions and effectively managing project execution. Collaborating with stakeholders to translate business requirements into technical specifications and designing robust data pipelines, storage solutions, and transformation workflows will be part of your responsibilities. Supporting UAT and production deployment planning is also required. Strong communication and collaboration skills are essential for this role. Experience: 14-16 Years Skills: Primary Skill: Data Architecture Sub Skill(s): Data Architecture Additional Skill(s): ETL, Data Architecture, Databricks, PySpark About the Company: Infogain is a human-centered digital platform and software engineering company based out of Silicon Valley. They engineer business outcomes for Fortune 500 companies and digital natives in various industries using technologies such as cloud, microservices, automation, IoT, and artificial intelligence. Infogain accelerates experience-led transformation in the delivery of digital platforms. The company is a Microsoft Gold Partner and Azure Expert Managed Services Provider. Infogain, an Apax Funds portfolio company, has offices in multiple locations worldwide.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
As a Senior Database Administrator at NTT DATA, you will play a crucial role in ensuring the availability, integrity, and performance of complex and critical data assets. Working closely with cross-functional teams, you will support data-driven applications, troubleshoot issues, and implement robust backup and recovery strategies. Your expertise will be instrumental in controlling access to database environments through permissions and privileges. Key Responsibilities: - Install, configure, and maintain complex database management systems (DBMS) such as Oracle, MySQL, PostgreSQL, and others. - Collaborate with software developers/architects to design and optimize database schemas and data models. - Write database documentation, data standards, data flow diagrams, and standard operating procedures. - Monitor database performance, identify bottlenecks, and optimize queries for optimal performance. - Design and implement backup and disaster recovery strategies for data availability and business continuity. - Work with Change Control and Release Management to commission new applications and customize existing ones. - Plan and execute database software upgrades and patches to ensure system security and up-to-date functionality. - Implement security measures to safeguard databases from unauthorized access, breaches, and data loss. - Conduct security audits and vulnerability assessments to maintain compliance with data protection standards. - Collaborate with cross-functional teams to support database-related initiatives and provide technical support to end-users. Knowledge, Skills, and Attributes: - Proficiency in database administration tasks, SQL, database security, backup and recovery strategies. - Ability to monitor database performance, manage multiple projects, and communicate complex IT information effectively. - Strong problem-solving and analytical skills to troubleshoot database-related issues. - Familiarity with data architecture, data services, and application development lifecycle. - Experience working with unstructured datasets and extracting value from large datasets. Academic Qualifications and Certifications: - Bachelor's degree in computer science, engineering, information technology, or related field. - Relevant certifications such as MCSE DBA, Oracle Certified Professional, MySQL Database Administrator, PostgreSQL Certified Professional. - Completion of database management courses covering database administration, data modeling, SQL, and performance tuning. Required Experience: - Demonstrated experience as a Database Administrator within an IT organization. - Experience with database backup and recovery practices, health assessment reports, and managing databases. Workplace Type: - Hybrid Working About NTT DATA: NTT DATA is a trusted global innovator of business and technology services, serving Fortune Global 100 clients. Committed to innovation and long-term success, NTT DATA invests in R&D to drive organizations confidently into the digital future. With a diverse global team and extensive partner ecosystem, NTT DATA offers consulting, AI, industry solutions, and application management services. As a leading provider of digital and AI infrastructure, NTT DATA is part of NTT Group and headquartered in Tokyo. NTT DATA is an Equal Opportunity Employer.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
The AIML Architect-Dataflow, BigQuery position is a critical role within our organization, focusing on designing, implementing, and optimizing data architectures in Google Cloud's BigQuery environment. You will combine advanced data analytics with artificial intelligence and machine learning techniques to create efficient data models that enhance decision-making processes across various departments. Your responsibilities will include building data pipeline solutions that utilize BigQuery and Dataflow functionalities to ensure high performance, scalability, and resilience in our data workflows. Collaboration with data engineers, data scientists, and application developers is essential to align with business goals and technical vision. You must possess a deep understanding of cloud-native architectures and be enthusiastic about leveraging cutting-edge technologies to drive innovation, efficiency, and insights from extensive datasets. You should have a robust background in data processing and AI/ML methodologies, capable of translating complex technical requirements into scalable solutions that meet the evolving needs of the organization. Key Responsibilities - Design and architect data processing solutions using Google Cloud BigQuery and Dataflow. - Develop data pipeline frameworks supporting batch and real-time analytics. - Implement machine learning algorithms for extracting insights from large datasets. - Optimize data storage and retrieval processes to improve performance. - Collaborate with data scientists to build scalable models. - Ensure data quality and integrity throughout the data lifecycle. - Work closely with cross-functional teams to align data workflows with business objectives. - Conduct technical evaluations and assessments of new tools and technologies. - Manage large-scale data migrations to cloud environments. - Document architecture designs and maintain technical specifications. - Provide mentorship and guidance to junior data engineers and analysts. - Stay updated on industry trends in cloud computing and data engineering. - Design and implement security best practices for data access and storage. - Monitor and troubleshoot data pipeline performance issues. - Conduct training sessions on BigQuery best practices for team members. Required Qualifications - Bachelor's or Master's degree in Computer Science, Data Science, or related field. - 5+ years of experience in data architecture and engineering. - Proficiency in Google Cloud Platform, especially BigQuery and Dataflow. - Strong understanding of data modeling and ETL processes. - Experience in implementing machine learning solutions in cloud environments. - Solid programming skills in Python, Java, or Scala. - Expertise in SQL and other query optimization techniques. - Experience with big data workloads and distributed computing. - Familiarity with modern data processing frameworks and tools. - Strong analytical and problem-solving skills. - Excellent communication and team collaboration abilities. - Proven track record of managing comprehensive projects from inception to completion. - Ability to work in a fast-paced, agile environment. - Understanding of data governance, compliance, and security. - Experience with data visualization tools is a plus. - Certifications in Google Cloud or relevant technologies are advantageous. Skills - Cloud Computing - SQL Proficiency - Dataflow - AIML - Scala - Data Governance - ETL Processes - Python - Machine Learning - Java - Google Cloud Platform - Data Architecture - Data Modeling - BigQuery - Data Engineering - Data Visualization Tools,
Posted 3 weeks ago
18.0 - 22.0 years
0 Lacs
haryana
On-site
As an Enterprise Architect in the Financial Services domain, you will play a crucial role in leading the design and integration of enterprise solutions that are in line with the business objectives. With over 18 years of experience, you will focus on designing scalable, reliable, and secure solution architectures that meet both the technical requirements and the overarching business strategy, specifically within the Capital Markets domain. Your responsibilities will include collaborating with technical architects, engineering teams, and key stakeholders to ensure successful implementation, bridging the gap between business objectives and technical execution. You will be responsible for aligning high-level solution architectural designs with the organization's immediate and long-term strategies, with a focus on cloud technologies and microservices integration patterns. Your role will involve leading the design of end-to-end solutions, making decisions on build vs buy, selecting COTS solutions, and prioritizing technical debt based on architectural principles and roadmap. Additionally, you will work closely with technical teams to ensure that the architectural vision is implemented successfully, balancing business needs with technical constraints. Your expertise in Capital Markets, COTS solutions, cloud technologies (such as AWS and Azure), integration patterns, and security governance will be vital in this role. You will also be required to have experience in working on RFP design, build vs buy decision making, and collaborating with business stakeholders to understand their problems and find suitable solutions. Strong communication skills, leadership abilities, and a collaborative mindset will be essential for fostering partnerships with internal and external teams to ensure solution delivery aligns with enterprise standards. Ideally, you should have a strong understanding of financial services regulations and compliance, along with a background in financial systems solution architecture. A degree in computer science, information technology, or related fields, as well as certification in TOGAF 9.1/9.2, would be advantageous. Your ability to articulate the architectural vision clearly and guide decision-making processes across business and technical teams will be key to your success in this role.,
Posted 3 weeks ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
You have over 8 years of experience and are located in Balewadi, Pune. Your technical skills and core competencies include a strong understanding of Data Architecture and models, leading data-driven projects, expertise in Data Modelling paradigms such as Kimball, Inmon, Data Marts, Data Vault, and Medallion. You have a solid experience with Cloud Based data strategies and big data technologies with a preference for AWS. You are adept at designing data pipelines for ETL, possessing expert knowledge on ingestion, transformation, and data quality. Hands-on experience in SQL is a must, including a deep understanding of PostGreSQL development, query optimization, and designing indexes. You should be able to understand and manipulate intermediate to complex levels of SQL, with thorough knowledge of Postgres PL/SQL for complex warehouse workflows. Moreover, you can apply advanced SQL concepts and statistical concepts through SQL, and experience with PostGres SQL extensions like PostGIS is desired. Expertise in writing ETL pipelines combining Python + SQL is required, along with an understanding of data manipulation libraries in Python like Pandas, Polars, DuckDB. Experience in designing Data visualization with tools such as Tableau and PowerBI is desirable. Your responsibilities include participating in the design and development of features in the existing Data Warehouse, providing leadership in establishing connections between Engineering, product, and analytics/data scientists team, designing, implementing, and updating existing/new batch ETL pipelines, defining and implementing data architecture, and partnering with both engineers and data analysts to build reliable datasets. You will work with various data orchestration tools (Apache Airflow, Dagster, Prefect, and others), embrace a fast-paced start-up environment, and should be passionate about your job and enjoy a fast-paced international working environment. Background or experience in the telecom industry is a plus but not a requirement. You love automating and enjoy monitoring.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
You will be responsible for demonstrating thorough knowledge and a proven record of success in executing various functional and technical aspects of SAP Master Data Governance (MDG) projects following industry best practices. This includes Data Modelling, Process Modelling, UI Modelling, Business Validation Rules Modelling, Derivations, and Data Replication Framework (DRF) and Workflow creations and Maintenance. Your role will require a good understanding of the SAP MDG technical framework, including BADI, BAPI/RFC/FM, Workflows, BRF+, Enterprise Services, IDoc, Floorplan Manager, WebDynPro, Fiori, and MDG API framework. Knowledge of SAP data dictionary tables, views, relationships, and corresponding data architecture for ECC and S/4 HANA for various SAP master and transactional data entities is essential, including excellent functional knowledge for core master data objects like customer, vendor, and material. Hands-on experience in configuring customer, vendor, finance, and product/material master data in MDG is necessary, including data harmonization involving de-duplication, mass changes, and data replication involving Key/Value mapping, SOA Web services, ALE/Idoc. Effective communication with customers and partners to understand specific Enterprise Data needs is a key aspect of this role. You should possess excellent written and verbal communication skills with the ability to impart ideas in technical, business, and user-friendly language. Having an appetite to acquire new knowledge, adapt to, and contribute to fast innovation is important for success in this role. The ideal candidate will have a minimum of 5 years of experience in SAP Master Data Governance (MDG) with at least 2 full cycle implementations. Implementation experience of SAP MDG in key domains such as Customer, Supplier, Material, and Finance Master is required. Hands-on experience with SAP Fiori, SAP MDG mass processing, consolidation, central governance, Workflow, and BRF+ is essential. Experience in RDG is considered an added advantage for this role.,
Posted 3 weeks ago
10.0 - 14.0 years
0 Lacs
pune, maharashtra
On-site
Siemens Energy is looking for a highly skilled and experienced Senior MLOps Engineer to join the Digital Core function and contribute significantly to the data architecture and strategy within the organization. You will work closely with stakeholders to develop our Machine Learning environment, collaborating with business stakeholders and other teams to prioritize backlog items, offer consultancy on AI ML solutions, support test automation, build CI CD pipelines, and work on PoCs/MVPs using various hyperscale offerings. If you are passionate about the environment and climate change and eager to be a part of the energy transition future, then the Siemens Energy Data Analytics & AI team is the place for you. We are seeking innovative, enthusiastic, and versatile data, digital, and AI professionals to drive us forward on this exciting journey of energy transformation. Your responsibilities will include onboarding new AI ML use cases in AWS / Google Cloud Platform, defining MLOps architecture for AWS/ GCP/ Cloud Agnostic, working with AI ML services like AWS Sagemaker and GCP AutoML, developing PoCs / MVPs using AWS / GCP and other MLOps services, implementing CI CD pipelines using GitLab CI, writing Infrastructure as code with AWS CDK scripts, providing consultancy to stakeholders on AI ML solutions, supporting test automation, and deploying code base to production environments. To be successful in this role, you should have a Bachelor's degree in Computer Science, Mathematics, Engineering, Physics, or related fields (a Master's degree is a plus), around 10 years of hands-on experience in ML / AI development and Operations, expertise in ML Life Cycle and MLOps, proficiency in Python coding and Linux administration, experience with CI CD pipelines and DevOps processes, familiarity with JIRA and Confluence, and excellent interpersonal and communication skills. Certification in AWS or GCP in ML AI area is preferred. The Data Platforms and Services organization at Siemens Energy is committed to becoming a data-driven organization to support customers in transitioning to a more sustainable world. By using innovative technologies and treating data as a strategic asset, we aim to make sustainable, reliable, and affordable energy a reality. Siemens Energy values diversity and inclusion, welcoming applications from individuals of all backgrounds. We believe that through diversity, we generate power, and our combined creative energy is fueled by over 130 nationalities. We celebrate character and do not discriminate based on differences, upholding equal opportunities for all. If you are ready to make an impact in shaping the future of energy and are committed to innovation, decarbonization, and energy transformation, join us at Siemens Energy. For more information about Siemens Energy and our commitment to diversity, visit our website: https://www.siemens-energy.com/employeevideo To explore job opportunities at Siemens Energy, visit: https://jobs.siemens-energy.com/jobs,
Posted 3 weeks ago
10.0 - 14.0 years
0 Lacs
chennai, tamil nadu
On-site
We are looking for a highly motivated and experienced Data and Analytics Senior Architect to lead our Master Data Management (MDM) and Data Analytics team. As the Data and Analytics Architect Lead, you will be responsible for defining and implementing the overall data architecture strategy to ensure alignment with business goals and support data-driven decision-making. Your role will involve designing scalable, secure, and efficient data systems, including databases, data lakes, and data warehouses. You will evaluate and recommend tools and technologies for data integration, processing, storage, and analytics while staying updated on industry trends. You will lead a high-performing team, fostering a collaborative and innovative culture, and ensuring data integrity, consistency, and availability across the organization. You will manage the existing MDM solution and data platform based on Microsoft Data Lake Gen 2, Snowflake as the DWH, and Power BI managing data from core applications. Additionally, you will drive further development to handle additional data and capabilities to support our AI journey. The ideal candidate will possess strong leadership skills, a deep understanding of data management and technology principles, and the ability to collaborate effectively across different departments and functions. **Principle Duties and Responsibilities:** **Team Leadership:** - Lead, mentor, and develop a high-performing team of data analysts and MDM specialists. - Foster a collaborative and innovative team culture that encourages continuous improvement and efficiency. - Provide technical leadership and guidance to the development teams and oversee the implementation of IT solutions. **Architect:** - Define the overall data architecture strategy, aligning it with business goals and ensuring it supports data-driven decision-making. - Identify, evaluate, and establish shared enabling technical capabilities for the division in collaboration with IT to ensure consistency, quality, and business value. - Design and oversee the implementation of data systems, including databases, data lakes, and data warehouses, ensuring they are scalable, secure, efficient, and cost-effective. - Evaluate and recommend tools and technologies for data integration, processing, storage, and analytics, staying updated on industry trends. **Strategic Planning:** - Develop and implement the MDM and analytics strategy aligned with the overall team and organizational goals. - Work with the Enterprise architect to align on the overall strategy and application landscape to ensure MDM and data analytics fit into the ecosystem. - Identify opportunities to enhance data quality, governance, and analytics capabilities. **Project Management:** - Oversee project planning, execution, and delivery to ensure timely and successful completion of initiatives. - Monitor project progress and cost, identify risks, and implement mitigation strategies. **Stakeholder Engagement:** - Collaborate with cross-functional teams to understand data needs and deliver solutions that support business objectives. - Serve as a key point of contact for data-related inquiries and support requests. - Develop business cases and proposals for IT investments and present them to senior management and stakeholders. **Data/Information Governance:** - Establish and enforce data/information governance policies and standards to ensure compliance and data integrity. - Champion best practices in data management and analytics across the organization. **Reporting and Analysis:** - Utilize data analytics to derive insights and support decision-making processes. - Document and present findings and recommendations to senior management. **Knowledge, Skills and Abilities Required:** - Bachelor's degree in computer science, Data Science, Information Management, or a related field; master's degree preferred. - 10+ years of experience in data management, analytics, or a related field, with at least 2 years in a leadership role. - Strong knowledge of master data management concepts, data governance, data technology, and analytics tools. - Proficiency in data modeling, ETL processes, database management, big data technologies, and data integration techniques. - Excellent project management skills with a proven track record of delivering complex projects on time and within budget. - Strong analytical, problem-solving, and decision-making abilities. - Exceptional communication and interpersonal skills. - Team player, result-oriented, structured, with attention to detail and a strong work ethic. **Special Competencies required:** - Proven leader with excellent structural skills, good at documenting and presenting. - Strong executional skills to make things happen, not just generate ideas. - Experience in working with analytics tools and data ingestion platforms. - Experience in working with MDM solutions and preferably TIBCO EBX. - Experience in working with Jira/Confluence. **Additional Information:** - Office, remote, or hybrid working. - Ability to function within variable time zones. - International travel may be required.,
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |