Jobs
Interviews

2438 Data Governance Jobs - Page 26

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

7 - 9 Lacs

Pune

Work from Office

So, what s the role all about? We are seeking a skilled Senior Data Engineer to join our Actimize Watch Data Analytics team. In this role, you will collaborate closely with the Data Science team, Business Analysts, SMEs to monitor and optimize the performance of machine learning models. You will be responsible for running various analytics on data stored in S3, using advanced Python techniques, generating performance reports & visualization in Excel, and showcasing model performance & stability metrics through BI tools such as Power BI and Quick Sight. How will you make an impact? Data Integration and Management: Design, develop, and maintain robust Python scripts to support analytics and machine learning model monitoring. Ensure data integrity and quality across various data sources, primarily focusing on data stored in AWS S3. Check the data integrity & correctness of various new customers getting onboarded to Actimize Watch Analytics and Reporting: Work closely with Data Scientists, BAs & SMEs to understand model requirements and monitoring needs. Perform complex data analysis as well as visualization using Jupyter Notebooks, leveraging advanced Python libraries and techniques. Generate comprehensive model performance & stability reports, showcase them in BI tools. Standardize diverse analytics processes through automation and innovative approaches. Model Performance Monitoring: Implement monitoring solutions to track the performance and drift of machine learning models in production for various clients. Analyze model performance over time and identify potential issues or areas for improvement. Develop automated alerts and dashboards to provide real-time insights into model health. Business Intelligence and Visualization: Create and maintain dashboards in BI tools like Tableau, Power BI and QuickSight to visualize model performance metrics. Collaborate with stakeholders to ensure the dashboards meet business needs and provide actionable insights. Continuously improve visualization techniques to enhance the clarity and usability of the reports. Collaboration and Communication: Work closely with cross-functional teams, including Data Scientists, Product Managers, Business Analysts and SMEs to understand requirements and deliver solutions. Communicate findings and insights effectively to both technical and non-technical stakeholders. Provide support and training to team members on data engineering and analytics best practices and tools. Have you got what it takes? 5 to 7 years of experience in data engineering, with a focus on analytics, data science and machine learning model monitoring. Proficiency in Python and experience with Jupyter Notebooks for data analysis. Strong experience with AWS services, particularly S3 and related data processing tools. Expertise in Excel for reporting and data manipulation. Hands-on experience with BI tools such as Tableau, Power BI and QuickSight. Solid understanding of machine learning concepts and model performance metrics. Strong Python & SQL skills for querying and manipulating large datasets. Excellent problem-solving and analytical skills. Ability to work in a fast-paced, collaborative environment. Strong communication skills with the ability to explain technical concepts to non-technical stakeholders. Preferred Qualifications: Experience with other AWS services like S3, Glue as well as BI tools like QuickSight & PowerBI Familiarity with CI/CD pipelines and automation tools. Knowledge of data governance and security best practices. What s in it for you? Enjoy NiCE-FLEX! Requisition ID: 7900 Reporting into: Tech Manager Role Type: Individual Contributor About NiCE

Posted 1 week ago

Apply

2.0 - 4.0 years

45 - 50 Lacs

Gurugram

Work from Office

In this role, you will be part of collaborative, inclusive and high performing engineering team where your ideas matter. We ship thoughtfully, review code with care and support each other s growth. Expect mentorship from senior engineers, real ownership of projects/products and the kind of team dynamic where people enjoy solving problems together. We build a culture where developers stick around - not because they are stuck, but because they are challenged, respected and given room to thrive. What you'll Do We are looking for a passionate Software Engineer to join our dynamic team. In this role, you will dive deep into our engineering practices to gain a thorough understanding of the systems that power our products. Your primary responsibilities will include rapidly learning the architecture and technologies, contributing to feature development with high velocity, and ensuring that all code meets strong quality and compliance standards. You will collaborate closely with cross-functional teams to deliver immediate value, continuously improving software and drive innovation. What you'll Bring 2-4 years of experience as a full stack developer (React, .NET, Cloud, DevOps tools) Good understanding of software engineering principles Using automation and AI tools for build, test & deploy Understanding of data structures, algorithms, version control and using automated tools for testing & deployment Ability to write clean, maintainable, secure & testable code. Great at breaking down problems, debugging & trouble shooting skills. Cloud and deployment basics Ability to communicate clearly , remain curious, adaptable and empathetic in a team.

Posted 1 week ago

Apply

10.0 - 15.0 years

15 - 19 Lacs

Pune

Work from Office

At Solidatus , we re changing how organisations understand their data. We re an award-winning, venture-backed software company often called the Git for Metadata. Our platform helps businesses harvest, model, and visualise complex data lineage flows. Our unique lineage-first approach, combined with active AI development, provides organisations with unparalleled clarity and robust control over their data s journey and meaning. As a growing B2B SaaS business ( great, collaborative culture. Join us as we expand globally and define the future of data understanding! Role Overview We are seeking an experienced Data Architect to lead the design and implementation of data lineage solution s that align with our clients business objectives . This role involves collaborating with cross-functional teams to ensure the integrity, accuracy and timeliness of their data lineage solution . You will be working directly with our clients helping them get the maximum value from our product and ensuring they achieve their contractual goals. Key Responsibilities Design and implement robust data lineage solutions that support business intelligence, analytics, and data governance initiatives. Collaborate with stakeholders to understand data lineage requirements and translate them into technical and business solutions . Develop and maintain lineage data models, semantic metadata systems, and data dictionaries. Ensure data quality, security, and compliance with relevant regulations. Understand Solidatus implementation and data lineage modelling best practice and ensure that it is followed at our clients. Stay abreast of emerging technologies and industry trends to continuously improve data lineage architecture practices. Qualifications Bachelors or Masters degree in Computer Science , Information Systems, or a related field. Proven experience in data architecture, with a focus on large-scale data systems with more than one company. Proficiency in data modelling , database design, and data warehousing concepts. Experience with cloud platforms (e.g., AWS, Azure, GCP) and big data technologies (e.g., Hadoop, Spark). Strong understanding of data governance, data quality, and data security principles. Excellent communication and interpersonal skills, with the ability to work effectively in a collaborative environmen t Why Join Solidatus Be part of an innovative company shaping the future of data management. Collaborate with a dynamic and talented team in a supportive work environment. Opportunities for professional growth and career advancement. Flexible working arrangements, including hybrid work options. Competitive compensation and benefits package.

Posted 1 week ago

Apply

2.0 - 4.0 years

45 - 50 Lacs

Gurugram

Work from Office

In this role, you will be part of collaborative, inclusive and high performing engineering team where your ideas matter. We ship thoughtfully, review code with care and support each other s growth. Expect mentorship from senior engineers, real ownership of projects/products and the kind of team dynamic where people enjoy solving problems together. We build a culture where developers stick around - not because they are stuck, but because they are challenged, respected and given room to thrive. What you'll Do We are looking for a passionate Software Engineer to join our dynamic team. In this role, you will dive deep into our engineering practices to gain a thorough understanding of the systems that power our products. Your primary responsibilities will include rapidly learning the architecture and technologies, contributing to feature development with high velocity, and ensuring that all code meets strong quality and compliance standards. You will collaborate closely with cross-functional teams to deliver immediate value, continuously improving software and drive innovation. What you'll Bring 2-4 years of experience as a full stack developer (React, .NET, Cloud, DevOps tools) Good understanding of software engineering principles Using automation and AI tools for build, test & deploy Understanding of data structures, algorithms, version control and using automated tools for testing & deployment Ability to write clean, maintainable, secure & testable code. Great at breaking down problems, debugging & trouble shooting skills. Cloud and deployment basics Ability to communicate clearly , remain curious, adaptable and empathetic in a team.

Posted 1 week ago

Apply

4.0 - 9.0 years

25 - 30 Lacs

Pune

Work from Office

About KPI Partners: KPI Partners is a leading provider of business intelligence and analytics solutions. We are committed to delivering innovative solutions that empower organizations to make data-driven decisions. Our team is passionate about leveraging the latest technologies to transform raw data into actionable insights. Position Summary: We are seeking a talented and experienced Lead Data Engineer with expertise in Unity Catalog to join our dynamic team. The ideal candidate will play a crucial role in architecting and implementing data engineering solutions, ensuring data quality, governance, and accessibility across our organization. You will work collaboratively with cross-functional teams to facilitate effective data management and integration. Key Responsibilities: - Design, develop, and implement data pipelines and ETL processes using Unity Catalog. - Manage and optimize data workflows to improve performance and reliability. - Ensure data quality and governance by implementing best practices and standards. - Collaborate with data scientists, analysts, and other stakeholders to understand data requirements and provide technical solutions. - Mentor and lead junior data engineers, providing guidance and support in their professional development. - Stay up-to-date with industry trends and emerging technologies related to data engineering and analytics. - Participate in architecture discussions and contribute to the overall data strategy of the organization. Qualifications: - Bachelor s or Master s degree in Computer Science, Engineering, or a related field. - Proven experience as a Data Engineer with a focus on data cataloging and management, specifically Unity Catalog. - Strong programming skills in languages such as Python, SQL, or Scala. - Experience with cloud platforms such as AWS, Azure, or Google Cloud. - Knowledge of big data technologies and frameworks such as Apache Spark, Hadoop, or similar. - Familiarity with data warehousing concepts and tools. - Excellent problem-solving skills and the ability to work in a fast-paced environment. - Strong communication skills and the ability to work collaboratively within a team. What We Offer: - Competitive salary and benefits package. - Opportunities for professional growth and development. - A collaborative and innovative work environment. - The chance to work on cutting-edge technologies and impactful projects.

Posted 1 week ago

Apply

7.0 - 10.0 years

25 - 30 Lacs

Bengaluru

Work from Office

We are seeking an experienced Power BI and Visualization Architect/Lead with expert-level skills in designing, developing, and optimizing data visualizations and analytics solutions. The ideal candidate will have a deep understanding of Power BI visuals, Power BI service administration, complex DAX solutions, and data modeling best practices to support scalable and efficient business insights. This role demands technical expertise combined with logical thinking, common sense, and programming skills to elevate our reporting capabilities to the next level. Responsibilities : Design, develop, and implement complex Power BI reports, dashboards, and data visualizations with optimal data models to ensure scalability and performance. Develop and manage complex DAX calculations, data relationships, and aggregation strategies for efficient reporting. Perform Power BI Service administration tasks, including workspace management, security setup, gateway configuration, and data refresh automation. Assess and improve the existing Power BI environment, optimizing performance, reducing refresh times, and ensuring data model efficiency. Leverage programming skills (e.g., Python, SQL, or Power Automate) to enhance Power BIs capabilities, automate tasks, and improve data workflows. Drive data integration by connecting multiple data into Power BI with efficient data modeling. Collaborate with business stakeholders to gather requirements and translate them into actionable insights using best-in-class visualization techniques. Identify continuous learning opportunities to upskill team members by mentoring staff and promoting Power BI best practices. Introduce innovative solutions to exploit Power BIs advanced features like parameterized reports, AI insights, and custom visuals. Requirements : Extensive hands-on experience with Power BI development, including advanced data modeling and DAX solutions. Strong knowledge of Power BI Service administration and best practices for performance optimization. Expertise in integrating data from diverse sources such as SQL, Excel, APIs, and cloud platforms. Expertise in using data models and Power Query within Power BI environment Solid experience with data governance, security frameworks, and role-based access controls within Power BI. Ability to develop solutions that align with DataOps best practices for automation, testing, and scalability. Strong problem-solving skills with a focus on logical thinking and solution design. Excellent communication and collaboration skills to engage with technical teams and business stakeholders effectively. Preferred Skills: Experience with programming languages such as Python, PowerAutomate, or R to extend Power BI capabilities. Experience in other visualization tools such as Tableau, Qliksense, Looker, Excel Familiarity with Azure Data Services, Databricks, SQL, Power Query or Power BI Embedded. Proven ability to lead initiatives, influence stakeholders, and drive innovation in data visualization strategies.

Posted 1 week ago

Apply

1.0 - 5.0 years

5 - 6 Lacs

Bengaluru

Work from Office

Job Requirements Summary Were seeking an experienced MLOps Engineer to build and maintain our computer vision infrastructure on AWS. The ideal candidate will develop model training pipeline, a comprehensive image data lake with advanced search capabilities, implement active learning pipelines for efficient annotation, and create frameworks enabling customers to deploy their own deep learning models. This role combines MLOps expertise with data engineering to create scalable, production-ready computer vision systems. Responsibilities: Design and implement end-to-end computer vision ML training pipelines on AWS SageMaker for model training, validation, deployment, and monitoring Architect and build a scalable image data lake solution enabling multi-modal search capabilities (structured metadata, image-to-image, text-to-image) along with data upload capability from edge devices Develop vector embedding pipelines for visual content using AWS services and deep learning frameworks Create APIs for seamless integration with third-party annotation services and automated dataset creation Implement active learning pipelines that intelligently select high-value images for annotation, optimizing annotation ROI Build data quality and validation frameworks to ensure consistency across the annotation lifecycle Develop infrastructure automation using AWS CloudFormation/CDK for scalable deep learning workflows Establish monitoring systems for data drift, annotation quality, and model performance Create skeleton frameworks and templates enabling customers to deploy their own deep learning models Optimize storage and retrieval mechanisms for large-scale image repositories Work Experience Requirements: Bachelors or Masters degree in Computer Science, Engineering, or related field 5+ years of experience in MLOps or ML Engineering with focus on computer vision applications Experience building data lakes or large-scale data repositories for unstructured data Strong understanding of vector databases, embedding models, and similarity search algorithms Hands-on experience with AWS services (S3, SageMaker, Lambda, Step Functions, Glue) Proficiency in Python and experience with PyTorch or TensorFlow Experience implementing active learning systems for optimizing annotation workflows Knowledge of RESTful API design and integration with third-party services Familiarity with annotation tools and workflows for computer vision datasets Experience with containerization (Docker) and orchestration (Kubernetes/EKS) Understanding of data governance and security best practices for sensitive image data

Posted 1 week ago

Apply

3.0 - 5.0 years

3 - 7 Lacs

Mumbai, Gurugram, Chennai

Work from Office

We re looking for a mid-level Marketing Operations Analyst to join our distributed team and support campaign execution across both B2B and B2C initiatives. You ll play a key role in building and launching campaigns that drive engagement and lead generation across international markets. What You ll Do Build and launch email campaigns, forms, landing pages, and automation flows in Marketo Apply standard processes for tokenization, dynamic content, segmentation, email scripting (e.g., Velocity), and program cloning Set up and handle lead scoring rules, champion programs, and triggered workflows in alignment with global campaign strategy Implement campaign tracking (e.g., UTMs, SFDC campaign association, program statuses) for downstream attribution and reporting, Troubleshoot issues related to deliverability, rendering, data flow, and sync errors with CRM Work closely with marketers in North America, EMEA, and APAC to gather campaign requirements and ensure regional nuances (e.g., time zones, languages, regulations) are supported Attend campaign intake meetings, status updates, and global syncs across time zones as needed Coordinate closely with design, web, and localization teams to ensure timely campaign asset delivery and quality Marketing Operations Best Practices Ensure consistent use of campaign templates, folder structures, naming conventions, and data governance protocols Maintain campaign calendars, project trackers, and intake tickets using project management tool Support continuous process improvement by flagging inefficiencies or common problems and suggesting automation or documentation improvements Uphold compliance with global privacy standards (e.g., GDPR, CAN-SPAM) and suppression list managementB2B & B2C Support Implement B2B campaigns focused on lead generation, champion, ABM, and event follow-up Support B2C lifecycle and retention campaigns such as upsell/cross-sell journeys, and transactional communications What We re Looking For 3 5 years of experience in Marketing Operations or Marketing Automation roles Hands-on experience with Marketo, HubSpot, Eloqua or similar, including email builds, workflows, segmentation, and reporting setup Familiarity with HTML/CSS for email editing, and use of tokens or scripting for personalization Experience working with Salesforce CRM or similar platforms for campaign tracking and lead flow Strong communication and time management skills to operate effectively in a global, remote team Comfortable balancing multiple campaign requests and shifting timelines Experience supporting B2B marketing teams Exposure to Marketing technologies like 6sense, Path factory or others Understanding of lead lifecycle management, data hygiene, and audience segmentation strategy Familiarity with deliverability tools and compliance frameworks Marketo Certified Professional and/or Marketo Certified Exp in Excel and reporting, good to have Work and life: Find your happy medium at Amex GBT. Flexible benefits are tailored to each country and start the day you do. These include health and welfare insurance plans, retirement programs, parental leave, adoption assistance, and wellbeing resources to support you and your immediate family . Travel perks: get a choice of deals each week from major travel providers on everything from flights to hotels to cruises and car rentals. Develop the skills you want when the time is right for you, with access to over 20,000 courses on our learning platform, leadership courses, and new job openings available to internal candidates first. We strive to champion Inclusion in every aspect of our business at Amex GBT. You can connect with colleagues through our global INclusion Groups, centered around common identities or initiatives, to discuss challenges, obstacles, achievements, and drive company awareness and action. And much more

Posted 1 week ago

Apply

15.0 - 20.0 years

40 - 100 Lacs

Bengaluru

Hybrid

Hiring, Investment Management and Risk Data Product Owner - ISS Data (Associate Director) Role The Investment and Risk & Attribution Data Product Owner role is instrumental in the creation and execution of a future state design for investment and risk data across our key business areas. The successful candidate will have an in-depth knowledge of all data domains that services Investment management, risk and attribution capabilities within the asset management industry. The role will sit within the ISS Delivery Data Analysis chapter and fully aligned to deliver cross functional ISS Data Programme in Technology, and the candidate will leverage their extensive industry knowledge to build a future state platform in collaboration with Business Architecture, Data Architecture, and business stakeholders. The role is to maintain strong relationships with the various business contacts to ensure a superior service to our clients. Key Responsibilities Leadership and Management: Lead the Investment and Risk data outcomes and capabilities for the ISS Data Programme. Realign existing resources and provide coaching and line management for junior data analysts within the chapter, influence and motivate them for high performance. Define the data product vision and strategy with end-to-end thought leadership. Lead data product documentation, enable peer-reviews, get analysis effort estimation, maintain backlog, and support end to end planning. Be a catalyst of change for improving efficiencies and innovation. Data Quality and Integrity: Define data quality use cases for all the required data sets and contribute to the technical frameworks of data quality. Align the functional solution with the best practice data architecture & engineering. Coordination and Communication: Senior management level communication to influence senior tech and business stakeholders globally, get alignment on the roadmaps. An advocate for the ISS Data Programme. Coordinate with internal and external teams to communicate with those impacted by data flows. Collaborate closely with Data Governance, Business Architecture, and Data owners etc. Conduct workshops within the scrum teams and across business teams, effectively document the minutes and drive the actions. Essential Skills Required Strong leadership and senior management level communication, internal and external client management and influencing skills. At least 15 years of proven experience as a senior business/technical/data analyst within technology and/or business change delivering data led business outcomes within the financial services/asset management industry. 5-10 years s a data product owner adhering to agile methodology, delivering data solutions using industry leading data platforms such as Snowflake, State Street Alpha Data, Refinitiv Eikon, SimCorp Dimension, BlackRock Aladdin, FactSet etc. In depth knowledge of how data vendor solutions such as Rimes, Bloomberg, MSCI, FactSet support Investment, Risk, Performance and Attribution business needs. Outstanding knowledge of data life cycle that drives Investment Management such as research, order management, trading, risk and attribution. In depth expertise in data and calculations across the investment industry covering the below. Financial data: This includes information on asset prices, market trends, economic indicators, interest rates, and other financial metrics that help in evaluating asset performance and making investment decisions. Asset-specific data: This includes data related to financial instruments reference data like asset specifications, maintenance records, usage history, and depreciation schedules. Market data: This includes data like security prices, exchange rates, index constituent and licensing restrictions on them. Risk data: This includes data related to risk factors such as market risk, credit risk, operational risk, and compliance risk. Performance & Attribution data: This includes data on fund performance returns and attribution using various methodologies like Time Weighted Returns, Transaction based performance attribution. Should possess Problem Solving, Attention to detail, Critical thinking. Technical Skills: Hands on SQL, Advanced Excel, Python, ML (optional) and knowledge of end-to-end tech solutions involving data platforms. Knowledge of data management, data governance and data engineering practices. Hands on experience on data modelling techniques like dimensional, data vault etc. Willingness to own and drive things, collaboration across business and tech stakeholders.

Posted 1 week ago

Apply

11.0 - 20.0 years

20 - 30 Lacs

Goregaon, Mumbai (All Areas)

Work from Office

An Opportunity to Work with One of India's Leading Credit Card Tech Innovators BOBCARD (A Bank of Baroda Subsidiary) Education: BE/B.Tech, BCA/MCA, BSc/MSc in Computer Science, IT, or related field. Experience: 1017 years Location: Goregaon, Mumbai (5 days' from Office) **Domain: Fintech/BFSI/NBFC (mandate) The AVP - Data Management (IT) will be responsible for leading the organizations data strategy, ensuring data integrity, governance, and analytics to drive informed business decisions. This role requires a deep understanding of data management best practices, analytical tools, and the ability to work cross-functionally to enhance business performance through data-driven insights. Key Responsibilities: Strategic Data Leadership: Develop and implement a comprehensive data strategy that aligns with organizational objectives. Drive data-driven decision-making by integrating analytics into business processes. Data Governance & Compliance: Establish and enforce data governance policies to ensure data accuracy, integrity, and security. Ensure compliance with regulatory requirements and industry best practices. Analytics & Business Intelligence: Design and deploy analytical solutions to enhance operational efficiency and uncover new business opportunities. Develop and maintain dashboards, reports, and predictive models to support key business functions. Collaboration & Stakeholder Engagement: Work closely with business leaders to understand their data needs and provide actionable insights. Partner with IT teams to optimize data infrastructure and implement advanced analytics capabilities. Technology Evaluation & Innovation: Stay updated with emerging data analytics trends and recommend tools and technologies to enhance capabilities. Evaluate and implement modern data management platforms to improve efficiency and scalability. Performance Measurement & Reporting: Define and track key performance indicators (KPIs) to assess the success of data initiatives. Present analytical findings to executive leadership, showcasing the impact of data-driven strategies. Budget & Resource Management: Oversee budget allocation for data management initiatives, ensuring optimal resource utilization. Identify cost-effective solutions to enhance data capabilities within the organization. Desired Candidate Profile: Proven experience in developing and executing data analytics strategies that drive business value. Strong knowledge of data analytics tools and technologies (e.g., Tableau, Power BI, SQL, Python, R) and data management platforms (e.g., Hadoop, Snowflake). Experience with data governance frameworks, compliance regulations, and industry best practices. Excellent analytical and problem-solving skills with the ability to interpret complex datasets and derive actionable insights. Strong communication and interpersonal skills, capable of engaging with both technical and non-technical stakeholders. Demonstrated ability to lead cross-functional teams and drive collaboration between IT and business units. You can directly apply at our careers page: https://bobcard.turbohire.co/job/publicjobs/MTPIC24RMImQZp%2FAEd6tTfczRMW72U%2FDf5ixj3NpXri1y4_Kcc5Y3ZCXNdjnRNAx

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

karnataka

On-site

You are a highly skilled and experienced Azure Data Architect & Senior Data Engineer who will be responsible for designing, implementing, and maintaining robust data solutions that support the business objectives. You have a deep understanding of data engineering principles, cloud technologies, and the Azure platform. You possess expertise in data modeling, integration, design, and data management. You can produce relevant data models and design the organization's data flow. You have experience with Azure cloud solutions related to data acquisition, transformation, storage, analysis, and loading, including SQL application data and unstructured data. Your proficiency extends to various SQL database languages such as T-SQL, Databricks, PL/SQL, and MySQL. You understand the advantages and drawbacks of each and can set them up effectively and securely across different data platforms and environments. You excel in building and maintaining data pipelines and workflows using tools like Azure Data Factory, Azure Functions, and others. You develop ETL/ELT processes to extract, transform, and load data from various sources into data warehouses and data lakes. Implementing data integration solutions using Azure Synapse Analytics, Azure SQL Database, and other Azure data storage options is also within your skillset. In addition, you develop and maintain comprehensive data architecture blueprints and roadmaps aligned with the business strategy. You assess and optimize existing data systems and processes for efficiency and scalability. Designing and implementing data governance frameworks to ensure data quality and integrity is part of your responsibilities. You have strong proficiency in Azure data services, including Azure Data Factory, Azure Synapse Analytics, Azure SQL Database, Azure Data Lake Storage, Azure Cosmos DB, and Azure Databricks. You are an expert in data modeling, data warehousing, and data lake concepts. Your knowledge also extends to data engineering tools and technologies, such as ETL/ELT tools, data integration frameworks, and data quality tools. Programming languages like Python, SQL, and Scala are familiar to you, and you have an understanding of data governance and compliance frameworks such as GDPR and CCPA. You hold a Bachelor's degree in Computer Science, Engineering, or a related field, with 10+ years of experience in data engineering and data architecture roles. You have at least 5+ years of hands-on experience with Azure data services like Azure Synapse, Azure SQL Database, Azure Cosmos DB, Azure Data Lake, Azure Data Factory, and Azure Databricks, or related services. Proven experience working with Azure cloud technologies, strong analytical and problem-solving skills, excellent communication and interpersonal skills, and the ability to work independently and as part of a team are some of your key qualifications. Additionally, certifications related to Azure data technologies, such as Azure Certified Data Engineer, are considered a plus.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

As an Azure Data Engineering Director, you will play a pivotal role in leading the data strategy and operations for our EDP Cloud Fabric. Your expertise will be essential in establishing resilience through a multi-cloud model and enabling key capabilities such as PowerBI and OpenAI from Microsoft. Collaborating with leads across GTIS, CSO, and CTO, you will accelerate the introduction and adoption of new designs on Azure. Your key responsibilities will include defining and executing a comprehensive data strategy aligned with business objectives, leveraging Azure services for innovation in data processing, analytics, and insights delivery. You will architect and manage large-scale data platforms using Azure tools like Azure Data Factory, Azure Synapse Analytics, Databricks, and Cosmos DB, optimizing data engineering pipelines for performance, scalability, and cost-efficiency. Furthermore, you will establish robust data governance frameworks to ensure compliance with industry regulations, oversee data quality, security, and consistency across all platforms, and build, mentor, and retain a high-performing data engineering team. Collaboration with cross-functional stakeholders to bridge technical and business objectives will be a key aspect of your role. You will also ensure data readiness for AI/ML initiatives, drive the adoption of real-time insights through event-driven architectures, streamline ETL/ELT processes for faster data processing and reduced downtime, and identify and implement cutting-edge Azure technologies to create new revenue streams through data-driven innovation. In this role, you will be accountable for building and maintaining data architectures pipelines, designing and implementing data warehouses and data lakes, developing processing and analysis algorithms, and collaborating with data scientists to build and deploy machine learning models. You will manage a business function, provide input to strategic initiatives, and lead a large team or sub-function, embedding a performance culture aligned with the organization's values. Additionally, you will provide expert advice to senior management, manage resourcing and budgeting, and foster compliance within the function. As a Senior Leader, you are expected to demonstrate a clear set of leadership behaviors, including listening and authenticity, energizing and inspiring others, aligning across the enterprise, and developing colleagues. Upholding the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship, along with the Barclays Mindset of Empower, Challenge, and Drive, will be essential in creating an environment for colleagues to thrive and deliver to an excellent standard.,

Posted 1 week ago

Apply

0.0 - 4.0 years

0 Lacs

haryana

On-site

As the Financial Services (FSO) division of Ernst & Young, you will have the unique opportunity to be part of a professional services organization dedicated exclusively to the financial services marketplace. Joining our multi-disciplinary teams from around the world, you will play a crucial role in delivering a global perspective. Aligned with key industry groups such as asset management, banking and capital markets, insurance, and private equity, we offer integrated advisory, assurance, tax, and transaction services. Through diverse experiences, world-class learning opportunities, and individually tailored coaching, you will undergo continuous professional development. Our focus is on developing exceptional leaders who collaborate effectively to fulfill our commitments to all stakeholders, thereby contributing significantly to building a better working world for our people, clients, and communities. Excited to be a part of this journey This is just the beginning, as the exceptional EY experience will stay with you for a lifetime. As a future FSO Technology Consultant at EY, you will be part of a team that helps clients navigate complex industry challenges and leverage technology to enhance business operations. Your role will involve addressing business and strategic challenges such as business and solution architecture, digital transformation, project management, and design of digital operating models. Additionally, you will work on technical matters including data science, advanced analytics, IoT, data governance, blockchain, artificial intelligence, and robotic process automation. Joining our team means working on critical projects within the financial services landscape, with opportunities to transition between teams as both you and our dynamic business continue to grow and evolve. Your contributions will be instrumental in propelling EY to new heights. We are currently seeking individuals for the following positions: - Cybersecurity - Digital - Platform - Data & Analytics To qualify for a role in our team, you must have: - A Bachelor's or Master's Degree in (Business) Engineering, Computer Science, Information Systems Management, Mathematics, (applied) Economics, or a related field with an interest in cutting-edge technologies. - Strong analytical skills. - Knowledge of project management methodologies, including agile, traditional, and hybrid approaches. - Proficiency in English at an advanced level. - Experience in team leadership. - Exceptional oral and written communication abilities. If you believe you meet the above criteria, we encourage you to apply at your earliest convenience. The exceptional EY experience awaits you, ready for you to shape and build upon.,

Posted 1 week ago

Apply

13.0 - 17.0 years

0 Lacs

punjab

On-site

The Deputy General Manager - Finance Master Data Lead at Bunge, reporting to the Global EDM Head, plays a crucial role in leading and supporting enterprise master data programs to drive Bunge's strategic initiatives encompassing Digital programs, Data Integration, and S4 Hana implementation for Finance data domain, including Cost center, GL, Profit center, and Company code. In this role, you will be accountable for overseeing the development of business solutions, ensuring Data Quality, and successfully implementing designed solutions across all geographic regions and Bunge's businesses. Your responsibilities will include collaborating with multiple stakeholders from Business, IT, and other areas globally to define and achieve mutually agreed outcomes within the master data domains. As a techno-functional expert in Master Data Management for Finance data type, you will work closely with Business Subject Matter Experts to understand detailed requirements for providing effective solutions, Business Data Owners to ensure alignment with domain priorities, and Business functional area leaders to ensure scope and deliverables meet business needs. You will collaborate with IT teams to gather requirements effectively, work alongside technical Solution Architects to ensure alignment with solution direction, engage with Bunge Business Services leaders to ensure globally standardized solutions, and coordinate with Delivery Partner teams to ensure high-quality delivery meeting business expectations. Key functions of this role include spearheading the end-to-end business requirements, engaging with stakeholders to gather requirements, defining project scope and deliverables, leading business UAT, managing large, complex global projects, and ensuring successful implementation of master data solutions without disrupting business processes. Additionally, you will build relationships with internal and external service providers, guide project teams, maintain an in-depth understanding of master data processes, lead Continuous Improvement initiatives, and contribute to strategic directions governing master data. To be successful in this role, you must have a minimum of 13-15 years of professional data management experience, including at least 8-10 years of experience in providing business solutions and working with SAP HANA & MDG/MDM. An educational background in CA, M.Com, ICWA, B.Tech, or MBA Finance is preferred. Moreover, you should possess hands-on knowledge of technologies such as SAP MDG, S4 HANA, Data Lake, Data Model, and MDM, as well as expertise in Master Data Management for Finance data type and experience with Data Dictionaries and Metadata management. Strong leadership, service delivery, and project management skills are essential for this role, along with the ability to work effectively in virtual teams across different cultures and time zones. Bunge, a world leader in sourcing, processing, and supplying oilseed and grain products, aims to create sustainable products and opportunities for farmers and consumers worldwide. With a global network and a dedicated workforce, Bunge is committed to feeding and fueling a growing world while upholding its values of innovation and sustainability.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be the visionary Group Data Product Manager (GPM) for AI/ML & Metadata Management, responsible for leading the development of advanced AI/ML-powered metadata solutions. Your primary focus will be on establishing a cohesive and intuitive Data Platform tailored to cater to a variety of user roles including data engineers, producers, and consumers. Your role involves integrating various tools to create a unified platform that will significantly improve data discoverability, governance, and operational efficiency on a large scale.,

Posted 1 week ago

Apply

10.0 - 18.0 years

2 - 3 Lacs

Hyderabad

Work from Office

Experience needed: 12-18 years Type: Full-Time Mode: WFO Shift: General Shift IST Location: Hyderabad NP: Immediate Joinee - 30 days Job Summary: We are looking for an experienced and visionary Data Architect - Azure Data & Analytics to lead the design and delivery of scalable, secure, and modern data platform solutions leveraging Microsoft Azure and Microsoft Fabric . This role requires deep technical expertise in the Azure Data & Analytics ecosystem, strong experience in designing cloud-native architectures, and a strategic mindset to modernize enterprise data platforms. Key Responsibilities: Architect and design Modern Data Platform solutions on Microsoft Azure, including ingestion, transformation, storage, and visualization layers. Lead implementation and integration of Microsoft Fabric , including OneLake, Direct Lake mode, and Fabric workloads (Data Engineering, Data Factory, Real-Time Analytics, Power BI). Define enterprise-level data architecture , including data lakehouse patterns, delta lakes, data marts, and semantic models. Collaborate with business stakeholders, data engineers, and BI teams to translate business needs into scalable cloud data solutions. Design solutions using Azure-native services such as Azure Data Factory, Azure Synapse Analytics, Azure Data Lake Storage (Gen2), Azure SQL, and Azure Event Hubs. Establish best practices for data security, governance, DevOps, CI/CD pipelines, and cost optimization. Guide implementation teams on architectural decisions and technical best practices across the data lifecycle. Develop reference architectures and reusable frameworks for accelerating data platform implementations. Stay updated on Microsofts data platform roadmap and proactively identify opportunities to enhance data strategy. Assist in developing RFPs, architecture assessments, and solution proposals. Requirements Required Skills & Qualifications: Proven 12-18 years of experience which includes designing and implementing cloud-based modern data platforms on Microsoft Azure. Deep knowledge and understanding of Microsoft Fabric architecture , including Data Factory, Data Engineering, Synapse Real-Time Analytics, and Power BI integration. Expertise in Azure Data Services : Azure Synapse, Data Factory, Azure SQL, ADLS Gen2, Azure Functions, Azure Purview, Event Hubs, etc. Experience of data warehousing, lakehouse architectures, ETL/ELT , and data modeling. Experience in data governance, security, role-based access (Microsoft Entra/Azure AD) , and compliance frameworks. Strong leadership and communication skills to influence both technical and non-technical stakeholders. Familiarity with DevOps and infrastructure-as-code (e.g., ARM templates, Bicep, Terraform) is a plus. Preferred Qualifications: Microsoft Certified: Azure Solutions Architect Expert , Azure Data Engineer Associate , or Microsoft Fabric Certification . Experience with real-time data streaming , IoT , or machine learning pipelines in Azure. Familiarity with multi-cloud data strategies or hybrid deployments is an advantage.

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

As a candidate for the position, you should have a strong understanding of Service-Oriented Architecture (SOA) and Microservices, particularly in the context of a cloud data platform. Your knowledge of full-stack development, encompassing front-end and back-end technologies such as Angular, React, and Node.js, will be crucial for collaborating on data access and visualization layers. In addition, experience with both relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases, as well as columnar databases like BigQuery, is required for effective database management. You will be responsible for understanding data governance frameworks and implementing Role-Based Access Control (RBAC), encryption, and data masking in cloud environments to ensure data security. Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) tools like Terraform, and automation frameworks will be essential for streamlining processes. Strong analytical skills are necessary for troubleshooting complex data platform and microservices issues. Your qualifications should include a B.Tech or M.Tech degree to qualify for this role. As part of your key job responsibilities, you will be tasked with designing and building scalable data pipelines and microservices on Google Cloud Platform (GCP) to support real-time and batch processing. Designing and implementing SOA and microservices-based architectures, ensuring modular, flexible, and maintainable data solutions, will be a core focus of your role. Utilizing your full-stack expertise, you will contribute to the seamless integration of front-end and back-end components, enabling robust data access and UI-driven data exploration. Leading data ingestion and integration efforts from various sources into the data platform, standardizing and optimizing data for analytics, will also be part of your responsibilities. Leveraging GCP services like BigQuery, Dataflow, Pub/Sub, Cloud Functions, among others, you will build and manage data platforms aligned with business needs. Implementing and managing data governance, access controls, and security best practices while utilizing GCP's native security features will be crucial. Continuous monitoring and improvement of performance, scalability, and efficiency of data pipelines and storage solutions will be required. Collaboration with data architects, software engineers, and cross-functional teams to define best practices, design patterns, and frameworks for cloud data engineering is essential. Lastly, automating data platform processes to enhance reliability, reduce manual intervention, and improve operational efficiency will be a key focus area in this role.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

You will play a crucial role in enhancing the Analytics capabilities for our businesses. Your responsibilities will include engaging with key stakeholders to comprehend Fidelity's sales, marketing, client services, and propositions context. You will collaborate with internal teams such as the data support team and technology team to develop new tools, capabilities, and solutions. Additionally, you will work closely with IS Operations to expedite the development and sharing of customized data sets. Maximizing the adoption of Cloud-Based Data Management Services will be a significant part of your role. This involves setting up sandbox analytics environments using platforms like Snowflake, AWS, Adobe, and Salesforce. You will also support data visualization and data science applications to enhance business operations. In terms of stakeholder management, you will work with key stakeholders to understand business problems and translate them into suitable analytics solutions. You are expected to facilitate smooth execution, delivery, and implementation of these solutions through effective engagement with stakeholders. Your role will also involve collaborating with the team to share knowledge and best practices, including coaching on deep learning and machine learning methodologies. Taking independent ownership of projects and initiatives within the team is crucial, demonstrating leadership and accountability. Furthermore, you will be responsible for developing and evaluating tools, methodologies, or infrastructure to address long-term business challenges. This may involve enhancing modelling software, methodologies, data requirements, and optimization environments to elevate the team's capabilities. To excel in this role, you should possess 5 to 8 years of overall experience in Analytics, with at least 4 years of experience in SQL, Python, open-source Machine Learning Libraries, and Deep Learning. Experience working in an AWS Environment, preferably using Snowflake, is preferred. Proficiency in analytics applications such as Python, SAS, SQL, and interpreting statistical results is necessary. Knowledge of SPARK, Hadoop, and Big Data Platforms will be advantageous.,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

hyderabad, telangana

On-site

As a Principal Data Engineer at Skillsoft, you will play a crucial role in driving the advancement of Enterprise data infrastructure by designing and implementing the logic and structure for how data is set up, cleansed, and stored for organizational usage. You will be responsible for developing a Knowledge Management strategy to support Skillsoft's analytical objectives across various business areas. Your role will involve building robust systems and reusable code modules to solve problems, working with the latest open-source tools and platforms to build data products, and collaborating with Product Owners and cross-functional teams in an agile environment. Additionally, you will champion the standardization of processes for data elements used in analysis, establish forward-looking data and technology objectives, manage a small team through project deliveries, and design rich data visualizations and interactive tools to communicate complex ideas to stakeholders. Furthermore, you will evangelize the Enterprise Data Strategy & Execution Team mission, identify opportunities to influence decision-making with supporting data and analysis, and seek additional data resources that align with strategic objectives. To qualify for this role, you should possess a degree in Data Engineering, Information Technology, CIS, CS, or related field, along with 7+ years of experience in Data Engineering/Data Management. You should have expertise in building cloud data applications, cloud computing, data engineering/analysis programming languages, and SQL Server. Proficiency in data architecture, data modeling, and experience with technology stacks for Metadata Management, Data Governance, and Data Quality are essential. Additionally, experience in working cross-functionally across an enterprise organization and an Agile methodology environment is preferred. Your strong business acumen, analytical skills, technical abilities, and problem-solving skills will be critical in this role. Experience with app and web analytics data, CRM, and ERP systems data is a plus. Join us at Skillsoft and be part of our mission to democratize learning and help individuals unleash their edge. If you find this opportunity intriguing, we encourage you to apply and be a part of our team dedicated to leadership, learning, and success at Skillsoft. Thank you for considering this role.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

hyderabad, telangana

On-site

As one of the world's leading asset managers, Invesco is dedicated to helping investors worldwide achieve their financial objectives. By delivering the combined power of our distinctive investment management capabilities, we provide a wide range of investment strategies and vehicles to our clients around the world. If you're looking for challenging work, smart colleagues, and a global employer with a social conscience, come explore your potential at Invesco. Make a difference every day! --- As one of the world's leading asset managers, we are solely dedicated to delivering an investment experience that helps people get more out of life. If you're looking for challenging work, smart colleagues, and a global employer with a social conscience, explore your potential at Invesco. --- The Department: Invesco understands data and the products created from data are the lifeblood of our business. The Distribution Data Office (DDO) is a linchpin of an ecosystem of data-first functions that will work together seamlessly to enable Invesco to achieve the true value of data and data products. DDO will empower Invesco to leverage data as a strategic asset by making quality, trusted data and content available to the right people, at the right time, in the right format, in the most efficient way possible to enable both large transformations and day-to-day operations. --- Your Role: The Data Product Owner sits within the Distribution Data Office. They maintain good relationships with internal partners and collaborate with their peers in business units within the Distribution domain. They develop a strong understanding of the distribution data, including both how it is produced and consumed across domains and how to apply strategy and practices to drive Distributions data capabilities forward. They have a good understanding of strategy, capabilities, and stated data operating model and associated roles and responsibilities, and work with Data Governance to assure this op model is established and optimized across all roles related to Distribution data. All work is done in partnership with their counterparts in Investments and Corporate Shared Services to ensure consistency of the data services and deliverables, as well as in partnership with their Technology counterparts. --- The Data Product Owner executes on the defined data strategy and roadmaps to improve the quality, availability, and usage of their data. They are accountable for the development and execution of detailed deliverables for the areas directly related to distribution data and participate in large transformational projects as well as internal initiatives and BAU/continuous enhancement. They have a good understanding of their business constituents" needs and use cases as it relates to distribution data, and they work with other functions to understand how this data ties to other data throughout the enterprise to ultimately create business-facing data products. They translate these use cases and requirements into actionable backlogs of work for themselves and the delivery teams, performing complex hands-on-keys work and collaborating with the broader delivery teams, both business analysts and technical engineering squads. --- This individual: - Must be comfortable working in an agile development environment - Have an ability to handle multiple requests from varied stakeholders, in a way that maintains clear priority, and ability to adjust when needed - Be an effective translator of business needs into technical requirements - Have an inquisitive and innovative mindset with a demonstrated ability to recognize opportunities to create distinctive value - Demonstrate an ability to build relationships, collaborate, mentor, motivate, and influence internal and external teams - Be able to work independently when needed, take initiative, and get projects completed with great attention to detail and on time. --- You will be responsible for: - Executes and drives distribution data deliverables within the defined overall data strategy and roadmap (functional analyses, requirements gathering and translation to specific data deliverables, ensuring adherence to architectural & technical standards, etc.) as related to their projects - Partners with business and technology peers to ensure priorities are agreed and deliverables executed with diligence and in accordance with business needs - Articulates dependencies across projects requiring distribution data as an input, escalating risks to leadership as required - Captures business use cases and requirements related to distribution data, translates to data requirements, and partners with technology in functional and technical design discussions to ensure the end data product meets business needs - Decomposes requirements into executable Epics and high-level User Stories for themselves and any impacted Squads. - Work with leadership to ensure that data related business requirements align with broader business goals - Perform business analysis and detailed requirements capturing activities within project(s) and delivery squads as required - Understand impacts of necessary technology and architecture priorities such as Cloud-first strategy and Tech Debt remediation and ensure the same are reflected in solution recommendations - Provide data-related input to Business Change Management activities as part of their projects to drive business user engagement and adoption and mitigate impact - Acts as a change agent and drives adoption of data capabilities within their projects and with direct business partners - Interfacing with and articulating the value of data deliverables to direct business partners and individual users - Playing a consultative role to identify opportunities for the people, process, and tools within the delivery team to improve efficiency and effectiveness - Coaching individual members of the team as needed to optimize the efficiency of the delivery team - Identify opportunities for continuous improvement of data management processes to reduce complexity, improve data quality, and increase efficiency throughout the data delivery lifecycle --- The experience you bring: - Minimum of 2 years of Product Ownership with asset management data or related experience - Knowledge of data capabilities, practices, and frameworks including concepts related to master data management, data governance, business intelligence, and analytics and their practical applications to deliver data products - Knowledge of common data platforms and how data technologies function, with a lens towards practical application to business needs - History of working on large, complex projects preferably within an Agile framework - Intellectual curiosity to gain a deep understanding of commercial business drivers and client needs in the investment management industry, and how efficient use of data & content can facilitate meeting those needs - Excellent interpersonal skills and demonstrated ability to work effectively with their project colleagues, peers within the Distribution Data Office, and peers across the enterprise - Exceptional intellectual horsepower and passion for excellence - Comfortable dealing with ambiguity - Solid business acumen including the ability to think strategically, exhibit sound business judgment, and demonstrate a strong drive-for-results --- Nice to haves: - Background in Financial Services or Asset Management is a plus - Background in Sales and Marketing data is a plus - Working with multiple delivery squads a plus - Hands-on data engineering or data delivery experience a plus --- What's in it for you Our people are at the very core of our success, and we strive to provide employees with a competitive total rewards package which includes: - 401(k) matching - Flex time off - Health and wellness benefits - Work flexibility programs - Parental leave benefits --- The above information has been designed to indicate the general nature and level of work performed by employees within this role. It is not designed to contain or be interpreted as a comprehensive inventory of all duties, responsibilities, and qualifications required of employees assigned to this job. The job holder may be required to perform other duties as deemed appropriate by their manager from time to time. --- Full Time / Part Time: Full time --- Worker Type: Employee --- Job Exempt (Yes / No): Yes --- Workplace Model: At Invesco, our workplace model supports our culture and meets the needs of our clients while providing flexibility our employees value. As a full-time employee, compliance with the workplace policy means working with your direct manager to create a schedule where you will work in your designated office at least three days a week, with two days working outside an Invesco office. --- Why Invesco: In Invesco, we act with integrity and do meaningful work to create an impact for our stakeholders. We believe our culture is stronger when we all feel we belong, and we respect each other's identities, lives, health, and well-being. We come together to create better solutions for our clients, our business, and each other by building on different voices and perspectives. We nurture and encourage each other to ensure our meaningful growth, both personally and professionally. --- We believe in a diverse, inclusive, and supportive workplace where everyone feels equally valued, and this starts at the top with our senior leaders having diversity and inclusion goals. Our global focus on diversity and inclusion has grown exponentially, and we encourage connection and community through our many employee-led Business Resource Groups (BRGs). --- What's in it for you As an organization, we support personal needs, diverse backgrounds, and provide internal networks, as well as opportunities to get involved in the community and in the world. --- Our benefit policy includes but is not limited to: - Competitive Compensation - Flexible, Hybrid Work - 30 days Annual Leave + Public Holidays - Life Insurance - Retirement Planning - Group Personal Accident Insurance - Medical Insurance for Employee and Family - Annual Health Check-up - 26 weeks Maternity Leave - Paternal Leave - Adoption Leave - Near site Childcare Facility - Employee Assistance Program - Study Support - Employee Stock Purchase Plan - ESG Commitments and Goals - Business Resource Groups - Career Development Programs - Mentoring Programs - Invesco Cares - Dress for your Day --- In Invesco, we offer development opportunities that help you thrive as a lifelong learner in a constantly evolving business environment and ensure your constant growth. Our AI-enabled learning platform delivers curated content based on your role and interest. We ensure our managers and leaders also have many opportunities to advance their skills and competencies that become pivotal in their continuous pursuit of performance excellence. --- To know more about us: About Invesco: https://www.invesco.com/corporate/en/home.html About our Culture: https://www.invesco.com/corporate/en/about-us/our-culture.html About our D&I policy: https://www.invesco.com/corporate/en/our-commitments/diversity-and-inclusion.html About our CR program: https://www.invesco.com/corporate/en/our-commitments/corporate-responsibility.html --- Apply for the role @ Invesco Careers: https://careers.invesco.com/india/,

Posted 1 week ago

Apply

6.0 - 10.0 years

0 Lacs

noida, uttar pradesh

On-site

FCM is one of the world's largest travel management companies and a trusted partner for national and multinational organizations. With a 24/7 reach in 97 countries, FCM's flexible technology anticipates and resolves client needs, backed by experts offering in-depth local knowledge and a commitment to duty of care. As part of the ASX-listed Flight Centre Travel Group, FCM delivers the best market-wide rates, unique added-value benefits, and exclusive solutions. A leader in the travel tech space, FCM offers proprietary client solutions and provides specialist services through FCM Consulting and FCM Meetings & Events. We are seeking a skilled Azure Data Engineer to join our dynamic team. The ideal candidate will have extensive experience in data engineering, working with Azure cloud services, and designing and implementing scalable data solutions. You will play a crucial role in developing, optimizing, and maintaining data pipelines and architectures, ensuring data quality and availability across various platforms. Key Responsibilities: - Design, develop, and maintain data pipelines and ETL processes using Azure Data Factory, Azure Databricks, and Azure Synapse Analytics. - Build and optimize data storage solutions using Azure Data Lake, Azure SQL Database, and Azure Cosmos DB. - Collaborate with data scientists, analysts, and business stakeholders to understand data requirements and deliver solutions. - Implement data quality checks, data governance, and security best practices across data platforms. - Monitor, troubleshoot, and optimize data workflows for performance and scalability. - Develop and maintain data models, data cataloging, and metadata management. - Automate data integration and transformation processes using Azure DevOps and CI/CD pipelines. - Stay up-to-date with emerging Azure technologies and data engineering trends. Qualifications: - Bachelor's degree in Computer Science, Information Technology, or a related field. - 6+ years of experience in data engineering with a focus on Azure cloud services. - Proficiency in Azure Data Factory, Azure Databricks, Azure Synapse Analytics, and Azure SQL Database. - Strong experience with SQL, Python, or other scripting languages. - Familiarity with data modeling, ETL design, and big data tools such as Hadoop or Spark. - Experience with data warehousing concepts, data lakes, and data pipelines. - Understanding of data governance, data quality, and security best practices. - Excellent problem-solving skills and ability to work in a fast-paced, collaborative environment. Preferred Skills: - Azure certification (e.g., Microsoft Certified: Azure Data Engineer Associate) is a plus. - Experience with Azure Logic Apps, Azure Functions, and API Management. - Knowledge of Power BI, Tableau, or other data visualization tools.,

Posted 1 week ago

Apply

7.0 - 12.0 years

0 Lacs

maharashtra

On-site

As a Lead Data Engineer, you will be responsible for leveraging your 7 to 12+ years of hands-on experience in SQL database design, data architecture, ETL, Data Warehousing, Data Mart, Data Lake, Big Data, Cloud (AWS), and Data Governance domains. Your expertise in a modern programming language such as Scala, Python, or Java, with a preference for Spark/ Pyspark, will be crucial in this role. Your role will require you to have experience with configuration management and version control apps like Git, along with familiarity working within a CI/CD framework. If you have experience in building frameworks, it will be considered a significant advantage. A minimum of 8 years of recent hands-on SQL programming experience in a Big Data environment is necessary, with a preference for experience in Hadoop/Hive. Proficiency in PostgreSQL, RDBMS, NoSQL, and columnar databases will be beneficial for this role. Your hands-on experience in AWS Cloud data engineering components, including API Gateway, Glue, IoT Core, EKS, ECS, S3, RDS, Redshift, and EMR, will play a vital role in developing and maintaining ETL applications and data pipelines using big data technologies. Experience with Apache Kafka, Spark, and Airflow is a must-have for this position. If you are excited about this opportunity and possess the required skills and experience, please share your CV with us at omkar@hrworksindia.com. We look forward to potentially welcoming you to our team. Regards, Omkar,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

maharashtra

On-site

As a Data Engineer at Blis, you will be part of a globally recognized and award-winning team that specializes in big data analytics and advertising. We collaborate with iconic brands like McDonald's, Samsung, and Mercedes Benz, providing precise audience insights to help them target their ideal customers effectively. Upholding ethical data practices and privacy rights is at the core of our operations, and we are committed to ensuring outstanding performance and reliability in all our systems. Working at Blis means being part of an international company with a diverse culture, spanning across four continents and comprising over 300 team members. Headquartered in the UK, we are financially successful and poised for continued growth, offering you an exciting opportunity to contribute to our journey. Your primary responsibility as a Data Engineer will involve designing and implementing high-performance data pipelines on Google Cloud Platform (GCP) to handle massive amounts of data efficiently. With a focus on scalability and automation, you will play a crucial role in building secure pipelines that can process over 350GB of data per hour and respond to 400,000 decision requests each second. Your expertise will be instrumental in driving improvements in data architecture, optimizing resource utilization, and delivering fast, accurate insights to stakeholders. Collaboration is key at Blis, and you will work closely with product and engineering teams to ensure that our data infrastructure evolves to support new initiatives seamlessly. Additionally, you will mentor and support team members, fostering a collaborative environment that encourages knowledge sharing, innovation, and professional growth. To excel in this role, you should have at least 5 years of hands-on experience with large-scale data systems, with a strong focus on designing and maintaining efficient data pipelines. Proficiency in Apache Druid and Imply platforms, along with expertise in cloud-based services like GCP, is essential. You should also have a solid understanding of Python for building and optimizing data flows, as well as experience with data governance and quality assurance practices. Furthermore, familiarity with event-driven architectures, tools like Apache Airflow, and distributed processing frameworks such as Spark will be beneficial. Your ability to apply complex algorithms and statistical techniques to large datasets, along with experience in working with relational databases and non-interactive reporting solutions, will be valuable assets in this role. Joining the Blis team means engaging in high-impact work in a data-intensive environment, collaborating with brilliant engineers, and being part of an innovative culture that prioritizes client obsession and agility. With a global reach and a commitment to diversity and inclusion, Blis offers a dynamic work environment where your contributions can make a tangible difference in the world of advertising technology.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

About the Company: Birlasoft is a powerhouse where domain expertise, enterprise solutions, and digital technologies converge to redefine business processes. The company takes pride in its consultative and design thinking approach, driving societal progress by enabling customers to run businesses with unmatched efficiency and innovation. As part of the CK Birla Group, a multibillion-dollar enterprise, Birlasoft boasts a 12,500+ professional team committed to upholding the Group's 162-year legacy. The core values of the company prioritize Diversity, Equity, and Inclusion (DEI) initiatives, along with Corporate Sustainable Responsibility (CSR) activities, demonstrating a dedication to building inclusive and sustainable communities. Join Birlasoft in shaping a future where technology seamlessly aligns with purpose. About the Role: Birlasoft is seeking a seasoned Data Functional Analyst with a strong background in Projects to join the team. Responsibilities: - Ensure data quality during migration from Oracle EBS to Oracle Fusion within the Projects domain, maintaining data integrity and accuracy. - Collaborate with business stakeholders to gather data quality with organizational goals. - Implement data quality improvement strategies, providing support and guidance to the technical team. - Monitor quality improvement progress, address issues promptly, and facilitate stakeholder sessions for a smooth transition to Oracle Fusion. Qualifications: Any Graduation Required Skills: - In-depth knowledge of data models, methodologies, and best practices especially related to project domain data such as project definitions, tasks, budgets, resources, time and expenses, billing, project controls, and performance reporting. - Good understanding of data quality standards and the ability to implement and enforce them. - Strong interpersonal skills and relevant Oracle certifications in Projects domain. - Prior experience in maintaining high-level data quality during a data migration project through data cleansing, standardization, new data construction, validation, and de-duplication. - Strong understanding of data systems, databases, and data governance. Preferred Skills: - Knowledge of industry best practices in data management. - Experience with data modeling and database design. - Ability to communicate complex data concepts effectively. Equal Opportunity Statement: Birlasoft is committed to diversity and inclusivity in all aspects of the company.,

Posted 1 week ago

Apply

3.0 - 12.0 years

0 Lacs

karnataka

On-site

As a Data Governance Consultant at KPMG in Bangalore, you will play a key role in developing and implementing data governance strategies and frameworks. Your responsibilities will include leading data quality management initiatives, managing metadata, collaborating with stakeholders on change management processes, and providing guidance on best practices to internal teams. To be successful in this role, you should have a minimum of 3 years of experience in a data governance role, with a total experience ranging from 3 to 12 years. Proficiency in data governance concepts, frameworks, and data quality management principles is essential. Experience with metadata management tools and change management processes will be beneficial. Excellent communication, stakeholder management skills, and the ability to work effectively in a cross-functional team environment are also required. KPMG offers a competitive salary package, health insurance coverage, opportunities for professional development and growth, and a dynamic and collaborative work environment. If you have a background in data governance practices and tools and are passionate about driving data quality and implementing data governance frameworks, we invite you to join our team in Bangalore. This is an Equal Opportunity Employer and we encourage candidates with a Full-Time education background in B.E/B.Tech/BCA/MBA/MCA/BBA/MBA to apply.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies