Jobs
Interviews

53 Data Platform Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

8.0 - 10.0 years

15 - 25 Lacs

bengaluru

Work from Office

About The Role MDM and Data Platforms Manager will oversee the strategy, design, implementation, and governance of enterprise master data and data platforms for ELGi. The role will drive the enterprise-wide adoption of MDM frameworks, optimize data platforms for analytics and operational use, and ensure seamless data integration across the organization to support business operations and digital transformation. Role & responsibilities Master Data Management (MDM) Develop and implement the enterprise-wide MDM strategy, ensuring alignment with business goals and data governance frameworks. Establish and enforce standards, policies, and processes for master data domains (e.g., product, customer, supplier, and asset data). Oversee the design, implementation, and maintenance of MDM solutions and tools (e.g., Informatica MDM, SAP MDG, or equivalent). Lead initiatives to cleanse, de-duplicate, and standardize master data across systems to ensure quality, accuracy, and consistency. Collaborate with business stakeholders to define data ownership, governance roles, and data stewardship processes. Monitor and measure MDM effectiveness using KPIs, driving continuous improvement in data quality and lifecycle management. Data Platforms Strategy and Management Lead the design, implementation, and operation of cloud-based data platforms, data lakes, and modern data architectures (e.g., Azure Data Services, AWS, or Snowflake). Ensure data platforms meet scalability, performance, and security requirements to support enterprise data, analytics, and operational needs. Drive initiatives for real-time data ingestion, storage, and transformation pipelines using tools such as Databricks, Apache Spark, or Kafka. Oversee data integration across enterprise systems (ERP, CRM, MES, PLM, etc.) to ensure a seamless flow of trusted data for analytics and operations. Enable self-service data access through optimized data catalogs, APIs, and visualization tools to empower business teams. Data Governance and Quality Management Partner with the data governance office to implement data quality frameworks and automated controls to ensure the accuracy, completeness, and timeliness of enterprise data. Define and monitor master data KPIs, ensuring data governance and ownership are embedded into business operations. Drive compliance with data regulations and policies (e.g., GDPR, CCPA) across MDM and data platform environments. Collaboration and Business Enablement Collaborate with IT, business leaders, and data teams to identify opportunities for leveraging master data and data platforms to improve operations and decision-making. Partner with analytics and application teams to ensure data platforms meet reporting, AI/ML, and operational requirements. Act as the subject matter expert for MDM and data platforms, providing guidance to cross-functional teams and senior stakeholders. Team Leadership and Delivery Lead and mentor a team of data engineers, MDM analysts, and platform administrators, ensuring delivery of high-quality solutions. Drive continuous upskilling of the team on modern tools, technologies, and best practices. Manage project delivery, budgets, and resources for MDM and data platform initiatives, ensuring on-time, within-scope execution. Skills Strong experience in implementing and managing MDM solutions (e.g., Informatica MDM, SAP MDG, TIBCO EBX). Expertise in cloud-based data platforms (Azure Data Services, AWS Redshift, Snowflake) and big data technologies. Proficiency in ETL tools, real-time data processing, and data pipeline design (e.g., Apache Spark, Databricks, Talend, or Kafka). Deep knowledge of data modeling, metadata management, and data integration strategies. Familiarity with enterprise systems (ERP, CRM, MES) and their master data structures. Proven ability to develop and execute strategies for master data management and modern data platform implementation. Strong analytical skills to measure data quality, platform performance, and business value delivered. Experience leading cross-functional teams and collaborating with business stakeholders to drive enterprise data initiatives. Strong communication, stakeholder management, and problem-solving skills. Knowledge of data governance frameworks, data quality principles, and regulatory compliance requirements. Experience and Educational Qualifications 8-10 years of experience in data management, with a focus on MDM, data platforms, and governance. Proven track record of implementing and optimizing enterprise MDM solutions and cloud-based data platforms. Experience working in a global manufacturing environment with large, complex datasets and enterprise systems. Bachelors degree in Computer Science, Data Science, Information Systems, or a related field (Masters degree preferred). Certifications: MDM certifications (e.g., Informatica MDM, SAP MDG). Cloud certifications (e.g., Azure Data Engineer, AWS Data Analytics, Snowflake Architect). Data governance or quality certifications (e.g., CDMP, DAMA-DMBOK) (preferred)

Posted 1 day ago

Apply

13.0 - 22.0 years

0 - 0 Lacs

bengaluru

Hybrid

This role will be based in Bangalore, India. At LinkedIn, we trust each other to do our best work where it works best for us and our teams. This role offers a hybrid work option, meaning you can both work from home and commute to a LinkedIn office, depending on whats best for you and when it is important for your team to be together. Responsibilities: - Deliver impact by driving innovation while building and shipping software at scale - Provide architectural guidance and mentorship to up-level the engineering organization - Actively improve the level of craftsmanship at LinkedIn by developing best practices and defining best strategies - Design products/services/tools and code that can be used by others while upholding operational impact of all decisions - Identify problems and opportunities and lead teams to architect, design, implement and operationalize systems - Work closely with and influence product and/or technology partners regularly to help define roadmap - Resolve conflicts between teams within the organization to get alignment and build team culture - Review others work and share knowledge Basic Qualifications: - BA/BS Degree in Computer Science or related technical discipline, or equivalent practical experience - 14+ years of industry experience in software design, development, and algorithm related solutions. - 14+ years of experience programming in object-oriented languages such as Java, C++, Python, Go, C# and/or Functional languages such as Scala or other relevant coding languages - Hands on experience developing distributed systems, large-scale systems, databases and/or Backend APIs - 14+ years experience in software design, development, and algorithm related solutions with at least 5 years of experience in a technical leadership position - 5+ years of experience in an architect or technical leadership position Preferred Qualifications: - 5+ years experience with large-scale distributed systems and client-server architectures - Experience leading a large organization - Experience leading by influence, formulating and driving technical strategies across various organizations Suggested Skills: - Distributed systems - Backend Systems Infrastructure - Java

Posted 2 days ago

Apply

13.0 - 22.0 years

0 - 0 Lacs

bengaluru

Hybrid

This role will be based in Bangalore, India. At LinkedIn, we trust each other to do our best work where it works best for us and our teams. This role offers a hybrid work option, meaning you can both work from home and commute to a LinkedIn office, depending on whats best for you and when it is important for your team to be together. Responsibilities: - Deliver impact by driving innovation while building and shipping software at scale - Provide architectural guidance and mentorship to up-level the engineering organization - Actively improve the level of craftsmanship at LinkedIn by developing best practices and defining best strategies - Design products/services/tools and code that can be used by others while upholding operational impact of all decisions - Identify problems and opportunities and lead teams to architect, design, implement and operationalize systems - Work closely with and influence product and/or technology partners regularly to help define roadmap - Resolve conflicts between teams within the organization to get alignment and build team culture - Review others work and share knowledge Basic Qualifications: - BA/BS Degree in Computer Science or related technical discipline, or equivalent practical experience - 14+ years of industry experience in software design, development, and algorithm related solutions. - 14+ years of experience programming in object-oriented languages such as Java, C++, Python, Go, C# and/or Functional languages such as Scala or other relevant coding languages - Hands on experience developing distributed systems, large-scale systems, databases and/or Backend APIs - 14+ years experience in software design, development, and algorithm related solutions with at least 5 years of experience in a technical leadership position - 5+ years of experience in an architect or technical leadership position Preferred Qualifications: - 5+ years experience with large-scale distributed systems and client-server architectures - Experience leading a large organization - Experience leading by influence, formulating and driving technical strategies across various organizations Suggested Skills: - Distributed systems - Backend Systems Infrastructure - Java

Posted 2 days ago

Apply

5.0 - 10.0 years

18 - 22 Lacs

hyderabad, bengaluru

Work from Office

We're Hiring: PL SQL Developer (5- 10 + Years Experience) | Contract Roles | Work From Office (WFO) Are you an experienced PL SQL Developer, looking for your next challenge? Join our dynamic team and contribute to exciting projects across various domains. We are hiring for multiple contractual positions with Work From Office (WFO) mode. Role Summary: We are seeking a skilled PL/SQL Developer to design, develop, and maintain database applications using Oracle PL/SQL. The ideal candidate will have strong experience in writing complex queries, procedures, and performance tuning to support business applications and data processing. Key Responsibilities: Develop and maintain PL/SQL packages, procedures, functions, and triggers. Optimize SQL queries for performance and scalability. Design and implement database solutions to support application development. Collaborate with application developers and business analysts to understand requirements. Perform data analysis, extraction, and transformation tasks. Ensure data integrity and security across systems. Debug and troubleshoot database-related issues. Participate in code reviews and provide feedback on database design. Create and maintain technical documentation for database processes. Required Skills: Strong proficiency in Oracle PL/SQL and SQL. Experience with Oracle database architecture and tools. Knowledge of performance tuning and query optimization. Familiarity with data modeling and relational database design. Experience with version control tools (e.g., Git). Good understanding of software development lifecycle (SDLC). Strong analytical and problem-solving skills.

Posted 2 days ago

Apply

4.0 - 9.0 years

20 - 30 Lacs

hyderabad, pune, bengaluru

Work from Office

Define overall AWS architecture for DTA (compute, networking, storage, data plane, messaging, CI/CD). Create ArchiMate deployment/instance diagrams and map infrastructure to logical components. Required Candidate profile Implement IaC (Terraform / AWS CloudFormation / CDK) for repeatable, auditable environments (dev/test/stage/prod). Build automated provisioning pipelines and guardrails.

Posted 3 days ago

Apply

2.0 - 6.0 years

0 Lacs

karnataka

On-site

Trelleborg is a world leader in engineered polymer solutions for almost every industry on the planet. Our talents have brought us to where we are today by specializing in polymer engineering that drives innovation and application possibilities. Collaborating closely with leading industry brands, we accelerate performance, drive businesses forward, and shape the industry for the benefit of humankind in the exciting years ahead. Join us to be a part of Shaping Industry from the Inside. If you are a talented individual aspiring to build business skills, gain experience, and tackle exciting challenges, Trelleborg offers you the opportunity to grow your career and shape the industry from within. We are currently seeking an IoT Infrastructure Analyst to join our IT Innovation Campus in Bangalore. This role involves assisting in the implementation of Cloud and IoT systems across the Trelleborg group of companies. You will contribute to defining and validating Platform architectures proposed by BA IT and Group IT development teams. As an ideal candidate, you should possess a strong background in development, deployment, architectural design, and support of large Cloud/IoT solutions, including Global Scale Implementations. You must also have experience in project design roles. Your responsibilities will include liaising with stakeholders to translate requirements and ensure implementation according to best architectural standards, maintaining and enhancing corporate standards related to IoT and Cloud Infrastructure services, assisting BA IT developers in aligning with Group-approved design patterns, building quick prototypes using various IoT tool sets to showcase concept viability, performing documentation, supporting implementation, and adhering to new and improved IT and OT processes following Trelleborg IT standards, and identifying and diagnosing issues specific to Azure Services or Azure Platform. The ideal candidate should have education and experience in technical computer and networking service support. An Associate's degree in computer science or a Bachelor's degree is preferred, although equivalent combinations of education, experience, and certification will be considered. A minimum of 2 years of experience in a global IT organization supporting Microsoft and Cloud Infrastructure Platforms, with a preference for Azure Cloud Experience, is required. Experience in scaling cloud infrastructure services for Enterprise applications, working in a culturally diversified fast-paced environment, developing and deploying IoT Devices in a Global Enterprise environment, implementing, documenting, and operationally maintaining IT infrastructure, interacting with and managing external suppliers and service providers, and understanding cloud computing technologies across Windows and Linux are essential. At Trelleborg, we prioritize your career progression and personal development. Your contributions are valued, and your professional aspirations are actively supported in our vibrant and dynamic work environment. Committed to innovation, excellence, and ecological sustainability, we ensure that your efforts not only contribute to our organizational achievements but also to global technological advancements. Join us at Trelleborg to face stimulating challenges, ensure your growth, and flourish in your career. Trelleborg is an equal opportunity employer that celebrates diversity and is dedicated to creating an inclusive environment for all employees. Reasonable accommodations will be provided to individuals with disabilities during the job application, interview process, crucial job functions, and other benefits and privileges of employment upon request. For any inquiries, feel free to reach out to our HR Team: Contact: Ashwini Email: ashwini.venkatesh@trelleborg.com Join us at Trelleborg, where our people are #ShapingIndustryfromtheInside#,

Posted 4 days ago

Apply

12.0 - 20.0 years

0 - 0 Lacs

pune

On-site

As a Hiring partner for many IT organizations we are hiring for CDP-Platform Data Architect, this is direct full time and on the payroll of hiring organization. Interested candidates can share word format resume with ctc,np location details at : info@unimorphtech.com Role : Sr. Data Architect-CDP / Customer Data Platform Solution Architect Location : Pune Experience : 12+ Yrs # Roles and Responsibilities: Sr. Data Architect ( CDP|ETL|Big Data|Cloud ) As a Sr. Data Architect, candidate is resposible to perform and manage all the Data Activities & Transformation. Sound knowledge on CDP ( customer data platform),Data model,designing,integration. Analyze & Understand the incoming data into CDP,Transform & migrate the structured,unstuctured data from CDP to RDBMS. Understand the Data pattern,Data Flow into CDP. Design the Process,Roadmap,strategy & Execute the data migration. Sound knowledge as Cloud Data architect including designing data infrastructure strategies,Data modelling,migration etc. Perform On-prem to Cloud migration (Aws,Azure,GCP). Sound knowledge on Big Data Techniques,tools. Sound knowledge on GDPR,Data Security,Data Encryption,Data Tokenization. Good to have knowledge on Data Management & DBA skills with operations and maintenance. Execute operations on multiple RDBMS Oracle,SQL Server,Postgres,MariaDB,MySQL ,Snowflake etc. Sound knowledge on NoSQL ( MongoDB,GraphDB and Cloud NoSQL DB) Responsible for day to day DBA activities and assit the team. Lead performance tuning and benchmarking activities.

Posted 5 days ago

Apply

0.0 - 4.0 years

0 Lacs

karnataka

On-site

Build the future of the AI Data Cloud. Join the Snowflake team. At Snowflake, we are at the forefront of the data revolution, committed to building the world's greatest data and applications platform. Our get it done culture allows everyone at Snowflake to have an equal opportunity to innovate on new ideas, create work with a lasting impact, and excel in a culture of collaboration. Snowflake's pre-sales organization is actively seeking an Associate Solution Engineer to join the Sales Engineering training program Snowmaker. Snowmaker's purpose is to develop aspiring technical talent through a combination of education and mentorship. This six-month program offers comprehensive technical and sales skills training through a combination of classroom sessions, shadowing, and mentoring from sales and pre-sales leaders and peers. In this role, you will have the opportunity to learn Snowflake's technology portfolio, the needs and business challenges of customers in every industry, and Snowflake's sales process to address them. You'll do so while applying your technical aptitude, exceptional communication, and creative problem-solving skills on a daily basis. Upon successful completion of the program, you will join our regional Sales Engineering organization and contribute to the success of the team. Upon successful completion of training, you will get to: - Present Snowflake technology and vision to executives and technical contributors at prospects and customers - Leverage knowledge of a domain or industry to align Snowflake's value to the customer's business and technical problems - Work hands-on with SEs, prospects, and customers to demonstrate and communicate the value of Snowflake technology throughout the sales cycle, from demo to proof of concept to design and implementation - Maintain a deep understanding of competitive and complementary technologies and vendors and how to position Snowflake in relation to them - Collaborate with Product Management, Engineering, and Marketing to continuously improve Snowflake's products and marketing - Provide ongoing, post-sales, technical guidance to the customer's technical team to drive customer utilization of Snowflake and digital transformation success - Contribute to global and regional Sales Engineering initiatives On day one, we will expect you to have: - A deep interest in translating customer needs and problems into technical solutions - A passion for technology, a willingness to learn, and the ability to work in a fast-paced work environment - Ability to present technical topics to a variety of audiences (technical, business, executive) via a whiteboard session or using presentations and demos - A university degree in Computer Science, Engineering, Mathematics, or related fields, or equivalent experience is preferred - Industry or internship experience with a focus on data analytics, pre-sales, solution architecture, or data engineering - Hands-on experience with SQL, Python, Scala, Spark, Java, cloud technology, data platform, or data analytics experience is a bonus - Have a strong desire to pursue a career in Sales Engineering Snowflake is growing fast, and we're scaling our team to help enable and accelerate our growth. We are looking for people who share our values, challenge ordinary thinking, and push the pace of innovation while building a future for themselves and Snowflake. How do you want to make your impact For jobs located in the United States, please visit the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

The Data Quality Analyst will collaborate with business stakeholders, Data Science, and wider data teams to enhance data quality throughout the organization and ensure data credibility in its usage. You will be responsible for developing a robust framework for data quality to uphold data integrity for regulatory and strategic needs. You will identify and address potential data quality issues at all stages of the data lifecycle and monitor data quality performance using tools and processes to maintain the highest standards. In this role, you will work in close coordination with data stewards to resolve data integrity issues and guarantee the delivery of high-quality data. Additionally, you will closely collaborate with the data platform team and stakeholders to contribute to the implementation of the data quality framework and roadmap. It is essential to align data quality initiatives with the overall data governance strategies. As a Data Quality Analyst, you will perform detailed root cause analysis of data issues and provide recommendations for preventing future defects. You will propose enhancements to streamline processes and enhance data management. You will also be responsible for implementing data quality rules in data quality tools to ensure compliance with enterprise data quality standards and requirements. Furthermore, you will advocate for high-quality data, ensuring that valuable data is governed, compliant, and delivers optimal value by identifying and resolving issues. You will also play a key role in contributing to Data management KPI reporting by maintaining data quality scores.,

Posted 3 weeks ago

Apply

3.0 - 6.0 years

7 - 10 Lacs

Hyderabad

Remote

Job Type: C2H (Contract to Hire) As a Data Engineer, you will work in a diverse, innovative team, responsible for designing, building, and optimizing the data infrastructure and pipelines for our new healthcare company's data platform. You'll architect and construct our core data backbone on a modern cloud stack, enabling the entire organization to turn complex data into life-saving insights. In this role, you will have the opportunity to solve challenging technical problems, mentor team members, and collaborate with innovative people to build a scalable, reliable, and world-class data ecosystem from the ground up. Core Responsibilities: (essential job duties and responsibilities) Design, develop, and maintain data replication streams and data flows to bring data from various SAP and non-SAP sources into Snowflake. Implement curated datasets on a modern data warehouse and data hub Interface directly with business and systems subject matter experts to understand analytic needs and determine logical data model requirements Work closely with data architects and senior analysts to identify common data requirements and develop shared solutions Support data integration and data modelers engineers Support and maintain data warehouse, ETL, and analytic platforms Required Skills and Experiences: Data warehouse and ETL background Advanced SQL programming capabilities Background in preparing data for analysis and reporting Familiar with data governance principles and tools Success in a highly dynamic environment with ability to shift priorities with agility Ability to go from whiteboard discussion to code Willingness to explore and implement new ideas and technologies Ability to effectively communicate with technical and non-technical audiences Ability to work independently with minimal supervision Minimum Qualifications: 4+ years experience with SQL. Snowflake strongly preferred. 3+ years experience with SAP Datasphere. 2+ years experience working directly with subject matter experts in both business and technology domains 2+ years experience with ERP data - preferably SAP S4, MS Dynamics and or BPCS 1+ year of experience with Salesforce, Workday, Concur or any other Enterprise application Nice-to-have: Experience with Machine Learning tools and processes Hands-on experience with Python Experience with Infrastructure as Code (IaC) principles and tools (e.g., Terraform, CloudFormation). Education: Bachelors in Computer Science, Information Systems, Engineering, science discipline, or similar.

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As a Business Requirements Analyst at LSEG, you will play a crucial role in the development and communication of Business Requirements for the Data Platform. Your responsibilities will include collaborating with Product Managers to create detailed maps of platform use cases, defining functional taxonomy, workflows, object definitions, and states, as well as specifying volumetric and non-functional requirements. You will work closely with programme collaborators to establish clear business success criteria and maintain a traceability model to ensure regulatory compliance. Throughout the delivery lifecycle, you will support design decisions, development, and testing activities, acting as a proxy-product owner when necessary. Building trust with product management and delivery Squads will be key to enabling flawless delivery. To excel in this role, you should have industry experience in developing business requirements for major transformations in financial services. Proficiency in standard Business Analysis methods and tools such as use case mapping, requirements gathering, data modeling, and user story mapping is essential. Your ability to thrive in a technically sophisticated and evolving environment, respond calmly to changing requirements, and exhibit strong planning and organization skills will be critical. Excellent communication skills, both written and verbal, are necessary for presenting and explaining issues logically. Furthermore, you should be a dedicated team player capable of working independently and making clear decisions in complex situations. Desired skills for this role include meticulous attention to detail, determination to focus on key business outcomes, experience in an agile DevOps environment, technical business analysis experience in the Financial Services industry, and familiarity with Microsoft cloud products and services. Prior experience in a large consulting firm is preferable. At LSEG, we are committed to fostering a diverse and inclusive organization that values individuality and encourages new ideas. Joining us means being part of a global team of 25,000 people across 70 countries, where your unique perspective will be welcomed. We strive to create a collaborative and creative culture focused on sustainability and driving economic growth. Additionally, we offer a range of benefits and support, including healthcare, retirement planning, paid volunteering days, and wellbeing initiatives. As a Recruitment Agency Partner, it is your responsibility to ensure that candidates applying to LSEG are aware of our privacy notice, which outlines how we handle personal information and your rights as a data subject.,

Posted 3 weeks ago

Apply

8.0 - 13.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Technical Skills: Deep knowledge of cloud technologies (AWS, Azure) and cloud-native architectures. Strong expertise in system integration patterns, microservices architecture , RESTful APIs and functioning of middleware technologies Experience with Business Intelligence (BI) tools and data platforms (e.g., Power BI, Data Fabric, Talend ETL etc.). Knowledge of enterprise software, databases, and data modeling (SQL, NoSQL). Proficiency in development frameworks and programming languages like PHP, React, HTML, CSS, Javascript Exposure to programming languages relevant to BI application desired (e.g. Python, R, or .NET) Experience to develop and deploy solutions in AWS Cloud is preferred. Design and Architecture: Analytical mindset with the ability to analyze complex issues and provide effective solutions. Strong understanding of system design patterns, enterprise architecture frameworks (e.g., TOGAF), and architectural principles. Experience & Education: Bachelors degree in computer science, Engineering, or a related field; a Masters degree is a plus

Posted 4 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

Are you someone with a relentless drive for perfection, always seeking to make things better If so, you'll find a kindred spirit in Ford Quality. We're passionate about continuous improvement, constantly striving to deliver the highest quality products and services our customers deserve. Join us and become a key player in driving operational excellence. You'll contribute to innovative, proprietary initiatives like our Global Product Development System, Quality Operating System, and New Model Launch processes. This role offers fantastic cross-functional exposure, as you'll collaborate closely with integrated teams across Manufacturing, Product Development, Purchasing, Marketing, Sales, and Service. In this exciting role, you'll be at the heart of our data-driven decision-making, analyzing vast amounts of data to pinpoint opportunities for improvement. Your insights will directly enhance quality performance and elevate the customer experience with our products. We truly believe that data holds immense power to help us create exceptional products and experiences that delight our customers. By providing actionable, persistent insights from a high-quality data platform, you'll empower our business and engineering teams to make even more impactful decisions.,

Posted 4 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

karnataka

On-site

As the Senior Product Manager for Platform at Enterpret, you will play a crucial role in defining and executing the strategic vision, product strategy, and roadmap for the core platform. This platform serves as the foundation of Enterpret's product, consolidating customer feedback from various sources and transforming it into valuable insights through the Knowledge Graph infrastructure and Adaptive Taxonomy engine, among others. Your key responsibilities will include driving the strategy by leading the product vision, roadmap, and overall platform development. You will be responsible for delivering a robust and performant platform that provides near real-time, high-quality, predictive insights to customers while ensuring developer productivity and customer satisfaction at scale. Collaboration with engineering and product leadership is essential to make architectural decisions that enhance performance, scalability, reliability, security, and cost efficiency. You will also work cross-functionally to understand platform needs across different product teams, align on roadmap dependencies, and ensure the platform continues to support and accelerate overall product development. Translating complex technical concepts into clear product requirements and owning key success metrics such as latency, scalability, reliability, cost, and internal developer velocity will be part of your role. Additionally, you will invest in improving developer experience through observability, documentation, and tooling to facilitate faster and higher-quality development by Enterpret teams. As a champion of platform-as-a-product, you will promote the platform's capabilities internally and externally, ensuring that shared services are well-understood, adopted, and designed with a customer-centric and metrics-driven approach. Your role will be instrumental in driving Enterpret's platform to new heights and maintaining its position as a key asset in delivering trusted insights to customers.,

Posted 4 weeks ago

Apply

10.0 - 20.0 years

25 - 40 Lacs

Noida, Hyderabad/Secunderabad, Bangalore/Bengaluru

Hybrid

Dear candidate, We found your profile suitable for our current opening, please go through the below JD for better understanding of the role, Job Description: Role: Technical Architect / Senior TA Exp :10 - 15 years Employment Type : Full-time Mode of work: Hybrid Model (3days WFO) Work Location: Hyderabad/Bangalore/Noida/Pune/Kolkata Role Overview: We are looking for a skilled Business Analyst with strong domain experience in Real Estate Investment Trusts (REITs) , specifically in Mortgage-Backed Securities (MBS) and/or Capital Allocation . The ideal candidate will have exposure to data platforms or application development and be capable of translating business needs into actionable insights and technical requirements. Key Responsibilities: Understand and analyze business processes in the REIT domain (MBS, capital allocation). Collaborate with stakeholders to gather and document requirements. Define domain models and mappings for data platforms. Work closely with data engineering and application development teams. Support product and platform enhancements through data-driven insights. Participate in stakeholder meetings, L2 interviews, and managerial discussions. Required Skills: Strong domain knowledge in REITs MBS and/or Capital Allocation. Experience as a Business Analyst or in a similar analytical role. Exposure to data platforms or application development projects. Ability to define domain models and mappings. Good understanding of SQL (basic knowledge is acceptable; writing skills are preferred). Excellent communication and documentation skills. Preferred Qualifications: Experience working in financial services, investment platforms, or real estate analytics. Familiarity with tools like Power BI, Tableau, Snowflake, or similar. Comfortable working in remote teams and cross-functional environments. How to Apply: Please share your updated resume highlighting relevant REIT domain experience, BA skills, and exposure to data platforms or app development. Please check below link for organisation details, https://www.tavant.com/ If interested , please drop your resume to dasari.gowri@tavant.com Regards Dasari Krishna Gowri Associate Manager - HR www.tavant.com

Posted 4 weeks ago

Apply

9.0 - 14.0 years

19 - 34 Lacs

Bengaluru

Hybrid

Reporting to the Senior Product Manager, Master Reference Data (MRD) is the centralised taxonomy and data governance solution that defines how Euromonitor structures and combines its various data sources. It is the single source of truth that lays out definitions to our taxonomy, enables seamless data integration across all our systems and unlocks value to our clients by enabling all our data sources to be combined in any possible way. The Senior Data Business Analyst will serve as the critical link between business stakeholders and technical teams, building in-depth knowledge on our various data sources; understanding our taxonomy challenges, clients needs, ETL processes and business objectives; analysing and documenting requirements; working closely with architects and software engineers to design solutions for our data warehouse and master reference data that deliver scalable, high-quality data solutions that solves real user problems and aligns with business objectives. Key responsibilities 1. Requirement Gathering and Analysis- Independently lead sessions with stakeholders and senior product manager. Navigate through complex requirements with autonomy. Gather, analyse, and document business requirements. Translate business requirements into functional specifications with clear acceptance criteria. 2. Solution Design and Implementation: Collaborate with architects and software engineers to clarify requirements and design solutions. Reconcile conflicting requirements from multiple stakeholders and design solutions that balance priorities and meet shared objectives. Conduct user acceptance testing (UAT) and coordinate with stakeholders for feedback and sign-off. Ensure consistency and traceability of data across systems. 3. Stakeholder and Team Management: Participate in sprint planning, backlog grooming and all other ceremonies. Discuss alternatives, cost-benefit, trade-offs and make informed recommendations to ensure solutions aligned with requirements and objectives are delivered on time and budget. Build strong relationships with stakeholders at all levels. Manage stakeholder expectations and provide regular updates. Communicate progress, issues and solutions effectively. 4. Documentation and Training: Create and maintain comprehensive and detailed documentation, ensuring it is up-to date and accessible. Provide training and support to end-users. 5. Process Improvement: Utilize process modelling techniques to develop detailed process models and workflows. Implement process improvement frameworks to systematically identify and address inefficiencies in business processes. The ideal candidates will demonstrate: • Minimum 8 years of experience as a Business Analyst, with recent experience specifically in data warehouse or data platform products. Must demonstrate expertise in capturing and translating complex data requirements into functional and nonfunctional requirements with clear acceptance criteria and testing cases within Agile teams. • Deep understanding on data platform technologies, ETL processes and dimensional modelling is a must-have. • Excellent communication skills and organisational skills. Oral and written fluency in English. • Proficient in business analysis tools and methodologies. Ability to produce high-quality documentation and artifacts to support stakeholders and team to understand requirements. • Ability to manage multiple projects and priorities simultaneously, deal with ambiguity and conflicting interests of different stakeholders. Desirable attributes: • Experience with Azure DevOps. • Basic / intermediate knowledge on data programming languages like SQL, Python or R. • Experience with Data Visualization tools, preferably Power BI. • Degree in Computer Science, Information Systems, Statistics or a related field. A masters degree is a plus.

Posted 1 month ago

Apply

5.0 - 8.0 years

15 - 20 Lacs

Hyderabad

Work from Office

Job Title: Product Manager Location: Hyderabad, India Work Model: Hybrid (Onsite 3 days/week) Team Function: Product Level: Mid-Senior Job Description: We are seeking a skilled Product Manager to support our enterprise data initiatives. This is a hybrid offshore role based in Hyderabad, India, where you will collaborate with global cross-functional teams to optimize and manage menu data across our brand platforms. Key Responsibilities: Partner with stakeholders across Data , Technology , and Business units to understand menu-related data needs and translate them into functional product requirements. Define user stories , acceptance criteria , and roadmaps aligned with business priorities and scalable data management practices. Collaborate with engineering and data teams to ensure successful integration, delivery, and QA of product solutions. Drive data consistency , governance , and standardization of menu attributes across systems and platforms. Serve as a subject matter expert on menu data management tools and workflows. Identify opportunities for process improvements and automation . Qualifications: 4-7 years of experience in product management or data platform/product roles. Experience working with offshore/global teams in a hybrid model . Ability to define and prioritize product backlogs in collaboration with cross-functional teams. Proficiency in JIRA , Confluence , and other Agile product management tools. Strong analytical skills with the ability to translate business needs into technical requirements. Excellent communication and stakeholder management skills. Preferred Skills: Experience in the restaurant , foodservice , or retail industry. Familiarity with data governance , data modeling , or enterprise data platforms . Knowledge of the Inspire Brands ecosystem or similar enterprise environments.

Posted 1 month ago

Apply

0.0 - 4.0 years

0 Lacs

karnataka

On-site

Build the future of the AI Data Cloud by joining the Snowflake team. Snowflake is at the forefront of the data revolution, committed to creating the world's greatest data and applications platform. Our "get it done" culture ensures that everyone at Snowflake has an equal opportunity to innovate on new ideas, create work with a lasting impact, and excel in a collaborative environment. Snowflake's pre-sales organization is actively seeking an Associate Sales Engineer to join the Sales Engineering training program called Snowmaker. The purpose of Snowmaker is to nurture aspiring technical talent through a blend of education and mentorship. This six-month program provides comprehensive technical and sales skills training through classroom sessions, shadowing, and mentoring by sales and pre-sales leaders and peers. As an Associate Sales Engineer, you will have the chance to familiarize yourself with Snowflake's technology portfolio, understand the needs and business challenges of customers from various industries, and grasp Snowflake's sales process to address them. You will apply your technical aptitude, exceptional communication skills, and creative problem-solving abilities on a daily basis. Upon successful completion of the program, you will join our regional Sales Engineering team and contribute to its success. Upon the successful completion of the training, your responsibilities will include: - Presenting Snowflake technology and vision to executives and technical contributors at prospects and customers - Leveraging knowledge of a domain or industry to align Snowflake's value with the customers" business and technical problems - Working hands-on with SEs, prospects, and customers to demonstrate and communicate the value of Snowflake technology throughout the sales cycle - Maintaining a deep understanding of competitive and complementary technologies and vendors to position Snowflake effectively - Collaborating with Product Management, Engineering, and Marketing to enhance Snowflake's products and marketing - Providing post-sales technical guidance to the customers" technical team to drive customer utilization of Snowflake and digital transformation success - Contributing to global and regional Sales Engineering initiatives On day one, we expect you to have: - A deep interest in translating customer needs and problems into technical solutions - A passion for technology, a willingness to learn, and the ability to thrive in a fast-paced work environment - Ability to present technical topics to various audiences via whiteboard sessions, presentations, and demos - A university degree in Computer Science, Engineering, Mathematics, or related fields; equivalent experience is preferred - Industry or internship experience focusing on data analytics, pre-sales, solution architecture, or data engineering - Hands-on experience with SQL, Python, Scala, Spark, Java, cloud technology, data platforms, or data analytics (bonus) - A strong desire to pursue a career in Sales Engineering Snowflake is experiencing rapid growth, and we are expanding our team to support and accelerate our development. We are seeking individuals who share our values, challenge conventional thinking, and drive innovation while building a successful future for themselves and Snowflake. Join us and make an impact today! For jobs in the United States, please refer to the job posting on the Snowflake Careers Site for salary and benefits information: careers.snowflake.com,

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

ahmedabad, gujarat

On-site

The Senior Data Engineer role requires 5 to 8 years of experience and expertise in Azure Synapse and deep Azure data engineering. You will be responsible for designing and implementing data technology and modern data platform solutions within the Azure environment. Your key responsibilities will include collaborating with Data Architects, Presales Architects, and Cloud Engineers to deliver high-quality solutions, mentoring junior team members, and conducting research to stay updated with the latest industry trends. You will also be expected to develop and enforce best practices in data engineering and platform development. We are looking for candidates with substantial experience in data engineering and Azure data services, strong analytical and problem-solving skills, proven experience working with diverse customers, and expertise in developing data pipelines, APIs, file formats, and databases. Familiarity with technologies such as Synapse, ADLS2, Databricks, Azure Data Factory, Azure SQL, Keyvault, and Azure Security is essential. Experience with CI/CD practices, specifically within Azure DevOps, and agile delivery methods is preferred. This is a full-time position based in Ahmedabad, India, with a hybrid work mode. The work schedule is from Monday to Friday during day shifts. As a Senior Data Engineer, you will have the opportunity to contribute to the development of cutting-edge data solutions, support various teams within the organization, and play a key role in mentoring and guiding junior team members. To apply for this position, please provide information on your notice period, current annual salary, expected annual salary, and current city of residence. The ideal candidate for this role will have a minimum of 6 years of experience with Azure data services, Azure Synapse, Databricks, and Azure Data Factory. If you have a passion for data engineering, a drive for continuous learning, and a desire to work with innovative technologies, we encourage you to apply for this exciting opportunity.,

Posted 1 month ago

Apply

10.0 - 14.0 years

0 Lacs

hyderabad, telangana

On-site

As an IT Project Manager/Architect for Data Platform & Monitoring within Global Operations and Supply Chain IT, your primary responsibility is to lead the architecture, technical implementation, and overall management of the data platform and monitoring program. Your role is critical in the planning and execution of a strategic program that includes developing a centralized data platform to consolidate manufacturing systems data across all sites and implementing robust observability and monitoring capabilities for global manufacturing systems and applications. Success in this role demands strong coordination and communication skills to work seamlessly across cross-functional teams, ensuring alignment with organizational objectives, timelines, and delivery standards. You will be leading a team of 10-15 Global Operations Supply Chain team members in the core manufacturing and supply chain digital platform domain. Your responsibilities will include developing a comprehensive project plan, defining project scope, goals, and objectives, identifying potential risks, leading a diverse cross-functional project team, establishing a collaborative environment, and working closely with business stakeholders to gather and document functional and technical requirements for the IT systems implementation solution. You will also lead the implementation of manufacturing IT systems, provide updates to the leadership team, and coordinate cross-functional teams and stakeholders to gather business and technical requirements, translating them into a clear, actionable 3-year data platform roadmap. Minimum qualifications for this role include a Bachelor's degree (required), with an advanced degree preferred, along with a minimum of 10 years of relevant experience in IT project or program management roles and 4+ years of team management experience of 10+ team members. Prior experience in regulated or validated industries is a strong plus. Strong documentation, organizational, and communication skills are essential, along with familiarity with project management tools and the ability to understand the customer's business problem and design effective solutions. Proven ability to deliver quality results within defined timelines, understanding of application lifecycle processes and system integration concepts, and the ability to thrive in a fast-paced, team-oriented environment are also required. Skills needed for this role include a strong background in IT project management, especially in manufacturing or supply chain domains, experience in leading multi-function cross-team collaboration between IT and Business, managing program timelines, risks, status, and escalations, understanding and working within processes and tools, solid knowledge of SDLC and Agile/Waterfall/Hybrid project management principles, experience with project management tools like DevOps, strong knowledge of MS PowerPoint, MS Excel, MS Projects, experience managing Project Costing, Budget Forecasting, and Resource Management, and working knowledge of manufacturing IT systems like ERP, MES, etc.,

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 12 Lacs

Navi Mumbai

Work from Office

Objectives aligned to this role: One place to manage all Informatica related development jobs along with the setup support. What would you do: Analyse Database models and Requirements: - Informatica developers analyse a business's database storage and warehousing capabilities and assess the company's data requirements.They review data storage and access procedures and use Informatica tools to update, test, and provide solutions for data issues Develop Technical documents for informatica systems: - Informatica developers maintainup-to-date documentation of implementation, troubleshooting, and ETL processes related to Informatica systems. They also keep documents pertaining to specific issues and how they were resolved, including coding information and extraction and transformation processes. Integrate Informatica Systems: - One of the main roles of an Informatica developer is to develop Target systems using Informatica software tools. These developers must integrate this system with a company's existing systems, troubleshoot any issues, and smoothly implement the Informatica cloud data management product. Develop informatica Workflows: - The developers core job is by following the technical documentation develop informatica ETL workflow to pull data from source transform and then load into the target system. Conduct data quality test: - It is up to Informatica developers to regularly check the quality of stored data. They oversee mappings and workflows, check data integrity and accuracy, and perform data cleansing procedures as needed. Whom we are looking for: In addition to the expected technical skills required to be an Informatica developer, these professionals should be team leaders with strong analytical, creative, and time management skills. After examining several job postings, we found that employers tend to favor candidates who display the following abilities. 2 to 5 years of overall experience doing development job in Informatica ETL. Computer skills - a thorough knowledge of computer programming, coding, and various operating and database systems is a must for Informatica developers. Time management - Informatica developers should have the ability to quickly develop data warehousing systems and solve any issues to ensure the continued accuracy of business data. Creativity - the ability to create mappings from scratch often calls for strong creative skills on the part of an Informatica developer. Analytical thinking - Informatica developers should be able to analyse data needs and options and understand the needs of various clients. Troubleshooting - when data warehousing systems are down, it falls to Informatica developers to quickly assess the problem and provide a solution. Team collaboration - Informatica developers rarely work alone; they typically interact closely with database managers and other IT specialists when maintaining, storing, and retrieving data. Technical skills: Required: Informatica PowerCenter tools (workflow manager, workflow monitor, designer) Programming languages (SQL, XTML) Data platforms (Oracle, Teradata, Hadoop) IMMEDIATE JOINERS ONLY (15 DAYS OR LESS)

Posted 1 month ago

Apply

8.0 - 12.0 years

0 Lacs

hyderabad, telangana

On-site

About McDonald's: McDonald's Corporation, one of the world's largest employers with locations in more than 100 countries, is offering corporate opportunities in Hyderabad. The global offices of McDonald's are dynamic innovation and operations hubs, aimed at expanding the global talent base and in-house expertise of the company. The newly established office in Hyderabad will bring together knowledge across business, technology, analytics, and AI, accelerating the ability of McDonald's to deliver impactful solutions for the business and customers worldwide. Position Overview: McDonald's is looking for an exceptional Senior Data Product Engineering SRE to take charge of the development and operational excellence of data products that provide insights and drive crucial business decisions. This role requires a unique combination of a product engineering mindset, data platform expertise, and site reliability engineering practices to create, scale, and maintain customer-facing data products and internal analytics platforms. The Senior Data Product Engineering SRE will be responsible for ensuring the end-to-end reliability of data products, from ingestion to user experience, to ensure they deliver business value at scale. Key Responsibilities: - Define and implement a product reliability strategy for customer-facing analytics, dashboards, and data APIs. - Collaborate with Product Management to translate business requirements into scalable, reliable data product architectures. - Establish product metrics, KPIs, and success criteria for data products serving both external and internal customers. - Lead cross-functional initiatives to enhance data product adoption, engagement, and customer satisfaction. - Develop and maintain data products, including real-time dashboards, analytics APIs, and embedded analytics solutions. - Design user-centric data experiences focusing on performance, reliability, and scalability. - Implement A/B testing frameworks and experimentation platforms for data product optimization. - Set and maintain SLAs for data product availability, latency, and accuracy. - Implement comprehensive monitoring for user-facing data products, encompassing frontend and backend metrics. - Create automated testing frameworks for data product functionality, performance, and data quality. - Lead incident response for data product issues that impact customer experience. - Monitor and optimize data product performance from an end-user perspective, including page load times and query response times. - Implement user feedback collection and product analytics to drive continuous improvement. - Collaborate closely with Product, Engineering, Data Science, and Customer Success teams. - Establish engineering practices for data product development, encompassing code reviews and deployment processes. - Influence the product roadmap with technical feasibility and reliability considerations. - Advocate for data product best practices throughout the organization. - Strike a balance between innovation, operational stability, and customer commitments. - Collaborate with Product Management on feature prioritization and requirements. Required Qualifications: - 8+ years of experience in product engineering, data engineering, or SRE roles. - 5+ years of experience in building customer-facing data products, analytics platforms, or business intelligence solutions. - 3+ years in senior or lead positions with direct team management experience. - Proven track record of delivering data products that drive measurable business impact. - Expertise in the product development lifecycle from ideation to launch and optimization. - Advanced experience in building user-facing applications and APIs. - Deep expertise with analytics databases (Redshift, BigQuery, ClickHouse), real-time processing (Kafka, Spark Streaming), and BI tools (Tableau, Looker, Power BI). - Proficiency in React, Vue.js, or Angular for constructing data visualization interfaces. - Advanced skills in Python, Java, or Node.js for API development and data services. - Expert-level SQL skills and experience optimizing queries for interactive analytics workloads. - Extensive experience with AWS or GCP data and compute services. - Strong product sense with the ability to balance technical constraints with user needs. - Experience with product analytics tools (Amplitude, Mixpanel, Google Analytics) and metrics-driven development. - Ability to understand business requirements and translate them into technical solutions. - Strong technical writing skills for customer-facing documentation and API specifications. - Experience with agile product development methodologies (Scrum, Kanban, Design Thinking). - Proven track record of building and scaling product engineering teams. Work Location: Hyderabad, India Work Pattern: Full-time role. Work Mode: Hybrid.,

Posted 1 month ago

Apply

3.0 - 8.0 years

15 - 25 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Skill Name :- I ntersystem IRIS Tool Work Location:- USI location Experience 3 to 5 year A resource with hands-on experience in Intersystem IRIS Data Platform, proficient in ObjectScript, SQL, and integration technologies (REST, SOAP) . FHIR is must to have. Experience with data modeling, performance tuning, and deploying IRIS on Linux/Windows is required. Skills in Python, Java, .NET, and Docker are a plus. Rounds of interview R1 and R2 (client round if required) Mode of interview (Virtual/ In-person) Work timing 11AM 8PM Work Mode (Remote/ On-site/ Hybrid) Hybrid

Posted 1 month ago

Apply

5.0 - 9.0 years

0 Lacs

chennai, tamil nadu

On-site

You will be the visionary Group Data Product Manager (GPM) for AI/ML & Metadata Management, responsible for leading the development of advanced AI/ML-powered metadata solutions. Your primary focus will be on establishing a cohesive and intuitive Data Platform tailored to cater to a variety of user roles including data engineers, producers, and consumers. Your role involves integrating various tools to create a unified platform that will significantly improve data discoverability, governance, and operational efficiency on a large scale.,

Posted 1 month ago

Apply

10.0 - 18.0 years

2 - 3 Lacs

Hyderabad

Work from Office

Experience needed: 12-18 years Type: Full-Time Mode: WFO Shift: General Shift IST Location: Hyderabad NP: Immediate Joinee - 30 days Job Summary: We are looking for an experienced and visionary Data Architect - Azure Data & Analytics to lead the design and delivery of scalable, secure, and modern data platform solutions leveraging Microsoft Azure and Microsoft Fabric . This role requires deep technical expertise in the Azure Data & Analytics ecosystem, strong experience in designing cloud-native architectures, and a strategic mindset to modernize enterprise data platforms. Key Responsibilities: Architect and design Modern Data Platform solutions on Microsoft Azure, including ingestion, transformation, storage, and visualization layers. Lead implementation and integration of Microsoft Fabric , including OneLake, Direct Lake mode, and Fabric workloads (Data Engineering, Data Factory, Real-Time Analytics, Power BI). Define enterprise-level data architecture , including data lakehouse patterns, delta lakes, data marts, and semantic models. Collaborate with business stakeholders, data engineers, and BI teams to translate business needs into scalable cloud data solutions. Design solutions using Azure-native services such as Azure Data Factory, Azure Synapse Analytics, Azure Data Lake Storage (Gen2), Azure SQL, and Azure Event Hubs. Establish best practices for data security, governance, DevOps, CI/CD pipelines, and cost optimization. Guide implementation teams on architectural decisions and technical best practices across the data lifecycle. Develop reference architectures and reusable frameworks for accelerating data platform implementations. Stay updated on Microsofts data platform roadmap and proactively identify opportunities to enhance data strategy. Assist in developing RFPs, architecture assessments, and solution proposals. Requirements Required Skills & Qualifications: Proven 12-18 years of experience which includes designing and implementing cloud-based modern data platforms on Microsoft Azure. Deep knowledge and understanding of Microsoft Fabric architecture , including Data Factory, Data Engineering, Synapse Real-Time Analytics, and Power BI integration. Expertise in Azure Data Services : Azure Synapse, Data Factory, Azure SQL, ADLS Gen2, Azure Functions, Azure Purview, Event Hubs, etc. Experience of data warehousing, lakehouse architectures, ETL/ELT , and data modeling. Experience in data governance, security, role-based access (Microsoft Entra/Azure AD) , and compliance frameworks. Strong leadership and communication skills to influence both technical and non-technical stakeholders. Familiarity with DevOps and infrastructure-as-code (e.g., ARM templates, Bicep, Terraform) is a plus. Preferred Qualifications: Microsoft Certified: Azure Solutions Architect Expert , Azure Data Engineer Associate , or Microsoft Fabric Certification . Experience with real-time data streaming , IoT , or machine learning pipelines in Azure. Familiarity with multi-cloud data strategies or hybrid deployments is an advantage.

Posted 1 month ago

Apply
Page 1 of 3
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies