Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 10.0 years
8 - 13 Lacs
Bengaluru
Work from Office
Job Title: SAP MDG Techo Functional Consultant Location: Chennai, Bangalore, Hyderabad, Mumbai, Jaipur, Gurugram, Noida Experience: 6-10 Years Shift Timings: 1-10 PM Required Skills: Lead the design and implementation of SAP MDG solutions, ensuring alignment with enterprise data management strategies and business objectives. Collaborate with business stakeholders to gather, analyze, and validate complex requirements for master data governance across domains such as Material, Customer, Vendor, Finance, etc. Oversee the mapping of SAP MDG functional capabilities to business needs, and drive the development of scalable, future-ready governance processes. Define and guide the creation of process models including entity types, change requests, business rules, and workflows within SAP MDG. Lead the configuration and enhancement of SAP MDG data models, workflows, and user interfaces to meet organizational and project-specific requirements. Provide technical leadership and oversight in creating functional specifications, test plans, and scripts for unit testing, integration, and UAT phases. Troubleshoot and resolve complex technical and functional issues, providing strategic solutions to ensure system integrity and data quality. Mentor and guide project team members, ensuring adherence to best practices in MDG implementation and support. Coordinate post-go-live support activities and lead training sessions for end-users and business teams to promote adoption and effective use of the MDG solution. Act as the primary point of contact for stakeholders and cross-functional teams, ensuring timely communication and alignment throughout the project lifecycle. Have a strong understanding of MDG ABAP programming, as the role involves guiding and designing solutions for the development team.
Posted 1 month ago
6.0 - 10.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Job Title: SAP MDG Techo Functional Lead Location: Chennai, Bangalore, Hyderabad, Mumbai, Jaipur, Gurugram, Noida Experience: 6-10 Years Shift Timings: 1-10 PM Required Skills: Lead the design and implementation of SAP MDG solutions, ensuring alignment with enterprise data management strategies and business objectives. Collaborate with business stakeholders to gather, analyze, and validate complex requirements for master data governance across domains such as Material, Customer, Vendor, Finance, etc. Oversee the mapping of SAP MDG functional capabilities to business needs, and drive the development of scalable, future-ready governance processes. Define and guide the creation of process models including entity types, change requests, business rules, and workflows within SAP MDG. Lead the configuration and enhancement of SAP MDG data models, workflows, and user interfaces to meet organizational and project-specific requirements. Provide technical leadership and oversight in creating functional specifications, test plans, and scripts for unit testing, integration, and UAT phases. Troubleshoot and resolve complex technical and functional issues, providing strategic solutions to ensure system integrity and data quality. Mentor and guide project team members, ensuring adherence to best practices in MDG implementation and support. Coordinate post-go-live support activities and lead training sessions for end-users and business teams to promote adoption and effective use of the MDG solution. Act as the primary point of contact for stakeholders and cross-functional teams, ensuring timely communication and alignment throughout the project lifecycle. Have a strong understanding of MDG ABAP programming, as the role involves guiding and designing solutions for the development team.
Posted 1 month ago
10.0 - 17.0 years
30 - 45 Lacs
Kolkata
Work from Office
About Client Hiring for One of the Most Prestigious Multinational Corporations! Job Description Job Title : Data Architect Qualification : Any Graduate or Above Relevant Experience : 10 to 17 Years Required Technical Skill Set (Skill Name) : Data Governance, Data Analyst Keywords Must-Have - Data Governance, Data Analyst Location : Kolkata CTC Range : 35 LPA-45 LPA Notice period : Any Shift Timing : N/A Mode of Interview : Virtual Mode of Work : WFO( Work From Office) Pooja Singh KS IT Staffing Analyst Black and White Business solutions PVT Ltd Bangalore, Karnataka, INDIA. pooja.singh@blackwhite.in I www.blackwhite.in
Posted 1 month ago
5.0 - 8.0 years
25 - 35 Lacs
Chennai
Remote
Snowflake database administration designing data architecture, database networking, maintaining data security, configuring user access Python, git, SQL, AWS Snowflake data pipelines, data structures, data platforms, data integration, data governance
Posted 1 month ago
3.0 - 8.0 years
12 - 20 Lacs
Hyderabad
Work from Office
Data Governance, Data Quality, Data Management Collibra, Alation MDM, and BI/reporting processes data privacy regulations (GDPR, CCPA, etc.) Agile project environments
Posted 1 month ago
5.0 - 10.0 years
15 - 22 Lacs
Hyderabad
Work from Office
data engineering, data management, Snowflake , SQL, data modeling, and cloud-native data architecture AWS, Azure, or Google Cloud (Snowflake on cloud platforms) ETL tools such as Informatica, Talend, or dbt Python or Shell scripting
Posted 1 month ago
1.0 - 6.0 years
4 - 9 Lacs
Chennai
Hybrid
Dear All, We are hiring for multiple Master Data Management (MDM) Consultants for multiple client projects across domains. Note : : Only one position requires SAP S/4HANA experience . Candidates without SAP experience are also encouraged to apply. Candidates with a notice period of 30 days or less will be prioritized. Those with longer notice periods may still be considered based on fit. Work Location & Mode: Role 1: Hybrid (Chennai based candidates preferred) Role 2: On-site at customer office (Chennai) Roles & Responsibilities: Support the migration of master data from multiple systems into a centralized ERP platform. Assist in data consolidation and standardization to maintain a single source of truth. Participate in data migration and integration activities. Create, maintain, and ensure the quality of Material, Customer, and Vendor Master Data. Perform data validation, cleansing, and quality checks to maintain high data integrity. Collaborate with cross-functional teams to gather and define business data requirements. Identify and resolve data quality issues and process inefficiencies. Conduct data analysis and prepare reports to support business decision-making. Support data governance policies and ensure compliance with data standards. Contribute to the development and enforcement of data management best practices. Soft Skills: Excellent communication and collaboration abilities. Strong analytical skills with attention to detail. Detail-oriented with a focus on accuracy and quality. Ability to manage multiple, changing priorities while working effectively in a team environment. Excellent problem-solving skills. Desired Candidate Profile: 1 to 6 years of experience in Master Data Management or related data-focused roles. Passion for working with data and driving data quality initiatives. Bachelors degree in any discipline (Engineering mandatory). Proficiency in Microsoft Excel; exposure to Power BI is an advantage. For SAP-specific roles: Experience with SAP MDM on SAP S/4HANA is preferred. Exposure to software like Java or Python or SQL is a plus. Familiarity with PLM tools like Teamcenter or Windchill is a plus. If you are passionate about data and excited to work on challenging data transformation projects, apply now and be part of a dynamic and growing team!
Posted 1 month ago
15.0 - 22.0 years
40 - 50 Lacs
Bengaluru
Hybrid
Role & responsibilities Shape technical strategy (e.g., build vs. buy decisions, technical road-mapping) in collaboration with architects Evaluate and identify appropriate technology stacks, platforms and vendors, including web application frameworks and cloud providers for solution development Attend team ceremonies as required; in particular, feature refinement and cross-team iteration reviews/demos Drive the resolution of technical impediments Own the 'success' of foundational enablers Champion for Research and Innovation Lead in scaled agile ceremonies and activities, like quarterly reviews, quarterly increment planning and OKR writing Collaborate with the Platform Owner in the writing and prioritization of technical capabilities and enablers Present platform delivery metrics, OKR health and platform finance status to Executive audiences Collaborate with other Technical Leads Create and maintain the technical roadmap for in-scope products and services at the platform. Preferred candidate profile Total work experience 15-22 years B.E. / B.Tech or equivalent Engineering professional Masters degree or equivalent experience in Marketing, Business or finance is an added advantage 1 0+ yrs. of experience in technical architecture, solution design , and platform engineering Strong experience in MDM, Data Quality and Data Governance practices including tool stack such as Informatica MDM SaaS, Informatica Data Quality, and Collibra is a plus Good experience with major cloud platforms and data tools in cloud including but not limited to AWS, Microsoft Azure, Kafka, CDC, Tableau, and Data virtualization tools Good experience in ETL and BI solution development and tool stack Informatica ETL experience is a plus Good experience in Data Architecture, SQL, NoSQL, REST API, data security, and AI concepts Familiarity with agile methodologies and data factory operations processes including usage of tools like confluence, Jira and Miro Strong knowledge of industry standards and regulations: A data platform owner should have knowledge of industry standards and regulations related to data management, such as HIPAA, PCI-DSS, and GDPR.
Posted 1 month ago
10.0 - 15.0 years
20 - 25 Lacs
Hyderabad
Work from Office
We are looking for a dedicated Principal Clinical Data Scientist to oversee all aspects of data management, coding, CDD, and DAP for several studies or medium to large-sized projects. This role is crucial in ensuring that pharmaceutical drug development plans at Novartis Global Drug Development are executed efficiently with high-quality deliverables. The successful candidate will conceptualize and implement scalable training delivery models and platforms, follow and oversee Good Clinical Practices (GCP), and ensure consistency across assigned programs to aid efficiencies for submissions. Key Responsibilities: - Lead functional activities for medium to large-sized projects in phase I to IV clinical studies within the Novartis Global Development Organization. Coordinate activities of Data Managers, both internally and externally. Make data management decisions and propose strategies at the study or project level. Ensure the application of consistent data management processes, increasing standardization and documentation across assigned projects/programs. Provide and implement data management solutions, ensuring knowledge sharing. Lead process and training deliverables within multiple platforms, franchises, or therapeutic areas, developing strategies for effective training and knowledge retention. Represent Data Operations (DO) in all audits and inspections, centralizing and aligning the team in audit preparation, readiness, and response. Essential Requirements:- Proven experience in clinical data management, data architecture, data governance, data integration, data profiling, data quality, data science, data strategy, and master data. Strong project management skills with the ability to manage multiple tasks and projects simultaneously. Cross-cultural experience and functional breadth in managing clinical data management activities. Ability to recognize and resolve protocol issues that may impact database design, data validation, and analysis/reporting. Excellent communication and interpersonal skills to build and maintain effective working relationships with cross-functional teams. Proficiency in using data management tools to generate listings for data review and provide these to study teams.
Posted 1 month ago
2.0 - 7.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Design, develop, and maintain interactive dashboards, reports, and data visualizations using IBM Cognos Analytics Translate business requirements into effective BI solutions and technical specifications Work with data sources to build and optimize data models (Framework Manager, Data Modules) Collaborate with cross-functional teams to understand reporting needs and data availability Maintain and enhance existing Cognos BI reports and packages Ensure accuracy, performance, and usability of BI solutions Provide support, troubleshooting, and performance tuning for Cognos reports and environments Document BI solutions, including requirements, specifications, and user guides Adhere to data governance and security standards in BI development
Posted 1 month ago
3.0 - 7.0 years
15 - 19 Lacs
Bengaluru
Work from Office
We are seeking a talented Data Engineer with strong expertise in Databricks, specifically in Unity Catalog, PySpark , and SQL, to join our data team. you'll play a key role in building secure, scalable data pipelines and implementing robust data governance strategies using Unity Catalog. Key Responsibilities Design and implement ETL/ELT pipelines using Databricks and PySpark . Work with Unity Catalog to manage data governance, access controls, lineage, and auditing across data assets. Develop highperformance SQL queries and optimize Spark jobs. Collaborate with data scientists, analysts, and business stakeholders to understand data needs. Ensure data quality and compliance across all stages of the data lifecycle. Implement best practices for data security and lineage within the Databricks ecosystem. Participate in CI/CD, version control, and testing practices for data pipelines. Required Skills Proven experience with Databricks and Unity Catalog (data permissions, lineage, audits). Strong handson skills with PySpark and Spark SQL. Solid experience writing and optimizing complex SQL queries. Familiarity with Delta Lake, data lakehouse architecture, and data partitioning. Experience with cloud platforms like Azure or AWS. Understanding of data governance, RBAC, and data security standards. Preferred Qualifications Databricks Certified Data Engineer Associate or Professional. Experience with tools like Airflow, Git, Azure Data Factory, or dbt . Exposure to streaming data and realtime processing. Knowledge of DevOps practices for data engineering. Mandatory skill sets Databricks Preferred skill sets Databricks Education qualification BE/BTECH, ME/MTECH, MBA, MCA Education Degrees/Field of Study required Master Degree Computer Applications, Master of Business Administration, Bachelor of Engineering, Master of Engineering, Bachelor of Technology Degrees/Field of Study preferred Required Skills Databricks Platform Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Coaching and Feedback, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling
Posted 1 month ago
5.0 - 10.0 years
15 - 19 Lacs
Hyderabad
Work from Office
Summary Within the Operations Data domain team this role is responsible for the design and implementation of Data Management Business Process Design and Governance processes including and Data Operating Model in reporting to the LDC Core Data Lead and in close collaboration with the Enterprise Data Owner (EDO) s team and members of the Functional teams. This role will focus establishing and developing the Novartis data capability in collaboration with the functions as we'll lead the implementation within LDC scope. About the Role Major accountabilities: Role is accountable for overall global Material master business process design and improvement activities in alignment of the business goals and priorities in close collaboration with the respective Solution teams and business process owners. Accountable and responsible to ensure consistency and completeness of the end-to-end design with Material master business processes and underlying data data design Accountable and responsible to design and solution the Material master data management processes comparable to the best in class process, and identify areas of process opportunities improvements in line with Novartis guidelines and Business specific requirements Accountable in identifying the digital solutioning options in close collaboration with respective IT teams to ensure business user acceptance, enablement of business process automation capabilities and best practices for data management processes. Drive the overall plan for implementation and adoption of the Material master business process design in LDC releases in close collaboration with Core and Adopt teams Responsible for gathering and implementing data requirements coming from Business Function (Domain Pillars in LDC projects), GPO, EDO and team and other dependent projects / programs. Facilitate cultural change by improving data literacy across the business through training, education and increasing data consumption. Act as a point of reference and contact of all queries related to Material master process and data design. Drives the transition into new ways of working defined by Enterprise Operating model per LDC Scope. Key performance indicators: Delivery of key milestones and deliverables of the program on time, and in quality, with full buy-in and support of country and global teams. Minimum Requirements: Education: masters university degree or higher Work Experience: At least 5 years experience in regional or global role in material/product data related functional area such as Material master data management, Product master data management or Supply chain master data in cross-functional setting. Solid understanding on cross-functional master data management business process design and best practices in master data governance. Experience from SAP MDG, SAP S/4 HANA and materials management and related data concepts. Experience in SAP PLM / SAP EHS and/or Specification Management is an additional advantage. Proven track record for detailing data concepts for material master both from conceptual and operational governance perspective. Proven track record in driving discussions and facilitating cross-functional decision making in matrix organization. Experienced in collaborating with diverse project teams including functional solution teams, data governance teams and system configuration workstreams. Additional Project Management training, a certification/designation desirable Skills: Business acumen : very good understanding of Material master data models in connection with operational significance of key data elements and cross-functional elements of data governance. Curious and forward looking : looks for examples both inside and outside the company to identify fit-for-purpose design for the company. Data savvy : proven experience to analyse the As-Is and propose solutions that are fit for purpose. Technical and process knowledge : knowledge and understanding of driving data driven business process definition and governance. Collaboration and influencing skills : Outspoken and experienced to interact and drive solutioning in x-functional matrix organization. Excellent interpersonal communication skills to drive conversations in virtual and diverse audiences. Languages : English: fluency in business English is a must.
Posted 1 month ago
12.0 - 15.0 years
14 - 18 Lacs
Hyderabad
Work from Office
Lead the technical vision and strategy for the Data Engineering Center of Excellence across cloud platforms (GCP, Azure, AWS), cloud-agnostic platforms (Databricks, Snowflake), and legacy systems. This leadership role will establish architectural standards and best practices while providing pre-sales leadership for strategic opportunities. Key Responsibilities Define and drive the technical vision and roadmap for the Data Engineering CoE Establish cross-cloud architecture standards and best practices with emphasis on Azure, GCP and AWS Lead pre-sales activities for strategic opportunities, particularly AWS, Azure, GCP-focused clients Build the CoEs accelerator development framework Mentor and guide pillar architects across all platforms Drive platform selection decisions and integration strategies Establish partnerships with key technology providers, especially Cloud Define governance models for the CoE implementation Represent the organization as a technical thought leader in client engagements 12+ years of data engineering experience with 6+ years in leadership roles Deep expertise in Google Cloud Platform data services (BigQuery, Dataflow, Dataproc) Strong knowledge of other cloud platforms (Azure Fabric/Synpase, Data Factory ,
Posted 1 month ago
5.0 - 8.0 years
7 - 10 Lacs
Hyderabad
Work from Office
Responsibilities and Accountabilities: Designing and building cloud-based data infrastructure and data pipelines using cloud services such as AWS Developing, testing, and deploying data integration processes that move data from various sources into cloud-based data warehouses or data lakes. Collaborating with data scientists, business analysts, and other stakeholders to identify data requirements and develop appropriate data solutions. Implementing and managing data governance policies, data quality, and data security measures to ensure data accuracy, consistency, and privacy. Managing and monitoring cloud-based data infrastructure and data pipelines to ensure data availability, scalability, and reliability. Take business specification requirements through Design, development, testing and deployment. Develop a strong understanding of business requirements; working with business users/ business analysts to define technical and process requirements Build effective working relationships with team members and cross functional colleagues. About Experian Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, and gain deeper insights into the automotive market, all using our unique combination of data, analytics and software. We also assist millions of people to realize their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.com. Experience and Skills Experience Required: Qualified to degree level in science or engineering preferable Cloud experience - S3, step functions, Glue, Step functions and Airflow. Good Python development for data transfers and extractions (ELT and ETL). 5 to 8 years of development experience building data pipelines using Cloud technologies. 5+ years of experience in architecture of modern data warehousing Excellent problem solving, design, debugging, and testing skills Ability to work with multiple different personality types while managing workload within a fast paced, energetic, and dynamic workplace. Additional Information Our uniqueness is that we truly celebrate yours. Experians culture and people are key differentiators. We take our people agenda very seriously and focus on what truly matters; DEI, work/life balance, development, authenticity, engagement, collaboration, wellness, reward recognition, volunteering... the list goes on. Experian s strong people first approach is award winning; Great Place To Work in 24 countries, FORTUNE Best Companies to work and Glassdoor Best Places to Work (globally 4.4 Stars) to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experians DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, colour, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here Share
Posted 1 month ago
15.0 - 17.0 years
40 - 45 Lacs
Bengaluru
Work from Office
To propel our next phase of growth, ServiceNow is investing in a Financial Services Industry Solutions organization. Were building a team of collaborative individuals who are passionate about the opportunity to transform Financial Services using ServiceNows powerful digital workflow platform. Join a team that re-shapes Financial Services, partnering with leading financial institutions and the most inspiring Fintechs in the world. We are always evolving. We are passionate about our product, and we live for our customers. We have high expectations, and a career at ServiceNow means challenging yourself to always be better. We are looking for a Sr. Principal, Financial Services Architect to be part of our Financial Services Industry Solutions organization. You are a Financial Services industry expert with a knowledge of the Financial Services solution stack across the application, workflow and data/integration layers. You understand the core Financial Services business processes and surrounding technology ecosystem, including systems of record, third-party data providers and the Fintech / Regtech landscape. You have a technical vision for ServiceNows opportunity to transform Financial Services and will work with customers and partners on the architecture for the solutions with care for this vision. You will work with pre-sales technical resources to gather solution requirements and help them on solution roadmaps and architecture. You will provide visibility within ServiceNow and an opportunity to provide incredible results during sales cycles while achieving quarterly and annual sales goals for an assigned territory. What you get to do in this role: Solution Architecture: Architecting Financial Services solutions built on the ServiceNow platform, both using current horizontal applications, platform services and driving requirements for Financial Services-specific product development. Consideration of future state and transitional architectures in Financial Services. Asset Creation: create reference architecture and technical enablement guides for Financial Services solutions. Work with Solution Consulting teams to develop demo prototypes for solutions. Field and Partner Enablement: promote internal pre-sales technical resources and System Integrator and Independent Software Vendor partner ecosystem on Financial Services solutions and architecture. Build a Financial Services community of technical ambassadors enabled on the ServiceNow Financial Services message and solution architecture. Customer Engagement: Work with customers to create technology roadmaps around Financial Services solutions and to dive into architecture components. Bring deep technical knowledge of Financial Services to improve strategic opportunities and identify new use cases. Market-Facing Engagement: Present to large customer audiences and build relationships with CxOs. To be successful in this role you have: Experience in leveraging or critically thinking about how to integrate AI into work processes, decision-making, or problem-solving. This may include using AI-powered tools, automating workflows, analyzing AI-driven insights, or exploring AIs potential impact on the function or industry. Minimum 15+ years of related work experience Experience working with the ServiceNow Platform with ServiceNow customers, and in-depth understanding of the ServiceNow architecture and platform Understanding of top Financial Services regulations globally with the ability to assess the impact on the architecture of a ServiceNow solution in Financial Services Experience working with sales, with the ability to work as an extended part of the account teams Ability to provide expertise and work with internal ServiceNow product teams Interact at multiple levels within a customer account (Enterprise Architects, Technical Architects, Directors, VPs, and CXOs) Ability to travel up to 30% of the time Knowledge of enterprise integration, service-oriented architectures and micro-services Knowledge of security, data privacy, data governance within Financial Services Instant customer credibility with a record of building customer relationships
Posted 1 month ago
8.0 - 13.0 years
14 - 18 Lacs
Bengaluru
Work from Office
The Solution Architect Data Engineer will design, implement, and manage data solutions for the insurance business, leveraging expertise in Cognos, DB2, Azure Databricks, ETL processes, and SQL. The role involves working with cross-functional teams to design scalable data architectures and enable advanced analytics and reporting, supporting the company's finance, underwriting, claims, and customer service operations. Key Responsibilities: Data Architecture & Design: Design and implement robust, scalable data architectures and solutions in the insurance domain using Azure Databricks, DB2, and other data platforms. Data Integration & ETL Processes: Lead the development and optimization of ETL pipelines to extract, transform, and load data from multiple sources, ensuring data integrity and performance. Cognos Reporting: Oversee the design and maintenance of Cognos reporting systems, developing custom reports and dashboards to support business users in finance, claims, underwriting, and operations. Data Engineering: Design, build, and maintain data models, data pipelines, and databases to enable business intelligence and advanced analytics across the organization. Cloud Infrastructure: Develop and manage data solutions on Azure, including Databricks for data processing, ensuring seamless integration with existing systems (e.g., DB2, legacy platforms). SQL Development: Write and optimize complex SQL queries for data extraction, manipulation, and reporting purposes, with a focus on performance and scalability. Data Governance & Quality: Ensure data quality, consistency, and governance across all data solutions, implementing best practices and adhering to industry standards (e.g., GDPR, insurance regulations). Collaboration: Work closely with business stakeholders, data scientists, and analysts to understand business needs and translate them into technical solutions that drive actionable insights. Solution Architecture: Provide architectural leadership in designing data platforms, ensuring that solutions meet business requirements, are cost-effective, and can scale for future growth. Performance Optimization: Continuously monitor and tune the performance of databases, ETL processes, and reporting tools to meet service level agreements (SLAs). Documentation: Create and maintain comprehensive technical documentation including architecture diagrams, ETL process flows, and data dictionaries. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Proven experience as a Solution Architect or Data Engineer in the insurance industry, with a strong focus on data solutions. Hands-on experience with Cognos (for reporting and dashboarding) and DB2 (for database management). Proficiency in Azure Databricks for data processing, machine learning, and real-time analytics. Extensive experience in ETL development, data integration, and data transformation processes. Strong knowledge of Python, SQL (advanced query writing, optimization, and troubleshooting). Experience with cloud platforms (Azure preferred) and hybrid data environments (on-premises and cloud). Familiarity with data governance and regulatory requirements in the insurance industry (e.g., Solvency II, IFRS 17). Strong problem-solving skills, with the ability to troubleshoot and resolve complex technical issues related to data architecture and performance. Excellent verbal and written communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Preferred Qualifications: Experience with other cloud-based data platforms (e.g., Azure Data Lake, Azure Synapse, AWS Redshift). Knowledge of machine learning workflows, leveraging Databricks for model training and deployment. Familiarity with insurance-specific data models and their use in finance, claims, and underwriting operations. Certifications in Azure Databricks, Microsoft Azure, DB2, or related technologies. Knowledge of additional reporting tools (e.g., Power BI, Tableau) is a plus. Key Competencies: Technical Leadership: Ability to guide and mentor development teams in implementing best practices for data architecture and engineering. Analytical Skills: Strong analytical and problem-solving skills, with a focus on optimizing data systems for performance and scalability. Collaborative Mindset: Ability to work effectively in a cross-functional team, communicating complex technical solutions in simple terms to business stakeholders. Attention to Detail: Meticulous attention to detail, ensuring high-quality data output and system performance.
Posted 1 month ago
2.0 - 5.0 years
7 - 11 Lacs
Hyderabad
Work from Office
Data Quality : Define and Measure Data Quality Metrics: Establish metrics for accuracy, completeness, validity, consistency, timeliness, and reliability. Continuous Monitoring and Remediation: Regularly monitor data quality, conduct audits, perform root cause analysis for recurring data issues, and implement preventive measures and remediation plans. Data Profiling: Develop and maintain comprehensive data profiles to understand data characteristics. Data Validation: Create and implement validation rules to ensure that incoming data conforms to expected formats and values. Data Cleansing: Design and execute data cleansing processes to correct errors and inconsistencies, enhancing overall data quality and reliability. Data Governance : Establish Governance Framework: Implement and enforce data governance practices to ensure compliance with regulatory requirements and corporate policies, ensuring data is managed according to best practices. Metadata Management: Develop and maintain a comprehensive metadata repository to document data definitions, lineage, and usage, ensuring it is kept up to date and accessible to end users. Understand User Needs: Collaborate with business users to identify data needs, pain points, and requirements, ensuring the data is fit for its intended use. Identify Improvement Areas: Continuously seek opportunities for process improvement in data governance and quality management. User Roles and Access Requirements: Understand user roles and access requirements for systems, so that similar protection can be implemented into the analytical solutions. Row-Level Security: Work with the data & analytics team to establish row-level security for analytical solutions, ensuring data is accessible only to authorised users. Continuous Improvement: Establish Naming Conventions: Define business friendly table names and column names, along with synonyms, to ensure data easily accessible using AI. Create Synonyms: Implement synonyms to simplify data access and enhance data readability. Establish KPIs for data governance and quality efforts and create regular reports for stakeholders to track progress and demonstrate the value of data governance initiatives. Continuous Improvement: Establish a feedback loop where users can report data quality issues and suggest improvements.
Posted 1 month ago
8.0 - 12.0 years
13 - 17 Lacs
Noida
Work from Office
Job Title: Data Architect Location: Jewar airport Noida Experience - 8+ Years Data Architect We are looking for a Data Architect to oversee our organizations data architecture, governance, and product lifecycle. The role focuses on managing data layers, maintaining data governance frameworks, and creating data products aligned with business objectives. Key Responsibilities: Design and maintain the Lakehouse architecture, including data lake setup and management. Create and maintain data products, ensuring their alignment with business needs. Develop and enforce data governance policies, including the maintenance of a data catalog. Design data models and define database development standards. Automate workflows using Python, CI/CD pipelines, and unit tests. Required Skills and Experience: Extensive experience in data architecture and data platform management. Expertise in data governance, data modeling, and database development. Proficiency in Python for automation and pipeline development. Familiarity with Azure data services and data processing pipelines.
Posted 1 month ago
15.0 - 20.0 years
10 - 15 Lacs
Bengaluru
Work from Office
Job Title:Data Analytics Lead Experience15-20 Years Location:Bengaluru : Key Responsibilities Team Leadership & Delivery Excellence: Lead a cross-functional team comprising data architects, analysts, business SMEs, and technologists to deliver high-impact data analytics solutions. Define and enforce best practices for efficient, scalable, and high-quality delivery. Inspire a culture of collaboration, accountability, and continuous improvement within the team. Strategic Data Leadership: Develop and execute a data strategy aligned with client business objectives, ensuring seamless integration of analytics into decision-making processes. Collaborate with stakeholders to translate business needs into actionable data solutions, influencing strategic decisions. Technical and Architectural Expertise: Architect and oversee data platforms, including SQL Server, Snowflake, and Power BI, ensuring optimal performance, scalability, and governance. Lead initiatives in Data Architecture, Data Modeling, and Data Warehouse (DWH) development, tailored to alternative investment strategies. Evaluate emerging technologies, such as big data and advanced analytics tools, and recommend their integration into client solutions. Champion data quality, integrity, and security, aligning with compliance standards in private equity and alternative investments. Performance & Metrics: Define and monitor KPIs to measure team performance and project success, ensuring timely delivery and measurable impact. Collaborate with stakeholders to refine reporting, dashboarding, and visualization for decision support. Governance & Compliance: Establish robust data governance frameworks in partnership with client stakeholders. Ensure adherence to regulatory requirements impacting private markets investments, including fund accounting and compliance What’s on offer Competitive and above-market salary. Flexible hybrid work schedule with tools for both office and remote productivity. Hands-on exposure to cutting-edge technology and global financial markets. Opportunity to collaborate directly with international teams in New York and London. Candidate Profile Experience: 15+ years of progressive experience in program or project management within the capital markets and financial services sectors. Demonstrated expertise in SQL Server, Snowflake, Power BI, ETL processes, and Azure Cloud Data Platforms. Hands-on experience with big data technologies and modern data architecture. Proven track record in delivering projects emphasizing data quality, integrity, and accuracy. Deep understanding of private markets, including areas such as private equity, private credit, CLOs, compliance, and regulations governing alternative investments. Leadership & Collaboration: Exceptional problem-solving skills and decision-making abilities in high-pressure, dynamic environments. Experience leading multi-disciplinary teams to deliver large-scale data initiatives. Strong client engagement and communication skills, fostering alignment and trust with stakeholders. Preferred Certifications: Relevant certifications (e.g., CFA, Snowflake Certified Architect, or Microsoft Power BI Certified). Education Bachelor’s degree in computer science, IT, Finance, Economics, or a related discipline. Advanced degrees (MBA, MS in Computer Science, or related fields) preferred. Interview Process Initial recruiter call. Interview with technical team, delivery and account leadership team at ThoughtFocus. Interview with the client stakeholders. Final HR discussion.
Posted 1 month ago
0.0 - 5.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Job Title:Data Engineer - DBT (Data Build Tool) Experience0-5 Years Location:Bengaluru : Job Responsibilities Assist in the design and implementation of Snowflake-based analytics solution(data lake and data warehouse) on AWS Requirements definition, source data analysis and profiling, the logical and physical design of the data lake and data warehouse as well as the design of data integration and publication pipelines Develop Snowflake deployment and usage best practices Help educate the rest of the team members on the capabilities and limitations of Snowflake Build and maintain data pipelines adhering to suggested enterprise architecture principles and guidelines Design, build, test, and maintain data management systems Work in sync with internal and external team members like data architects, data scientists, data analysts to handle all sorts of technical issue Act as technical leader within the team Working in Agile/Lean model Deliver quality deliverables on time Translating complex functional requirements into technical solutions. EXPERTISE AND QUALIFICATIONS Essential Skills, Education and Experience Should have a B.E. / B.Tech. / MCA or equivalent degree along with 4-7 years of experience in Data Engineering Strong experience in DBT concepts like Model building and configurations, incremental load strategies, macro, DBT tests. Strong experience in SQL Strong Experience in AWS Creation and maintenance of optimum data pipeline architecture for ingestion, processing of data Creation of necessary infrastructure for ETL jobs from a wide range of data sources using Talend, DBT, S3, Snowflake. Experience in Data storage technologies like Amazon S3, SQL, NoSQL Data modeling technical awareness Experience in working with stakeholders working in different time zones Good to have AWS data services development experience. Working knowledge on using Bigdata technologies. Experience in collaborating data quality and data governance team. Exposure to reporting tools like Tableau Apache Airflow, Apache Kafka (nice to have) Payments domain knowledge CRM, Accounting, etc. in depth understanding Regulatory reporting exposure Other skills Good Communication skills Team Player Problem solver Willing to learn new technologies, share your ideas and assist other team members as needed Strong analytical and problem-solving skills; ability to define problems, collect data, establish facts, and draw conclusions.
Posted 1 month ago
3.0 - 4.0 years
1 - 5 Lacs
Bengaluru
Work from Office
Job Title:Data Quality Engineer Experience3-4 Years Location:Bangalore : We are seeking a detail-oriented and highly motivated Data Quality Engineerto join our growing data team. In this role, you will be responsible for designing, implementing, and maintaining data quality frameworks to ensure the accuracy, completeness, consistency, and reliability of enterprise data. You will work closely with business stakeholders, data stewards, and data engineers to enforce data governance policies and utilize tools like Ataccamato support enterprise data quality initiatives. We only need immediate joiners. Key Responsibilities: Design and implement robust data quality frameworksand rules using Ataccama ONEor similar data quality tools. Develop automated data quality checks and validation routines to proactively detect and remediate data issues. Collaborate with business and technical teams to define data quality metrics, thresholds, and standards. Support the data governance strategyby identifying critical data elements and ensuring alignment with organizational policies. Monitor, analyze, and report on data quality trends, providing insights and recommendations for continuous improvement. Work with data stewards to resolve data issues and ensure adherence to data quality best practices. Support metadata management, data lineage, and data profiling activities. Document processes, data flows, and data quality rules to facilitate transparency and reproducibility. Conduct root cause analysis on data issues and implement corrective actions to prevent recurrence. Required Skills and Qualifications: Bachelor's degree in Computer Science, Information Systems, Data Science, or a related field. 3+ years of experience in a Data Quality, Data Governance, or Data Engineering role. Hands-on experience with Ataccama ONE or similar data quality tools, including rule creation, data profiling, and issue management. Strong knowledge of data governance frameworks, principles, and best practices. Proficient in SQL and data analysis with the ability to query complex datasets. Experience with data management platforms and enterprise data ecosystems. Excellent problem-solving skills and attention to detail. Strong communication and stakeholder engagement skills. Preferred Qualifications: Experience with cloud data platforms (e.g., Snowflake, AWS, Azure). Familiarity with data catalog tools (e.g., Collibra, Alation). Knowledge of industry data standards and regulatory requirements (e.g., GDPR, HIPAA).
Posted 1 month ago
4.0 - 8.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Sustainability Data Quality Manager Duties & responsibilities The Sustainability Data Quality Manager role will join JLL’s Sustainability Data and Reporting team to support our data management, platform, compliance and reporting functions. The role will report to the team’s Regional Lead but work across our global client portfolio and multiple stakeholder groups to deliver regular data quality analytics reviews and reports, coordinating the resolution of issues found. The position is required to work collaboratively across internal global business lines including JLL’s Client Account, Technology and Operations teams to help manage stakeholder expectations and maintain high quality service delivery. The candidate will have experience in delivering multiple programs of work in parallel and applying a strategy that learns from and leverages challenges and opportunities observed from across the board. The role will be responsible for several tasks including: Partnering with business and technology teams to design, implement, and optimize data quality processes that support business operations and analytics Develop a detailed understanding of key tracking and reporting platforms, including internal tools and how we support our clients in measuring their sustainability performance. Coordinate and manage adherence to QAQC process for key business groups. Identify, assess, and document data quality issues and their impact on business operations Develop a detailed understanding of data structures within client’s data. Performance objectives Ability to actively manage concurrent projects and a strong talent for project coordination. Regularly communicate in a clear and non-technical way to internal JLL users. Be an integral part of the data and reporting team, completing a full review of data quality practices, identifying trends and potential issues, and communicating with others to implement changes and improvements as necessary. Identify support, training and management processes that can be improved to increase scalability and efficiency. Key skills Ability to see patterns and tell trends within and across large data sets, applying understanding of sustainability performance. Strong organizational and analytical skills, process-driven, with an orientation toward continuous improvement. Ability to clearly identify issues with data and raise them to the appropriate stakeholder. Ability to meet milestone dates and raise concerns early and often. Able to determine root causes of data discrepancies and recommend long-term solutions. Candidate specification 5+ years’ experience in similar role. High proficiency in Microsoft Excel and data management. Knowledge of other analytical tools such as Power BI, Tableau, Python, or SQL. Excellent communication skills including the ability to identify and describe data anomalies and provide solutions accordingly. Lateral thinking and problem-solving skills. Ability to multi-task and manage priorities to meet deadlines. Familiarity of sustainability and carbon emissions reporting will be a strong advantage. Project management experience would be an advantage. This role requires a high attention to detail and a strong process-driven approach. Location On-site –Bengaluru, KA, Mumbai, MH Scheduled Weekly Hours: 40 If this job description resonates with you, we encourage you to apply even if you don’t meet all of the requirements. We’re interested in getting to know you and what you bring to the table! JLL Privacy Notice Jones Lang LaSalle (JLL), together with its subsidiaries and affiliates, is a leading global provider of real estate and investment management services. We take our responsibility to protect the personal information provided to us seriously. Generally the personal information we collect from you are for the purposes of processing in connection with JLL’s recruitment process. We endeavour to keep your personal information secure with appropriate level of security and keep for as long as we need it for legitimate business or legal reasons. We will then delete it safely and securely. Candidate Privacy Statement . For candidates in the United States, please see a full copy of our Equal Employment Opportunity and Affirmative Action policy here. Jones Lang LaSalle (“JLL”) is an Equal Opportunity Employer and is committed to working with and providing reasonable accommodations to individuals with disabilities. If you need a reasonable accommodation because of a disability for any part of the employment process – including the online application and/or overall selection process – you may contact us at Accommodation Requests . This email is only to request an accommodation. Please direct any other general recruiting inquiries to our Contact Us page I want to work for JLL.
Posted 1 month ago
3.0 - 6.0 years
9 - 14 Lacs
Mumbai
Work from Office
Role Overview : We are looking for aTalend Data Catalog Specialistto drive enterprise data governance initiatives by implementingTalend Data Catalogand integrating it withApache Atlasfor unified metadata management within a Cloudera-based data lakehouse. The role involves establishing metadata lineage, glossary harmonization, and governance policies to enhance trust, discovery, and compliance across the data ecosystem Key Responsibilities: o Set up and configure Talend Data Catalog to ingest and manage metadata from source systems, data lake (HDFS), Iceberg tables, Hive metastore, and external data sources. o Develop and maintain business glossaries , data classifications, and metadata models. o Design and implement bi-directional integration between Talend Data Catalog and Apache Atlas to enable metadata synchronization , lineage capture, and policy alignment across the Cloudera stack. o Map technical metadata from Hive/Impala to business metadata defined in Talend. o Capture end-to-end lineage of data pipelines (e.g., from ingestion in PySpark to consumption in BI tools) using Talend and Atlas. o Provide impact analysis for schema changes, data transformations, and governance rule enforcement. o Support definition and rollout of enterprise data governance policies (e.g., ownership, stewardship, access control). o Enable role-based metadata access , tagging, and data sensitivity classification. o Work with data owners, stewards, and architects to ensure data assets are well-documented, governed, and discoverable. o Provide training to users on leveraging the catalog for search, understanding, and reuse. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise 6–12 years in data governance or metadata management, with at least 2–3 years in Talend Data Catalog. Talend Data Catalog, Apache Atlas, Cloudera CDP, Hive/Impala, Spark, HDFS, SQL. Business glossary, metadata enrichment, lineage tracking, stewardship workflows. Hands-on experience in Talend–Atlas integration , either through REST APIs, Kafka hooks, or metadata bridges. Preferred technical and professional experience .
Posted 1 month ago
2.0 - 6.0 years
6 - 11 Lacs
Hyderabad
Work from Office
As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include Strategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall 5 - 12 years of relevant experience in SAP BODS/BOIS/SDI/SDQ and 3+ Years of SAP functional experience specializing in design and configuration of SAP BODS/HANA SDI modules. Experience in gathering business requirements and Should be able to create requirement specifications based on Architecture/Design/Detailing of Processes. Should be able to prepare mapping sheet combining his/her Functional and technical expertise. All BODS Consultant should primarily have Data migration experience from Different Legacy Systems to SAP or Non SAP systems. Data Migration experience from SAP ECC to S/4HANA using Migration Cockpit or any other methods. In addition to Data Migration experience, Consultant should have experience or Strong knowledge on BOIS( BO Information Steward) for data Profiling or Data Governance Preferred technical and professional experience Having BODS Admin experience/Knowledge. Having working or strong Knowledge of SAP DATA HUB. Experience/Strong knowledge of HANA SDI (Smart data Integration) to use this as an ETL and should be able to develop flow graphs to Validate/Transform data. Consultant should Develop Workflows, Data flows based on the specifications using various stages in BODS
Posted 1 month ago
4.0 - 9.0 years
9 - 13 Lacs
Mumbai
Work from Office
We are seeking a skilled Python Developer with expertise in Django, Flask, and API development to join our growing team. The Python Developer will be responsible for designing and implementing backend services, APIs, and integrations that power our core platform. The ideal candidate should have a strong foundation in Python programming, experience with Django and/or Flask frameworks, and a proven track record of delivering robust and scalable solutions. Primary Skill Responsibilities Design, develop, and maintain backend services and APIs using Python frameworks such as Django and Flask. Collaborate with front-end developers, product managers, and stakeholders to translate business requirements into technical solutions. Build and integrate RESTful APIs for seamless communication between our applications and external services. Qualifications Bachelors degree in computer science, Engineering, or related field; or equivalent experience. 5+ years of professional experience as a Python Developer, with a focus on backend development. Secondary Skill Amazon Elastic File System (EFS) Amazon Redshift Amazon S3 Apache Spark Ataccama DQ Analyzer AWS Apache Airflow AWS Athena Azure Data Factory Azure Data Lake Storage Gen2 (ADLS) Azure Databricks Azure Event Hub Azure Stream Analytics Azure Synapse Analytics BigID C++ Cloud Storage Collibra Data Governance (DG) Collibra Data Quality (DQ) Data Lake Storage Data Vault Modeling Databricks DataProc DDI Dimensional Data Modeling EDC AXON Electronic Medical Record (EMR) Extract, Transform & Load (ETL) Financial Services Logical Data Model (FSLDM) Google Cloud Platform (GCP) BigQuery Google Cloud Platform (GCP) Bigtable Google Cloud Platform (GCP) Dataproc HQL IBM InfoSphere Information Analyzer IBM Master Data Management (MDM) Informatica Data Explorer Informatica Data Quality (IDQ) Informatica Intelligent Data Management Cloud (IDMC) Informatica Intelligent MDM SaaS Inmon methodology Java Kimball Methodology Metadata Encoding & Transmission Standards (METS) Metasploit Microsoft Excel Microsoft Power BI NewSQL noSQL OpenRefine OpenVAS Performance Tuning Python R RDD Optimization SaS SQL Tableau Tenable Nessus TIBCO Clarity
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane