Jobs
Interviews

2326 Data Governance Jobs - Page 29

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 8.0 years

5 - 8 Lacs

Bengaluru

Remote

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 1 week ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

Mumbai

Work from Office

Position Overview : We are seeking an experienced Data Catalog Lead to lead the implementation and ongoing development of enterprise data catalog using Collibra. This role focuses specifically on healthcare payer industry requirements, including complex regulatory compliance, member data privacy, and multi-system data integration challenges unique to health plan operations. Key Responsibilities : - Data Catalog Implementation & Development - Configure and customize Collibra workflows, data models, and governance processes to support health plan business requirements - Develop automated data discovery and cataloging processes for healthcare data assets including claims, eligibility, provider networks, and member information - Design and implement data lineage tracking across complex healthcare data ecosystems spanning core administration systems, data warehouses, and analytics platforms - Healthcare-Specific Data Governance - Build specialized data catalog structures for healthcare data domains including medical coding systems (ICD-10, CPT, HCPCS), pharmacy data (NDC codes), and provider taxonomies - Configure data classification and sensitivity tagging for PHI (Protected Health Information) and PII data elements in compliance with HIPAA requirements - Implement data retention and privacy policies within Collibra that align with healthcare regulatory requirements and member consent management - Develop metadata management processes for regulatory reporting datasets (HEDIS, Medicare Stars, MLR reporting, risk adjustment) Technical Integration & Automation : - Integrate Collibra with healthcare payer core systems including claims processing platforms, eligibility systems, provider directories, and clinical data repositories - Implement automated data quality monitoring and profiling processes that populate the data catalog with technical and business metadata - Configure Collibra's REST APIs to enable integration with existing data governance tools and business intelligence platforms Required Qualifications : - Collibra Platform Expertise - 8+ years of hands-on experience with Collibra Data Intelligence Cloud platform implementation and administration - Expert knowledge of Collibra's data catalog, data lineage, and data governance capabilities - Proficiency in Collibra workflow configuration, custom attribute development, and role-based access control setup - Experience with Collibra Connect for automated metadata harvesting and system integration - Strong understanding of Collibra's REST APIs and custom development capabilities - Healthcare Payer Industry Knowledge - 4+ years of experience working with healthcare payer/health plan data environments - Deep understanding of healthcare data types including claims (professional, institutional, pharmacy), eligibility, provider data, and member demographics - Knowledge of healthcare industry standards including HL7, X12 EDI transactions, and FHIR specifications - Familiarity with healthcare regulatory requirements (HIPAA, ACA, Medicare Advantage, Medicaid managed care) - Understanding of healthcare coding systems (ICD-10-CM/PCS, CPT, HCPCS, NDC, SNOMED CT) Technical Skills : - Strong SQL skills and experience with healthcare databases (claims databases, clinical data repositories, member systems) - Knowledge of cloud platforms (AWS, Azure, GCP) and their integration with Collibra cloud services - Understanding of data modeling principles and healthcare data warehouse design patterns Data Governance & Compliance : - Experience implementing data governance frameworks in regulated healthcare environments - Knowledge of data privacy regulations (HIPAA, state privacy laws) and their implementation in data catalog tools - Understanding of data classification, data quality management, and master data management principles - Experience with audit trail requirements and compliance reporting in healthcare organizations Preferred Qualifications : - Advanced Healthcare Experience - Experience with specific health plan core systems (such as HealthEdge, Facets, QNXT, or similar platforms) - Knowledge of Medicare Advantage, Medicaid managed care, or commercial health plan operations - Understanding of value-based care arrangements and their data requirements - Experience with clinical data integration and population health analytics Technical Certifications & Skills : - Collibra certification (Data Citizen, Data Steward, or Technical User) - Experience with additional data catalog tools (Alation, Apache Atlas, IBM Watson Knowledge Catalog) - Knowledge of data virtualization tools and their integration with data catalog platforms - Experience with healthcare interoperability standards and API management

Posted 1 week ago

Apply

5.0 - 10.0 years

7 - 11 Lacs

Hyderabad

Work from Office

As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include: Strategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Overall, 5 - 12 years of relevant experience in SAP BODS/BOIS/SDI/SDQ and 3+ Years of SAP functional experience specializing in design and configuration of SAP BODS/HANA SDI modules. Experience in gathering business requirements and Should be able to create requirement specifications based on Architecture/Design/Detailing of Processes. Should be able to prepare mapping sheet combining his/her Functional and technical expertise. All BODS Consultant should primarily have Data migration experience from Different Legacy Systems to SAP or Non-SAP systems. Data Migration experience from SAP ECC to S/4HANA using Migration Cockpit or any other methods. In addition to Data Migration experience, Consultant should have experience or Strong knowledge on BOIS(BO Information Steward) for data Profiling or Data Governance Preferred technical and professional experience Having BODS Admin experience/Knowledge. Having working or strong Knowledge of SAP DATA HUB. Experience/Strong knowledge of HANA SDI (Smart data Integration) to use this as an ETL and should be able to develop flow graphs to Validate/Transform data. Consultant should Develop Workflows, Data flows based on the specifications using various stages in BODS

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

. Backend Rebate Claim Process and Inventory Management: Comprehensive understanding of the point of sale (POS) and backend rebate processes. Ensure timely and accurate receipt of POS files through various channels (EDI, Manual). Perform basic data reconciliations promptly to ensure data completion across distributors. Highlight discrepancies to distributors and track missing products (e.g., serial numbers). Track rebates and discounts, updating them in the system for accurate margins and revenue reporting. Report on various regular and ad-hoc items related to product sales. Ensure data integrity through error correction in various input files (POS, rebate, etc.). Distribute weekly POS and claim errors to relevant teams. Perform inventory reconciliation. Understand the virtual inventory model. Distributor Master Data Governance, Business Model Review & Approval, Process, and ID Changes: Manage business models for new and existing distributors, including name changes, tax ID changes, and address changes. Oversee new business models and activities. Create CLM templates for distributor account changes. Maintain credit party relationships (Sold To - Bill To), interbranch relationships, and sales locations (authorized distributor non-stocking locations in specific countries) as they pertain to distributors. Serve as the primary support contact for distributor queries or distribution-related queries. Engage cross-functionally with Customer Master Data, Theatre Sales Support, SFDC, Legal, OM, Logistics, AR, Global Service, Compliance, and GTS. Maintain the verified Sold To and Ship To list for distributors. Provide input to the process owner for SOX control related to Distributor Ship To changes. Maintain a centralized Tcode list and CMR/DMR templates. Qualifications: Minimum of 3-5 years of experience in business unit financial support, with prior experience in a channel sales environment dealing with distributors and channel partners preferred. Strong interpersonal skills necessary to work effectively across multiple functions and levels of management. Outstanding verbal and written communication skills across multiple functions and levels of the organization. Project and program management experience. Education: Degree in Finance, Accounting, or a related field; MBA preferred but not required.

Posted 1 week ago

Apply

4.0 - 10.0 years

6 - 12 Lacs

Bengaluru

Work from Office

Job Title: Subject Matter Expert - Microsoft Purview Data Governance Location: India preferred) Job Summary: We are seeking experienced and detail-oriented Subject Matter Experts (SMEs) to lead the implementation of sensitive data discovery and labeling initiatives across SharePoint Online and OneDrive environments. The ideal candidate will have deep expertise in Microsoft Purview and a strong understanding of data governance, compliance, and information protection frameworks. Key Responsibilities: Design, configure, and execute sensitive data discovery scans using Microsoft Purview. Develop and apply sensitivity labels and retention labels to files stored in SharePoint Online and OneDrive. Collaborate with compliance, security, and IT teams to define data classification policies and labeling taxonomy . Monitor and fine-tune scanning rules and policies to ensure accurate detection of sensitive information (e.g., PII, PHI, financial data). Generate reports and dashboards to track labeling coverage, policy effectiveness, and compliance posture. Provide training and documentation to stakeholders on labeling practices and Purview capabilities. Stay current with Microsoft 365 compliance features and recommend improvements to data governance strategies. Required Qualifications: Proven experience with Microsoft Purview (formerly Microsoft Information Protection & Compliance). Strong knowledge of Microsoft 365 compliance center , sensitivity labels , data loss prevention (DLP) , and retention policies . Hands-on experience with SharePoint Online and OneDrive for Business administration. Familiarity with Regulatory Compliance Standards (e.g., GDPR, HIPAA, ISO 27001). Ability to interpret and implement data classification frameworks. Excellent communication and documentation skills. Preferred Qualifications: Microsoft certifications such as SC-400 , MS-500 , or SC-900 . Experience with PowerShell scripting for automation and reporting. Background in data privacy , cybersecurity , or information

Posted 1 week ago

Apply

10.0 - 15.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Tech Mahindra Ltd. is looking for Data Architect - Data Governance - Collibra & Purview to join our dynamic team and embark on a rewarding career journey. A Data Architect is a professional who is responsible for designing, building, and maintaining an organization's data architecture. 1. Designing and implementing data models, data integration solutions, and data management systems that ensure data accuracy, consistency, and security. 2. Developing and maintaining data dictionaries, metadata, and data lineage documents to ensure data governance and compliance. 3. Data Architect should have a strong technical background in data architecture and management, as well as excellent communication skills. 4. Strong problem - solving skills and the ability to think critically are also essential to identify and implement solutions to complex data issues.

Posted 1 week ago

Apply

2.0 - 6.0 years

5 - 7 Lacs

Ahmedabad, Aurangabad

Work from Office

Job Title: Alteryx Engineer Location: Bangalore/Mumbai/Ahmedabad/Aurangabad Experience Required: 2-5 years Domain: Manufacturing Job Description: We are seeking a highly skilled Alteryx Engineer with 2-5 years of experience, specifically within the manufacturing domain, to join our dynamic team. The ideal candidate will have a strong background in data preparation, blending, and advanced analytics, coupled with practical experience in the manufacturing industry. This role involves designing, developing, and deploying robust Alteryx workflows to automate data processes, generate insights, and support strategic business decision-making. Key Responsibilities: Workflow Development: Design, develop, and maintain efficient and scalable Alteryx workflows to extract, transform, and load (ETL) data from various sources, ensuring data quality and integrity. Data Blending & Transformation: Perform complex data blending, cleansing, and transformation operations using Alteryx Designer to prepare data for analysis and reporting. Automation: Implement and manage automated data pipelines and analytical processes using Alteryx Server to streamline data delivery and reduce manual efforts. Data Analysis: Analyze complex datasets within Alteryx to identify trends, patterns, and insights that drive strategic decisions and operational improvements. Data Integration: Work with diverse data sources, including SAP, flat files (Excel, CSV), APIs, and other enterprise systems, to ensure accurate and timely data availability within Alteryx workflows. Collaboration: Collaborate closely with business stakeholders, including production, supply chain, and quality assurance teams, to gather requirements, understand their data needs, and translate them into effective Alteryx solutions. Reporting & Output: Configure Alteryx workflows to generate various outputs, including data extracts for reporting tools, analytical datasets, and automated reports. Troubleshooting: Diagnose, resolve, and optimize issues related to Alteryx workflows, data connections, and performance promptly. Required Skills: Experience: 2-5 years of hands-on experience in Alteryx workflow development, data preparation, and automation, with a strong focus on the manufacturing domain. Technical Proficiency: Strong proficiency in Alteryx Designer for building complex analytical workflows. Experience with Alteryx Server for deploying, scheduling, and managing workflows is highly desirable. Data Management: Hands-on experience with SQL and relational databases for querying, data extraction, and understanding database structures. Experience extracting and integrating data from SAP systems using Alteryx connectors or other relevant methods is crucial. Analytical Skills: Strong analytical and problem-solving skills with the ability to interpret complex data, identify root causes, and provide actionable insights. Communication: Excellent communication skills with the ability to present complex technical information clearly to both technical and non-technical audiences. Problem-Solving: Proven ability to troubleshoot issues, optimize workflow performance, and resolve data-related challenges effectively in a fast-paced environment. Domain Knowledge: Familiarity with manufacturing processes, operational metrics, supply chain data, and Key Performance Indicators (KPIs) is highly desirable. Preferred Skills: Alteryx Certification (e.g., Alteryx Designer Core, Advanced, or Expert) is a significant plus. Knowledge of other BI tools (e.g., Tableau, Power BI) or data analysis techniques and programming languages (e.g., Python, R) for advanced analytics is advantageous. Experience with data governance and best practices in Alteryx development. Direct experience with SAP modules relevant to manufacturing (e.g., FICO, Production Planning, Materials Management, Sales and Distribution) is a strong asset.

Posted 1 week ago

Apply

7.0 - 12.0 years

22 - 27 Lacs

Pune

Hybrid

Job Title: AVP Data Designer Location: Pune Package up to 27 LPA Key Responsibilities Translate business data needs into scalable data models, schemas, and flows. Lead the design and implementation of logical and physical data models across platforms. Conduct data profiling and quality analysis to ensure data integrity. Collaborate with cross-functional teams to define data requirements and ensure smooth integration with existing systems. Maintain and update metadata, data dictionaries, and design specifications. Support the banks Data & Analytics strategy by enabling use-case driven data solutions. Ensure data solutions comply with governance, risk, and security frameworks. Optimize data structures for performance, scalability, and business insight generation. Must-Have Skills 58 years of experience in data design , data modeling , or data architecture . Proficiency in SQL and working with databases like Oracle, MySQL, SQL Server . Hands-on experience with Kafka , AWS , or other cloud/data streaming platforms. Strong understanding of data profiling , quality checks , and remediation. Excellent communication skills — ability to work with both technical and non-technical teams. Nice-to-Have Bachelor’s degree in Data Science , Computer Science , or related field. Knowledge of data warehousing and ETL concepts . Experience in the financial services or financial crime domain . Familiarity with data governance tools and frameworks . Exposure to tools like Power BI , Tableau , or data catalog platforms. For more details call Kanika on 9953939776 or email resume to kanika@manningconsulting.in

Posted 2 weeks ago

Apply

7.0 - 12.0 years

30 - 45 Lacs

Bengaluru

Work from Office

Exp required 7+ yrs with data governance informatica tool Key Responsibilities: Data Governance Framework Development : Develop, implement, and maintain data governance frameworks, policies, and standards to ensure high-quality, consistent, and secure data across the organization. Collaborate with business units and stakeholders to define and enforce data governance policies, ensuring alignment with business goals and regulatory requirements. Data Quality Management : Define and enforce data quality standards, monitoring key data quality metrics. Identify, analyze, and resolve data quality issues across various data sources and platforms. Work with cross-functional teams to implement data quality improvement initiatives. Data Lineage & Metadata Management : Implement and maintain data lineage and metadata management solutions to ensure visibility and traceability of data throughout its lifecycle. Work with data architects and engineers to establish and document data flows, transformations, and dependencies. Data Security & Compliance : Ensure that data governance practices comply with relevant regulatory requirements (e.g., GDPR, CCPA, HIPAA). Implement data security controls to protect sensitive data and manage access to sensitive information. Stakeholder Collaboration : Partner with data architects, data engineers, data scientists, and business analysts to ensure alignment between technical and business needs for data governance. Provide training and support for teams on data governance policies, best practices, and tools. Data Governance Tools & Technologies : Lead the implementation and optimization of data governance tools and platforms. Continuously evaluate emerging tools and technologies to improve data governance processes. Reporting & Documentation : Develop and maintain comprehensive data governance documentation and reports. Provide regular updates to senior management on the status of data governance initiatives, risks, and areas of improvement. Requirements: Experience : 7+ years of experience in data governance, data management, or related fields. Proven track record in implementing data governance frameworks and policies at an enterprise level. In-depth knowledge of data governance concepts, including data quality, data lineage, metadata management, and data security. Technical Skills : Experience with data governance tools such as Collibra, Informatica, Alation, or similar. Strong understanding of databases, data warehousing, and big data platforms (e.g., Hadoop, Spark). Familiarity with data integration, ETL processes, and data modeling. Proficiency in SQL and other scripting languages (e.g., Python, Shell). Regulatory Knowledge : Solid understanding of data privacy and compliance regulations (GDPR, CCPA, HIPAA, etc.). Ability to assess and mitigate compliance risks related to data handling. Soft Skills : Excellent communication and interpersonal skills. Strong problem-solving skills and the ability to collaborate across teams. Ability to manage multiple projects and deadlines in a fast-paced environment. Roles and Responsibilities Exp required 7+ yrs with data governance informatica tool Key Responsibilities: Data Governance Framework Development : Develop, implement, and maintain data governance frameworks, policies, and standards to ensure high-quality, consistent, and secure data across the organization. Collaborate with business units and stakeholders to define and enforce data governance policies, ensuring alignment with business goals and regulatory requirements. Data Quality Management : Define and enforce data quality standards, monitoring key data quality metrics. Identify, analyze, and resolve data quality issues across various data sources and platforms. Work with cross-functional teams to implement data quality improvement initiatives. Data Lineage & Metadata Management : Implement and maintain data lineage and metadata management solutions to ensure visibility and traceability of data throughout its lifecycle. Work with data architects and engineers to establish and document data flows, transformations, and dependencies. Data Security & Compliance : Ensure that data governance practices comply with relevant regulatory requirements (e.g., GDPR, CCPA, HIPAA). Implement data security controls to protect sensitive data and manage access to sensitive information. Stakeholder Collaboration : Partner with data architects, data engineers, data scientists, and business analysts to ensure alignment between technical and business needs for data governance. Provide training and support for teams on data governance policies, best practices, and tools. Data Governance Tools & Technologies : Lead the implementation and optimization of data governance tools and platforms. Continuously evaluate emerging tools and technologies to improve data governance processes. Reporting & Documentation : Develop and maintain comprehensive data governance documentation and reports. Provide regular updates to senior management on the status of data governance initiatives, risks, and areas of improvement. Requirements: Experience : 7+ years of experience in data governance, data management, or related fields. Proven track record in implementing data governance frameworks and policies at an enterprise level. In-depth knowledge of data governance concepts, including data quality, data lineage, metadata management, and data security. Technical Skills : Experience with data governance tools such as Collibra, Informatica, Alation, or similar. Strong understanding of databases, data warehousing, and big data platforms (e.g., Hadoop, Spark). Familiarity with data integration, ETL processes, and data modeling. Proficiency in SQL and other scripting languages (e.g., Python, Shell). Regulatory Knowledge : Solid understanding of data privacy and compliance regulations (GDPR, CCPA, HIPAA, etc.). Ability to assess and mitigate compliance risks related to data handling. Soft Skills : Excellent communication and interpersonal skills. Strong problem-solving skills and the ability to collaborate across teams. Ability to manage multiple projects and deadlines in a fast-paced environment.

Posted 2 weeks ago

Apply

2.0 - 7.0 years

7 - 17 Lacs

Chennai

Work from Office

Desired qualifications Required Experience and Skills: Notice Period : 0 to 30days Domain Expertise: 2-10 years of hands-on experience in sanctions screening framework development, tuning, and validation. Data Science/Data Modelling + exp in ML/AI, OMR Model or OMR Reporting, Quant Modelling or Quantitative Analysis and Sanction Screening/Transaction Monitoring/Fin Crime exp is must Familiarity with leading screening platforms (e.g., FircoSoft, Bridger, Actimize, Oracle Watchlist Screening, etc.) and list management practices. In-depth understanding of global sanctions regimes (OFAC, EU, UN, HMT) and related regulatory expectations. Experience in integrating sanctions screening models with broader AML/CFT frameworks. Exposure to AI/ML techniques for entity resolution or fuzzy matching optimization. Prior involvement in regulatory examinations or independent validations of screening tools. Technical Proficiency: Strong programming and scripting skills (Python, R, SQL, SAS). Experience in data modeling, scoring logic calibration, and large-scale dataset analysis. Analytical Thinking: Ability to conduct root cause analysis on alert quality issues. Strong quantitative and qualitative problem-solving capabilities. Communication: Strong written and verbal communication skills, including the ability to explain technical models to non-technical stakeholders. Ability to craft data-backed narratives and present recommendations with clarity. Education Bachelor's Degree / Master's Degree Location and way of working Base location: Chennai This profile involves working from to client location. Interested Candidates please reach out with relevant experience Email: Gentella.VijayaDurga@adecco.com Vijaya Durga M +91 9686640353

Posted 2 weeks ago

Apply

10.0 - 15.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Job description: Maddisoft has the following immediate opportunity, let us know if you or someone you know would be interested. Fulltime role, Send in your updated resume along with LinkedIn profile ASAP without which applications will not be considered. Call us NOW! Job Title: SAP DATA Steward Location: Hyderabad, India As the SAP Data Steward is responsible for creating, maintaining, and deactivating master data and data attributes in SAP with focus on data migration. The Data Stewards has an essential role in establishing in monitoring existing and new data against and ensuring timely and high-quality creation of new data in the system. • Bachelors Degree or Associates degree with additional 12+ years of work experience required or an equivalent combination of education and experience. • Excellent attention to detail, exceptional interest in creating order and consistency required. • 10+ years of experience in data management and/or data governance activities and responsibilities. • Experience working with SAP ECC required. • Demonstrated expert-level experience and capability with MS Excel required. • High degree of initiative and ownership, as well as a proven history of delivering results while working with several different departments in a fast-paced environment required. • Experience creating and running business reports and data queries is preferred. • Confident user of Microsoft Office (Word, Excel, Outlook, PowerPoint, Teams). • Experience working with teams across multiple functions. • Ability to multi-task and work under tight timelines required. • Excellent communication skills both verbal and written.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

7 - 17 Lacs

Chennai

Work from Office

Data Modelling Skillset: Data Science/Data Modelling + exp in ML/AI, OMR Model or OMR Reporting, Quant Modelling or Quantitative Analysis and Sanction Screening/Transaction Monitoring/Fin Crime exp is must (please use these keywords while sourcing) Location: Chennai (Mandatory) - open to consider candidates willing to relocate to Chennai Notice Period: 0 to 30 days Exp Range: 2 to 10 years (SE to Manager) Note: interested candidates can call me at 8050295397 and mail me on parul.rathore@adecco.com

Posted 2 weeks ago

Apply

4.0 - 9.0 years

7 - 17 Lacs

Chennai

Work from Office

Data Modelling Skillset: Data Science/Data Modelling + exp in ML/AI, OMR Model or OMR Reporting, Quant Modelling or Quantitative Analysis and Sanction Screening/Transaction Monitoring/Fin Crime exp is must (please use these keywords while sourcing) Location: Chennai (Mandatory) - open to consider candidates willing to relocate to Chennai Notice Period: 0 to 30 days Exp Range: 2 to 10 years (SE to Manager)

Posted 2 weeks ago

Apply

8.0 - 10.0 years

12 - 14 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

We are seeking an SAP MDG Consultant with 8-10 years of experience in SAP MDG (Master Data Governance). The consultant should have a strong techno-functional background with expertise in MDG Data Model build, Business Partner, Finance, and MM domains. The role involves implementing MDG BRF+, managing mass changes, and understanding the Data Replication Framework. Knowledge of Data distribution using BD64, Partner Profiles, and RFCs is required. Exposure to ALE Idoc for Master Data and debugging skills in SAP (especially ABAP) is a big plus. The candidate should be comfortable with remote work and be willing to travel to Manila in January to run onboarding sessions for the new support team. Location-Remote,Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 2 weeks ago

Apply

8.0 - 10.0 years

10 - 12 Lacs

Bengaluru

Work from Office

Location: Bangalore/Hyderabad/Pune Experience level: 8+ Years About the Role We are looking for a technical and hands-on Lead Data Engineer to help drive the modernization of our data transformation workflows. We currently rely on legacy SQL scripts orchestrated via Airflow, and we are transitioning to a modular, scalable, CI/CD-driven DBT-based data platform. The ideal candidate has deep experience with DBT , modern data stack design , and has previously led similar migrations improving code quality, lineage visibility, performance, and engineering best practices. Key Responsibilities Lead the migration of legacy SQL-based ETL logic to DBT-based transformations Design and implement a scalable, modular DBT architecture (models, macros, packages) Audit and refactor legacy SQL for clarity, efficiency, and modularity Improve CI/CD pipelines for DBT: automated testing, deployment, and code quality enforcement Collaborate with data analysts, platform engineers, and business stakeholders to understand current gaps and define future data pipelines Own Airflow orchestration redesign where needed (e.g., DBT Cloud/API hooks or airflow-dbt integration) Define and enforce coding standards, review processes, and documentation practices Coach junior data engineers on DBT and SQL best practices Provide lineage and impact analysis improvements using DBTs built-in tools and metadata Must-Have Qualifications 8+ years of experience in data engineering Proven success in migrating legacy SQL to DBT , with visible results Deep understanding of DBT best practices , including model layering, Jinja templating, testing, and packages Proficient in SQL performance tuning , modular SQL design, and query optimization Experience with Airflow (Composer, MWAA), including DAG refactoring and task orchestration Hands-on experience with modern data stacks (e.g., Snowflake, BigQuery etc.) Familiarity with data testing and CI/CD for analytics workflows Strong communication and leadership skills; comfortable working cross-functionally Nice-to-Have Experience with DBT Cloud or DBT Core integrations with Airflow Familiarity with data governance and lineage tools (e.g., dbt docs, Alation) Exposure to Python (for custom Airflow operators/macros or utilities) Previous experience mentoring teams through modern data stack transitions

Posted 2 weeks ago

Apply

4.0 - 5.0 years

3 - 5 Lacs

Kolkata

Remote

Job Title: Data Protection Officer (Contract-Based | Hourly | On-Call) Location: Remote / India (with availability for EU/UK time zone coordination) Type: Contractual | Hourly Basis | As-needed Engagement Experience: 4 - 5 years of relevant experience Job Summary: We are looking for an experienced and independent Data Protection Officer (DPO) to support our organization in ensuring compliance with the General Data Protection Regulation (GDPR) and other applicable data privacy laws. This is a contract-based, hourly paid position , and the DPO will be engaged on an as-needed basis . The role requires flexibility to provide consultation, conduct reviews, and respond to data protection matters when required. Key Responsibilities: Serve as the point of contact for UK/EU residents , supervisory authorities, and internal teams regarding data protection issues. Identify and evaluate the companys data processing activities . Provide expert advice on conducting Data Protection Impact Assessments (DPIA) . Monitor compliance with GDPR and applicable local data protection laws . Review and advise on data management procedures and internal policies. Offer consultation on incident response and handling of privacy breaches . Track regulatory changes and provide recommendations to maintain compliance. Maintain and update a register of processing operations , including risk-prone processes for prior checks. Support internal awareness and training initiatives regarding data protection obligations. Requirements- Work experience in data protection and legal compliance is a must. Solid knowledge of GDPR and data protection laws. Ability to handle confidential information. Ensure that controllers and data subjects are informed about their data protection rights, obligations and responsibilities and raise awareness about them; Create a register of processing operations within the institution and notify the EDPS those that present specific risks (so-called prior checks); Ethical, with the ability to remain impartial and report all non-compliances Organizational skills with attention to details. Experience: 4-5 years expertise in managing international data protection compliance programs and implementing data governance policies, technology compliance standards and programs, and privacy-by-design frameworks. To be successful in this role, you should have in-depth knowledge of GDPR and local data protection laws and be familiar with our industry and the nature of its data processing activities.

Posted 2 weeks ago

Apply

8.0 - 12.0 years

10 - 15 Lacs

Mumbai

Work from Office

Transform raw data into strategic insights. Conduct analyses to support decisions in demand forecasting, SKU profiling, procurement, warehouse management, and logistics. Optimize inventory levels and enhance overall supply chain efficiency. Required Candidate profile Bachelor’s or Master’s degree in Data Science, Business Analytics, or Supply Chain Management. Advanced skills in Excel, SQL, and BI/visualisation tools (e.g., Power BI, Tableau) Oversee MIS platforms

Posted 2 weeks ago

Apply

2.0 - 5.0 years

4 - 5 Lacs

Dhule

Work from Office

The Development Lead will oversee the design, development, and delivery of advanced data solutions using Azure Databricks, SQL, and data visualization tools like Power BI. The role involves leading a team of developers, managing data pipelines, and creating insightful dashboards and reports to drive data-driven decision-making across the organization. The individual will ensure best practices are followed in data architecture, development, and reporting while maintaining alignment with business objectives. Key Responsibilities: Data Integration & ETL Processes: Design, build, and optimize ETL pipelines to manage the flow of data from various sources into data lakes, data warehouses, and reporting platforms. Data Visualization & Reporting: Lead the development of interactive dashboards and reports using Power BI, ensuring that business users have access to actionable insights and performance metrics. SQL Development & Optimization: Write, optimize, and review complex SQL queries for data extraction, transformation, and reporting, ensuring high performance and scalability across large datasets. Azure Cloud Solutions: Implement and manage cloud-based solutions using Azure services (Azure Databricks, Azure SQL Database, Data Lake) to support business intelligence and reporting initiatives. Collaboration with Stakeholders: Work closely with business leaders and cross-functional teams to understand reporting and analytics needs, translating them into technical requirements and actionable data solutions. Quality Assurance & Best Practices: Implement and maintain best practices in development, ensuring code quality, version control, and adherence to data governance standards. Performance Monitoring & Tuning: Continuously monitor the performance of data systems, reporting tools, and dashboards to ensure they meet SLAs and business requirements. Documentation & Training: Create and maintain comprehensive documentation for all data solutions, including architecture diagrams, ETL workflows, and data models. Provide training and support to end-users on Power BI reports and dashboards. Required Qualifications: Bachelors or Masters degree in Computer Science, Information Systems, or a related field. Proven experience as a Development Lead or Senior Data Engineer with expertise in Azure Databricks, SQL, Power BI, and data reporting/visualization. Hands-on experience in Azure Databricks for large-scale data processing and analytics, including Delta Lake, Spark SQL, and integration with Azure Data Lake. Strong expertise in SQL for querying, data transformation, and database management. Proficiency in Power BI for developing advanced dashboards, data models, and reporting solutions. Experience in ETL design and data integration across multiple systems, with a focus on performance optimization. Knowledge of Azure cloud architecture, including Azure SQL Database, Data Lake, and other relevant services. Experience leading agile development teams, with a strong focus on delivering high-quality, scalable solutions. Strong problem-solving skills, with the ability to troubleshoot and resolve complex data and reporting issues. Excellent communication skills, with the ability to interact with both technical and non-technical stakeholders. Preferred Qualifications: Knowledge of additional Azure services (e.g., Azure Synapse, Data Factory, Logic Apps) is a plus. Experience in Power BI for data visualization and custom calculations.

Posted 2 weeks ago

Apply

4.0 - 6.0 years

1 - 2 Lacs

Mumbai, New Delhi, Bengaluru

Work from Office

Responsibilities: Design and implement scalable data pipelines to ingest, process, and analyze large volumes of structured and unstructured data from various sources. Develop and optimize data storage solutions, including data warehouses, data lakes, and NoSQL databases, to support efficient data retrieval and analysis. Implement data processing frameworks and tools such as Apache Hadoop, Spark, Kafka, and Flink to enable real-time and batch data processing. Collaborate with data scientists and analysts to understand data requirements and develop solutions that enable advanced analytics, machine learning, and reporting. Ensure data quality, integrity, and security by implementing best practices for data governance, metadata management, and data lineage. Monitor and troubleshoot data pipelines and infrastructure to ensure reliability, performance, and scalability. Develop and maintain ETL (Extract, Transform, Load) processes to integrate data from various sources and transform it into usable formats. Stay current with emerging technologies and trends in big data and cloud computing, and evaluate their applicability to enhance our data engineering capabilities. Document data architectures, pipelines, and processes to ensure clear communication and knowledge sharing across the team. Strong programming skills in Java, Python, or Scala.Strong understanding of data modelling, data warehousing, and ETL processes. Min 4 to Max 6yrs of Relevant exp.Strong understanding of Big Data technologies and their architectures, including Hadoop, Spark, and NoSQL databases. Locations : Mumbai, Delhi NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 2 weeks ago

Apply

3.0 - 5.0 years

12 - 14 Lacs

Gurugram

Work from Office

We are seeking an experienced Salesforce Marketing Cloud Consultant to implement, integrate, and optimize Salesforce Marketing Cloud across multiple channels such as Email, SMS, WhatsApp, and Push notifications. Responsibilities include configuring SFMC components like Email Studio, Mobile Studio, Contact Builder, Audience Builder, Automation Studio, Journey Builder, Reporting, and Einstein features. The role involves developing automated processes, writing SQL queries, and handling API integrations. The consultant will collaborate with cross-functional teams, providing support, troubleshooting, and guiding clients in best practices, while ensuring system stability and performance. This role requires a strong understanding of marketing automation and data governance.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

10 - 12 Lacs

Bengaluru, Doddakannell, Karnataka

Work from Office

We are seeking a highly skilled Data Engineer with expertise in ETL techniques, programming, and big data technologies. The candidate will play a critical role in designing, developing, and maintaining robust data pipelines, ensuring data accuracy, consistency, and accessibility. This role involves collaboration with cross-functional teams to enrich and maintain a central data repository for advanced analytics and machine learning. The ideal candidate should have experience with cloud-based data platforms, data modeling, and data governance processes. Location - Bengaluru,Doddakannell, Karnataka, Sarjapur Road

Posted 2 weeks ago

Apply

3.0 - 7.0 years

6 - 10 Lacs

Mumbai

Work from Office

JD for Power Bi. Role Description: Power BI Competencies: Digital : Microsoft Power BI Experience (Years): 2-4 Essential Skills: Power BI Job description Key Responsibilities: Power BI Report and Dashboard Development: Design, develop, and deploy interactive Power BI dashboards and reports for business users. Leverage Power BI features such as DAX (Data Analysis Expressions) , Power Query , and Power BI Service to develop insightful analytics. Create custom visualizations and interactive reports that help business leaders understand key metrics. Data Modeling and Transformation: Build and maintain optimized data models to support business requirements and improve report performance. Develop ETL (Extract, Transform, Load) processes using Power Query and integrate data from multiple sources (SQL Server, Excel, Azure, etc.). Data Integration: Connect Power BI to various data sources (e.g., SQL Server , Excel , SharePoint , Azure , API integrations ) to pull data and create real-time reports. Design and implement data pipelines for seamless data flow and processing. Collaboration with Stakeholders: Work with business analysts, managers, and end-users to gather reporting requirements and translate them into Power BI solutions. Deliver training sessions to end-users, ensuring they can efficiently navigate Power BI reports and dashboards. Performance Tuning: Optimize Power BI reports for speed and efficiency, ensuring they can handle large datasets and deliver fast results. Troubleshoot and resolve performance issues related to reports and dashboards. Governance and Security: Implement role-based security to control access to specific data based on user roles. Maintain and monitor Power BI workspaces , ensuring appropriate governance and data access controls. Documentation: Create and maintain documentation for developed reports, dashboards, and data models. Ensure that reports are properly version-controlled and aligned with data governance practices. Continuous Improvement: Stay up-to-date with the latest Power BI features, best practices, and industry trends. Suggest improvements and enhancements to existing reports and dashboards based on user feedback and business needs.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

5 - 6 Lacs

Bengaluru

Work from Office

Req ID: 331269 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Informatica Admin to join our team in Bangalore, Karn taka (IN-KA), India (IN). Informatica Cloud Data Governance & Catalog (CDGC): Glossary creation, metadata management, data classification Data lineage, policy definition, and domain configuration Informatica Administration: User/role management, Secure Agent installation & maintenance Job monitoring, repository backups, system troubleshooting Environment configuration and version upgrades Informatica Data Quality (IDQ): Data profiling, rule specification, transformations Scorecards, DQ metrics, accuracy, and completeness checks Exception handling and remediation Additionally, it would be beneficial if the candidate has knowledge and experience in: Scripting: Shell scripting (Bash/Korn), Python scripting for automation Experience in building monitoring and housekeeping scripts Cloud Knowledge: Familiarity with Azure, AWS, or GCP Working with cloud-hosted Informatica services DevOps & CI/CD: Azure DevOps: Creating and managing pipelines, repos, and releases Integration with Informatica for automated deployments

Posted 2 weeks ago

Apply

2.0 - 7.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Role: Product Manager, Campaign Management Location: Bengaluru What you ll do We re MiQ, a global programmatic media partner for marketers and agencies. Our people are at the heart of everything we do, so you will be too. No matter the role or the location, we re all united in the vision to lead the programmatic industry and make it better. As a Product Manager in our Product department, you ll have the chance to: Research and stakeholder management: Gain a deep understanding of client needs by meeting with stakeholders, as well as global MiQ team leads to ensure the global validity of roadmap additions Coordinate resources across a range of departments of the business in order to develop great products, in particular, working closely with product stakeholders across the product and tech team Collaborate with product stakeholders to research, validate and prioritize new features that align with business priorities Planning and delivery: Plan, prioritise & project manage product development features to accelerate development of features Convert global client needs into technical features, and then convert the resulting features into marketable benefits. Work closely with the global product marketing team to align product enhancements internally with sales and marketing collateral externally Integrate usability studies, research and market studying into product requirements Lead the ideation, technical development, launch and continued adoption of features you build Drive product development by partnering with a team of platform product managers, engineers, data scientists, and UX/UI designers Partner evaluation and collaboration Identify and work closely with key external partners to ensure that MIQ product roadmap is additive Measuring and reporting on success Understand, collect and analyse metrics that inform the success of the product Drive adoption and measure success of new measurement offerings and features. Spread knowledge and train teams on new developments and evangelize MiQ s capabilities What impact will you create? In MiQ the Product Manager is tasked with building on our existing capabilities for campaign management, identifying gaps and opportunities in our offering, working with MiQ s technology and product teams to execute a development roadmap for this capability, and working to scale features across our regional markets. Who are your stakeholders? The key set of people from product stakeholders across the product and tech team, such as platform product managers, engineers, data scientists, and UX/UI designers will be your major stakeholders What you ll bring 2+ years of advertising platform technology experience in product management or related capacity, buy-side experience preferred In depth knowledge of the programmatic landscape, the data and technology that underpins it A fail fast and learn mentality in experimenting with new concepts and industry opportunities. Detail-oriented with an ability to prioritise projects/tasks simultaneously and to completion Alignment with MiQs core values A can-do attitude to provide energy, drive, and enthusiasm We ve highlighted some key skills, experience and requirements for this role. But please don t worry if you don t meet every single one. Our talent team strives to find the best people. They might see something in your background that s a fit for this role, or another opportunity at MiQ. If you have a passion for the role, please still apply. MiQ Values Our values are so much more than statements . They unite MiQers in every corner of the world. They shape the way we work and the decisions we make. And they inspire us to stay true to ourselves and to aim for better. Our values are there to be embraced by everyone, so that we naturally live and breathe them. Just like inclusivity, our values flow through everything we do - no matter how big or small. We do what we love - Passion We figure it out - Determination We anticipate the unexpected - Agility We always unite - Unite We dare to be unconventional - Courage What s in it for you? Our Center of Excellence is the very heart of MiQ, and it s where the magic happens. It means everything you do and everything you create will have a huge impact across our entire global business. MiQ is incredibly proud to foster a welcoming culture. We do everything possible to make sure everyone feels valued for what they bring. With global teams committed to diversity, equity, and inclusion, we re always moving towards becoming an even better place to work. Benefits Every region and office have specific perks and benefits, but every person joining MiQ can expect: A hybrid work environment New hire orientation with job specific onboarding and training Internal and global mobility opportunities Competitive healthcare benefits Bonus and performance incentives Generous annual PTO paid parental leave, with two additional paid days to acknowledge holidays, cultural events, or inclusion initiatives. Employee resource groups designed to connect people across all MiQ regions, drive action, and support our communities. Apply today! Equal Opportunity Employer Role: Associate Product Manager Location: Bengaluru What you ll do We re MiQ, a global programmatic media partner for marketers and agencies. Our people are at the heart of everything we do, so you will be too. No matter the role or the location, we re all united in the vision to lead the programmatic industry and make it better. As an Associate Product Manager in our Data Management product stream, you ll have the chance to: Ideate, vision, validate and help build key data management features which is at the core of MiQ s data-driven programmatic media offering Help build tools, frameworks, SDKs, APIs, pipelines, data cubes and data formats for other teams to consume data platforms and data products for building business-critical product features Collaborate with product stakeholders to research, validate and prioritize new features that align with business priorities Identify and work closely with key external partners to ensure that MIQ product roadmap is additive Integrate usability studies, research and market analysis into product requirements Lead the ideation, technical development, launch and continued adoption of features you build Drive product development with a team of world-class engineers & data scientists. Define, collect and analyse metrics that inform the success of the product Who are your stakeholders? The key set of people who will be consuming the data management features that you will help build our analysts, data scientists and product teams at MiQ, Hence they will be your major stakeholders In a few scenarios, we also have client facing data platform requirements directly being delivered by the Data Management team - e.g. 1PD onboarding platform, data cubes. Here stakeholders are clients and client facing teams like Sales and Client Services who represent the clients internally within MiQ What you ll bring Prior Experience in Product Management for a minimum of 2+ years, preferably as a Technical Product Manager of data platforms / products meant for developers or data scientists. (e.g. API products) Prior software development experience is preferable but not mandatory, however, you should have at least tried out a number of hobby projects like app creation, data pipelines etc. on cloud. Prior exposure to ETL solutions, big data tool sets, data ops and data governance, Jupyter Notebooks, Cloudera, Apache Nifi Exposure and understanding of more than one of the following is preferable data platforms like Databricks, Snowflake etc data formats like Delta, Apache Iceberg newer data paradigms like Data As A Product, Medallion Architecture, Data Mesh etc. Understanding of cloud platforms like AWS or GCP Prior experience in handling data pipeline optimization kind of product scenarios for failures, speed and efficiency. Prior experience of stakeholder interaction, managing conflicting requirements/priorities across stakeholders Extensive experience in handling and maintaining a relationship with vendors for product issues and also for build/buy decisioning. Exposure to ad-tech is preferable but not mandatory Willingness to try new technologies and ability to grasp and learn quickly Comfort with numbers and a quantitative approach to solve problems We ve highlighted some key skills, experience and requirements for this role. But please don t worry if you don t meet every single one. Our talent team strives to find the best people. They might see something in your background that s a fit for this role, or another opportunity at MiQ. If you have a passion for the role, please still apply. What impact will you create? MiQ has a petabyte scale data platform serving our data-driven programmatic offering. As an Associate Product Manager in the Data Management team, you will build key data management features which will have an impact at the bottom line of the company enabling sophisticated data and AI driven programmatic media capabilities and elevate MiQ in its offering. You will get an opportunity to democratise data and AI within MiQ and contribute to the data culture organically in a product-led way. What s in it for you? Our Center of Excellence is the very heart of MiQ, and it s where the magic happens. It means everything you do and everything you create will have a huge impact across our entire global business. MiQ is incredibly proud to foster a welcoming culture. We do everything possible to make sure everyone feels valued for what they bring. With global teams committed to diversity, equity, and inclusion, we re always moving towards becoming an even better place to work. Values Our values are so much more than statements . They unite MiQers in every corner of the world. They shape the way we work and the decisions we make. And they inspire us to stay true to ourselves and to aim for better. Our values are there to be embraced by everyone, so that we naturally live and breathe them. Just like inclusivity, our values flow through everything we do - no matter how big or small. We do what we love - Passion We figure it out - Determination We anticipate the unexpected - Agility We always unite - Unite We dare to be unconventional - Courage Benefits Every region and office have specific perks and benefits, but every person joining MiQ can expect: A hybrid work environment New hire orientation with job specific onboarding and training Internal and global mobility opportunities Competitive healthcare benefits Bonus and performance incentives Generous annual PTO paid parental leave, with two additional paid days to acknowledge holidays, cultural events, or inclusion initiatives. Employee resource groups designed to connect people across all MiQ regions, drive action, and support our communities. Apply today! Equal Opportunity Employer!

Posted 2 weeks ago

Apply

6.0 - 11.0 years

25 - 30 Lacs

Bengaluru

Work from Office

We are seeking an experienced Senior Data Engineer to join our data team. As a Data Engineer at ThoughtSpot, you will be responsible for designing, building, and maintaining the data infrastructure that powers our analytics and drives data-driven decision-making for leadership. You will work closely with business teams to ensure our data systems are robust, scalable, and efficient. We have a rapidly expanding list of happy customers who love our product, and were growing to serve even more. What youll do: Design, develop, and maintain scalable data pipelines to process large volumes of data from various sources. Working closely with our business teams to process & curate analytics ready data. Ensure data quality and consistency through rigorous testing and validation processes. Monitor and troubleshoot data pipeline performance and resolve any issues. What you bring: 6+ years of experience in data engineering, building data infra and pipelines. Experience building and maintaining large data pipelines, data infrastructure. Advanced working SQL knowledge and experience working with relational databases, query authoring (SQL) as well as working familiarity with a variety of databases/warehouses. Experience with ETL tools like Hevo, Matillion etc. Experience with data analytics products. Thoughtspot experience is good to have. Experience with cloud services such as AWS, GCP, Azure etc. Knowledge of data warehousing concepts and experience with EDW like Databricks, Snowflake or Redshift. Proficiency in programming languages such as Python and data processing libraries such as Pandas etc. Understanding of data governance, data quality and security best practices. Knowledge of development good practices such as testing, code reviews and git. Ability to work independently and coordinate with different stakeholders. You love building and leading exceptional teams in a fast-paced, entrepreneurial environment. You have a strong bias for action and being resourceful Bring amazing problem-solving skills and an ability to identify, quantify, debug, and remove bottlenecks and functional issues Great communication skills, both verbal and written, and an interest in working with a diverse set of peers and customers Alignment with ThoughtSpot Values What makes ThoughtSpot a great place to work? ThoughtSpot is the experience layer of the modern data stack, leading the industry with our AI-powered analytics and natural language search. We hire people with unique identities, backgrounds, and perspectives this balance-for-the-better philosophy is key to our success. When paired with our culture of Selfless Excellence and our drive for continuous improvement (2% done), ThoughtSpot cultivates a respectful culture that pushes norms to create world-class products. If you re excited by the opportunity to work with some of the brightest minds in the business and make your mark on a truly innovative company, we invite you to read more about our mission, and apply to the role that s right for you. ThoughtSpot for All Building a diverse and inclusive team isnt just the right thing to do for our people, its the right thing to do for our business. We know we can t solve complex data problems with a single perspective. It takes many voices, experiences, and areas of expertise to deliver the innovative solutions our customers need. At ThoughtSpot, we continually celebrate the diverse communities that individuals cultivate to empower every Spotter to bring their whole authentic self to work. We re committed to being real and continuously learning when it comes to equality, equity, and creating space for underrepresented groups to thrive. Research shows that in order to apply for a job, women feel they need to meet 100% of the criteria while men usually apply after meeting 60%. Regardless of how you identify, if you believe you can do the job and are a good match, we encourage you to apply.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies