Jobs
Interviews

2293 Data Governance Jobs - Page 31

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

15 - 30 Lacs

Pune

Hybrid

Dear Candidate, This is with reference to Senior Business Intelligence Analyst Openings At Wolters Kluwer, Pune Kindly share your resume on jyoti.salvi@wolterskluwer.com Job Specifications :- Skillset Requirement : Looking for Data Governance professionals with experience in Collibra . Experience in Microsoft Purview is highly preferred. Experience Range : Candidates with 2 to 8 years of relevant Data Governance experience Primary Skills - Data Governance, Microsoft Purview, Collibra Architecting, designing, and implementing data governance solutions using Microsoft Purview Experience in data lifecycle management, including data retention, deletion, and archiving strategies using Microsoft Purview Data Lifecycle Management Assist transitions to Microsoft Purview services, including setting up data lifecycle management and eDiscovery configurations Maintain accurate documentation of configurations, processes, and procedures related to Microsoft Purview Experience in implementation of data governance policies and procedures to ensure compliance with regulatory requirements and organizational standards Ensure Data Quality and Compliance by applying expertise in MDM and data governance principles, including data governance frameworks and practices, to ensure the relevancy, quality, security, and compliance of master data Develop and implement data integration solutions for metadata, data lineage, and data quality

Posted 2 weeks ago

Apply

3.0 - 8.0 years

10 - 20 Lacs

Bengaluru

Hybrid

Job Title: Data Governance & Quality Specialist Experience: 3-8 Years Location: Bangalore (Hybrid) Domain: Financial Services Notice Period: Immediate to 30 Days What Youll Do: Define and enforce data-governance policies (BCBS 239/GDPR) across credit-risk datasets Design, monitor and report on data-quality KPIs; perform profiling & root-cause analysis in SAS/SQL Collaborate with data stewards, risk teams and auditors to remediate data issues Develop governance artifacts: data-lineage maps, stewardship RACI, council presentations Must Have: 38 yrs in data-governance or data-quality roles (financial services) Advanced SAS for data profiling & reporting; strong SQL skills Hands-on with governance frameworks and regulatory requirements Excellent stakeholder-management and documentation abilities Nice to Have: Experience with Collibra, Informatica or Talend Exposure to credit-risk model inputs (PD/LGD/EAD) Automation via SAS macros or Python scripting If Interested, please share your resume to sunidhi.manhas@portraypeople.com

Posted 2 weeks ago

Apply

2.0 - 6.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Cigna is a global health services company dedicated to improving the health and well-being of those we serve. Through our divisions, Cigna Healthcare and Evernorth Health Services, we provide a wide range of services and solutions that enhance the lives of our clients, customers, and patients. Part of Cigna Healthcare, International Health delivers a diverse range of health services and solutions globally, ensuring access to quality care and support. Our International Health Technology team is at the forefront of technological innovation, ensuring seamless integration of systems and processes across global operations. We leverage advanced technologies to enhance service delivery and support strategic goals, meeting the evolving needs of our international community. Role Overview We are seeking an experienced Enterprise Architecture Coordinator who will play a crucial role in supporting the development and implementation of enterprise architecture strategies that align with business goals. This role involves collaborative efforts with senior architects and stakeholders to ensure the seamless integration of systems and processes, thus driving technological innovation within the organization. Key Responsibilities Assist in aligning IT initiatives with business strategy, ensuring that technology solutions support overarching business objectives and contribute to long-term success. Support the establishment and enforcement of architectural standards and governance policies, ensuring compliance with industry best practices and regulatory requirements. Assist in adopting and integrating emerging technologies, ensuring investments support the architecture runway and enhance the organizations technological capabilities. Help translate business needs into actionable technical solutions by collaborating with cross-functional teams and leveraging advanced technologies. Recommend enhancements to existing solutions for improved efficiency, scalability, and performance. Assist in developing and maintaining governance frameworks to ensure consistent application of architectural principles and practices. Work closely with senior architects and team members for seamless integration of new systems and processes, ensuring minimal disruption to operations. Maintain and update architectural documentation, ensuring that all architectural artifacts are accurate, up-to-date, and accessible to relevant stakeholders. Apply SAFe principles to support agile transformation initiatives, emphasizing lean-agile leadership, technical agility, product delivery, enterprise solution delivery, and portfolio management. Support the Enterprise Architecture framework using tools like LeanIX, ensuring accurate and up-to-date architectural data and insights that inform decision-making. Identify and assess technology risks, and develop mitigation strategies to ensure the security, reliability, and resilience of IT systems. Foster a culture of innovation by exploring and recommending new technologies and methodologies that can enhance the organizations IT capabilities and drive competitive advantage. Required Skills and Qualifications Strong understanding of data governance, digital transformation, and integration architecture prevalent in the health insurance industry. Excellent communication skills for conveying technical concepts to both technical and non-technical stakeholders. Strong analytical and problem-solving skills to identify and resolve issues effectively. Willingness to learn and adapt to new technologies and methodologies. Ability to adapt to changing business needs and technological advancements. Knowledge of frameworks (TOGAF, Zachman), modelling and design tools, system integration, security architecture, and cloud services. Additional Information Ability to travel internationally as needed. Ability to work effectively in a globally distributed team environment. Commitment to continuous learning and professional development. About The Cigna Group

Posted 2 weeks ago

Apply

6.0 - 11.0 years

12 - 16 Lacs

Pune

Work from Office

What You'll Do We are looking for Senior Analyst based in the Pune office. You will empower the HR team to harness the full potential of data, enabling them to make informed decisions that drive organizational success. Partnering with the engineering team, you will ensure seamless integration and availability of people data across various systems and tools. You will collaborate closely with both HR stakeholders and tech teams to design, develop, and maintain scalable data pipelines using tools such as Snowflake, dbt, and Fivetran, while implementing and optimizing ELT processes to ensure data accuracy and reliability. Report to Lead Project Manager What Your Responsibilities Will Be On a daily basis, you will build and monitor data pipelines to ensure seamless data flow and availability for analysis. You will troubleshoot and resolve any data-related issues, minimizing disruptions to HR operations. You will partner with the engineering team to integrate people data across systems, ensuring it is handled securely, and in accordance with data governance standards, making it accessible for cross-functional use. You will develop and maintain SQL queries and scripts to support data analysis and reporting needs, ensuring data integrity and security across all HR-related data systems. Additionally, you will document processes, provide technical support and training to HR Team members on tools and improve data infrastructure and processes to enhance performance and scalability. What You'll Need to be Successful You have 6+ years of experience as a Data Engineer or in a similar role. You possess documentation skills, ensuring models are understandable and maintainable by all stakeholders. You have strong proficiency in SQL and deep experience with modern data stack tools (Snowflake, dbt, Fivetran, GitLab). You can translate business requirements into data models. You have experience with ELT processes and data pipeline development. You have knowledge of data security and privacy regulations and have implemented techniques such as row level security or data masking to keep data safe. You hold a Bachelor's degree in Computer Science, Information Technology, or a related field. Highly desirable if you have experience with Workday reporting, calculated fields, RaaS. Plus if you have experience with ICIMS, Workramp, Gallup, Espresa, Adaptive, or Salesforce. Plus if you have experience with Power BI, particularly data marts within Power BI. Plus if you have experience using a scripting language such as Python to automate ELT processes.

Posted 2 weeks ago

Apply

4.0 - 9.0 years

15 - 25 Lacs

Pune

Work from Office

Role & responsibilities: Design & Configuration: Develop and configure MDM and Data Quality tools such as Syndigo, Snowflake, and Alation. Perform data quality functions including audits, assessments, entity resolution, data profiling, scorecard development, and exception management configuration. Configure workflows for Products and Customer data in Syndigo and Winshuttle. Support the development of Master Data and IT system architecture roadmaps to improve e-commerce strategy, supply chain efficiency, and customer experience. Collaborate in an agile environment to design and build data solutions, ensuring thorough end-to-end testing. Implement data orchestration pipelines, data sourcing, cleansing, and quality control processes. Contribute to software verification plans and quality assurance procedures. Document and maintain data pipeline architecture. Contribute to IT standards, procedures, and processes. Document and maintain software functionality. Training & Support: Develop & maintain technical design documentation. Create and maintain relevant project documentation throughout SDLC Create clear and concise training documentation Lead end-to-end delivery of technical solutions Support the installations, maintenance, and upgrades of MDM tools Create test cases and support user testing throughout relevant test cycles. Preferred candidate profile: Bachelor's degree in computer science, systems analysis, or a related study, or equivalent experience. Requires mastery-level knowledge of the job area obtained either through advanced education, experience, or both. Should have a minimum of 4 years of Experience in Data Management and Data Quality solutions. Of which a minimum of 3 years of experience in the Syndigo application is required. Data Quality related certifications (i.e., GS1, CIMP, IQCP, ICP, CDMP, and CMMI Enterprise Data Management) are a plus. Should have a conceptual understanding of SAP and PLMs with a strong understanding of Key business processes involving Customer & Product Master data. Should have an in-Depth Knowledge of SQL, Python, & Snowflake. Knowledge of Cloud-based solutions and Alation/Data Governance tools (preferable) Strong Analytical and Problem-Solving Skills. Knowledge of Agile/Waterfall methodologies. Ability to effectively communicate, orally and writing, with IT & Business stakeholders Propensity to learn innovative technologies and approaches and use this knowledge to enhance Smith & Nephew's strategy, standard practices, and processes. Ability to prioritize tasks and adapt to frequent changes in priorities Ability to work in a distributed team setting and a fast-paced environment Perks and benefits Major medical coverage + policy exclusions and insurance non-medical limit. Educational Assistance. Flexible Personal/Vacation Time Off, Privilege Leave, Floater Leave. Parents/Parents-in-Laws Insurance (Employer Contribution of 8,000/- annually), Employee Assistance Program, Parental Leave. Hybrid Work Model Hands-On, Team-Customized Mentorship Extra Perks: Free Cab Transport Facility for all employees; One-Time Meal provided to all employees as per shift. Night shift allowance for shift-based roles.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

15 - 20 Lacs

Pune

Work from Office

Notice Period: Immediate About the role: We are hiring a Senior Snowflake Data Engineer with 10+ years of experience in cloud data warehousing and deep expertise on the Snowflake platform. The ideal candidate should have strong skills in SQL, ETL/ELT, data modeling, and performance tuning, along with a solid understanding of Snowflake architecture, security, and cost optimization. Roles & Responsibilities: Collaborate with data engineers, product owners, and QA teams to translate business needs into efficient Snowflake-based data models and pipelines. Design, build, and optimize data solutions leveraging Snowflake features such as virtual warehouses, data sharing, cloning, and time travel. Develop and maintain robust ETL/ELT pipelines using tools like Talend, Snowpipe, Streams, Tasks, and Python. Ensure optimal performance of SQL queries, warehouse sizing, and cost-efficient design strategies. Implement best practices for data quality, security, and governance, including RBAC, network policies, and masking. Contribute to code reviews and development standards to ensure high-quality deliverables. Support analytics and BI teams with data exploration and visualization using tools like Tableau or Power BI. Maintain version control using Git and follow Agile development practices. Required Skills: Snowflake Expertise: Deep knowledge of Snowflake architecture and core features. SQL Development: Advanced proficiency in writing and optimizing complex SQL queries. ETL/ELT: Hands-on experience with ETL/ELT design using Snowflake tools and scripting (Python). Data Modeling: Proficient in dimensional modeling, data vault, and best practices within Snowflake. Automation & Scripting: Python or similar scripting language for data workflows. Cloud Integration: Familiarity with Azure and its services integrated with Snowflake. BI & Visualization: Exposure to Tableau, Power BI or other similar platforms.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

0 - 1 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Job Position Title : Project Manager SFDC Responsibilities: The Project Manager will be responsible for overseeing the planning, execution, and delivery of the Data Cloud Project in Salesforce. This role requires strong leadership skills, extensive experience with Salesforce, and a deep understanding of data management best practices. The Project Manager will work closely with cross-functional teams to ensure that project goals are met on time and within budget. Key Responsibilities: - Lead and manage the full lifecycle of the Data Cloud Project in Salesforce, including planning, execution, monitoring, and closure. - Develop detailed project plans, timelines, and budgets to guide project execution and ensure alignment with business objectives. - Coordinate with key stakeholders, including business analysts, developers, and IT teams, to gather requirements and define project scope. - Oversee the design, configuration, and integration of Salesforce Data Cloud to meet organizational data management needs. - Identify and mitigate project risks, and implement contingency plans to ensure successful project delivery. - Monitor project progress and performance, providing regular status updates to stakeholders and senior management. - Facilitate effective communication and collaboration among project team members and stakeholders. - Ensure compliance with data governance and security policies throughout the project lifecycle. - Conduct project evaluations and post-implementation reviews to identify lessons learned and areas for improvement. - Stay current with Salesforce updates, industry trends, and best practices in data management and cloud solutions. Qualifications: - Bachelor's(BE/BTech) degree in Information Technology, Computer Science, Business Administration, or a related field. A Master's degree is a plus. - Proven experience (5+ years) in project management, specifically managing Salesforce projects. - Strong understanding of Salesforce Data Cloud and data management principles. - Project Management Professional (PMP) certification or equivalent is preferred. - Excellent leadership, communication, and interpersonal skills. - Ability to manage multiple projects simultaneously and prioritize tasks effectively. - Strong problem-solving skills and attention to detail. - Experience with Agile/Scrum methodologies is a plus. - Proficiency in project management software and tools. Mandatory skill sets: Project Management Experience Preferred skill sets : PMO Years of experience required: 8Years + Education qualification : B.E./B.Tech

Posted 2 weeks ago

Apply

7.0 - 12.0 years

25 - 30 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Job Description: Strong change and project management skills Stakeholder Management, Communications, Reporting Data management, data governance, and data quality management domain knowledge Subject Matter Expertise required in more than one of the following areas- Data Management, Data Governance, Data Quality Measurement and Reporting, Data Quality Issues Management. Liaise with IWPB markets and stakeholders to coordinate delivery of organizational DQ Governance objectives, and provide consultative support to facilitate progress Conduct analysis of IWPB DQ portfolio to identify thematic trends and insights, to effectively advise stakeholders in managing their respective domains Proficiency in MI reporting and visualization is strongly preferred Proficiency in Change and Project Management is strongly preferred. Ability to prepare programme update materials and present for senior stakeholders, with prompt response any issues / escalations Strong communications and Stakeholder Management skills: Should be able to work effectively and maintain strong working relationships as an integral part of a larger team 8+ yrs of relevant experience preferred.

Posted 2 weeks ago

Apply

6.0 - 7.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Job Title: Technical Team LeadLocation: TechM Blr ITC06 07Years of Experience: 5 7 YearsJob Summary:We are seeking a highly skilled and motivated Technical Team Lead with a strong background in SAP Archiving The ideal candidate will lead a team of technical professionals, ensuring the successful delivery of projects while maintaining high standards of quality and efficiency This role requires a deep understanding of SAP Archiving processes and technologies, as well as the ability to mentor and guide team members in best practices Responsibilities:Lead and manage a team of technical professionals, providing guidance and support in SAP Archiving projects Design, implement, and optimize SAP Archiving solutions to enhance system performance and data management Collaborate with cross functional teams to gather requirements and ensure alignment with business objectives Conduct regular code reviews and provide constructive feedback to team members Monitor project progress, identify risks, and implement mitigation strategies to ensure timely delivery Stay updated with the latest SAP Archiving trends and technologies, and share knowledge with the team Facilitate training sessions and workshops to enhance team skills in SAP Archiving Prepare and present project status reports to stakeholders and management Mandatory Skills:Strong expertise in SAP Archiving, including knowledge of archiving objects, data retention policies, and data retrieval processes Proven experience in leading technical teams and managing projects in a fast paced environment Excellent problem solving skills and the ability to troubleshoot complex technical issues Strong communication and interpersonal skills, with the ability to work collaboratively with diverse teams Experience with SAP modules and integration points related to archiving Preferred Skills:Familiarity with SAP S/4HANA and its archiving capabilities Knowledge of data governance and compliance standards related to data archiving Experience with project management methodologies (Agile, Scrum, etc ) Certifications in SAP or related technologies Qualifications:Bachelors degree in Computer Science, Information Technology, or a related field 5 7 years of experience in SAP Archiving and technical team leadership Proven track record of successful project delivery and team management If you are a passionate leader with a strong background in SAP Archiving and are looking to take the next step in your career, we encourage you to apply for this exciting opportunity

Posted 2 weeks ago

Apply

10.0 - 15.0 years

20 - 25 Lacs

Bengaluru

Work from Office

SAP MDG Experience in SAP MDG EhP6 & MDG 7.0/8.0 (Preferably 9.0) (10+ Years of experience) Extensive ECC and/or S/4 HANA experience, Worked on at least 2 MDG projects Expertise in Implementation of SAP MDG Solution for masters like Customer, Vendor, Material, etc. Expertise in Data Model Enhancement, Data Transfer (DIF/DEF), Data Replication Framework (DRF), Business Rules Framework plus (BRFplus). Experience in Configuration rule based Workflow and in Integrating business process requirements with the technical implementation of SAP Master Data Governance. Experience in User interface modelling (Design and Creation of UI, Value restriction, Define navigation elements of type Hyperlink or Push button, Data quality, Validation and Derivation rules). Experience in Process Modelling (Entity, Business Activity change, Request type, Workflow, Edition type, Relationship, Data replication techniques, SOA service, ALE connection, Key & value mapping, Data transfer, Export & import master data, Convert master data). Expert knowledge in activation and configuration of the MDG modules & components. SAP ERP logistics knowledge (SAP modules SD or MM), especially master data is required.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

20 - 25 Lacs

Bengaluru

Work from Office

We are seeking an experienced Data Platform Reliability Engineer to lead our efforts in designing, implementing, and maintaining highly reliable data infrastructure. The ideal candidate will bring extensive expertise in building enterprise-grade data platforms with a focus on reliability engineering, governance, and SLA/SLO design. This role will be instrumental in developing advanced monitoring solutions, including LLM-powered systems, to ensure the integrity and availability of our critical data assets. Platform Architecture and Design Design and architect scalable, fault-tolerant data platforms leveraging modern technologies like Snowflake, Databricks, and cloud-native services Establish architectural patterns that ensure high availability and resiliency across data systems Develop technical roadmaps for platform evolution with reliability as a core principle Reliability Engineering Implement comprehensive SLA/SLO frameworks for data services Design and execute chaos engineering experiments to identify and address potential failure modes Create automated recovery mechanisms for critical data pipelines and services Establish incident management processes and runbooks Monitoring and Observability Develop advanced monitoring solutions, including LLM-powered anomaly detection Design comprehensive observability strategies across the data ecosystem Implement proactive alerting systems to identify issues before they impact users Create dashboards and visualization tools for reliability metrics Data Quality and Governance Establish data quality monitoring processes and tools Implement data lineage tracking mechanisms Develop automated validation protocols for data integrity Collaborate with data governance teams to ensure compliance with policies Innovation and Improvement Research and implement AI/ML approaches to improve platform reliability Lead continuous improvement initiatives for data infrastructure Mentor team members on reliability engineering best practices Stay current with emerging technologies and reliability patterns in the data platform space Qualifications 10+ years of experience in data platform engineering or related fields Proven expertise with enterprise data platforms (Snowflake, Databricks, etc.) Strong background in reliability engineering, SRE practices, or similar disciplines Experience implementing data quality monitoring frameworks Knowledge of AI/ML applications for system monitoring and reliability Excellent communication skills and ability to translate technical concepts to diverse stakeholders

Posted 2 weeks ago

Apply

3.0 - 6.0 years

20 - 25 Lacs

Hyderabad

Work from Office

Blend is hiring a Senior Data Scientist (Generative AI) to spearhead the development of advanced AI-powered classification and matching systems on Databricks. You will contribute to flagship programs like the Diageo AI POC by building RAG pipelines, deploying agentic AI workflows, and scaling LLM-based solutions for high-precision entity matching and MDM modernization. Key Responsibilities Design and implement end-to-end AI pipelines for product classification, fuzzy matching, and deduplication using LLMs, RAG, and Databricks-native workflows . Develop scalable, reproducible AI solutions within Databricks notebooks and job clusters , leveraging Delta Lake, MLflow, and Unity Catalog. Engineer Retrieval-Augmented Generation (RAG) workflows using vector search and integrate with Python-based matching logic. Build agent-based automation pipelines (rule-driven + GenAI agents) for anomaly detection, compliance validation, and harmonization logic. Implement explainability, audit trails, and governance-first AI workflows aligned with enterprise-grade MDM needs. Collaborate with data engineers, BI teams, and product owners to integrate GenAI outputs into downstream systems. Contribute to modular system design and documentation for long-term scalability and maintainability. Bachelor s/Master s in Computer Science, Artificial Intelligence, or related field. 5+ years of overall Data Science experience with 2+ years in Generative AI / LLM-based applications. Deep experience with Databricks ecosystem: Delta Lake, MLflow, DBFS, Databricks Jobs & Workflows. Strong Python and PySpark skills with ability to build scalable data pipelines and AI workflows in Databricks. Experience with LLMs (e.g., OpenAI, LLaMA, Mistral) and frameworks like LangChain or LlamaIndex. Working knowledge of vector databases (e.g., FAISS, Chroma) and prompt engineering for classification/retrieval. Exposure to MDM platforms (e.g., Stibo STEP) and familiarity with data harmonization challenges. Experience with explainability frameworks (e.g., SHAP, LIME) and AI audit tooling. Preferred Skills Knowledge of agentic AI architectures and multi-agent orchestration. Familiarity with Azure Data Hub and enterprise data ingestion frameworks. Understanding of data governance, lineage, and regulatory compliance in AI systems.

Posted 2 weeks ago

Apply

3.0 - 6.0 years

20 - 25 Lacs

Hyderabad

Work from Office

Job Description Blend is hiring a Senior Data Scientist (Generative AI) to spearhead the development of advanced AI-powered classification and matching systems on Databricks. You will contribute to flagship programs like the Diageo AI POC by building RAG pipelines, deploying agentic AI workflows, and scaling LLM-based solutions for high-precision entity matching and MDM modernization. Key Responsibilities Design and implement end-to-end AI pipelines for product classification, fuzzy matching, and deduplication using LLMs, RAG, and Databricks-native workflows . Develop scalable, reproducible AI solutions within Databricks notebooks and job clusters , leveraging Delta Lake, MLflow, and Unity Catalog. Engineer Retrieval-Augmented Generation (RAG) workflows using vector search and integrate with Python-based matching logic. Build agent-based automation pipelines (rule-driven + GenAI agents) for anomaly detection, compliance validation, and harmonization logic. Implement explainability, audit trails, and governance-first AI workflows aligned with enterprise-grade MDM needs. Collaborate with data engineers, BI teams, and product owners to integrate GenAI outputs into downstream systems. Contribute to modular system design and documentation for long-term scalability and maintainability. Qualifications Bachelor s/Master s in Computer Science, Artificial Intelligence, or related field. 5+ years of overall Data Science experience with 2+ years in Generative AI / LLM-based applications. Deep experience with Databricks ecosystem: Delta Lake, MLflow, DBFS, Databricks Jobs & Workflows. Strong Python and PySpark skills with ability to build scalable data pipelines and AI workflows in Databricks. Experience with LLMs (e.g., OpenAI, LLaMA, Mistral) and frameworks like LangChain or LlamaIndex. Working knowledge of vector databases (e.g., FAISS, Chroma) and prompt engineering for classification/retrieval. Exposure to MDM platforms (e.g., Stibo STEP) and familiarity with data harmonization challenges. Experience with explainability frameworks (e.g., SHAP, LIME) and AI audit tooling. Preferred Skills Knowledge of agentic AI architectures and multi-agent orchestration. Familiarity with Azure Data Hub and enterprise data ingestion frameworks. Understanding of data governance, lineage, and regulatory compliance in AI systems.

Posted 2 weeks ago

Apply

3.0 - 5.0 years

6 - 10 Lacs

Mumbai

Work from Office

Position: Data Lifecycle Management (DLM) Specialist | Mumbai | WFO Location: Goregaon, Mumbai (Apply if you are from Western line) Shift Timing: 9 AM 6 PM Notice Period: Immediate to 30 Days Experience: 3 to 5 Years Work Mode: Work from Office (WFO) Interested candidates can apply to saikeertana.r@twsol.com Role Overview: Seeking a highly motivated and client-centric DLM Specialist with 35 years of experience in data management , financial services , or other regulated industries . This role focuses on reviewing applications and ensuring data retention, disposition, and archiving compliance while aligning with privacy regulations and internal policy. Key Responsibilities: Assess data retention, archiving, and disposition requirements across all business divisions Conduct regular reviews and stakeholder meetings with business and technology teams Manage data risk identification and mitigation plans related to retention, location, and transfer Document concise data management requirements and ensure implementation tracking Support in defining operational and compliance controls Compile analysis reports and drive recommendation implementation Engage system owners in problem-solving and decision-making Represent DLM in cross-functional meetings to communicate policy standards Prepare progress reports and contribute to process improvements Required Qualifications: Bachelors degree 3 to 5 years experience in information/data management , data storage , or financial services operations Strong business analysis skills Excellent verbal and written communication skills in English High attention to detail with the ability to document complex information clearly Demonstrated client servicing ability and stakeholder management Experience in developing business and functional requirements for tech systems Nice to Have: Degree in Information Systems , Business Administration , Archiving , or Law Understanding of personal data protection and privacy regulations Familiarity with database and cloud technologies , AI trends Reporting experience with Power BI / Tableau Experience working with high-volume datasets

Posted 2 weeks ago

Apply

5.0 - 10.0 years

8 - 9 Lacs

Pune

Work from Office

Req ID: 332236 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Business Consulting-Technical analyst with ETL,GCP using Pyspark to join our team in Pune, Mah r shtra (IN-MH), India (IN). Key Responsibilities: Data Pipeline Development: Designing, implementing, and optimizing data pipelines on GCP using PySpark for efficient and scalable data processing. ETL Workflow Development: Building and maintaining ETL workflows for extracting, transforming, and loading data into various GCP services. GCP Service Utilization: Leveraging GCP services like BigQuery, Cloud Storage, Dataflow, and Dataproc for data storage, processing, and analysis. Data Transformation: Utilizing PySpark for data manipulation, cleansing, enrichment, and validation. Performance Optimization: Ensuring the performance and scalability of data processing jobs on GCP. Collaboration: Working with data scientists, analysts, and other stakeholders to understand data requirements and translate them into technical solutions. Data Quality and Governance: Implementing and maintaining data quality standards, security measures, and compliance with data governance policies on GCP. Troubleshooting and Support: Diagnosing and resolving issues related to data pipelines and infrastructure. Staying Updated: Keeping abreast of the latest GCP services, PySpark features, and best practices in data engineering. Required Skills: GCP Expertise: Strong understanding of GCP services like BigQuery, Cloud Storage, Dataflow, and Dataproc. PySpark Proficiency: Demonstrated experience in using PySpark for data processing, transformation, and analysis. Python Programming: Solid Python programming skills for data manipulation and scripting. Data Modeling and ETL: Experience with data modeling, ETL processes, and data warehousing concepts. SQL: Proficiency in SQL for querying and manipulating data in relational databases. Big Data Concepts: Understanding of big data principles and distributed computing concepts. Communication and Collaboration: Ability to effectively communicate technical solutions and collaborate with cross-functional teams

Posted 2 weeks ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Mumbai

Work from Office

Key Responsibilities: Design, develop, and maintain robust, scalable, and efficient data pipelines to collect, process, and store structured and unstructured data. Build and optimize data warehouses, data lakes, and ETL/ELT workflows. Integrate data from multiple sources including databases, APIs, and streaming platforms. Collaborate with data scientists and analysts to understand data requirements and deliver high-quality datasets. Ensure data quality, integrity, and security throughout the data lifecycle. Monitor and troubleshoot data pipeline performance and failures. Implement data governance and compliance policies. Automate data workflows and implement data orchestration tools (e.g., Apache Airflow). Optimize storage and query performance in cloud and on-premises environments. Keep up to date with emerging data engineering tools, techniques, and best practices.

Posted 2 weeks ago

Apply

13.0 - 16.0 years

32 - 40 Lacs

Bengaluru

Work from Office

Key Responsibilities Facilitating the integration of diverse data types and sources to provide a comprehensive view of patient health and treatment outcomes. Provide coaching and peer review to ensure that the team s work reflects the industry s best practices for data curation activities, including data privacy and anonymization standards. Ensure all datasets meet analysis-ready and privacy requirements by performing necessary data curation activities (e.g. pre-process, contextualize and/or anonymize). Ensure that datasets are processed to meet conditions mentioned in the approved data re-use request (e.g., remove subjects from countries that do not allow re-use). Write clean, readable code. Ensure that deliverables are appropriately quality controlled, documented, and when required, can be handed over to R&D Tech team for production pipeline implementation. Transforming raw healthcare data into products that can be used to catalyze the work of the wider RWDMA and Biostatistics teams and be leveraged by our diverse group of stakeholders to generate insights. Ensuring data quality, integrity, and security across various data sources. Supporting data-driven decision-making processes that enhance patient outcomes and operational efficiencies. Education Requirements Advanced degree (Masters or Ph.D.) in Life Sciences, Epidemiology, Biostatistics, Public Health, Computer Sciences, Mathematics, Statistics or a related field with applicable experience . Job Related Experience Experience in data engineering and curation, with majority of experience on real-world data in the healthcare or pharmaceutical industry. Proven ability to handle and process large datasets efficiently, ensuring data privacy. Proficiency in handling structured, semi-structured, and unstructured data while ensuring data privacy. Understanding of data governance principles and practices with a focus on data privacy. Innovative mindset and willingness to challenge status quo, solution-oriented mindset Fluent in written and spoken English to effectively communicate and able to articulate complex concepts to diverse audiences Experience of working in global matrix environment and managing stakeholders effectively Experience in complex batch processing, Azure Data Factory, Databricks, Airflow, Delta Lake, PySpark, Pandas and other python dataframe libraries including how to apply them to achieve industry standards and data privacy. Proven ability to collaborate with cross-functional teams. Strong communication skills to present curated data.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

35 - 40 Lacs

Bengaluru

Work from Office

Build Your Career at Informatica We seek innovative thinkers who believe in the power of data to drive meaningful change. At Informatica, we welcome adventurous, work-from-anywhere minds eager to tackle the worlds most complex challenges. Our employees are empowered to push their bold ideas forward, and we are united by a shared passion for using data to do the extraordinary for each other and the world. Senior Solution Architect - Presales (Remote) Were looking for a senior solution architect candidate with experience in Presales, Data Integration and MDM, to join our team in remote. You will report to the Director, Technical Sales. Technology Youll Use Presales, Data Integration and MDM Your Role Responsibilities? Heres What Youll Do Basic knowledge of top 3 cloud ecosystems and top 2 data related technologies Basic knowledge of cloud computing security aspects Basic certification on at least 1 cloud ecosystem and 1 data related cloud technology at the level defined by the business area of focus Skills on at least one INFA related software platform /technology, Storytelling, and experience establishing communication and engagement with prospects specific to use cases Ability to engage and create relationships with influencers, coaches, and decision makers, and partners Basic technical knowledge of hybrid deployment of software solutions, Data Warehousing, Database, and/or Business Intelligence software concepts and products. What Wed Like to See Manage customer engagements without support. Responsible for sharing best practices, content, and tips and tricks within the primary area of responsibility Stay current on certification of services required for area of responsibility Perform all activities leading up to the delivery of a customer demo with some assistance including discovery, technical qualification/fit, customer presentations, standard demos, and related customer facing communication Assist on RFP responses and/or POCs Partner with the CSM team on nurture activities including technical advisory, workshops, etc. Provide customer feedback on product gaps using Vivun Ability to support demos at marketing events without support. Role Essentials 6+ years of relevant experience in data integration, master data management, or data governance 8+ year of presales/technical sales, industry, or consulting experience BA/BS or equivalent educational background is preferred Perks & Benefits Comprehensive health, vision, and wellness benefits (Paid parental leave, adoption benefits, life insurance, disability insurance and 401k plan or international pension/retirement plans Flexible time-off policy and hybrid working practices Equity opportunities and an employee stock purchase program (ESPP) Comprehensive Mental Health and Employee Assistance Program (EAP) benefit Our DATA values are our north star and we are passionate about building and delivering solutions that accelerate data innovations. At Informatica, our employees are our greatest competitive advantage. So, if your experience aligns but doesnt exactly match every qualification, apply anyway. You may be exactly who we need to fuel our future with innovative ideas and a thriving culture. Informatica (NYSE: INFA), a leader in enterprise AI-powered cloud data management, brings data and AI to life by empowering businesses to realize the transformative power of their most critical assets. We pioneered the Informatica Intelligent Data Management Cloud that manages data across any multi-cloud, hybrid system, democratizing data to advance business strategies. Customers in approximately 100 countries and more than 80 of the Fortune 100 rely on Informatica. www.informatica.com . Connect with LinkedIn , X , and Facebook . Informatica. Where data and AI come to life. ","

Posted 2 weeks ago

Apply

10.0 - 15.0 years

20 - 27 Lacs

Bengaluru

Work from Office

Build Your Career at Informatica Were looking for a diverse group of collaborators who believe data has the power to improve society. Adventurous minds who value solving some of the worlds most challenging problems. Here, employees are encouraged to push their boldest ideas forward, united by a passion to create a world where data improves the quality of life for people and businesses everywhere. Principal Advisory Services Consultant Informatica is looking for a Principal Consultant--Advisory Services with practitioner experience leading large-scale data management and analytics projects. This is a remote position, reporting to a Senior Director, Data Strategy & Governance, you have experience implementing data governance programs, defining vision and data strategy with peers and senior leadership to gain "support" to the strategy and overall value of Informatica Products & Solutions to join our Professional Services team. You will provide pre- and post-sale strategic consulting services. Responsibilities include providing clients with data strategy development and understanding, implementation guidance, program design, business use case identification, program road mapping, and business outcome definition. Provides pre- and post-sale business-oriented strategic consulting services, typically onsite at the customers location. Responsibilities include providing clients with overall data strategy development and alignment, implementation guidance, program design, business use case identification, program road mapping, and business outcome definition. Essential Duties & Responsibilities Analyzes complex customer environments comprised of Informatica and non-informatica products. Organizes large-scale programs and coordinates/leads multiple delivery teams. Applies innovative design solutions by keeping current on new technology trends and changing industry standards and patterns. Travel to customer sites typically exceeds 50%, but may exceed 75% for extended periods, as applicable to the customer engagement. Knowledge & Skills Holds expert-level experience and uses professional concepts and company objectives to resolve complex issues in creative and effective ways. Works on complex issues where analysis of situations or data requires an in-depth evaluation of variable factors. Exercises judgment in methods, techniques, and evaluation criteria for obtaining results. Etensively leverages business acumen and subject matter expertise to provide expert-level advice and guidance to clients. Thorough understanding of Informatica business priorities, strategy and direction. Works across the organization and maintains/builds strong working relationships based in experiences/past interactions. Significant experience leading the delivery of complex enterprise data management projects/initiatives. Competent in navigating, using, and demonstrating functionality in Informaticas business-facing applications. Published industry white papers, best practices, field guides and external communications. Strong written communication skills with competency in developing professional looking presentation materials and customer deliverables. Developed ability in communicating to executive level audiences in both interpersonal and presentation formats. Education/Experience BA/BS or equivalent educational background is preferred. Minimum 10+ years of relevant professional experience. Perks & Benefits Comprehensive health, vision, and wellness benefits (Paid parental leave, adoption benefits, life insurance, disability insurance and 401k plan or international pension/retirement plans Flexible time-off policy and hybrid working practices Equity opportunities and an employee stock purchase program (ESPP) Comprehensive Mental Health and Employee Assistance Program (EAP) benefit Our DATA values are our north star and we are passionate about building and delivering solutions that accelerate data innovations. At Informatica, our employees are our greatest competitive advantage. So, if your experience aligns but doesnt exactly match every qualification, apply anyway. You may be exactly who we need to fuel our future with innovative ideas and a thriving culture. Informatica (NYSE: INFA), an Enterprise Cloud Data Management leader, brings data and AI to life by empowering businesses to realize the transformative power of their most critical assets. We pioneered the Informatica Intelligent Data Management Cloud that manages data across any multi-cloud, hybrid system, democratizing data to advance business strategies. Customers in over 100 countries and 85 of the Fortune 100 rely on Informatica. www.informatica.com . Connect with LinkedIn , Twitter , and Facebook . Informatica. Where data and AI come to life. ","

Posted 2 weeks ago

Apply

6.0 - 7.0 years

13 - 15 Lacs

Bengaluru

Work from Office

Job Title: Software DeveloperLocation: TechM Blr ITC06 07Years of Experience: 2 5 YearsJob Summary:We are seeking a skilled Software Developer with a strong background in SAP Archiving to join our dynamic team The ideal candidate will have 2 5 years of experience in software development, with a focus on SAP solutions You will be responsible for designing, developing, and implementing software applications that meet our business needs while ensuring data integrity and compliance through effective archiving strategies Responsibilities:Design, develop, and maintain software applications in accordance with business requirements Implement and manage SAP Archiving solutions to optimize data storage and retrieval processes Collaborate with cross functional teams to gather requirements and translate them into technical specifications Conduct code reviews and ensure adherence to best practices in software development Perform testing and debugging of applications to ensure high quality deliverables Provide technical support and troubleshooting for existing applications Stay updated with the latest industry trends and technologies related to SAP and software development Mandatory Skills:Strong knowledge and experience in SAP Archiving

Posted 2 weeks ago

Apply

5.0 - 10.0 years

10 - 11 Lacs

Pune

Work from Office

Job Overview: We are seeking a skilled Data Solution Architect to design Solution and Lead the implementation on GCP. The ideal candidate will possess extensive experience in data Architecting, Solution design and data management practices.Responsibilities: Architect and design end-to-end data solutions on Cloud Platform, focusing on data warehousing and big data platforms. Collaborate with clients, developers, and architecture teams to understand requirements and translate them into effective data solutions.Develop high-level and detailed data architecture and design documentation. Implement data management and data governance strategies, ensuring compliance with industry standards. Architect both batch and real-time data solutions, leveraging cloud native services and technologies. Design and manage data pipeline processes for historic data migration and data integration. Collaborate with business analysts to understand domain data requirements and incorporate them into the design deliverables. Drive innovation in data analytics by leveraging cutting-edge technologies and methodologies. Demonstrate excellent verbal and written communication skills to communicate complex ideas and concepts effectively. Stay updated on the latest advancements in Data Analytics, data architecture, and data management techniques. Requirements Minimum of 5 years of experience in a Data Architect role, supporting warehouse and Cloud data platforms/environments. Experience with common GCP services such as BigQuery, Dataflow, GCS, Service Accounts, cloud function Extremely strong in BigQuery design, development Extensive knowledge and implementation experience in data management, governance, and security frameworks. Proven experience in creating high-level and detailed data architecture and design documentation. Strong aptitude for business analysis to understand domain data requirements. Proficiency in Data Modelling using any Modelling tool for Conceptual, Logical, and Physical models is preferred Hands-on experience with architecting end-to-end data solutions for both batch and real-time designs. Ability to collaborate effectively with clients, developers, and architecture teams to implement enterprise-level data solutions. Familiarity with Data Fabric and Data Mesh architecture is a plus. Excellent verbal and written communication skills.

Posted 2 weeks ago

Apply

8.0 - 10.0 years

15 - 19 Lacs

Mumbai

Work from Office

Job Description Are You Ready to Make It Happen at Mondel z International? Join our Mission to Lead the Future of Snacking. Make It Uniquely Yours. The Senior Manager, Finance Data Governance is a critical role responsible for leading and executing the finance master data governance strategy. This role will drive the implementation of data governance policies, standards, and processes to ensure data quality, integrity, and security. The Senior Manager will collaborate with business stakeholders, IT teams, and data owners to establish a data-driven culture and enable effective use of data for business decision-making. How you will contribute Strategy and Leadership: Contribute to the development and execution of the overall data governance strategy, aligning with business objectives and regulatory requirements. Promote data governance awareness and adoption throughout the organization. Policy and Standards: Develop and maintain data governance policies, standards, and procedures, ensuring alignment with industry best practices and regulatory guidelines. Define data quality metrics and monitor data quality performance. Establish data ownership and stewardship responsibilities. Implementation and Execution: Lead the implementation of data governance tools and technologies. Work with business units to identify and prioritize data governance initiatives. Ensure data lineage is documented and maintained. Collaboration and Communication: Partner with business stakeholders to understand data needs and requirements. Collaborate with IT teams to ensure data governance requirements are integrated into system development and maintenance processes. Communicate data governance policies and procedures to the organization. Facilitate data governance council meetings and working groups. Data Quality Management: Establish data quality rules and monitor data quality metrics. Identify and resolve data quality issues. Implement data quality improvement initiatives. Compliance and Security: Ensure data governance policies and procedures comply with relevant regulations, such as GDPR, CCPA, and other data privacy laws. Implement data security measures to protect sensitive data. Monitor and audit data governance activities to ensure compliance. What you will bring A desire to drive your future and accelerate your career and the following experience and knowledge: Qualifications: Education: Bachelors degree in a relevant field (e.g., Finance, Business Administration, Data & Analytics). Masters degree preferred. Experience: Minimum of 8-10 years of experience in data governance, data management, or related fields. Proven experience in leading and implementing data governance initiatives. Strong understanding of data governance principles, practices, and technologies. Experience with data quality management tools and techniques. Skills: Excellent leadership and communication skills. Strong analytical and problem-solving skills. Ability to work effectively with cross-functional teams. Proficiency in data governance tools and technologies (e.g., Collibra, Informatica, Alation). Knowledge of data warehousing and business intelligence concepts. Strong project management skills. Key Competencies: Strategic Thinking: Ability to develop and execute a data governance strategy aligned with business objectives. Communication: Ability to communicate complex data governance concepts to both technical and non-technical audiences. Collaboration: Ability to work effectively with cross-functional teams. Problem Solving: Ability to identify and resolve data governance issues. Technical Proficiency: Strong understanding of data governance tools and technologies. Results Orientation: Ability to drive data governance initiatives to achieve measurable results. More about this role Education / Certifications: Education: Bachelors degree in a relevant field (e.g., Finance, Business Administration, Data & Analytics). Masters degree preferred. Job specific requirements: Minimum of 8-10 years of experience in data governance, data management, or related fields. Proven experience in leading and implementing data governance initiatives. Strong understanding of data governance principles, practices, and technologies. Experience with data quality management tools and techniques. Travel requirements: Occasional Work schedule: Flexible Relocation Support Available? No Relocation support available Business Unit Summary We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position you see is right for you, we encourage you to apply! Our people make all the difference in our succes Excited to grow your career? We value our talented employees, and whenever possible strive to help one of our associates grow professionally before recruiting new talent to our open positions. If you think the open position you see is right for you, we encourage you to apply! IF YOU REQUIRE SUPPORT TO COMPLETE YOUR APPLICATION OR DURING THE INTERVIEW PROCESS, PLEASE CONTACT THE RECRUITER Job Type Regular Project and Program Management Business Capability

Posted 2 weeks ago

Apply

7.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Experience: 7-10 years Job Description: We are looking for an experienced SAP Master Data Management (MDM) Consultant with expertise in ECC, S4HANA Migration, Rollouts, and Data Management. The ideal candidate will lead and execute MDM strategies, manage data migration, and drive continuous improvements. Key Responsibilities: Own and manage Master Data Management (MDM) activities for SAP projects. De-duplication of Masters Lead data migration and cutovers in SAP S/4HANA projects (Greenfield, Migration, or Rollouts). Establish and implement MDM best practices and data management capabilities. Define data management principles, policies, and lifecycle strategies. Monitor data quality with consistent metrics and reporting. Work with MDM stakeholders to drive data governance and compliance. Track and manage MDM objects, ensuring timely delivery. Conduct training sessions for teams on ECC & S/4HANA MDM. Participate in daily stand-ups, issue tracking, and dashboard updates. Identify risks and process improvements for MDM. Required Skills & Qualifications: Minimum 7-10 years of experience in SAP MDM. Strong knowledge of ECC, SAP S/4HANA, Data Migration, and Rollouts. Experience in data governance, lifecycle management, and compliance. Familiarity with JIRA KANBAN boards, ticketing tools, and dashboards. Strong problem-solving and communication skills. Ability to work with the team especially ABAP, Middleware, Functionals. Knowledge on Excel is a MUST ABAP knowledge is preferable SAP training or certifications are an asset Team player, with strong communication skills and with a collaborative spirit Able to coach, support, train and develop junior consultants Customer oriented, result driven & focused on delivering quality

Posted 2 weeks ago

Apply

2.0 - 7.0 years

7 - 12 Lacs

Chennai

Work from Office

Role Purpose The purpose of this role is to execute the process and drive the performance of the team on the key metrices of the process. Job Details Country/Region: India Employment Type: Onsite Work Type: Contract State: Tamil Nadu City: Chennai Requirements Onsite at Abu Dhabi Contract for 2 Years Shift: Abu Dhabi General Shift Timings Someone who can travel to onsite ASAP or max 40 days to start working on this role. Job Description Minimum overall work experience: 10 years Financial Systems Support (L1/L2) Provide financials system support to all end users across ADD functions relating to financial systems. Coordinating with Finance HQ IT support team to resolve reported issues through RITM/Incident/Idea tickets. Endorse and approve financial roles access & authorization tickets. Month End Closing (MEC) Support Provide support during transactional data processing, preparing uploads, data validation & reconciliation and identify incorrect master data assignments and subsequently suggest corrective actions. Closely coordinating with financial users in data reconciliations and validation during month end closing activities. Close coordinating with HUB and Bi teams to update SAP data to get correct segmented finance reports which is aligned with SAP data. Validate Bi Financial Reports & Dashboard Master data governance and policy & procedure compliance

Posted 2 weeks ago

Apply

4.0 - 6.0 years

8 - 12 Lacs

Bengaluru

Work from Office

[{"Salary":null , "Remote_Job":false , "Posting_Title":"Senior Databricks Engineer / Tech Lead" , "Is_Locked":false , "City":"Bangalore" , "Industry":"Technology" , "Job_Description":" About the Role: As part of our Innovation Team , we are seeking a C ertified Senior Databricks Engineer / Tech Lead with 7\u20138 years of hands-on experience in building scalable data platforms. This role will focus on designing, building, and operationalizing data solutions on the Databricks platform to accelerate advanced analytics and AI use cases. Key Responsibilities: Architect, develop, productionize and maintain end to end solutions in Databricks Implement and optimize ETL/ELT processes for structured and semi-structured data Leverage Delta Lake for ACID transactions, data versioning, and time-travel features Drive adoption of the Lakehouse architecture to unify data warehousing and AI/ML workloads Implement CI/CD pipelines using Databricks Repos , Asset Bundles , and integration with DevOps tools Configure and enforce Unity Catalog for secure, governed access to data assets Design and implement data quality and validation frameworks to ensure trusted data Lead performance tuning and optimization efforts for Spark jobs and queries Integrate with external systems such as Kafka , Event Hub , and REST APIs for real-time and batch processing Collaborate with data scientists and business stakeholders to build feature-rich datasets and reusable assets Troubleshoot and debug complex data workflows in development and production environments Guide junior engineers and contribute to best practices in data engineering and platform usage Ensure platform security, access controls , and compliance with enterprise data governance standards. Required Skills: Expertise in Apache Spark and Databricks platform Experience with Databricks Lakehouse architecture Delta Lake concepts Proficient in PySpark, SQL, and Delta Lake Strong knowledge of Data Engineering concepts Experience with data ingestion, ETL/ELT pipelines Familiarity with Unity Catalog and data governance Hands-on with Databricks Notebooks and Jobs CI/CD automation with Databricks Repos and DevOps, Asset Bundles Databricks Asset Bundle implementation knowledge Strong understanding of performance tuning in Spark Data quality and validation framework implementation Experience in handling structured, semi-structured data Proficient in debugging and troubleshooting Collaboration with data scientists and analysts Good understanding of security and access control Experience with Mosaic AI or Databricks ML capabilities Exposure to streaming pipelines using Structured Streaming Familiarity with data observability and lineage tools.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies