Jobs
Interviews

2211 Data Governance Jobs - Page 42

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

10 - 14 Lacs

Mumbai

Work from Office

Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.

Posted 3 weeks ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

Bengaluru

Remote

Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 3 weeks ago

Apply

8.0 - 12.0 years

12 - 16 Lacs

Hyderabad

Work from Office

At Storable , were on a mission to power the future of storage. Our innovative platform helps businesses manage, track, and grow their self-storage operations, and we re looking for a Data Manager to join our data-driven team. Storable is committed to leveraging cutting-edge technologies to improve the efficiency, accessibility, and insights derived from data, empowering our team to make smarter decisions and foster impactful growth. As a Data Manager, you will play a pivotal role in overseeing and shaping our data operations, ensuring that our data is organized, accessible, and effectively managed across the organization. You will lead a talented team, work closely with cross-functional teams, and drive the development of strategies to enhance data quality, availability, and security. Key Responsibilities: Lead Data Management Strategy: Define and execute the data management vision, strategy, and best practices, ensuring alignment with Storables business goals and objectives. Oversee Data Pipelines: Design, implement, and maintain scalable data pipelines using industry-standard tools to efficiently process and manage large-scale datasets. Ensure Data Quality & Governance: Implement data governance policies and frameworks to ensure data accuracy, consistency, and compliance across the organization. Manage Cross-Functional Collaboration: Partner with engineering, product, and business teams to make data accessible and actionable, and ensure it drives informed decision-making. Optimize Data Infrastructure: Leverage modern data tools and platforms (e.g., AWS, Apache Airflow, Apache Iceberg ) to create an efficient, reliable, and scalable data infrastructure. Monitor & Improve Performance: Proactively monitor data processes and workflows, troubleshoot issues, and optimize performance to ensure high reliability and data integrity. Mentorship & Leadership: Lead and develop a team of data engineers and analysts, fostering a collaborative environment where innovation and continuous improvement are valued. Qualifications: Proven Expertise in Data Management: Significant experience in managing data infrastructure, data governance, and optimizing data pipelines at scale. Technical Proficiency: Strong hands-on experience with data tools and platforms such as Apache Airflow, Apache Iceberg, and AWS services (S3, Lambda, Redshift, Glue) . Data Pipeline Mastery: Familiarity with designing, implementing, and optimizing data pipelines and workflows in Python or other languages for data processing. Experience with Data Governance: Solid understanding of data privacy, quality control, and governance best practices. Leadership Skills: Ability to lead and mentor teams, influence stakeholders, and drive data initiatives across the organization. Analytical Mindset: Strong problem-solving abilities and a data-driven approach to improving business operations. Excellent Communication: Ability to communicate complex data concepts to both technical and non-technical stakeholders effectively. Bonus Points: Experience with visualization tools (e.g., Looker, Tableau ) and reporting frameworks to provide actionable insights. Why Storable Cutting-Edge Technology: Work with the latest tools and technologies to solve complex data challenges. Impactful Work: Join a dynamic and growing company where your work directly contributes to shaping the future of the storage industry. Collaborative Culture: Be part of a forward-thinking, inclusive environment where innovation and teamwork are at the core of everything we do. Career Growth: We believe in continuous learning and provide ample opportunities for personal and professional development.

Posted 3 weeks ago

Apply

5.0 - 10.0 years

16 - 20 Lacs

Hyderabad

Work from Office

What You Will Do: As a Data Governance Architect at Kanerika, you will play a pivotal role in shaping and executing the enterprise data governance strategy. Your responsibilities include: 1. Strategy, Framework, and Governance Operating Model Develop and maintain enterprise-wide data governance strategies, standards, and policies. Align governance practices with business goals like regulatory compliance and analytics readiness. Define roles and responsibilities within the governance operating model. Drive governance maturity assessments and lead change management initiatives. 2. Stakeholder Alignment & Organizational Enablement Collaborate across IT, legal, business, and compliance teams to align governance priorities. Define stewardship models and create enablement, training, and communication programs. Conduct onboarding sessions and workshops to promote governance awareness. 3. Architecture Design for Data Governance Platforms Design scalable and modular data governance architecture. Evaluate tools like Microsoft Purview, Collibra, Alation, BigID, Informatica. Ensure integration with metadata, privacy, quality, and policy systems. 4. Microsoft Purview Solution Architecture Lead end-to-end implementation and management of Microsoft Purview. Configure RBAC, collections, metadata scanning, business glossary, and classification rules. Implement sensitivity labels, insider risk controls, retention, data map, and audit dashboards. 5. Metadata, Lineage & Glossary Management Architect metadata repositories and ingestion workflows. Ensure end-to-end lineage (ADF \u2192 Synapse \u2192 Power BI). Define governance over business glossary and approval workflows. 6. Data Classification, Access & Policy Management Define and enforce rules for data classification, access, retention, and sharing. Align with GDPR, HIPAA, CCPA, SOX regulations. Use Microsoft Purview and MIP for policy enforcement automation. 7. Data Quality Governance Define KPIs, validation rules, and remediation workflows for enterprise data quality. Design scalable quality frameworks integrated into data pipelines. 8. Compliance, Risk, and Audit Oversight Identify risks and define standards for compliance reporting and audits. Configure usage analytics, alerts, and dashboards for policy enforcement. 9. Automation & Integration Automate governance processes using PowerShell, Azure Functions, Logic Apps, REST APIs. Integrate governance tools with Azure Monitor, Synapse Link, Power BI, and third-party platforms. Requirements 1 5+ years in data governance and management. Expertise in Microsoft Purview, Informatica, and related platforms. Experience leading end-to-end governance initiatives. Strong understanding of metadata, lineage, policy management, and compliance regulations. Hands-on skills in Azure Data Factory, REST APIs, PowerShell, and governance architecture. Familiar with Agile methodologies and stakeholder communication. Benefits 1. Culture: Open Door Policy: Encourages open communication and accessibility to management. Open Office Floor Plan: Fosters a collaborative and interactive work environment. Flexible Working Hours: Allows employees to have flexibility in their work schedules. Employee Referral Bonus: Rewards employees for referring qualified candidates. Appraisal Process Twice a Year: Provides regular performance evaluations and feedback. 2. Inclusivity and Diversity: Hiring practices that promote diversity: Ensures a diverse and inclusive workforce. Mandatory POSH training: Promotes a safe and respectful work environment. 3. Health Insurance and Wellness Benefits: GMC and Term Insurance: Offers medical coverage and financial protection. Health Insurance: Provides coverage for medical expenses. Disability Insurance: Offers financial support in case of disability. 4. Child Care & Parental Leave Benefits: Company-sponsored family events: Creates opportunities for employees and their families to bond. Generous Parental Leave: Allows parents to take time off after the birth or adoption of a child. Family Medical Leave: Offers leave for employees to take care of family members medical needs. 5. Perks and Time-Off Benefits: Company-sponsored outings: Organizes recreational activities for employees. Gratuity: Provides a monetary benefit as a token of appreciation. Provident Fund: Helps employees save for retirement. Generous PTO: Offers more than the industry standard for paid time off. Paid sick days: Allows employees to take paid time off when they are unwell. Paid holidays: Gives employees paid time off for designated holidays. Bereavement Leave: Provides time off for employees to grieve the loss of a loved one. 6. Professional Development Benefits: L&D with FLEX- Enterprise Learning Repository: Provides access to a learning repository for professional development. Mentorship Program: Offers guidance and support from experienced professionals. Job Training: Provides training to enhance job-related skills. Professional Certification Reimbursements: Assists employees in obtaining professional certifications.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

12 - 16 Lacs

Pune

Work from Office

I am sharing the job description (JD) for the Data Architect role. We are looking for someone who can join as soon as possible, and I have included a few key bullet points below. The ideal candidate should have between 6 to 8 years of experience, and I am flexible in terms of a strong profile. Job Description: Key Responsibilities The ideal profile should have a strong foundation in data concepts, design, and strategy, with the ability to work across diverse technologies in an agnostic manner. Transactional Database Architecture Design and implement high-performance, reliable, and scalable transactional database architectures. Collaborate with cross-functional teams to understand transactional data requirements and create solutions that ensure data consistency, integrity, and availability. Optimize database designs and recommend best practices and technology stacks. Oversee the management of entire transactional databases, including modernization and de-duplication initiatives. Data Lake Architecture Design and implement data lakes that consolidate data from disparate sources into a unified, scalable storage solution. Architect and deploy cloud-based or on-premises data lake infrastructure. Ensure self-service capabilities across the data engineering space for the business. Work closely with Data Engineers, Product Owners, and Business teams. Data Integration & Governance: Understand ingestion and orchestration strategies. Implement data sharing, data exchange, and assess data sensitivity and criticality to recommend appropriate designs. Basic understanding of data governance practices. Innovation Evaluate and implement new technologies, tools, and frameworks to improve data accessibility, performance, and scalability. Stay up-to-date with industry trends and best practices to continuously innovate and enhance the data architecture strategy. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 3 weeks ago

Apply

3.0 - 6.0 years

6 - 10 Lacs

Hyderabad

Work from Office

Job Title: Data Governance Associate Location: Hyderabad Experience: 4-6 Years Key Responsibilities: 1. Data Quality and Master Data Management: Assist in the development and implementation of data quality frameworks and master data management processes. Monitor and report on data quality metrics, identifying areas for improvement and ensuring compliance with data governance standards. 2. Data Object Dictionary: Support the creation and maintenance of a comprehensive data object dictionary to ensure that data assets are well-documented and easily accessible. Collaborate with cross-functional teams to standardize terminology and enhance understanding of data objects across the organization. 3. Data Governance Support: Assist in the execution of data governance initiatives, including data stewardship and data lifecycle management. Participate in data governance meetings and contribute to the development of policies and procedures that promote data integrity and compliance. 4. Collaboration and Communication: Work closely with data owners, data stewards, and other stakeholders to ensure alignment on data governance practices. Communicate effectively with technical and non-technical teams to promote data governance awareness and best practices. Qualifications: Education: Bachelor s degree in Computer Science, Information Management, or a related field. Experience: 3 to 4 years of experience in data governance, data quality, and master data management. Familiarity with data object dictionaries and data documentation practices. Experience in monitoring and improving data quality metrics. Technical Skills: Basic proficiency in SQL for querying and extracting data from databases. Familiarity with data governance tools and platforms is a plus. Understanding of data integration techniques and tools. Soft Skills: Strong analytical and problem-solving skills with attention to detail. Excellent communication skills, with the ability to convey complex data concepts clearly. Ability to work collaboratively in a team environment and manage multiple tasks effectively. Preferred Qualifications: Experience in supporting data governance initiatives and projects. Knowledge of data quality principles and practices

Posted 3 weeks ago

Apply

6.0 - 8.0 years

12 - 16 Lacs

Pune

Work from Office

I am sharing the job description (JD) for the Data Architect role. We are looking for someone who can join as soon as possible, and I have included a few key bullet points below. The ideal candidate should have between 6 to 8 years of experience, and I am flexible in terms of a strong profile. Job Description: Key Responsibilities The ideal profile should have a strong foundation in data concepts, design, and strategy, with the ability to work across diverse technologies in an agnostic manner. Transactional Database Architecture Design and implement high-performance, reliable, and scalable transactional database architectures. Collaborate with cross-functional teams to understand transactional data requirements and create solutions that ensure data consistency, integrity, and availability. Optimize database designs and recommend best practices and technology stacks. Oversee the management of entire transactional databases, including modernization and de-duplication initiatives. Data Lake Architecture Design and implement data lakes that consolidate data from disparate sources into a unified, scalable storage solution. Architect and deploy cloud-based or on-premises data lake infrastructure. Ensure self-service capabilities across the data engineering space for the business. Work closely with Data Engineers, Product Owners, and Business teams. Data Integration & Governance: Understand ingestion and orchestration strategies. Implement data sharing, data exchange, and assess data sensitivity and criticality to recommend appropriate designs. Basic understanding of data governance practices. Innovation Evaluate and implement new technologies, tools, and frameworks to improve data accessibility, performance, and scalability. Stay up-to-date with industry trends and best practices to continuously innovate and enhance the data architecture strategy. Location: Pune Brand: Merkle Time Type: Full time Contract Type: Permanent

Posted 3 weeks ago

Apply

10.0 - 14.0 years

45 - 55 Lacs

Bengaluru

Work from Office

As a Senior Engineering Manager - Myntra Data Platform, you will oversee the technical aspects of the data platform, driving innovation, and ensuring efficient data management processes. Your role will have a significant impact on the organizations data strategy and overall business objectives. Roles and Responsibilities: Lead and mentor a team of engineers to deliver high-quality data solutions. Develop and execute strategies for data platform scalability and performance optimization. Collaborate with cross-functional teams to align data platform initiatives with business goals. Define and implement best practices for data governance, security, and compliance. Drive continuous improvement through innovation and technological advancement. Monitor and analyze data platform metrics to identify areas for enhancement. Ensure seamless integration of new data sources and technologies into the platform. Qualifications: Bachelor s or Master s degree in Computer Science, Engineering, or related field. 10-14 years of experience in engineering roles with a focus on data management and analysis. Proven experience in leading high-performing engineering teams. Strong proficiency in data architecture, ETL processes, and database technologies. Excellent communication and collaboration skills to work effectively with stakeholders. Relevant certifications in data management or related fields are a plus. " Who are we Myntra is India s leading fashion and lifestyle platform, where technology meets creativity. As pioneers in fashion e-commerce, we ve always believed in disrupting the ordinary. We thrive on a shared passion for fashion, a drive to innovate to lead, and an environment that empowers each one of us to pave our own way. We re bold in our thinking, agile in our execution, and collaborative in spirit. Here, we create MAGIC by inspiring vibrant and joyous self-expression and expanding fashion possibilities for India, while staying true to what we believe in. We believe in taking bold bets and changing the fashion landscape of India. We are a company that is constantly evolving into newer and better forms and we look for people who are ready to evolve with us. From our humble beginnings as a customization company in 2007 to being technology and fashion pioneers today, Myntra is going places and we want you to take part in this journey with us. Working at Myntra is challenging but fun - we are a young and dynamic team, firm believers in meritocracy, believe in equal opportunity, encourage intellectual curiosity and empower our teams with the right tools, space, and opportunities.

Posted 3 weeks ago

Apply

9.0 - 14.0 years

0 - 0 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Role & responsibilities Design and develop conceptual, logical, and physical data models for enterprise and application-level databases. Translate business requirements into well-structured data models that support analytics, reporting, and operational systems. Define and maintain data standards, naming conventions, and metadata for consistency across systems. Collaborate with data architects, engineers, and analysts to implement models into databases and data warehouses/lakes. Analyze existing data systems and provide recommendations for optimization, refactoring, and improvements. Create entity relationship diagrams (ERDs) and data flow diagrams to document data structures and relationships. Support data governance initiatives including data lineage, quality, and cataloging. Review and validate data models with business and technical stakeholders. Provide guidance on normalization, denormalization, and performance tuning of database designs. Ensure models comply with organizational data policies, security, and regulatory requirements. Looking for a Data Modeler Architect to design conceptual, logical, and physical data models. Must translate business needs into scalable models for analytics and operational systems. Strong in normalization , denormalization , ERDs , and data governance practices. Experience in star/snowflake schemas and medallion architecture preferred. Role requires close collaboration with architects, engineers, and analysts. Data modelling, Normalization, Denormalization, Star and snowflake schemas, Medallion architecture, ERDLogical data model, Physical data model & Conceptual data model

Posted 3 weeks ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Chennai

Work from Office

Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.

Posted 3 weeks ago

Apply

5.0 - 7.0 years

10 - 14 Lacs

Kolkata

Work from Office

Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.

Posted 4 weeks ago

Apply

7.0 - 10.0 years

5 - 8 Lacs

Chennai

Work from Office

Employment Type : Contract (Remote). Job Summary : We are looking for a highly skilled Data Engineer / Data Modeler with strong experience in Snowflake, DBT, and GCP to support our data infrastructure and modeling initiatives. The ideal candidate should possess excellent SQL skills, hands-on experience with Erwin Data Modeler, and a strong background in modern data architectures and data modeling techniques. Key Responsibilities : - Design and implement scalable data models using Snowflake and Erwin Data Modeler. - Create, maintain, and enhance data pipelines using DBT and GCP (BigQuery, Cloud Storage, Dataflow). - Perform reverse engineering on existing systems (e.g., Sailfish/DDMS) using DBeaver or similar tools to understand and rebuild data models. - Develop efficient SQL queries and stored procedures for data transformation, quality, and validation. - Collaborate with business analysts and stakeholders to gather data requirements and convert them into physical and logical models. - Ensure performance tuning, security, and optimization of the Snowflake data warehouse. - Document metadata, data lineage, and business logic behind data structures and flows. - Participate in code reviews, enforce coding standards, and provide best practices for data modeling and governance. Must-Have Skills : - Snowflake architecture, schema design, and data warehouse experience. - DBT (Data Build Tool) for data transformation and pipeline development. - Strong expertise in SQL (query optimization, complex joins, window functions, etc.) - Hands-on experience with Erwin Data Modeler (logical and physical modeling). - Experience with GCP (BigQuery, Cloud Composer, Cloud Storage). - Experience in reverse engineering legacy systems like Sailfish or DDMS using DBeaver. Good To Have : - Experience with CI/CD tools and DevOps for data environments. - Familiarity with data governance, security, and privacy practices. - Exposure to Agile methodologies and working in distributed teams. - Knowledge of Python for data engineering tasks and orchestration scripts. Soft Skills : - Excellent problem-solving and analytical skills. - Strong communication and stakeholder management. - Self-driven with the ability to work independently in a remote setup.

Posted 4 weeks ago

Apply

5.0 - 7.0 years

10 - 14 Lacs

Chennai

Work from Office

Job Title : Sr. Data Engineer Ontology & Knowledge Graph Specialist Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.

Posted 4 weeks ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.

Posted 4 weeks ago

Apply

8.0 - 12.0 years

10 - 14 Lacs

Kolkata

Work from Office

Role Responsibilities : - Develop and design comprehensive Power BI reports and dashboards. - Collaborate with stakeholders to understand reporting needs and translate them into functional requirements. - Create visually appealing interfaces using Figma for enhanced user experience. - Utilize SQL for data extraction and manipulation to support reporting requirements. - Implement DAX measures to ensure accurate data calculations. - Conduct data analysis to derive actionable insights and facilitate decision-making. - Perform user acceptance testing (UAT) to validate report performance and functionality. - Provide training and support for end-users on dashboards and reporting tools. - Monitor and enhance the performance of existing reports on an ongoing basis. - Work closely with cross-functional teams to align project objectives with business goals. - Maintain comprehensive documentation for all reporting activities and processes. - Stay updated on industry trends and best practices related to data visualization and analytics. - Ensure compliance with data governance and security standards. - Participate in regular team meetings to discuss project progress and share insights. - Assist in the development of training materials for internal stakeholders. Qualifications - Minimum 8 years of experience in Power BI and Figma. - Strong proficiency in SQL and database management. - Extensive knowledge of data visualization best practices. - Expertise in DAX for creating advanced calculations. - Proven experience in designing user interfaces with Figma. - Excellent analytical and problem-solving skills. - Ability to communicate complex data insights to non-technical stakeholders. - Strong attention to detail and commitment to quality. - Experience with business analytics and reporting tools. - Familiarity with data governance and compliance regulations. - Ability to work independently and as part of a team in a remote setting. - Strong time management skills and ability to prioritize tasks. - Ability to adapt to fast-paced working environments. - Strong interpersonal skills and stakeholder engagement capability. - Relevant certifications in Power BI or data analytics are a plus.

Posted 4 weeks ago

Apply

7.0 - 9.0 years

7 - 12 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

About KPI Partners. KPI Partners is a leading provider of enterprise software solutions, enabling organizations to harness the power of data through innovation and advanced analytics. Our team is dedicated to delivering high-quality services and products that transform the way businesses operate. We are seeking a seasoned Senior Report Developer – Tableau to drive and manage a strategic migration project from Tableau to Power BI . This role requires strong hands-on experience with Tableau dashboards, a deep understanding of Power BI, and the technical leadership to manage end-to-end migration with minimal disruption to business users. You will serve as the bridge between business stakeholders, BI developers, and project managers, ensuring a seamless transition while maintaining data integrity, performance, and user experience. Key Responsibilities: Lead the assessment, planning, and execution of Tableau-to-Power BI migration Analyze existing Tableau dashboards, data models, and data sources to define migration scope and approach Translate Tableau visualizations and logic (calculated fields, filters, LOD expressions, etc.) into Power BI equivalents Oversee data model optimization, dashboard redesign, and user experience improvements during migration Collaborate with business stakeholders to validate migrated reports and ensure alignment with reporting needs Manage a team of BI developers and provide technical direction and code reviews Ensure proper version control, documentation, and change management throughout the migration Establish performance benchmarks and implement best practices for Power BI deployment Conduct knowledge transfer and training sessions for end-users transitioning to Power BI Required Qualifications: 7+ years of experience in Business Intelligence and Analytics 5+ years of hands-on experience with Tableau development (dashboards, data blending, LODs, parameters) 2+ years of experience with Power BI , including DAX, Power Query, and data modeling Proven experience in BI migration projects (preferably Tableau to Power BI) Strong SQL skills and experience working with relational databases (SQL Server, Oracle, etc.) Solid understanding of data governance, security, and access controls Excellent communication, leadership, and stakeholder management skills Why Join KPI Partners? - Opportunity to work with a talented and passionate team in a fast-paced environment. - Competitive salary and benefits package. - Continuous learning and professional development opportunities. - A collaborative and inclusive workplace culture that values diversity and innovation. KPI Partners is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. Join us at KPI Partners and help us unlock the power of data for our clients!

Posted 4 weeks ago

Apply

12.0 - 15.0 years

16 - 20 Lacs

Chennai

Work from Office

Group Company: MINDSPRINT DIGITAL (INDIA) PRIVATE LIMITED Designation: Data SolutionArchitect JobDescription: Design,architect, and implement scalable data solutions on Google Cloud Platform (GCP)to meet the strategic data needs of the organization. Leadthe integration of diverse data sources into a unified data platform, ensuringseamless data flow and accessibility across the organization. Developand enforce robust data governance, security, and compliance frameworkstailored to GCP's architecture. Collaboratewith cross-functional teams, including data engineers, data scientists, andbusiness stakeholders, to translate business requirements into technical datasolutions. Optimizedata storage, processing, and analytics solutions using GCP services such asBigQuery, Dataflow, and Cloud Storage. Drivethe adoption of best practices in data architecture and cloud computing toenhance the performance, reliability, and scalability of data solutions. Conductregular reviews and audits of the data architecture to ensure alignment withevolving business goals and technology advancements. Stayinformed about emerging GCP technologies and industry trends to continuouslyimprove data solutions and drive innovation. ProfileDescription: Experience:12-15 years of experience in data architecture, with extensive expertise inGoogle Cloud Platform (GCP). Skills:Deep understanding of GCP services including BigQuery, Dataflow, Pub/Sub, CloudStorage, and IAM. Proficiency in data modeling, ETL processes, and datawarehousing. Qualifications:Masters degree in Computer Science, Data Engineering, or a related field. Competencies:Strong leadership abilities, with a proven track record of managing large-scaledata projects. Ability to balance technical and business needs in designingdata solutions. Certifications:Google Cloud Professional Data Engineer or Professional Cloud Architectcertification preferred. Knowledge:Extensive knowledge of data governance, security best practices, and compliancein cloud environments. Familiarity with big data technologies like ApacheHadoop and Spark. SoftSkills: Excellent communication skills to work effectively with both technicalteams and business stakeholders. Ability to lead and mentor a team of dataengineers and architects. Tools:Experience with version control (Git), CI/CD pipelines, and automation tools.Proficient in SQL, Python, and data visualization tools like Looker or Power BI. Required abilities Physical: Other: Work Environment Details: Specific requirements Travel: Vehicle: Work Permit: Other details Pay Rate: Contract Types: Time Constraints: Compliance Related: Union Affiliation:

Posted 4 weeks ago

Apply

12.0 - 15.0 years

15 - 19 Lacs

Chennai

Work from Office

Group Company: MINDSPRINT DIGITAL (INDIA) PRIVATE LIMITED Designation: Data Solution Architect Job Description: Design, architect, and implement scalable data solutions on Google Cloud Platform (GCP) to meet the strategic data needs of the organization. Lead the integration of diverse data sources into a unified data platform, ensuring seamless data flow and accessibility across the organization. Develop and enforce robust data governance, security, and compliance frameworks tailored to GCP's architecture. Collaborate with cross-functional teams, including data engineers, data scientists, and business stakeholders, to translate business requirements into technical data solutions. Optimize data storage, processing, and analytics solutions using GCP services such as BigQuery, Dataflow, and Cloud Storage. Drive the adoption of best practices in data architecture and cloud computing to enhance the performance, reliability, and scalability of data solutions. Conduct regular reviews and audits of the data architecture to ensure alignment with evolving business goals and technology advancements. Stay informed about emerging GCP technologies and industry trends to continuously improve data solutions and drive innovation. Profile Description: Experience: 12-15 years of experience in data architecture, with extensive expertise in Google Cloud Platform (GCP). Skills: Deep understanding of GCP services including BigQuery, Dataflow, Pub/Sub, Cloud Storage, and IAM. Proficiency in data modeling, ETL processes, and data warehousing. Qualifications: Masters degree in Computer Science, Data Engineering, or a related field. Competencies: Strong leadership abilities, with a proven track record of managing large-scale data projects. Ability to balance technical and business needs in designing data solutions. Certifications: Google Cloud Professional Data Engineer or Professional Cloud Architect certification preferred. Knowledge: Extensive knowledge of data governance, security best practices, and compliance in cloud environments. Familiarity with big data technologies like Apache Hadoop and Spark. Soft Skills: Excellent communication skills to work effectively with both technical teams and business stakeholders. Ability to lead and mentor a team of data engineers and architects. Tools: Experience with version control (Git), CI/CD pipelines, and automation tools. Proficient in SQL, Python, and data visualization tools like Looker or Power BI. Required abilities Physical: Other: Work Environment Details: Specific requirements Travel: Vehicle: Work Permit: Other details Pay Rate: Contract Types: Time Constraints: Compliance Related: Union Affiliation:

Posted 4 weeks ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Duration : 6 Months Notice Period : within 15 days or immediate joiner Experience : 3- 6 Years About The Role : As a Data Engineer for the Data Science team, you will play a pivotal role in enriching and maintaining the organization's central repository of datasets. This repository serves as the backbone for advanced data analytics and machine learning applications, enabling actionable insights from financial and market data. You will work closely with cross-functional teams to design and implement robust ETL pipelines that automate data updates and ensure accessibility across the organization. This is a critical role requiring technical expertise in building scalable data pipelines, ensuring data quality, and supporting data analytics and reporting infrastructure for business growth. Note : Must be ready for face-to-face interview in Bangalore (last round). Should be working with Azure as cloud technology. Key Responsibilities : ETL Development : - Design, develop, and maintain efficient ETL processes for handling multi-scale datasets. - Implement and optimize data transformation and validation processes to ensure data accuracy and consistency. - Collaborate with cross-functional teams to gather data requirements and translate business logic into ETL workflows. Data Pipeline Architecture : - Architect, build, and maintain scalable and high-performance data pipelines to enable seamless data flow. - Evaluate and implement modern technologies to enhance the efficiency and reliability of data pipelines. - Build pipelines for extracting data via web scraping to source sector-specific datasets on an ad hoc basis. Data Modeling : - Design and implement data models to support analytics and reporting needs across teams. - Optimize database structures to enhance performance and scalability. Data Quality And Governance : - Develop and implement data quality checks and governance processes to ensure data integrity. - Collaborate with stakeholders to define and enforce data quality standards across the and Communication - Maintain detailed documentation of ETL processes, data models, and other key workflows. - Effectively communicate complex technical concepts to non-technical stakeholders and business Collaboration - Work closely with the Quant team and developers to design and optimize data pipelines. - Collaborate with external stakeholders to understand business requirements and translate them into technical solutions. Essential Requirements Basic Qualifications : - Bachelor's degree in Computer Science, Information Technology, or a related field. - Familiarity with big data technologies like Hadoop, Spark, and Kafka. - Experience with data modeling tools and techniques. - Excellent problem-solving, analytical, and communication skills. - Proven experience as a Data Engineer with expertise in ETL techniques (minimum years). - 3-6 years of strong programming experience in languages such as Python, Java, or Scala - Hands-on experience in web scraping to extract and transform data from publicly available web sources. - Proficiency with cloud-based data platforms such as AWS, Azure, or GCP. - Strong knowledge of SQL and experience with relational and non-relational databases. - Deep understanding of data warehousing concepts and Qualifications : - Master's degree in Computer Science or Data Science. - Knowledge of data streaming and real-time processing frameworks. - Familiarity with data governance and security best practices.

Posted 4 weeks ago

Apply

4.0 - 6.0 years

6 - 11 Lacs

Noida

Work from Office

Responsibilities : - Collaborate with the sales team to understand customer challenges and business objectives and propose solutions, POC etc. - Develop and deliver impactful technical presentations and demos showcasing the capabilities of GCP Data and AI , GenAI Solutions. - Conduct technical proof-of-concepts (POCs) to validate the feasibility and value proposition of GCP solutions. - Collaborate with technical specialists and solution architects from COE Team to design and configure tailored cloud solutions. - Manage and qualify sales opportunities, working closely with the sales team to progress deals through the sales funnel. - Stay up to date on the latest GCP offerings, trends, and best practices. Experience : - Design and implement a comprehensive strategy for migrating and modernizing existing relational on-premise databases to scalable and cost-effective solution on Google Cloud Platform ( GCP). - Design and Architect the solutions for DWH Modernization and experience with building data pipelines in GCP. - Strong Experience in BI reporting tools ( Looker, PowerBI and Tableau). - In-depth knowledge of Google Cloud Platform (GCP) services, particularly Cloud SQL, Postgres, Alloy DB, BigQuery, Looker Vertex AI and Gemini (GenAI). - Strong knowledge and experience in providing the solution to process massive datasets in real time and batch process using cloud native/open source Orchestration techniques. - Build and maintain data pipelines using Cloud Dataflow to orchestrate real-time and batch data processing for streaming and historical data. - Strong knowledge and experience in best practices for data governance, security, and compliance. - Excellent Communication and Presentation Skills with ability to tailor technical information as per customer needs. - Strong analytical and problem-solving skills. - Ability to work independently and as part of a team.

Posted 4 weeks ago

Apply

3.0 - 8.0 years

9 - 18 Lacs

Bengaluru

Work from Office

3+ years of experience in data governance projects. * Import the technical metadata from different resources into EDC environment. * Ensuring population of data lineage between tables and fields * Create Custom lineage * Should have knowledge on various APIs required to extract details from EDC * Should be able to analyse IICS mappings and create lineage mapping accordingly * Should have knowledge of SQL and should be able to analyse view definitions.

Posted 4 weeks ago

Apply

3.0 - 6.0 years

27 - 42 Lacs

Chennai

Work from Office

1. Job Title : GeoSpatial Sr.Analyst MAP 2. Job Summary : The GeoSpatial Sr. Analyst MAP will play a crucial role in analyzing and interpreting geospatial data to support strategic decision-making. With a focus on Networx technologies the analyst will ensure the effective use of geospatial data in various projects. The role requires a blend of technical expertise and analytical skills to drive impactful outcomes in a hybrid work model. 3. Experience : 3 - 6 years 4. Required Skills : Technical Skills: Networx - Pricer Networx - Facets Pricer Networx - Modeler NetworX Domain Skills: 5. Nice to have skills : Domain Skills: 6. Technology : Custom Service 7. Shift : Day 8. Responsibilities : - Analyze geospatial data using advanced Networx tools to support strategic business decisions. - Collaborate with cross-functional teams to integrate geospatial insights into project planning and execution. - Develop and maintain geospatial databases to ensure data accuracy and accessibility. - Provide detailed reports and visualizations to communicate geospatial findings to stakeholders. - Utilize Networx - Pricer and Networx - Facets Pricer to optimize pricing strategies and models. - Implement Networx - Modeler to simulate and predict geospatial trends and patterns. - Ensure compliance with data governance and security protocols in all geospatial analyses. - Conduct regular audits of geospatial data to maintain high-quality standards. - Support the development of geospatial strategies that align with organizational goals. - Train team members on the effective use of geospatial tools and technologies. - Monitor industry trends to keep the organization at the forefront of geospatial innovation. - Facilitate workshops and presentations to share geospatial insights with internal and external audiences. - Contribute to the continuous improvement of geospatial processes and methodologies. Qualifications - - Possess a strong background in geospatial analysis with experience in Networx technologies. - Demonstrate proficiency in Networx - Pricer Networx - Facets Pricer and Networx - Modeler. - Exhibit excellent analytical and problem-solving skills. - Have a minimum of 3 years of experience in a similar role. - Show ability to work effectively in a hybrid work environment. - Display strong communication skills to convey complex geospatial concepts. - Be detail-oriented with a focus on data accuracy and integrity. 9. Job Location : Primary Location :INTNCHNA16(ITIND COG KITS Campus(CKC)SDB2&3 SEZ) Alternate Location :NA NA Alternate Location 1 :NA NA 10. Job Type : Business Associate [60CW00] 11. Demand Requires Travel? : No 12. Certifications Required : N/A

Posted 4 weeks ago

Apply

12.0 - 17.0 years

5 - 9 Lacs

Mumbai

Work from Office

Project Role : Business Function Implement Practitioner Project Role Description : Support the implementation of activities for a specific business function to improve performance for a function end to end. Activities include analyzing and designing/re-designing business processes and/or defining parts of an organization. Must have skills : SAP Master Data Governance MDG Tool Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : MBA Summary :As a Business Function Implement Practitioner, you will support the implementation of activities for a specific business function to improve performance end to end. You will be involved in analyzing and designing/re-designing business processes and defining parts of an organization in Mumbai. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Lead the implementation of SAP Master Data Governance MDG Tool.- Provide expertise in optimizing business processes.- Contribute to the strategic direction of the organization. Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Master Data Governance MDG Tool.- Strong understanding of data governance principles.- Experience in implementing SAP MDG solutions.- Knowledge of SAP ERP systems.- Familiarity with data modeling and data management best practices. Additional Information:- The candidate should have a minimum of 12 years of experience in SAP Master Data Governance MDG Tool.- This position is based at our Mumbai office.- A MBA degree is required. Qualification MBA

Posted 4 weeks ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Ab Initio Good to have skills : NAMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will be involved in designing, building, and configuring applications to meet business process and application requirements. Your typical day will revolve around creating innovative solutions to address various business needs and ensuring seamless application functionality. Roles & Responsibilities:- Expected to be an SME- Collaborate and manage the team to perform- Responsible for team decisions- Engage with multiple teams and contribute on key decisions- Provide solutions to problems for their immediate team and across multiple teams- Lead and mentor junior professionals- Conduct regular knowledge sharing sessions within the team- Stay updated on the latest industry trends and technologies Professional & Technical Skills: - Must To Have Skills: Proficiency in Ab Initio- Strong understanding of ETL processes- Experience in data warehousing concepts- Hands-on experience with data integration tools- Knowledge of data quality and data governance principles Additional Information:- The candidate should have a minimum of 5 years of experience in Ab Initio- This position is based at our Bengaluru office- A 15 years full-time education is required Qualification 15 years full time education

Posted 4 weeks ago

Apply

5.0 - 7.0 years

11 - 15 Lacs

Coimbatore

Work from Office

Mandate Skills : Apache spark, hive, Hadoop, spark, scala, Databricks The Role : - Designing and building optimized data pipelines using cutting-edge technologies in a cloud environment to drive analytical insights. - Constructing infrastructure for efficient ETL processes from various sources and storage systems. - Leading the implementation of algorithms and prototypes to transform raw data into useful information. - Architecting, designing, and maintaining database pipeline architectures, ensuring readiness for AI/ML transformations. - Creating innovative data validation methods and data analysis tools. - Ensuring compliance with data governance and security policies. - Interpreting data trends and patterns to establish operational alerts. - Developing analytical tools, programs, and reporting mechanisms - Conducting complex data analysis and presenting results effectively. - Preparing data for prescriptive and predictive modeling. - Continuously exploring opportunities to enhance data quality and reliability. - Applying strong programming and problem-solving skills to develop scalable solutions. Requirements : - Experience in the Big Data technologies (Hadoop, Spark, Nifi, Impala) - 5+ years of hands-on experience designing, building, deploying, testing, maintaining, monitoring, and owning scalable, resilient, and distributed data pipelines. - High proficiency in Scala/Java and Spark for applied large-scale data processing. - Expertise with big data technologies, including Spark, Data Lake, and Hive

Posted 4 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies