Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 7.0 years
6 - 10 Lacs
pune
Work from Office
Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 7 hours ago
5.0 - 7.0 years
6 - 10 Lacs
surat
Work from Office
Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 7 hours ago
5.0 - 7.0 years
6 - 10 Lacs
chennai
Work from Office
Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 7 hours ago
5.0 - 7.0 years
10 - 14 Lacs
kanpur
Work from Office
Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 12 hours ago
5.0 - 7.0 years
10 - 14 Lacs
lucknow
Work from Office
Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 13 hours ago
5.0 - 7.0 years
10 - 14 Lacs
kolkata
Work from Office
Job Title : Sr. Data Engineer Ontology & Knowledge Graph Specialist Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 1 day ago
5.0 - 7.0 years
10 - 14 Lacs
ahmedabad
Work from Office
Job Title : Sr. Data Engineer Ontology & Knowledge Graph Specialist Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 1 day ago
5.0 - 7.0 years
6 - 10 Lacs
bengaluru
Work from Office
Job Title : Sr. Data Engineer Ontology & Knowledge Graph Specialist Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 1 day ago
5.0 - 7.0 years
10 - 14 Lacs
hyderabad
Work from Office
Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 2 days ago
5.0 - 7.0 years
10 - 14 Lacs
gurugram
Work from Office
Department : Platform Engineering Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 2 days ago
5.0 - 7.0 years
10 - 14 Lacs
mumbai
Work from Office
Summary : We are seeking a highly skilled Data Engineer with expertise in ontology development and knowledge graph implementation. This role will be pivotal in shaping our data infrastructure and ensuring the accurate representation and integration of complex data sets. You will leverage industry best practices, including the Basic Formal Ontology (BFO) and Common Core Ontologies (CCO), to design, develop, and maintain ontologies, semantic and syntactic data models, and knowledge graphs on the Databricks Data Intelligence Platform that drive data-driven decision-making and innovation within the company. Responsibilities : Ontology Development : - Design and implement ontologies based on BFO and CCO principles, ensuring alignment with business requirements and industry standards. - Collaborate with domain experts to capture and formalize domain knowledge into ontological structures. - Develop and maintain comprehensive ontologies to model various business entities, relationships, and processes. Data Modeling : - Design and implement semantic and syntactic data models that adhere to ontological principles. - Create data models that are scalable, flexible, and adaptable to changing business needs. - Integrate data models with existing data infrastructure and applications. Knowledge Graph Implementation : - Design and build knowledge graphs based on ontologies and data models. - Develop algorithms and tools for knowledge graph population, enrichment, and maintenance. - Utilize knowledge graphs to enable advanced analytics, search, and recommendation systems. Data Quality And Governance : - Ensure the quality, accuracy, and consistency of ontologies, data models, and knowledge graphs. - Define and implement data governance processes and standards for ontology development and maintenance. Collaboration And Communication : - Work closely with data scientists, software engineers, and business stakeholders to understand their data requirements and provide tailored solutions. - Communicate complex technical concepts clearly and effectively to diverse audiences. Qualifications : Education : - Bachelor's or Master's degree in Computer Science, Data Science, or a related field. Experience : - 5+ years of experience in data engineering or a related role. - Proven experience in ontology development using BFO and CCO or similar ontological frameworks. - Strong knowledge of semantic web technologies, including RDF, OWL, SPARQL, and SHACL. - Proficiency in Python, SQL, and other programming languages used for data engineering. - Experience with graph databases (e.g., TigerGraph, JanusGraph) and triple stores (e.g., GraphDB, Stardog) is a plus. Desired Skills : - Familiarity with machine learning and natural language processing techniques. - Experience with cloud-based data platforms (e.g., AWS, Azure, GCP). - Experience with Databricks technologies including Spark, Delta Lake, Iceberg, Unity Catalog, UniForm, and Photon. - Strong problem-solving and analytical skills. - Excellent communication and interpersonal skills.
Posted 2 days ago
3.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Join Us! At Google Operations Center we help Google users and customers solve problems and achieve their goalsall while enjoying a culture focused on improving continuously and being better together. We work hard, we play hard, and we want you to join us! As Part Of The GUP Analytics Team, We Ensure Our Users Have Access To Aligned, Accurate And Useful Business Metrics And Dimensions That Support Their Operations. We Deliver On Our Mission, And Collaborate With Our Partners To Achieve The Following Single-source-of-truth data that is robust, high-quality and trusted Informative visualizations supporting reporting and understanding Automated insights and analyses that enhance decision making at all levels Reporting infrastructure that is stable and scalable In understanding the user need, we partner directly with Analytics Managers to create & maintain dashboards, resolve bugs and work on feature requests. Where there is a gap between business requirements and available products, we prioritize the need, identify solutions and deliver resulting product initiatives. Position Responsibilities As the BI SME, youll be responsible to oversee and review the development of the data infrastructure that our BI Analysts are creating to ensure that our gUP teams have the most accurate data they need to make crucial business decisions Youll act as a subject matter expert and quality auditor for a team of BI Analysts and partner with the gUP Analytics Managers to create data and visualization solutions In this role, you&aposll have the opportunity to design and lead the development of innovative data solutions while troubleshooting challenging problems using Googles large-scale production data infrastructure Youll act as a technical liaison between business and analyst teams and supervise the design of new data products and optimize how they will perform Minimum Requirements Overall 4 years of experience, with a minimum of 3 years in designing, testing, optimizing and troubleshooting ETL solutions 3 years of experience writing software in one or more programming languages (e.g. Java, C++, Python). 3 years experience working with Advanced SQL, OLAP and reporting tools to analyze and organize large data sets Ability to handle multiple complex projects in a fast-paced environment with strong communication and leadership skills Write and review technical documents, including design, development and revision documents Preferred Qualifications Preferred Qualifications Having as many of these specific qualifications is a plus, but transferable skills/experiences may be equally valuable: Experience designing data warehouses, especially for business performance management Experience in large scale distributed data processing, including familiarity with NoSQL databases Proficiency in all aspects of software development cycle with experience identifying and capitalizing on technical improvements and optimization opportunities Experience with Unix/GNU Linux systems Benefits We support you with competitive wages and comprehensive health care including medical, dental and vision coverage We support your family with gender-neutral baby bonding leave, 14 week birth-parent maternity leave, and generous life, accident and disability insurance minimums Employees who work onsite can enjoy free meals and snacks, and fun onsite experiences At the Google Operations Center, we don&apost just accept differences - we celebrate them, we support them, and we thrive on them for the benefit of our employees, our products, and our community. We are committed to equal employment opportunities regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity or Veteran status. We also consider qualified applicants regardless of criminal histories, consistent with legal requirements. If you have a disability or special need that requires accommodation, please let us know. Information collected and processed as part of your GOC jobs profile, and any job applications you choose to submit is subject to GOC&aposs Applicant and Candidate Privacy Policy . Thanks for your interest in this opportunity! Our recruitment team will contact you if your profile is a good fit for the role. If you don&apost hear from us within 2-3 weeks, please consider your application unsuccessful at this time. We value your patience throughout this time. For any questions, feel free to reach out to us at [HIDDEN TEXT] . Show more Show less
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
pune, maharashtra
On-site
The opportunity: We are looking for a technical powerhouse - a hands-on Data Science and AI leader who can build, scale, and deploy machine learning systems that drive significant impact. As the key person in charge, you will be responsible for developing and executing the entire AI/ML strategy, from research to practical application, while playing a pivotal role in shaping a top-tier data organization. What you'll own: AI Leadership with a Global Lens: You will be tasked with establishing and leading a high-performing, multicultural team of data scientists, ML engineers, and analysts. Your role will involve setting a visionary path, fostering collaboration, and integrating diverse perspectives to stimulate innovation. Production-Grade AI at Scale: Your responsibilities will include deploying models that not only function effectively but also revolutionize processes such as fraud detection, credit scoring, and personalized finance on a large scale in various markets. Data Infrastructure for the Future: You will architect scalable, real-time systems that drive AI applications across different regions, languages, and regulatory landscapes. Fintech at Its Core: Your contributions will extend beyond technical aspects to directly impact financial inclusion, risk assessment, and overall growth in one of the world's most dynamic regions. Ethical AI for Real-World Impact: Ensuring fairness, transparency, and compliance in every model will be crucial, as trust forms the bedrock of the financial sector. Who you are: A Technical Leader Who Thrives in Diversity: You possess experience in building and leading teams with a diverse cultural mix, combining technical proficiency with emotional intelligence. A Hands-On AI/ML Expert: Your expertise lies in training, deploying, and scaling models, while also empowering your team members to reach their full potential. A Fintech or High-Stakes AI Veteran: Previous involvement in fraud detection, risk assessment, or financial analytics will be advantageous. A Communicator & Collaborator: You excel in bridging communication gaps between technical teams, executives, and regulators, effectively conveying complex AI concepts across various languages and cultures. Why this role Lead a Truly Regional Team: Collaborate with top talents from across Asia to develop AI solutions that cater to the needs of millions of diverse users. Zero Bureaucracy, Maximum Impact: Join a fast-paced environment where results matter; if you can demonstrate effectiveness, you will see your projects come to fruition swiftly. Your Legacy in Fintech: This opportunity is not just a job but a chance to shape your career significantly by leaving a lasting mark on the fintech industry.,
Posted 1 week ago
7.0 - 11.0 years
0 Lacs
chennai, tamil nadu
On-site
If you are looking for a career at a dynamic company with a people-first mindset and a deep culture of growth and autonomy, ACV is the right place for you! Competitive compensation packages and learning and development opportunities, ACV has what you need to advance to the next level in your career. We will continue to raise the bar every day by investing in our people and technology to help our customers succeed. We hire people who share our passion, bring innovative ideas to the table, and enjoy a collaborative atmosphere. ACV is a technology company that has revolutionized how dealers buy and sell cars online. We are transforming the automotive industry. ACV Auctions Inc. (ACV), has applied innovation and user-designed, data driven applications and solutions. We are building the most trusted and efficient digital marketplace with data solutions for sourcing, selling and managing used vehicles with transparency and comprehensive insights that were once unimaginable. We are disruptors of the industry and we want you to join us on our journey. Our network of brands include ACV Auctions, ACV Transportation, ClearCar, MAX Digital and ACV Capital within its Marketplace Products, as well as, True360 and Data Services. ACV Auctions in Chennai, India are looking for talented individuals to join our team. As we expand our platform, we're offering a wide range of exciting opportunities across various roles in corporate, operations, and product and technology. Our global product and technology organization spans product management, engineering, data science, machine learning, DevOps and program leadership. What unites us is a deep sense of customer centricity, calm persistence in solving hard problems, and a shared passion for innovation. If you're looking to grow, lead, and contribute to something larger than yourself, we'd love to have you on this journey. Let's build something extraordinary together. Join us in shaping the future of automotive! At ACV we focus on the Health, Physical, Financial, Social and Emotional Wellness of our Teammates and to support this we offer industry leading benefits and wellness programs. We are seeking a highly motivated and experienced Data Infrastructure Manager to lead and oversee the design, development, and maintenance of our company's data infrastructure. The Data Infrastructure Manager will be responsible for ensuring the scalability, reliability, security, and performance of our data platforms, supporting the needs of our growing data science, analytics, and engineering teams. This role requires a strong technical background, proven leadership abilities, and an understanding of modern data architectures. **What You Will Do** **Strategy & Architecture** Define and execute the data infrastructure strategy, aligning it with the overall business goals and technology roadmap. Evaluate and recommend new technologies and architectures to improve performance, scalability, and efficiency. Translate product requirements into engineering work for your team, reaching out others for assistance, providing work estimates/timelines. Work with product partners to ensure that work is scoped, prioritized, and assigned appropriately to integrate solutions into ACV to meet business objectives and schedules. Guide and participate in architecture discussions and system designs. **Team Leadership** Manage and mentor a global team of Data Infrastructure Engineers. Provide technical guidance, performance feedback, and career development opportunities. Grow your team's knowledge of their domain and of the technical expertise required to support the ACV Auctions business. Evaluate, hire, and on-board engineers based on organizational need, technical skill set and cultural fit. Use your strong foundation as a technical leader to reliably deliver on complex projects while keeping the bar high. **Platform Management** Oversee the day-to-day operations and maintenance of our data platforms, including kafka, data warehouses, data lakes, ETL pipelines, and related infrastructure. **Scalability & Performance** Proactively identify and address performance bottlenecks and scalability challenges. Implement solutions to optimize data processing and storage. **Security & Compliance** Ensure data security and compliance with relevant regulations (e.g., GDPR, CCPA). Implement and maintain security best practices across the data infrastructure. **Automation & Monitoring** Drive automation of infrastructure provisioning, deployment, and monitoring. Establish robust monitoring and alerting systems to ensure proactive issue resolution. **Collaboration** Partner closely with Data Science, Analytics, Engineering, Data Engineering, and Security teams to understand their needs and provide solutions that enable their success. **Budget Management** Manage the data infrastructure budget effectively, ensuring efficient allocation of resources. **Vendor Management** Evaluate and manage relationships with data infrastructure vendors. Perform additional duties as assigned. **What You Will Need** Bachelors degree in computer science, Information Technology, or a related field (or equivalent work experience) Ability to read, write, speak and understand English 7+ years of experience in data infrastructure engineering, data architecture, or related roles. 3+ years experience in an engineering leadership or management role Proven track record of building and managing scalable and reliable data platforms. Experience architecting, developing, and delivering software products Deep experience with product lifecycle management and improving software products through metrics and experimentation Deep understanding and use of Agile practices Prior experience working large multi-faceted data sets Prior experience working in a cloud-native environment such as AWS or Google cloud Preferred hands-on expertise with python, RDBMS systems, REST APIs, and Jira/Confluence Experience with cloud-based data platforms (e.g., AWS, GCP, Azure). Excellent communication, interpersonal, and leadership skills. Ability to think strategically and translate business requirements into technical solutions. **Nice to Have** Experience with Infrastructure-as-Code tools (e.g., Terraform, CloudFormation. Experience with containerization technologies (e.g., Docker, Kubernetes) **Our Values** Trust & Transparency | People First | Positive Experiences | Calm Persistence | Never Settling **Data Processing Consent** When you apply to a job on this site, the personal data contained in your application will be collected by ACV Auctions Inc. and/or one of its subsidiaries ("ACV Auctions"). By clicking "apply", you hereby provide your consent to ACV Auctions and/or its authorized agents to collect and process your personal data for purpose of your recruitment at ACV Auctions and processing your job application. ACV Auctions may use services provided by a third party service provider to help manage its recruitment and hiring process. For more information about how your personal data will be processed by ACV Auctions and any rights you may have, please review ACV Auctions" candidate privacy notice here. If you have any questions about our privacy practices, please contact datasubjectrights@acvauctions.com.,
Posted 2 weeks ago
4.0 - 8.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics engineering focus on leveraging advanced technologies and techniques to design and develop robust data solutions for clients. They play a crucial role in transforming raw data into actionable insights, enabling informed decision-making and driving business growth. In data engineering at PwC, you will focus on designing and building data infrastructure and systems to enable efficient data processing and analysis. You will be responsible for developing and implementing data pipelines, data integration, and data transformation solutions. Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us. At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firms growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities Technical Experience Design and implement responsive web and mobile user interfaces using HTML, CSS, JavaScript, and other front-end technologies. Collaborate with product managers and designers to translate concepts into wireframes, prototypes, and final designs. Conduct user research and usability testing to gather feedback and refine designs. Optimize applications for maximum speed and scalability. Ensure designs are consistent with company branding and style guidelines. Stay up-to-date with industry trends and best practices in UI/UX design and development. Work with developers to integrate UI components with back-end functionality. Develop and maintain design documentation and style guides. Mandatory Skill Sets Adobe XD, Sketch, Figma, or similar Preferred Skill Sets Adobe XD, Sketch, Figma, or similar Years Of Experience Required 4-8 Years Education Qualification BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Engineering, Bachelor of Engineering Degrees/Field Of Study Preferred Certifications (if blank, certifications not specified) Required Skills UI/UX Design Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Agile Scalability, Amazon Web Services (AWS), Analytical Thinking, Apache Airflow, Apache Hadoop, Azure Data Factory, Communication, Creativity, Data Anonymization, Data Architecture, Database Administration, Database Management System (DBMS), Database Optimization, Database Security Best Practices, Databricks Unified Data Analytics Platform, Data Engineering, Data Engineering Platforms, Data Infrastructure, Data Integration, Data Lake, Data Modeling, Data Pipeline + 27 more Desired Languages (If blank, desired languages not specified) Travel Requirements Not Specified Available for Work Visa Sponsorship No Government Clearance Required No Job Posting End Date Show more Show less
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
hyderabad, telangana
On-site
Data engineers play a crucial role in constructing dependable and scalable data infrastructure to empower organizations in extracting valuable insights, facilitating data-informed decision-making, and maximizing the potential of their data resources. As a data engineer, you will lead and supervise a team of data engineers, establish and implement data engineering strategies, and oversee the efficient deployment of data solutions. Your responsibilities will include offering technical guidance, fostering innovation, and working closely with stakeholders to provide top-notch, scalable, and dependable data infrastructure and solutions.,
Posted 3 weeks ago
6.0 - 10.0 years
0 Lacs
indore, madhya pradesh
On-site
The Modern Data Company is seeking a seasoned UX Product Manager who is passionate about creating intuitive and elegant experiences for technically sophisticated users. As a part of the Platform team at DataOS, you will have a pivotal role in redefining the user experience of GUI applications that drive how data developers interact with the platform. Data platforms are inherently complex, and our mission is to simplify this complexity without compromising power or flexibility. We are looking for a candidate who can reduce cognitive load, drive delightful user interactions, and deliver exceptional UX across our suite of enterprise tools. Responsibilities include owning the end-to-end UX strategy and product roadmap for platform-facing GUI applications, collaborating with engineering, design, and other product teams to create seamless and scalable user experiences, deeply understanding data developers workflows and goals to make impactful product decisions, conducting usability testing, gathering feedback, and iterating rapidly, and ensuring that UI/UX patterns align with enterprise-grade expectations around performance, security, and accessibility. We are looking for candidates with at least 6 years of product management experience in B2B software, preferably focused on platforms or developer tools, a proven track record of launching and evolving GUI applications for enterprise-grade products, proficiency in Figma and comfortable leading design explorations hands-on, strong empathy for technical users and ability to reduce friction in complex workflows, excellent communication and storytelling skills to rally teams around a vision and drive clarity amidst ambiguity, and a bonus if you have experience working with data infrastructure, developer platforms, or system-level products. At Modern, our value system is based on HEAT: Humility, Empathy, Accountability, and Transparency. We appreciate individuals who are curious, love problem-solving, and possess a big-picture perspective. We believe in providing attractive compensation, benefits, and ESOPs to our employees, ensuring that they create significant value for themselves while working towards a common goal. If you are looking to do your best work and embrace a culture of competition for talent, Modern is the place for you.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
Rockwell Automation is a global technology leader dedicated to assisting the world's manufacturers in enhancing productivity, sustainability, and agility. With over 28,000 employees striving to make a positive impact daily, we take pride in our exceptional team. Our customers, remarkable companies contributing to essential sectors such as healthcare, agriculture, and environmental conservation, inspire our team of energetic problem solvers to drive meaningful change worldwide. If you are a maker, forward thinker, or problem solver seeking a platform to showcase your skills, we invite you to consider joining us! As an AI Architect at Rockwell Automation based in Pune, Noida, or Bangalore, India, you will play a crucial role in designing, developing, and implementing enterprise-grade artificial intelligence solutions. Reporting to the Director of Enterprise Architecture, you will operate from our Pune office in a hybrid capacity. Collaborating with diverse teams including data scientists, engineers, and business leaders, you will ensure that AI systems are scalable, secure, and aligned with organizational objectives. **Your Responsibilities:** - **Architect and implement AI solutions**: Design and supervise end-to-end AI architectures that seamlessly integrate with existing IT and data infrastructure, focusing on scalability, performance, and maintainability. - **Lead cross-functional delivery teams**: Provide guidance to data scientists, engineers, and business stakeholders across the AI solution lifecycle, from ideation and prototyping to production deployment and monitoring. - **Evaluate and recommend technologies**: Assess AI/ML platforms, frameworks, and tools to ensure alignment with business requirements and technical feasibility. - **Establish best practices**: Define standards for model development, testing, deployment, and lifecycle management, ensuring compliance with ethical AI principles and data privacy regulations. - **Mentor and evangelize**: Offer technical leadership and mentorship to junior architects and data professionals, while advocating for AI adoption and architectural vision within the organization. **The Essentials You Will Have:** - A Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Data Science, or a related field. - 5+ years of experience in designing and implementing AI architectures in production environments. - Proficiency in AI/ML frameworks and tools such as TensorFlow, PyTorch, Keras, and cloud-based AI services. - Strong programming skills in Python, R, Java, or similar languages. - Deep understanding of data structures, algorithms, and software engineering best practices. - Experience leading complex AI projects with cross-functional teams and delivering solutions that drive business outcomes. - Familiarity with ethical AI practices and data protection regulations. **The Preferred You Might Also Have:** - Experience deploying AI solutions on cloud platforms in hybrid or multi-cloud environments. - Knowledge of MLOps tools and practices for continuous integration, deployment, and monitoring of AI models. - Strong problem-solving abilities and the capacity to translate business requirements into scalable technical solutions. - Experience in mentoring and developing talent within AI or data science teams. **What We Offer:** Our benefits package includes comprehensive mindfulness programs, volunteer paid time off, employee assistance programs, personalized wellbeing programs, on-demand professional development courses, and other local benefits. At Rockwell Automation, we are committed to fostering a diverse, inclusive, and authentic workplace. If you are enthusiastic about this role but do not meet every qualification in the job description, we encourage you to apply as you might be the right fit for this position or other roles. Rockwell Automation's hybrid policy mandates employees to work at a Rockwell location at least on Mondays, Tuesdays, and Thursdays unless they have a business obligation elsewhere.,
Posted 3 weeks ago
2.0 - 6.0 years
0 Lacs
pune, maharashtra
On-site
Aera Technology is revolutionizing enterprise decision-making. Our AI-driven platform, Aera Decision Cloud, integrates seamlessly with existing systems to digitize, augment, and automate critical business decisions in real-time. Aera helps global enterprises transform decision-making delivering millions of recommendations that have resulted in significant revenue gains and cost savings for some of the world's best-known brands. We are looking for a Product Manager - Data to lead the evolution of our core Decision Intelligence capabilities. You will redefine how organizations harness data and AI to drive smarter, faster, and more sustainable decision-making. This is an exciting opportunity to be at the forefront of enterprise AI innovation, collaborating with a dynamic team in a fast-paced, startup-like environment. This role will be based in our Pune office. Responsibilities As a Product Manager, you will own the strategy, development, and execution of key platform components required for building a Decision Data Model which enables enterprises to build powerful AI-driven workflows. Lead product strategy & execution: Define and drive priorities, roadmap, and development efforts to maximize business value. Understand market needs: Research target users, use cases, and feedback to refine features and address customer pain points. Analyze competitive landscape: Stay ahead of industry trends and competitors to inform product differentiation. Define product requirements: Work closely with designers and engineers to develop user-centric, scalable solutions. Collaborate cross-functionally: Partner with Customer Success, Engineering, and Executive teams to align on vision and priorities. Drive user adoption: Act as the go-to expert, ensuring internal teams are equipped with the knowledge and resources to enable customers. About You You are passionate - you are your product's biggest advocate, and its biggest critic. You will ceaselessly pursue excellence and do whatever it takes to deliver a product that users love and that delivers value. You are pragmatic - you know when to focus on nuanced details, and when to bring a more strategic perspective to the table. You love to learn - you continually gather new information, ideas, and feedback, and you seek to understand the root of an issue, in order to identify an optimal solution. You are a master at communication and collaboration - not only can you communicate a compelling vision or a complex concept, but you also know how to motivate a team to collaborate around a problem and work toward a common goal. Experience At least 2 yrs of B2B SaaS PM experience Mandatory. Experience in data infrastructure, AI/ML platforms, or enterprise data products. Knowledge of data modeling, SQL, and ETL/ELT processes. Knowledge of data quality, metadata management, data lineage, and observability is a plus. Bachelor's degree in Engineering/Computer Science or a related technical discipline. If you share our passion for building a sustainable, intelligent, and efficient world, you're in the right place. Established in 2017 and headquartered in Mountain View, California, we're a series D start-up, with teams in Mountain View, San Francisco (California), Bucharest and Cluj-Napoca (Romania), Paris (France), Munich (Germany), London (UK), Pune (India), and Sydney (Australia). So join us, and let's build this! Benefits Summary At Aera Technology, we strive to support our Aeranauts and their loved ones through different stages of life with a variety of attractive benefits and great perks. In addition to offering a competitive salary and company stock options, we have other great benefits available. You'll find comprehensive medical, Group Medical Insurance, Term Insurance, Accidental Insurance, paid time off, Maternity leave, and much more. We offer unlimited access to online professional courses for both professional and personal development, coupled with people manager development programs. We believe in a flexible working environment to allow our Aeranauts to perform at their best, ensuring a healthy work-life balance. When you're working from the office, you'll also have access to a fully-stocked kitchen with a selection of snacks and beverages.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
As a Business Intelligence (BI) Developer, your primary responsibility will involve developing a BI framework and an implementation roadmap to deploy BI solutions across various functions to meet organizational requirements. You will collaborate with functional and business stakeholders to gather and understand their requirements, design scalable reports, visualizations, and interactive dashboards to provide actionable insights and support data-driven decision-making. Your role will also entail handling ad-hoc requests for data analysis and visualization, assisting stakeholders in identifying patterns, generating meaningful insights, and facilitating data-driven decision-making. In addition, you will be involved in developing data standards, data archiving procedures, performing data analysis and profiling using SQL to identify data quality issues, and recommending solutions to stakeholders. You will also be responsible for identifying opportunities for automation, quality improvement, streamlining, and standardization in data gathering, reporting, and insights generation. Moreover, you will assess system performance and provide recommendations for hardware, software, and data management/storage improvements. Furthermore, you will collaborate with third-party vendors to ensure a smooth handover-takeover of existing BI systems or Manufacturing Execution Systems (MES). You will act as the super user for MES and provide first-level support for internal user queries. Your role will also involve creating and optimizing data models, data connections, and transformations to ensure accurate and efficient data analysis, visualization, and reporting. Additionally, you will develop Excel-based tools and utilities to support data gathering and problem-solving, utilizing VBA to automate and error-proof them as much as possible. Working with cross-functional teams, you will define Key Performance Indicators (KPIs), set performance baselines, and ensure the availability of real-time insights through live dashboards and regular reports. You will also design, develop, and modify data infrastructure to accelerate data analysis and reporting processes. Moreover, as a BI Developer, you will lead Artificial Intelligence (AI) and Machine Learning (ML) implementation projects to deliver AI-powered insights. You will be responsible for developing and maintaining standards of operation for handling and archiving data, as well as overseeing the integration of new technologies and initiatives into data standards and structures. Additionally, you will participate in evaluating the design, selection, and implementation of database changes by aligning them with business requirements and design documents and ensuring data/information security across global teams and third parties. Your Profile: - Qualification: STEM graduate with a degree in Computer Science Engineering. - Certification in BI-Analytics is desirable. - Proficiency in Microsoft Excel, VBA, automation, SQL, Power BI, Tableau, and SAP Analytics Cloud (preferred but not mandatory), Data modeling, Statistical Analysis, Data analysis, data Visualization, with a good understanding of advanced analytics (AI/ML, desirable). - 5-8 years of experience in data analytics with demonstrated expertise in Power BI. - Desired experience in SAP Analytics Cloud.,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
karnataka
On-site
The Network Specialist (Tier3 support) plays a critical role in supporting the technical activities and projects associated with Telstra IP product range. Your primary responsibility is to ensure that the Telstra network delivers a superior network experience for our consumer and business customers. Drawing on your deep levels of telecommunications network expertise and experience, you will provide specialist analysis, design, development, and deployment in network-specific technology domains. This includes developing platform-specific technology roadmaps and making technology investment recommendations. Your role significantly contributes to the business strategy by effectively and efficiently supporting a diverse set of technical projects with specific outcomes via programs of work in Technology delivery. You will also provide expert input and recommendations regarding future-looking technical projects. Key areas of focus for this role include Core IP routers and capacities supporting the complete suite of International IP network services, Internet Edge Routers providing IPT and GID services to key OTT and Eyeball customers, Private IP service Provider Edge routers servicing Telstra internal and external customers, Overlay GMNS and SD-WAN product range providing services to enterprise customers, and support and operational systems. Your responsibilities will involve working closely with operations teams, supporting the deployment of configurations and new designs/solutions to network elements, developing and testing solutions in physical and virtual lab environments, supporting deployment activities of new technologies and hardware, and collaborating with product teams to support customer product offerings such as SD-WAN and GNMS. You will also be responsible for providing engineering sales support, responding to RFQs and tenders on potential services of a complex or novel nature, attending customer meetings as a technical expert, identifying, analyzing, and specifying new and emerging technologies, technology roadmaps, and product/service opportunities, and maintaining a high level of capability in network engineering. Additionally, you will support the deployment and management of internal networks, communicate complex technical concepts to a less technically skilled audience, handle complex network issues, and deliver customer-focused service support under high pressure. Qualifications/Experiences: - University graduate in Telecommunication or related study - Industry certification is a requirement with a preference for candidates with Cisco CCIE and Juniper JNCIE certifications - In-depth knowledge of Juniper and Cisco IP networking and equipment - Excellent analytical and writing skills, with attention to detail - Ability to remain calm under pressure and experience working in an operational capacity Essential Technical Skills: - BGP - BGP-LS - Data Infrastructure - DevOps practices - DHCP - DNS - Ethernet - ETSI NFV-MANO (frameworks) - EVPN - Internet protocol suite (TCP/IP) - IP Networking - IP Routing - IPv4 - IPv6 - MPLS - NETCONF/RESTCONF - NFV - OpenFlow - OpenStack - Orchestration - OSI Reference Model - Network Telemetry - CLA technologies - RPKI - Management protocols (SNMP, NETFLOW) - SDWAN,
Posted 3 weeks ago
5.0 - 9.0 years
0 Lacs
chennai, tamil nadu
On-site
The Team Lead - Business Analyst will be joining the Merchandising Analytics Team in Dania Beach, FL. Reporting directly to the Sr. Manager of Enterprise Data and BI, you will play a crucial role in delivering scalable data infrastructure, supporting data governance initiatives, and enhancing the self-service merchandising reporting environment. Collaboration with analysts, data engineers, and the corporate IT team will be key to anticipating data infrastructure requirements and proactively developing solutions to drive data-centric insights. Your primary responsibilities will include demonstrating proficiency in handling and analyzing data sets comprising millions of records using SQL, Excel, and Tableau. You will be tasked with creating metrics and dashboards encompassing customer, marketing, ecommerce, and financial metrics. Additionally, the ability to leverage data visualization techniques to present intricate topics in a clear and concise manner will be essential. Key Responsibilities: - Collaborate with analytics, data engineering, and BI teams to develop and quality-assure new data tables and sources across various platforms - Work closely with business leaders and end-users to enhance the Tableau experience and ensure actionable and insightful KPIs - Assist the Sr. Manager of Enterprise BI in establishing and upholding stringent data governance standards within Tableau and underlying data sources - Develop and maintain reusable SQL scripts to cater to diverse business cases and reporting demands - Create informative and user-friendly Tableau dashboards using multiple data sources, parameters, and measures - Conduct intricate analyses on customer behavior, marketing mix, and funnel performance by leveraging extensive data sets from multiple systems - Collaborate with different analytics teams to design and implement scalable, self-service reporting for cross-functional stakeholders - Partner with stakeholders to define and sequence product release roadmaps using tools such as JIRA, Confluence, and other workflow management and documentation platforms.,
Posted 4 weeks ago
3.0 - 8.0 years
5 - 9 Lacs
Bengaluru
Work from Office
Job Description : We are looking for a Big Data Engineer that will work on the collecting, storing, processing, and analyzing of huge sets of data. The primary focus will be on choosing optimal solutions to use for these purposes, then maintaining, implementing, and monitoring them. You will also be responsible for integrating them with the architecture used across the company. Responsibilities : - Selecting and integrating any Big Data tools and frameworks required to provide requested capabilities - Implementing ETL process - Monitoring performance and advising any necessary infrastructure changes - Defining data retention policies Skills and Qualifications : - Proficient understanding of distributed computing principles - Management of Hadoop cluster, with all included services - Ability to solve any ongoing issues with operating the cluster - Proficiency with Hadoop v2, MapReduce, HDFS - Experience with building stream-processing systems, using solutions such as Storm or Spark-Streaming - Good knowledge of Big Data querying tools, such as Pig, Hive, and Impala - Experience with Spark - Experience with integration of data from multiple data sources - Experience with NoSQL databases, such as HBase, Cassandra, MongoDB - Knowledge of various ETL techniques and frameworks, such as Flume - Experience with various messaging systems, such as Kafka or RabbitMQ - Experience with Big Data ML toolkits, such as Mahout, SparkML, or H2O - Good understanding of Lambda Architecture, along with its advantages and drawbacks - Experience with Cloudera/MapR/Hortonworks
Posted 1 month ago
4.0 - 9.0 years
4 - 10 Lacs
Bengaluru, Karnataka, India
On-site
Role Responsibilities Lead and manage complex, cross-functional technical programs, ensuring timely execution and stakeholder alignment. Collaborate with Engineering, Product, and AI Data Infra teams to optimize workflows and align operational priorities. Own resource allocation, process improvements, and tooling launches, managing escalations and critical incidents. Drive program execution, measure KPIs, mitigate risks, and contribute to budgeting and forecasting for AI Data operations. Job Requirements Bachelor's degree in Computer Science Engineering, or equivalent practical experience. 5 years of experience in program management with technical cross-functional projects. 3 years of experience managing large-scale programs, including vendor-managed resources. Strong ability to influence, communicate across teams, and operate in a fast-paced environment.
Posted 1 month ago
4.0 - 9.0 years
4 - 10 Lacs
Hyderabad, Telangana, India
On-site
Role Responsibilities Lead and manage complex, cross-functional technical programs, ensuring timely execution and stakeholder alignment. Collaborate with Engineering, Product, and AI Data Infra teams to optimize workflows and align operational priorities. Own resource allocation, process improvements, and tooling launches, managing escalations and critical incidents. Drive program execution, measure KPIs, mitigate risks, and contribute to budgeting and forecasting for AI Data operations. Job Requirements Bachelor's degree in Computer Science Engineering, or equivalent practical experience. 5 years of experience in program management with technical cross-functional projects. 3 years of experience managing large-scale programs, including vendor-managed resources. Strong ability to influence, communicate across teams, and operate in a fast-paced environment.
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City