Home
Jobs

366 Neo4J Jobs - Page 14

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 12.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Job Description We are looking for a highly skilled GCP Technical Lead with 6 to 12 years of experience to join our dynamic team. In this role, you will be responsible for designing and implementing scalable, secure, and highly available cloud infrastructure solutions on Google Cloud Platform (GCP). You will lead the architecture and development of cloud-native applications and ensure that infrastructure and applications are optimized for performance, security, and scalability. Your expertise will play a key role in the design and execution of workload migrations, CI/CD pipelines, and infrastructure : Cloud Architecture and Design : Lead the design and implementation of scalable, secure, and highly available cloud infrastructure solutions on GCP using services like Compute Engine, Kubernetes Engine, Cloud Storage, Cloud SQL, and Cloud Load Balancing. Cloud-Native Applications Design : Develop architecture design and guidelines for the development, deployment, and lifecycle management of cloud-native applications, ensuring optimization for security, performance, and scalability with services such as App Engine, Cloud Functions, and Cloud Run. API Management : Implement secure API interfaces and granular access control using IAM, RBAC, and API Gateway for workloads running on GCP. Workload Migration : Lead the migration of on-premises workloads to GCP, ensuring minimal downtime, data integrity, and smooth transitions. CI/CD : Design and implement CI/CD pipelines using Cloud Build, Cloud Source Repositories, and Artifact Registry to automate development and deployment processes. Infrastructure as Code (IaC) : Automate cloud infrastructure provisioning and management using Terraform. Collaboration : Collaborate closely with cross-functional teams to define requirements, design solutions, and ensure successful project delivery, utilizing tools like Google Workspace and Jira. Monitoring and Optimization : Continuously monitor cloud environments to ensure optimal performance, availability, and security, and perform regular audits and tuning. Documentation : Prepare and maintain comprehensive documentation for cloud infrastructure, configurations, and procedures using Google Docs and Qualifications : Bachelors degree in computer science, Information Systems, or related field. 6-12 years of relevant experience in cloud engineering and architecture. Google Cloud Professional Cloud Architect certification. Experience with Kubernetes. Familiarity with DevOps methodologies. Strong problem-solving and analytical skills. Excellent communication skills. Required Skills Google Cloud Platform (GCP) Services, Compute Engine, Google Kubernetes Engine (GKE), Cloud Storage, Cloud SQL, Cloud Load Balancing, Identity and Access Management (IAM), Google Workflows, Google Cloud Pub/Sub, App Engine, Cloud Functions, Cloud Run, API Gateway Cloud Build, Cloud Source Repositories, Artifact Registry, Google Cloud Monitoring, Logging and Error Reporting, Python, Terraform, Google Cloud Firestore, GraphQL, MongoDB, Cassandra, Neo4j, ETL (Extract, Transform, Load) Paradigms, Google Cloud Dataflow, Apache Beam, BigQuery, Service Mesh, Content Delivery Network (CDN), Stackdriver, Google Cloud Trace (ref:hirist.tech) Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Summary We are seeking a highly skilled Data Engineer with expertise in leveraging Data Lake architecture and the Azure cloud platform to develop, deploy, and optimise data-driven solutions. . You will play a pivotal role in transforming raw data into actionable insights, supporting strategic decision-making across the organisation. Responsibilities Design and implement scalable data science solutions using Azure Data Lake, Azure Data Bricks, Azure Data Factory and related Azure services. Develop, train, and deploy machine learning models to address business challenges. Collaborate with data engineering teams to optimise data pipelines and ensure seamless data integration within Azure cloud infrastructure. Conduct exploratory data analysis (EDA) to identify trends, patterns, and insights. Build predictive and prescriptive models to support decision-making processes. Expertise in developing end-to-end Machine learning lifecycle utilizing crisp-DM which includes of data collection, cleansing, visualization, preprocessing, model development, model validation and model retraining Proficient in building and implementing RAG systems that enhance the accuracy and relevance of model outputs by integrating retrieval mechanisms with generative models. Ensure data security, compliance, and governance within the Azure cloud ecosystem. Monitor and optimise model performance and scalability in production environments. Prepare clear and concise documentation for developed models and workflows. Skills Required Good experience in using Pyspark, Python, MLops (Optional), ML flow (Optional), Azure Data Lake Storage. Unity Catalog Worked and utilized data from various RDBMS like MYSQL, SQL Server, Postgres and NoSQL databases like MongoDB, Cassandra, Redis and graph DB like Neo4j, Grakn. Proven experience as a Data Engineer with a strong focus on Azure cloud platform and Data Lake architecture. Proficiency in Python, Pyspark, Hands-on experience with Azure services such as Azure Data Lake, Azure Synapse Analytics, Azure Machine Learning, Azure Databricks, and Azure Functions. Strong knowledge of SQL and experience in querying large datasets from Data Lakes. Familiarity with data engineering tools and frameworks for data ingestion and transformation in Azure. Experience with version control systems (e.g., Git) and CI/CD pipelines for machine learning projects. Excellent problem-solving skills and the ability to work collaboratively in a team environment. Show more Show less

Posted 1 month ago

Apply

1.0 - 3.0 years

11 - 15 Lacs

Mumbai

Work from Office

Naukri logo

Overview The Data Technology team at MSCI is responsible for meeting the data requirements across various business areas, including Index, Analytics, and Sustainability. Our team collates data from multiple sources such as vendors (e.g., Bloomberg, Reuters), website acquisitions, and web scraping (e.g., financial news sites, company websites, exchange websites, filings). This data can be in structured or semi-structured formats. We normalize the data, perform quality checks, assign internal identifiers, and release it to downstream applications. Responsibilities As data engineers, we build scalable systems to process data in various formats and volumes, ranging from megabytes to terabytes. Our systems perform quality checks, match data across various sources, and release it in multiple formats. We leverage the latest technologies, sources, and tools to process the data. Some of the exciting technologies we work with include Snowflake, Databricks, and Apache Spark. Qualifications Core Java, Spring Boot, Apache Spark, Spring Batch, Python. Exposure to sql databases like Oracle, Mysql, Microsoft Sql is a must. Any experience/knowledge/certification on Cloud technology preferrably Microsoft Azure or Google cloud platform is good to have. Exposures to non sql databases like Neo4j or Document database is again good to have. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 1 month ago

Apply

2.0 - 3.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Solid experience in NodeJS, TypeScript, React, Neo4j and Firestore (GCP) . In-depth knowledge in software design & development practices Design and develop scalable systems using advanced concepts in NodeJS, TypeScript, Javascript, and React. Should have a better understanding of deploying and working with GKE. Ability to design for scale and performance/peer code reviews Architecture/platform development, API, data modelling at scale Excellent working experience in Express, Knex, Serverless GC Functions Solid experience in JavaScript Frameworks (Angular / React.JS), Redux, JavaScript , JQuery, CSS, HTML5, ES5, ES6 & ES7, in-memory databases (Redis / Hazelcast), Build tools (web pack) Good Error and Exceptional Handling Skills. Ability to work with Git repositories, and remote code hosting services like GitHub and Gitlab Ability to deliver amazing results with minimal guidance and supervision Passionate (especially about web development!), highly motivated, and fun to work with React, Es6 (5), Html & Css, Javascript & Typescript, Api, Node

Posted 1 month ago

Apply

10.0 - 20.0 years

25 - 35 Lacs

Pune

Remote

Naukri logo

Job Description Technical Delivery Manager Saama Technologies Responsibilities: Oversee the end-to-end development and delivery of the Graph RAG system. Manage project timelines, ensuring timely delivery and adherence to milestones. Establish and maintain strong communication with client technical leads, providing regular updates and addressing technical concerns. Offer technical leadership and expertise in Graph Databases (e.g., Neo4j) and LLM-based applications. Collaborate with the team on architectural decisions, ensuring solutions are scalable, robust, and aligned with client requirements. Mitigate technical risks and address challenges proactively. Qualifications: Proven experience in technical project management and delivery, ideally within the AI/ML or data science domain. Strong understanding of Graph Databases and LLM-based systems. Experience with cloud-based development and deployment (AWS, GCP, or Azure). Excellent communication and interpersonal skills, with the ability to bridge the gap between technical and non-technical stakeholders. Ability to work independently and lead a team in a fast-paced environment. Experience with Agile methodologies. Required Skills: Knowledge of Graph Databases (Neo4) - Experience with LLM-based systems - Proficiency in LangChain - API development and cloud deployment expertise - Experience managing engineering teams and Agile methodologies Desired Skills: Familiarity with LangChain and API development. Knowledge of MLOps and CI/CD practices.

Posted 1 month ago

Apply

7.0 - 9.0 years

25 - 40 Lacs

Pune

Work from Office

Naukri logo

Our world is transforming, and PTC is leading the way.Our software brings the physical and digital worlds together, enabling companies to improve operations, create better products, and empower people in all aspects of their business. Our people make all the difference in our success. Today, we are a global team of nearly 7,000 and our main objective is to create opportunities for our team members to explore, learn, and grow – all while seeing their ideas come to life and celebrating the differences that make us who we are and the work we do possible. : PTC is a dynamic and innovative company dedicated to creating innovative products that transform industries and improve lives. We are looking for a talented Product Architect that will be able to lead the conceptualization and development of groundbreaking products, and leverage the power of cutting edge AI technologies to drive enhanced productivity and innovation. Job Description: Responsibilities: Design and implement scalable, secure, and high-performing Java applications. Focus on designing, building, and maintaining complex, large-scale systems with intrinsic multi-tenant SaaS characteristics. Define architectural standards, best practices, and technical roadmaps. Lead the integration of modern technologies, frameworks, and cloud solutions. Collaborate with DevOps, product teams, and UI/UX designers to ensure cohesive product development. Conduct code reviews, mentor developers, and enforce best coding practices. Stay up-to-date with the latest design patterns, technological trends, and industry best practices. Ensure scalability, performance, and security of product designs. Conduct feasibility studies and risk assessments. Requirements: Proven experience as a Software Solution Architect or similar role. Strong expertise in vector and graph databases (e.g., Pinecone, Chroma DB, Neo4j, ArangoDB, Elastic Search). Extensive experience with content repositories and content management systems. Familiarity with SaaS and microservices implementation models. Proficiency in programming languages such as Java, Python, or C#. Excellent problem-solving skills and ability to think strategically. Strong technical, analytical, communication, interpersonal, and presentation skills. Bachelor's or Master's degree in Computer Science, Engineering, or related field. Experience with cloud platforms (e.g., AWS, Azure). Knowledge of containerization technologies (e.g., Docker, Kubernetes). Experience with artificial intelligence (AI) and machine learning (ML) technologies. Benefits: Competitive salary and benefits package. Opportunities for professional growth and development. Collaborative and inclusive work environment. Flexible working hours and hybrid work options. Life at PTC is about more than working with today’s most cutting-edge technologies to transform the physical world. It’s about showing up as you are and working alongside some of today’s most talented industry leaders to transform the world around you. If you share our passion for problem-solving through innovation, you’ll likely become just as passionate about the PTC experience as we are. Are you ready to explore your next career move with us? We respect the privacy rights of individuals and are committed to handling Personal Information responsibly and in accordance with all applicable privacy and data protection laws. Review our Privacy Policy here ."

Posted 1 month ago

Apply

0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About Company : Our client is prominent Indian multinational corporation specializing in information technology (IT), consulting, and business process services and its headquartered in Bengaluru with revenues of gross revenue of ₹222.1 billion with global work force of 234,054 and listed in NASDAQ and it operates in over 60 countries and serves clients across various industries, including financial services, healthcare, manufacturing, retail, and telecommunications. The company consolidated its cloud, data, analytics, AI, and related businesses under the tech services business line. Major delivery centers in India, including cities like Chennai, Pune, Hyderabad, and Bengaluru, kochi, kolkatta, Noida. Job Title : Java Backend Developer (Contract to Hire) Experience : 6 to 10 Years Job Location : Bengaluru / Chennai Interview Mode : Face-to-Face Interview Date : 24th May 2025 Venue : [Will be shared shortly] · Notice period :- Immediate joiners. Standard Job Requirements 6+ Years of experience in Application Development using Java and Advance Technologies tool Strong understanding of fundamental architecture and design principles, object-orientation principles, and coding standards Ability to design and build smart, scalable, and resilient solutions with tight deadlines, both high and low-level. Strong analytical and problem-solving skills Strong verbal and written communication skills Good knowledge in DevOps, CI-CD Understanding on source control, versioning, branching etc. Experienced in Agile methodology and Waterfall models Strong experience in Application Delivery, that also includes Production Support Very Good presentation and documentation skills Ability to learn and adapt to new technologies and frameworks Awareness about Release Management Strong team player who can collaborate effectively with relevant stakeholders Recommend future technology capabilities and architecture design considering business objectives, technology strategy, trends and regulatory requirements. Technical Competence Must Have Strong programming and hands-on skills in Java 8 or above (preferably Java 17) Good hands on Java Collections and Streams Good hands on Data structure and Algorithms. Good experience in developing vulnerable free Spring Framework applications Good knowledge on Spring DI/Blueprints, Spring Boot, etc. Good knowledge about Design Patterns and Principles Good knowledge on OR frameworks like Hibernate, JPA etc. Good knowledge on API building (Web Service, SOAP/REST) Good knowledge on Unit testing and code coverage using JUnit/Mockito Good knowledge on code quality tools like SonarQube, Security Scans etc. Good knowledge on containerized platforms like Kubernetes, OpenShift, EKS (AWS) Good knowledge in Enterprise Application Integration patterns (synchronous, asynchronous) Good knowledge on multi-threading and multi-processing implementations Experience in RDBMS (Oracle, PostgreSQL, MySQL) Knowledge on SQL queries Ability to work in quick paced, dynamic environment adapting agile methodologies Ability to work with minimal guidance and/or high-level design input Knowledge on Microservices based development and implementation Knowledge on CI-CD pattern with related tools like Azure DevOps, GIT, Bitbucket, etc. Knowledge on JSON libraries like Jackson/GSON Knowledge on basic Unix Commands Possess good documentation and presentation skills Able to articulate ideas, designs, and suggestions Mentoring fellow team members, conducting code reviews Good to Have Hands-on skills in J2EE specifications like JAX-RS, JAX-WS Experience in working and supporting OLTP and OLAP systems Good Knowledge on Spring Batch, Spring Security Good knowledge in Linux Operating System (Preferably RHEL) Good knowledge on NoSQL offerings (Cassandra, MongoDB, GraphDB, etc) Knowledge on testing methodologies like performance testing, smoke testing, stress testing, endurance testing, etc. Knowledge in Python, Groovy Knowledge in middleware technologies like Kafka, Solace etc. Knowledge in DSE DataStax or Neo4j Cloud environments knowledge (AWS / Azure etc.) Knowledge on IMDG (Hazelcast, Ignite) Knowledge on Rule Engines like Drools, OpenL Tablets, Easy Rules etc. Experience in presenting solutions to architecture forums and follow the principles and standards in implementation Domain: Good to Have Experience in application development for Client Due Diligence (CDD), On-boarding, FATCA & CRS, AML, KYC, and Screening Good knowledge on Cloud native application development, and knowledge of Cloud computing services Training, Qualifications and Certifications Training/qualifications and Certifications in some of the functional and/or technical domains as mentioned will be an added advantage Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Kolkata metropolitan area, West Bengal, India

On-site

Linkedin logo

We are seeking an experienced AI Solution Architect to lead the design and implementation of AI-driven, cloud-native applications. The ideal candidate will possess deep expertise in Generative AI, Agentic AI, cloud platforms (AWS, Azure, GCP), and modern data engineering practices. This role involves collaborating with cross-functional teams to deliver scalable, secure, and intelligent solutions in a fast-paced, innovation-driven environment. Key Responsibilities: Design and architect AI/ML solutions, including Generative AI, Retrieval-Augmented Generation (RAG), and fine-tuning of Large Language Models (LLMs) using frameworks like LangChain, LangGraph, and Hugging Face. Implement cloud migration strategies for monolithic systems to microservices/serverless architectures using AWS, Azure, and GCP. Lead development of document automation systems leveraging models such as BART, LayoutLM, and Agentic AI workflows. Architect and optimize data lakes, ETL pipelines, and analytics dashboards using Databricks, PySpark, Kibana, and MLOps tools. Build centralized search engines using ElasticSearch, Solr, and Neo4j for intelligent content discovery and sentiment analysis. Ensure application and ML pipeline security with tools like SonarQube, WebInspect, and container security tools. Collaborate with InfoSec and DevOps teams to maintain CI/CD pipelines, perform vulnerability analysis, and ensure compliance. Guide modernization initiatives across app stacks and coordinate BCDR-compliant infrastructures for mission-critical services. Provide technical leadership and mentoring to engineering teams during all phases of the SDLC. Hands-on experience with: Generative AI, LLMs, Prompt Engineering, LangChain, AutoGen, Vertex AI, AWS Bedrock Python, Java (Spring Boot, Spring AI), PyTorch Vector & Graph Databases: ElasticSearch, Solr, Neo4j Cloud Platforms: AWS, Azure, GCP (CAF, serverless, containerization) DevSecOps: SonarQube, OWASP, oAuth2, container security Strong background in application modernization, cloud-native architecture, and MLOps orchestration. Familiarity with front-end technologies: HTML, JavaScript, React, JQuery. Certifications Any certification on AI/ML from reputed institute Required Skills & Qualifications Bachelor's degree in Computer Science, Engineering, or Mathematics 10+ years of total experience, with extensive tenure as a Solution Architect in AI and cloud-driven transformations. Advanced knowledge of leading architecture solutions in the industry area Strong interpersonal and collaboration skills Ability to demonstrate technical concepts to non-technical audiences Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Role Description This is a contract remote role for a Senior Graph Data Engineer. The Senior Graph Data Engineer will be responsible for designing, developing, and maintaining graph database solutions, creating data models, and implementing ETL processes. The role includes data warehousing tasks and data analytics to support decision-making processes. Collaborating with cross-functional teams, the Senior Graph Data Engineer will ensure the integrity and performance of graph databases. Qualifications Proficiency in Data Engineering and Data Modeling Experience with Extract Transform Load (ETL) processes Knowledge of Data Warehousing Strong Data Analytics skills Excellent problem-solving and critical-thinking skills Strong communication and teamwork abilities Bachelor's degree in Computer Science, Engineering, or related field Experience with graph databases like Neo4j is a plus Exposure to AI and Machine Learning techniques is beneficial Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Sadar, Uttar Pradesh, India

On-site

Linkedin logo

. Role Overview: We are seeking a motivated Junior AI Testing Engineer to join our team. In this role, you will support the testing of AI models and pipelines, with a special focus on data ingestion into knowledge graphs and knowledge graph administration. You will collaborate with data scientists, engineers, and product teams to ensure the quality, reliability, and performance of our AI-driven solutions. Key Responsibilities: AI Model & Pipeline Testing: Design and execute test cases for AI models and data pipelines, ensuring accuracy, stability, and fairness Knowledge Graph Ingestion: Support the development and testing of Python scripts for data extraction, transformation, and loading (ETL) into enterprise knowledge graphs Knowledge Graph Administration: Assist in maintaining, monitoring, and troubleshooting knowledge graph environments (e.g., Neo4j, RDF stores), including user access and data integrity. Test Automation: Develop and maintain basic automation scripts (preferably in Python) to streamline testing processes for AI functionalities Data Quality Assurance: Evaluate and validate the quality of input and output data for AI models, reporting and documenting issues as needed Bug Reporting & Documentation: Identify, document, and communicate bugs or issues discovered during testing. Maintain clear testing documentation and reports. Collaboration: Work closely with knowledge graph engineers, data scientists, and product managers to understand requirements and deliver robust solutions. Requirements Requirements: Education: Bachelor’s degree in Computer Science, Information Technology, or related field. Experience: ideally experience in software/AI testing, data engineering, or a similar technical role. Technical Skills: Proficient in Python (must have) Experience with test case design, execution, and bug reporting Exposure to knowledge graph technologies (e.g., Neo4j, RDF, SPARQL) and data ingestion/ETL processes Analytical & Problem-Solving Skills: Strong attention to detail, ability to analyze data and systems, and troubleshoot issues Communication: Clear verbal and written communication skills for documentation and collaboration. Preferred Qualifications: Experience with graph query languages (e.g., Cypher, SPARQL) Exposure to cloud platforms (AWS, Azure, GCP) and CI/CD workflows Familiarity with data quality and governance practices. Show more Show less

Posted 1 month ago

Apply

2 - 5 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Databricks Engineer Full-time DepartmentDigital, Data and Cloud Company Description Version 1 has celebrated over 26+ years in Technology Services and continues to be trusted by global brands to deliver solutions that drive customer success. Version 1 has several strategic technology partners including Microsoft, AWS, Oracle, Red Hat, OutSystems and Snowflake. Were also an award-winning employer reflecting how employees are at the heart of Version 1. Weve been awardedInnovation Partner of the Year Winner 2023 Oracle EMEA Partner Awards, Global Microsoft Modernising Applications Partner of the Year Award 2023, AWS Collaboration Partner of the Year - EMEA 2023 and Best Workplaces for Women by Great Place To Work in UK and Ireland 2023. As a consultancy and service provider, Version 1 is a digital-first environment and we do things differently. Were focused on our core values; using these weve seen significant growth across our practices and our Digital, Data and Cloud team is preparing for the next phase of expansion. This creates new opportunities for driven and skilled individuals to join one of the fastest-growing consultancies globally. About The Role This is an exciting opportunity for an experienced developer of large-scale data solutions. You will join a team delivering a transformative cloud hosted data platform for a key Version 1 customer. The ideal candidate will have a proven track record as a senior/self-starting data engineerin implementing data ingestion and transformation pipelines for large scale organisations. We are seeking someone with deep technical skills in a variety of technologies, specifically SPARK performanceuning\optimisation and Databricks , to play an important role in developing and delivering early proofs of concept and production implementation. You will ideally haveexperience in building solutions using a variety of open source tools & Microsoft Azure services, and a proven track record in delivering high quality work to tight deadlines. Your main responsibilities will be: Designing and implementing highly performant metadata driven data ingestion & transformation pipelines from multiple sources using Databricks and Spark Streaming and Batch processes in Databricks SPARK performanceuning\optimisation Providing technical guidance for complex geospatial problems and spark dataframes Developing scalable and re-usable frameworks for ingestion and transformation of large data sets Data quality system and process design and implementation. Integrating the end to end data pipeline to take data from source systems to target data repositories ensuring the quality and consistency of data is maintained at all times Working with other members of the project team to support delivery of additional project components (Reporting tools, API interfaces, Search) Evaluating the performance and applicability of multiple tools against customer requirements Working within an Agile delivery / DevOps methodology to deliver proof of concept and production implementation in iterative sprints. Qualifications Direct experience of building data piplines using Azure Data Factory and Databricks Experience Required is 6 to 8 years. Building data integration with Python Databrick Engineer certification Microsoft Azure Data Engineer certification. Hands on experience designing and delivering solutions using the Azure Data Analytics platform. Experience building data warehouse solutions using ETL / ELT tools like Informatica, Talend. Comprehensive understanding of data management best practices including demonstrated experience with data profiling, sourcing, and cleansing routines utilizing typical data quality functions involving standardization, transformation, rationalization, linking and matching. Nice to have Experience working in a Dev/Ops environment with tools such as Microsoft Visual Studio Team Services, Chef, Puppet or Terraform Experience working with structured and unstructured data including imaging & geospatial data. Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience with Azure Event Hub, IOT Hub, Apache Kafka, Nifi for use with streaming data / event-based data Additional Information At Version 1, we believe in providing our employees with a comprehensive benefits package that prioritises their well-being, professional growth, and financial stability. One of our standout advantages is the ability to work with a hybrid schedule along with business travel, allowing our employees to strike a balance between work and life. We also offer a range of tech-related benefits, including an innovative Tech Scheme to help keep our team members up-to-date with the latest technology. We prioritise the health and safety of our employees, providing private medical and life insurance coverage, as well as free eye tests and contributions towards glasses. Our team members can also stay ahead of the curve with incentivized certifications and accreditations, including AWS, Microsoft, Oracle, and Red Hat. Our employee-designed Profit Share scheme divides a portion of our company's profits each quarter amongst employees. We are dedicated to helping our employees reach their full potential, offering Pathways Career Development Quarterly, a programme designed to support professional growth. Cookies Settings

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

As an AI Product Manager in GenAI Core team you will help shape the future of our Generative AI platform—with a strong focus on GenAI and agentic capabilities. This isn’t a traditional AI/ML development role. Instead, it’s about transforming ideas into impactful GenAI product features—particularly in the space of autonomous agents, multi-agent collaboration, and LLM-driven automation. Duties And Responsibilities Translate stakeholder needs into scalable, implementable features in the GenAI platform. Collaborate across teams to build next-gen GenAI experiences Continuously refine and optimize the product based on feedback and trends such as agentic and multiagent capabilities. Create and maintain product feature backlog, user guides. Refine the backlog features into implementation Epic, user stories, artefacts. Provide prompt and effective support and guidance to developers. Ensure the quality of implementation and deliverables. Collaborate with the engineering team to address and resolve product issues. Stay abreast of the latest industry trends and technologies. Assume overall responsibility for the product management in respective projects. Demonstrate a strong desire for continuous learning and the ability to quickly adapt and implement new technologies. Skills Qualification, Experience, Technical and Functional Skills Bachelor's degree in Computer Science, Information Technology, or a related field with 5+ years of working experience. 3–6 years in product management (preferably in AI/tech platforms) Experience in GenAI product development and agent-based design Strong understanding of LLMs, prompt engineering, and agent orchestration Proficiency in implementing Generative AI-based applications using different large language models. Understanding & experience in various generative AI models on cloud platforms such as Azure/ AWS, including Retrieval Augmented Generation, Prompt engineering, Agentic frameworks. Proficiency in cloud platforms like Azure or AWS, familiarity with deploying and managing AI applications, handling cloud storage, compute instances, and cloud security. Familiarity with in Angular or similar JavaScript frameworks for building user interfaces, along with a solid understanding of HTML, CSS, and JavaScript. Familiarity with database technologies like Postgres, SQL, NoSQL, MongoDB, and vector databases such as pgvector, neo4j. Understanding of security principles as they apply to AI applications, especially in a cloud environment. Experience with collaboration and versioning tools such as JIRA, Confluence, GitHub. Excellent problem-solving skills and the ability to debug code effectively. Ability to quickly learn and apply new technologies. 72622 | Customer Services & Claims | Professional | Allianz Technology | Full-Time | Permanent Warning: When posting this job advertisment on an external job board, the length of the following fields combined must not exceed 3950 characters: "External Posting Description", "External Posting Footer" We offer a hybrid work model which recognizes the value of striking a balance between in-person collaboration and remote working incl. up to 25 days per year working from abroad We believe in rewarding performance and our compensation and benefits package includes a company bonus scheme, pension, employee shares program and multiple employee discounts (details vary by location) From career development and digital learning programs to international career mobility, we offer lifelong learning for our employees worldwide and an environment where innovation, delivery and empowerment are fostered Flexible working, health and wellbeing offers (including healthcare and parental leave benefits) support to balance family and career and help our people return from career breaks with experience that nothing else can teach About Allianz Technology Allianz Technology is the global IT service provider for Allianz and delivers IT solutions that drive the digitalization of the Group. With more than 13,000 employees located in 22 countries around the globe, Allianz Technology works together with other Allianz entities in pioneering the digitalization of the financial services industry. We oversee the full digitalization spectrum – from one of the industry’s largest IT infrastructure projects that includes data centers, networking and security, to application platforms that span from workplace services to digital interaction. In short, we deliver full-scale, end-to-end IT solutions for Allianz in the digital age. D&I statement Allianz Technology is proud to be an equal opportunity employer encouraging diversity in the working environment. We are interested in your strengths and experience. We welcome all applications from all people regardless of gender identity and/or expression, sexual orientation, race or ethnicity, age, nationality, religion, disability, or philosophy of life. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

We are seeking a skilled, motivated, and quick-learner Full Stack Developer to join our team working on cutting-edge Gen AI development work. The successful candidate will be responsible for developing innovative applications and solutions using including frontend and backend. While the solutions will often utilize Retrieval Augmented Generation (RAG), Agentic frameworks the role will not be limited to this, and will involve various AI technologies. Duties And Responsibilities Develop and maintain web applications using Angular, NDBX frameworks, and other modern technologies. Design and implement databases in Postgres DB, apply & implement ingestion and retrieval pipelines using pgvector, neo4j, ensuring efficient and secure data practices. Use different generative AI models & frameworks such as LangChain, Haystack, LlamIndex etc for chucking, embeddings, chat completions, integration with different data sources etc. Familiarity and experience with different agentic frameworks and technique like Langgraph, AutoGen, CrewAI, tool using techniques like MCP (Model Context Protocol). Use Azure & AWS cloud platforms in implementation to stay relevant to company AI guidelines requirements. Usage of OpenAPI standards, API first approach to develop APIs for communication between different software components. Collaborate with the team members to integrate various GenAI capabilities into the applications, including but not limited to RAG. Write clean, maintainable, and efficient code that adheres to company standards. Conduct testing to identify and fix bugs or vulnerabilities. Use collaboration and versioning tools such as GitHub for effective team working and code management. Stay updated with emerging technologies and apply them into operations and activities. Show a strong desire for continuous learning and the ability to quickly adapt and implement new technologies Skills Qualification, Experience, Technical and Functional Skills Bachelor's degree in Computer Science, Information Technology, or a related field with 6+ years of working experience. Proven experience as a Full Stack Developer or similar role in designing, developing and deploying end to end applications. Knowledge of multiple front-end languages and libraries (e.g. HTML/ CSS, JavaScript, XML, jQuery). Experience with Angular and NDBX frameworks. Good experience with database technology such as Postgres DB, vector databases. Experience developing APIs following the OpenAPI standards. Understanding & experience in various generative AI models on cloud platforms such as Azure/ AWS, including Retrieval Augmented Generation, Prompt engineering, Agentic RAG, Agentic frameworks, Model context protocols etc. Experience with collaboration and versioning tools such as GitHub Experience with docker images, containers to package up an application with all the parts it needs, such as libraries and other dependencies, and ship it all out as one package 72617 | IT & Tech Engineering | Professional | Allianz Technology | Full-Time | Permanent Warning: When posting this job advertisment on an external job board, the length of the following fields combined must not exceed 3950 characters: "External Posting Description", "External Posting Footer" We offer a hybrid work model which recognizes the value of striking a balance between in-person collaboration and remote working incl. up to 25 days per year working from abroad We believe in rewarding performance and our compensation and benefits package includes a company bonus scheme, pension, employee shares program and multiple employee discounts (details vary by location) From career development and digital learning programs to international career mobility, we offer lifelong learning for our employees worldwide and an environment where innovation, delivery and empowerment are fostered Flexible working, health and wellbeing offers (including healthcare and parental leave benefits) support to balance family and career and help our people return from career breaks with experience that nothing else can teach About Allianz Technology Allianz Technology is the global IT service provider for Allianz and delivers IT solutions that drive the digitalization of the Group. With more than 13,000 employees located in 22 countries around the globe, Allianz Technology works together with other Allianz entities in pioneering the digitalization of the financial services industry. We oversee the full digitalization spectrum – from one of the industry’s largest IT infrastructure projects that includes data centers, networking and security, to application platforms that span from workplace services to digital interaction. In short, we deliver full-scale, end-to-end IT solutions for Allianz in the digital age. D&I statement Allianz Technology is proud to be an equal opportunity employer encouraging diversity in the working environment. We are interested in your strengths and experience. We welcome all applications from all people regardless of gender identity and/or expression, sexual orientation, race or ethnicity, age, nationality, religion, disability, or philosophy of life. Show more Show less

Posted 1 month ago

Apply

3 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Exp: 6 - 14 Yrs Work Mode: Hybrid Location: Bangalore, Chennai, Kolkata, Pune and Gurgaon Primary Skills:Python, pyspark, Azure Data Factory, snowflake, snowpipe, snowsql, Snowsight, Snowpark, ETL and SQL. Snowpro certified is plus Architect Exp Mandatory Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Snowflake, Databricks and ADF. Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in the reporting layer and develop a data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussions with client architect and team members Orchestrate the data pipelines in the scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Expertise in Snowflake security, Snowflake SQL and designing/implementing other Snowflake objects. Hands-on experience with Snowflake utilities, SnowSQL, Snowpipe, Snowsight and Snowflake connectors. Deep understanding of Star and Snowflake dimensional modeling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL and Spark (PySpark) Experience in building ETL / data warehouse transformation processes Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning, troubleshooting and Query Optimization. Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: python,snowpro,sql,azure data factory,azure,snowpipe,neo4j,skills,data engineering,snowflake,terraform,nosql,circleci,git,data management,unix shell scripting,pl/sql,data warehouse,data bricks,pipelines,cassandra,snowsql,architects,rdbms,databricks,data,projects,pyspark,adf,snowsight,etl,snowpark,mongodb Show more Show less

Posted 1 month ago

Apply

8 - 12 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: data,azure synapses,pl/sql,skills,nosql,git,terraform,apache kafka,unix shell scripting,hadoop,pyspark,architects,azure datafactory,azure data factory,circleci,azure functions,architect designing,data warehouse,python,sql,pipelines,etl,data engineering,azure synapse,data pipeline,data warehousing,rdbms,azure databricks,azure,airflow Show more Show less

Posted 1 month ago

Apply

8 - 12 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: data,azure synapses,pl/sql,skills,nosql,git,terraform,apache kafka,unix shell scripting,hadoop,pyspark,architects,azure datafactory,azure data factory,circleci,azure functions,architect designing,data warehouse,python,sql,pipelines,etl,data engineering,azure synapse,data pipeline,data warehousing,rdbms,azure databricks,azure,airflow Show more Show less

Posted 1 month ago

Apply

7 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Company Description 👋🏼We're Nagarro. We are a Digital Product Engineering company that is scaling in a big way! We build products, services, and experiences that inspire, excite, and delight. We work at scale across all devices and digital mediums, and our people exist everywhere in the world (18000+ experts across 36 countries, to be exact). Our work culture is dynamic and non-hierarchical. We are looking for great new colleagues. That's where you come in! Job Description REQUIREMENTS: Experience: 7+ Years Extensive Experience with Azure cloud platform Good Experience in maintaining cost-efficient, scalable cloud environments for the organization involving best practices for monitoring and cloud governance Experience with CI tools like Jenkins and building end to end CI/CD pipelines for projects Experience with various build tools like Maven/Ant/Gradle Rich Experience with container frameworks like Docker, Kubernetes or cloud native container services Good Experience in Infrastructure as a Code (IaC) using tools like Terraform Good Experience with anyone CM tools of following: Ansible, Chef, Saltstack, Puppet Good Experience in monitoring tools like Prometheus & Grafana, Nagios/ DataDog/Zabbix and logging tools like Splunk/LogStash Good Experience in scripting and automation using languages like Bash/Shell, Python, PowerShell, Groovy, Perl. Configure and manage data sources like MySQL, Mongo, Elasticsearch, Redis, Cassandra, Hadoop, PostgreSQL, Neo4J etc Good experience on managing version control tool like Git, SVN/BitBucket Good problem-solving ability, strong written and verbal communication skills RESPONSIBILITIES: Understanding the client’s business use cases and technical requirements and be able to convert them into technical design which elegantly meets the requirements. Mapping decisions with requirements and be able to translate the same to developers. Identifying different solutions and being able to narrow down the best option that meets the client’s requirements. Defining guidelines and benchmarks for NFR considerations during project implementation Writing and reviewing design document explaining overall architecture, framework, and high-level design of the application for the developers Reviewing architecture and design on various aspects like extensibility, scalability, security, design patterns, user experience, NFRs, etc., and ensure that all relevant best practices are followed. Developing and designing the overall solution for defined functional and non-functional requirements; and defining technologies, patterns, and frameworks to materialize it Understanding and relating technology integration scenarios and applying these learnings in projects Resolving issues that are raised during code/review, through exhaustive systematic analysis of the root cause, and being able to justify the decision taken. Carrying out POCs to make sure that suggested design/technologies meet the requirements. Qualifications Bachelor’s or master’s degree in computer science, Information Technology, or a related field. Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

Position Overview ABOUT APOLLO Apollo is a high-growth, global alternative asset manager. In our asset management business, we seek to provide our clients excess return at every point along the risk-reward spectrum from investment grade to private equity with a focus on three investing strategies: yield, hybrid, and equity. For more than three decades, our investing expertise across our fully integrated platform has served the financial return needs of our clients and provided businesses with innovative capital solutions for growth. Through Athene, our retirement services business, we specialize in helping clients achieve financial security by providing a suite of retirement savings products and acting as a solutions provider to institutions. Our patient, creative, and knowledgeable approach to investing aligns our clients, businesses we invest in, our employees, and the communities we impact, to expand opportunity and achieve positive outcomes. OUR PURPOSE AND CORE VALUES Our Clients Rely On Our Investment Acumen To Help Secure Their Future. We Must Never Lose Our Focus And Determination To Be The Best Investors And Most Trusted Partners On Their Behalf. We Strive To Be The leading provider of retirement income solutions to institutions, companies, and individuals. The leading provider of capital solutions to companies. Our breadth and scale enable us to deliver capital for even the largest projects – and our small firm mindset ensures we will be a thoughtful and dedicated partner to these organizations. We are committed to helping them build stronger businesses. A leading contributor to addressing some of the biggest issues facing the world today – such as energy transition, accelerating the adoption of new technologies, and social impact – where innovative approaches to investing can make a positive difference. We are building a unique firm of extraordinary colleagues who: Outperform expectations. Challenge Convention Champion Opportunity Lead responsibly. Drive collaboration As One Apollo team, we believe that doing great work and having fun go hand in hand, and we are proud of what we can achieve together. Our Benefits Apollo relies on its people to keep it a leader in alternative investment management, and the firm’s benefit programs are crafted to offer meaningful coverage for both you and your family. Please reach out to your Human Capital Business Partner for more detailed information on specific benefits. Position Overview At Apollo, we are a global team of alternative investment managers passionate about delivering uncommon value to our investors and shareholders. With over 30 years of proven expertise across Private Equity, Credit, and Real Assets in various regions and industries, we are known for our integrated businesses, our strong investment performance, our value-oriented philosophy, and our people. We seek a Senior Engineer/Full Stack Developer to innovate, manage, direct, architect, design, and implement solutions focused on our trade operations and controller functions across Private Equity, Credit, and Real Assets. The ideal candidate is a well-rounded hands-on engineer passionate about delivering quality software on the Java stack. Our Senior Engineer will work closely with key stakeholders in our Middle Office and Controllers teams and in the Credit and Opportunistic Technology teams to successfully deliver business requirements, projects, and programs. The candidate will have proven skills in independently managing the full software development lifecycle, working with end-users, business analysts, and project managers in defining and refining the problem statement, and delivering quality solutions on time. They will have the aptitude to quickly learn and embrace emerging technologies and proven methodologies to innovate and improve the correctness, quality, and timeliness of solutions delivered by the team. Primary Responsibilities Contribute to development of elegant solutions for systems that result in simple, extensible, maintainable, high-quality code. Participate in design discussions, hands-on technical, development, code reviews, quality assurance, observability, and product support. Use technical knowledge of patterns and code to identify risks and prevent software defects. Foster a culture of collaboration, disciplined software engineering practices, and a mindset to leave things better than you found them. Optimize team processes to improve productivity and responsiveness to feedback and changing priorities. Build strong relationships with key stakeholders, collaborate, and communicate effectively to reach successful outcomes. Passionate about delivering high-impact and breakthrough value to stakeholders. Desire to learn the domain and deliver enterprise solutions with at a higher velocity. Contribute to deliverables from early stages of requirement gathering through development, testing, UAT, deployment and post-production Lead in the planning, execution, and delivery of the team’s commitments. Qualifications & Experience Master’s or bachelor’s degree in Computer Science or another STEM field Experience with software development in the Alternative Asset Management or Investment Banking domain 5+ years of software development experience in at least one of the following OO languages: Java, C++, or C# 3+ years of Web 2.0 UI/UX development experience in at least one of the following frameworks using JavaScript/TypeScript: ExtJS, ReactJS, AngularJS, or Vue. Hands-on development expertise in Java, Spring Boot, REST, Messaging, JPA, and SQL for the last 2+ years Hands-on development expertise in building applications using RESTful and Microservices architecture Expertise in developing applications using TDD/BDD/ATDD with hands-on experience with at least one of Junit, Spring Test, TestNG, or Cucumber A strong understanding of SOLID principles, Design Patterns, Enterprise Integration Patterns A strong understanding of relational databases, SQL, ER modeling, and ORM technologies A strong understanding of BPM and its application Hands-on experience with various CI/CD practices and tools such as Jenkins, Azure DevOps, TeamCity, etcetera Exceptional problem-solving & debugging skills. Awareness of emerging application development methodologies, design patterns, and technologies. Ability to quickly learn new and emerging technologies and adopt solutions from within the company or the open-source community. Experience with the below will be a plus Buy-side operational and fund accounting processes Business processes and workflows using modern BPM/Low Code/No Code platforms (JBPM, Bonitasoft, Appian, Logic Apps, Unqork, etcetera…) OpenAPI, GraphQL, gRPC, ESB, SOAP, WCF, Kafka, and Node Serverless architecture Microsoft Azure Designing and implementing microservices on AKS Azure DevOps Sencha platform NoSQL databases (MongoDB, Cosmos DB, Neo4J) Python software development Functional programming paradigm Apollo provides equal employment opportunities regardless of age, disability, gender reassignment, marital or civil partner status, pregnancy or maternity, race, color, nationality, ethnic or national origin, religion or belief, veteran status, gender/sex or sexual orientation, or any other criterion or circumstance protected by applicable law, ordinance, or regulation. The above criteria are intended to be used as a guide only – candidates who do not meet all the above criteria may still be considered if they are deemed to have relevant experience/ equivalent levels of skill or knowledge to fulfil the requirements of the role. Any job offer will be conditional upon and subject to satisfactory reference and background screening checks, all necessary corporate and regulatory approvals or certifications as required from time to time and entering into definitive contractual documentation satisfactory to Apollo. Show more Show less

Posted 1 month ago

Apply

5 - 9 years

7 - 11 Lacs

Kochi, Coimbatore, Thiruvananthapuram

Work from Office

Naukri logo

Job Title - Senior Data Engineer (Graph DB specialist)+ Specialist + Global Song Management Level :9,Specialist Location:Kochi, Coimbatore Must have skills: Data Modeling Techniques and Methodologies Good to have skills:Proficiency in Python and PySpark programming. Job Summary :We are seeking a highly skilled Data Engineer with expertise in graph databases to join our dynamic team. The ideal candidate will have a strong background in data engineering, graph querying languages, and data modeling, with a keen interest in leveraging cutting-edge technologies like vector databases and LLMs to drive functional objectives. Your responsibilities will include: Design, implement, and maintain ETL pipelines to prepare data for graph-based structures. Develop and optimize graph database solutions using querying languages such as Cypher, SPARQL, or GQL. Neo4J DB experience is preferred. Build and maintain ontologies and knowledge graphs, ensuring efficient and scalable data modeling. Integrate vector databases and implement similarity search techniques, with a focus on Retrieval-Augmented Generation (RAG) methodologies and GraphRAG. Collaborate with data scientists and engineers to operationalize machine learning models and integrate with graph databases. Work with Large Language Models (LLMs) to achieve functional and business objectives. Ensure data quality, integrity, and security while delivering robust and scalable solutions. Communicate effectively with stakeholders to understand business requirements and deliver solutions that meet objectives. Roles & Responsibilities: Experience:At least 5 years of hands-on experience in data engineering. With 2 years of experience working with Graph DB. Programming: Querying:Advanced knowledge of Cypher, SPARQL, or GQL querying languages. ETL Processes:Expertise in designing and optimizing ETL processes for graph structures. Data Modeling:Strong skills in creating ontologies and knowledge graphs.Presenting data for Graph RAG based solutions Vector Databases:Understanding of similarity search techniques and RAG implementations. LLMs:Experience working with Large Language Models for functional objectives. Communication:Excellent verbal and written communication skills. Cloud Platforms:Experience with Azure analytics platforms, including Function Apps, Logic Apps, and Azure Data Lake Storage (ADLS). Graph Analytics:Familiarity with graph algorithms and analytics. Agile Methodology:Hands-on experience working in Agile teams and processes. Machine Learning:Understanding of machine learning models and their implementation. Professional & Technical Skills: Additional Information: (do not remove the hyperlink) Qualifications Experience: Minimum 5-10 year(s) of experience is required Educational Qualification: Any graduation / BE / B Tech

Posted 1 month ago

Apply

8 - 12 years

17 - 22 Lacs

Mumbai, Hyderabad

Work from Office

Naukri logo

Principal Data Scientist - NAV02CM Company Worley Primary Location IND-MM-Navi Mumbai Other Locations IND-KR-Bangalore, IND-MM-Mumbai, IND-MM-Pune, IND-TN-Chennai, IND-GJ-Vadodara, IND-AP-Hyderabad, IND-WB-Kolkata Job Digital Platforms & Data Science Schedule Full-time Employment Type Employee Job Level Experienced Job Posting May 8, 2025 Unposting Date Jun 7, 2025 Reporting Manager Title Head of Data Intelligence Building on our past. Ready for the future Worley is a global professional services company of energy, chemicals and resources experts. We partner with customers to deliver projects and create value over the life of their assets. Were bridging two worlds, moving towards more sustainable energy sources, while helping to provide the energy, chemicals and resources needed now. The Role As a Data Science Leadwith Worley, you will work closely with our existing team to deliver projects for our clients while continuing to develop your skills and experience etc. Conceptualise, build and manage AI/ML (more focus on unstructured data) platform by evaluating and selecting best in industry AI/ML tools and frameworks Drive and take ownership for developing cognitive solutions for internal stakeholders & external customers. Conduct research in various areas like Explainable AI, Image Segmentation,3D object detections and Statistical Methods Evaluate not only algorithms & models but also available tools & technologies in the market to maximize organizational spend. Utilize the existing frameworks, standards, patterns to create architectural foundation and services necessary for AI/ML applications that scale from multi-user to enterprise class. Analyse marketplace trends - economical, social, cultural and technological - to identify opportunities and create value propositions. Offer a global perspective in stakeholder discussions and when shaping solutions/recommendations Analyse marketplace trends - economical, social, cultural and technological - to identify opportunities and create value propositions. Offer a global perspective in stakeholder discussions and when shaping solutions/recommendations IT Skills & Experience Thorough understanding of complete AI/ML project life cycle to establish processes & provide guidance & expert support to the team. Expert knowledge of emerging technologies in Deep Learning and Reinforcement Learning Knowledge of MLOps process for efficient management of the AI/ML projects. Must have lead project execution with other data scientists/ engineers for large and complex data sets Understanding of machine learning algorithms, such as k-NN, GBM, Neural Networks Naive Bayes, SVM, and Decision Forests. Experience in the AI/ML components like Jupyter Hub, Zeppelin Notebook, Azure ML studio, Spark ML lib, TensorFlow, Tensor flow,Keras, Py-Torch and Sci-Kit Learn etc Strong knowledge of deep learning with special focus on CNN/R-CNN/LSTM/Encoder/Transformer architecture Hands-on experience with large networks like Inseption-Resnets,ResNeXT-50. Demonstrated capability using RNNS for text, speech data, generative models Working knowledge of NoSQL (GraphX/Neo4J), Document, Columnar and In-Memory database models Working knowledge of ETL tools and techniques, such as Talend,SAP BI Platform/SSIS or MapReduce etc. Experience in building KPI /storytelling dashboards on visualization tools like Tableau/Zoom data People Skills: Professional and open communication to all internal and external interfaces.Ability to communicate clearly and concisely and a flexible mindset to handle a quickly changing cultureStrong analytical skills. Industry Specific Experience: 10 -18 Years of experience of AI/ML project execution and AI/ML research Education Qualifications, Accreditation, Training Master or Doctroate degree Computer Science Engineering/Information Technology /Artificial Intelligence Moving forward together Were committed to building a diverse, inclusive and respectful workplace where everyone feels they belong, can bring themselves, and are heard. We provide equal employment opportunities to all qualified applicants and employees without regard to age, race, creed, color, religion, sex, national origin, ancestry, disability status, veteran status, sexual orientation, gender identity or expression, genetic information, marital status, citizenship status or any other basis as protected by law. We want our people to be energized and empowered to drive sustainable impact. So, our focus is on a values-inspired culture that unlocks brilliance through belonging, connection and innovation. And we're not just talking about it; we're doing it. We're reskilling our people, leveraging transferable skills, and supporting the transition of our workforce to become experts in today's low carbon energy infrastructure and technology. Whatever your ambition, theres a path for you here. And theres no barrier to your potential career success. Join us to broaden your horizons, explore diverse opportunities, and be part of delivering sustainable change.

Posted 1 month ago

Apply

3 - 5 years

6 - 10 Lacs

Gurugram

Work from Office

Naukri logo

Position Summary: A Data Engineer designs and maintains scalable data pipelines and storage systems, with a focus on integrating and processing knowledge graph data for semantic insights. They enable efficient data flow, ensure data quality, and support analytics and machine learning by leveraging advanced graph-based technologies. How You"™ll Make an Impact (responsibilities of role) Build and optimize ETL/ELT pipelines for knowledge graphs and other data sources. Design and manage graph databases (e.g., Neo4j, AWS Neptune, ArangoDB). Develop semantic data models using RDF, OWL, and SPARQL. Integrate structured, semi-structured, and unstructured data into knowledge graphs. Ensure data quality, security, and compliance with governance standards. Collaborate with data scientists and architects to support graph-based analytics. What You Bring (required qualifications and skills) Bachelor"™s/master"™s in computer science, Data Science, or related fields. Experience3+ years of experience in data engineering, with knowledge graph expertise. Proficiency in Python, SQL, and graph query languages (SPARQL, Cypher). Experience with graph databases and frameworks (Neo4j, GraphQL, RDF). Knowledge of cloud platforms (AWS, Azure). Strong problem-solving and data modeling skills. Excellent communication skills, with the ability to convey complex concepts to non-technical stakeholders. The ability to work collaboratively in a dynamic team environment across the globe.

Posted 1 month ago

Apply

2 - 3 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

At least2to 3yrs of experience in NodeJS, TypeScript, React is required Proven experience in building, deploying, maintaining & scaling APIs, microservices Job Responsibilities Solid experience in NodeJS, TypeScript, React, Neo4j and Firestore (GCP) . In-depth knowledge in software design & development practices Design and develop scalable systems using advanced concepts in NodeJS, TypeScript, Javascript, and React. Should have a better understanding of deploying and working with GKE. Ability to design for scale and performance/peer code reviews Architecture/platform development, API, data modelling at scale Excellent working experience in Express, Knex, Serverless GC Functions Solid experience in JavaScript Frameworks (Angular / React.JS), Redux, JavaScript , JQuery, CSS, HTML5, ES5, ES6 & ES7, in-memory databases (Redis / Hazelcast), Build tools (web pack) Good Error and Exceptional Handling Skills. Ability to work with Git repositories, and remote code hosting services like GitHub and Gitlab Ability to deliver amazing results with minimal guidance and supervision Passionate (especially about web development!), highly motivated, and fun to work with Fullsatck, React, Es6, Node & Expressjs, Resapi

Posted 1 month ago

Apply

3 - 6 years

7 - 11 Lacs

Hyderabad

Work from Office

Naukri logo

Sr Semantic Engineer – Research Data and Analytics What you will do Let’s do this. Let’s change the world. In this vital role you will be part of Research’s Semantic Graph Team is seeking a dedicated and skilled Semantic Data Engineer to build and optimize knowledge graph-based software and data resources. This role primarily focuses on working with technologies such as RDF, SPARQL, and Python. In addition, the position involves semantic data integration and cloud-based data engineering. The ideal candidate should possess experience in the pharmaceutical or biotech industry, demonstrate deep technical skills, and be proficient with big data technologies and demonstrate experience in semantic modeling. A deep understanding of data architecture and ETL processes is also essential for this role. In this role, you will be responsible for constructing semantic data pipelines, integrating both relational and graph-based data sources, ensuring seamless data interoperability, and leveraging cloud platforms to scale data solutions effectively. Roles & Responsibilities: Develop and maintain semantic data pipelines using Python, RDF, SPARQL, and linked data technologies. Develop and maintain semantic data models for biopharma scientific data Integrate relational databases (SQL, PostgreSQL, MySQL, Oracle, etc.) with semantic frameworks. Ensure interoperability across federated data sources, linking relational and graph-based data. Implement and optimize CI/CD pipelines using GitLab and AWS. Leverage cloud services (AWS Lambda, S3, Databricks, etc.) to support scalable knowledge graph solutions. Collaborate with global multi-functional teams, including research scientists, Data Architects, Business SMEs, Software Engineers, and Data Scientists to understand data requirements, design solutions, and develop end-to-end data pipelines to meet fast-paced business needs across geographic regions. Collaborate with data scientists, engineers, and domain experts to improve research data accessibility. Adhere to standard processes for coding, testing, and designing reusable code/components. Explore new tools and technologies to improve ETL platform performance. Participate in sprint planning meetings and provide estimations on technical implementation. Maintain comprehensive documentation of processes, systems, and solutions. Harmonize research data to appropriate taxonomies, ontologies, and controlled vocabularies for context and reference knowledge. What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications and Experience: Doctorate Degree OR Master’s degree with 4 - 6 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Bachelor’s degree with 6 - 8 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field OR Diploma with 10 - 12 years of experience in Computer Science, IT, Computational Chemistry, Computational Biology/Bioinformatics or related field Preferred Qualifications and Experience: 6+ years of experience in designing and supporting biopharma scientific research data analytics (software platforms) Functional Skills: Must-Have Skills: Advanced Semantic and Relational Data Skills: Proficiency in Python, RDF, SPARQL, Graph Databases (e.g. Allegrograph), SQL, relational databases, ETL pipelines, big data technologies (e.g. Databricks), semantic data standards (OWL, W3C, FAIR principles), ontology development and semantic modeling practices. Cloud and Automation ExpertiseGood experience in using cloud platforms (preferably AWS) for data engineering, along with Python for automation, data federation techniques, and model-driven architecture for scalable solutions. Technical Problem-SolvingExcellent problem-solving skills with hands-on experience in test automation frameworks (pytest), scripting tasks, and handling large, complex datasets. Good-to-Have Skills: Experience in biotech/drug discovery data engineering Experience applying knowledge graphs, taxonomy and ontology concepts in life sciences and chemistry domains Experience with graph databases (Allegrograph, Neo4j, GraphDB, Amazon Neptune) Familiarity with Cypher, GraphQL, or other graph query languages Experience with big data tools (e.g. Databricks) Experience in biomedical or life sciences research data management Soft Skills: Excellent critical-thinking and problem-solving skills Good communication and collaboration skills Demonstrated awareness of how to function in a team setting Demonstrated presentation skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

4 - 6 years

11 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Data Engineering Manager What you will do Let’s do this. Let’s change the world. In this vital role you will lead a team of data engineers to build, optimize, and maintain scalable data architectures, data pipelines, and operational frameworks that support real-time analytics, AI-driven insights, and enterprise-wide data solutions. As a strategic leader, the ideal candidate will drive best practices in data engineering, cloud technologies, and Agile development, ensuring robust governance, data quality, and efficiency. The role requires technical expertise, team leadership, and a deep understanding of cloud data solutions to optimize data-driven decision-making. Lead and mentor a team of data engineers, fostering a culture of innovation, collaboration, and continuous learning for solving complex problems of R&D division. Oversee the development of data extraction, validation, and transformation techniques, ensuring ingested data is of high quality and compatible with downstream systems. Guide the team in writing and validating high-quality code for data ingestion, processing, and transformation, ensuring resiliency and fault tolerance. Drive the development of data tools and frameworks for running and accessing data efficiently across the organization. Oversee the implementation of performance monitoring protocols across data pipelines, ensuring real-time visibility, alerts, and automated recovery mechanisms. Coach engineers in building dashboards and aggregations to monitor pipeline health and detect inefficiencies, ensuring optimal performance and cost-effectiveness. Lead the implementation of self-healing solutions, reducing failure points and improving pipeline stability and efficiency across multiple product features. Oversee data governance strategies, ensuring compliance with security policies, regulations, and data accessibility best practices. Guide engineers in data modeling, metadata management, and access control, ensuring structured data handling across various business use cases. Collaborate with business leaders, product owners, and cross-functional teams to ensure alignment of data architecture with product requirements and business objectives. Prepare team members for key partner discussions by helping assess data costs, access requirements, dependencies, and availability for business scenarios. Drive Agile and Scaled Agile (SAFe) methodologies, handling sprint backlogs, prioritization, and iterative improvements to enhance team velocity and project delivery. Stay up-to-date with emerging data technologies, industry trends, and best practices, ensuring the organization uses the latest innovations in data engineering and architecture. What we expect of you We are all different, yet we all use our unique contributions to serve patients. We are seeking a seasoned Engineering Manager (Data Engineering) to drive the development and implementation of our data strategy with deep expertise in R&D of Biotech or Pharma domain. Basic Qualifications: Doctorate degree OR Master’s degree and 4 to 6 years of experience in Computer Science, IT or related field OR Bachelor’s degree and 6 to 8 years of experience in Computer Science, IT or related field OR Diploma and 10 to 12 years of experience in Computer Science, IT or related field Experience leading a team of data engineers in the R&D domain of biotech/pharma companies. Experience architecting and building data and analytics solutions that extract, transform, and load data from multiple source systems. Data Engineering experience in R&D for Biotechnology or pharma industry Demonstrated hands-on experience with cloud platforms (AWS) and the ability to architect cost-effective and scalable data solutions. Proficiency in Python, PySpark, SQL. Experience with dimensional data modeling. Experience working with Apache Spark, Apache Airflow. Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops. Experienced with AWS or GCP or Azure cloud services. Understanding of end to end project/product life cycle Well versed with full stack development & DataOps automation, logging frameworks, and pipeline orchestration tools. Strong analytical and problem-solving skills to address complex data challenges. Effective communication and interpersonal skills to collaborate with cross-functional teams. Preferred Qualifications: AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Project Management certifications preferred Data Engineering Management experience in Biotech/Pharma is a plus Experience using graph databases such as Stardog or Marklogic or Neo4J or Allegrograph, etc. Soft Skills: Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation Ability to handle multiple priorities successfully Team-oriented, with a focus on achieving team goals Strong presentation and public speaking skills What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we’ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now and make a lasting impact with the Amgen team. careers.amgen.com As an organization dedicated to improving the quality of life for people around the world, Amgen fosters an inclusive environment of diverse, ethical, committed and highly accomplished people who respect each other and live the Amgen values to continue advancing science to serve patients. Together, we compete in the fight against serious disease. Amgen is an Equal Opportunity employer and will consider all qualified applicants for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability status, or any other basis protected by applicable law. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

8 - 10 years

18 - 20 Lacs

Pune

Work from Office

Naukri logo

Technical Skill Set: 1. Front-End Technologies: o Strong experience with HTML5, CSS3, and JavaScript. o Proficiency in front-end frameworks such as React, Angular, or Vue.js. o Knowledge of responsive design and cross-browser compatibility. o Familiarity with front-end build tools (Webpack, Gulp, etc.). 2. Back-End Technologies: o Proficient in one or more back-end programming languages such as Node.js, Python or Java. o Experience with server-side frameworks (Express.js, Django, Spring, GraphQL etc.). o Strong knowledge of RESTful API and GraphQL design and development. o Strong experience in Azure Cloud web services. o Experience in Kubernetes development and deployment. 3. Databases: o Proficiency in relational databases (SQL Server, PostgreSQL, etc.). o Knowledge of NoSQL databases (MongoDB, Neo4J, CosmosDB, Redis, etc.). o Strong SQL skills and ability to write optimized queries. 4. Version Control: o Experience with Git for version control, including branching, merging, and pull requests. o Familiarity with Git workflows such as GitFlow or trunk-based development. 5. Deployment & DevOps: o Experience with CI/CD tools such as Jenkins, GitLab CI, or CircleCI. o Familiarity with containerization technologies like Docker and container orchestration platforms like Kubernetes. o Knowledge of cloud platforms (AWS, Azure, GCP) for hosting and deploying applications. 6. Testing & Debugging: o Knowledge of testing frameworks and tools like Jest, Mocha, or Jasmine. o Experience with test-driven development (TDD) and writing unit and integration tests. o Familiarity with debugging tools and strategies. 7. Agile Methodology: o Experience working in Agile development environments, participating in Scrum ceremonies (stand-ups, sprint planning, etc.). o Familiarity with project management tools like Jira, Trello, or Asana. 8. Additional Skills: o Strong problem-solving skills and ability to think critically. o Good understanding of web security best practices (e.g., OWASP Top 10). o Ability to work in a collaborative, team-oriented environment. o Strong communication skills and ability to articulate technical concepts to non-technical stakeholders. Preferred Qualifications: 8-10 years of hands-on experience as a full-stack developer. Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience). Familiarity with additional technologies or frameworks like React, Vue.js, Svelte, etc.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies