Home
Jobs

361 Neo4J Jobs - Page 13

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 3.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Key Responsibilities: Deliver engaging and interactive training sessions (24 hours total) based on structured modules. Teach integration of monitoring, logging, and observability tools with machine learning. Guide learners in real-time anomaly detection, incident management, root cause analysis, and predictive scaling. Support learners in deploying tools like Prometheus, Grafana, OpenTelemetry, Neo4j, Falco, and KEDA. Conduct hands-on labs using LangChain, Ollama, Prophet, and other AI/ML frameworks. Help participants set up smart workflows for alert classification and routing using open-source stacks. Prepare learners to handle security, threat detection, and runtime anomaly classification using LLMs. Provide post-training support and mentorship when necessary.

Posted 1 month ago

Apply

8.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

About Neo4j Neo4j is the leader in Graph Database & Analytics, helping organizations uncover hidden patterns and relationships across billions of data connections deeply, easily, and quickly. Customers use Neo4j to gain a deeper understanding of their business and reveal new ways of solving their most pressing problems. Over 84% of Fortune 100 companies use Neo4j, along with a vibrant community of 250,000+ developers, data scientists, and architects across the globe. At Neo4j, we’re proud to build the technology that powers breakthrough solutions for our customers. These solutions have helped NASA get to Mars two years earlier, broke the Panama Papers for the ICIJ, and are helping Transport for London to cut congestion by 10% and save $750M a year. Some of our other notable customers include Intuit, Lockheed Martin, Novartis, UBS, and Walmart. Neo4j experienced rapid growth this year as organizations looking to deploy generative AI (GenAI) recognized graph databases as essential for improving it’s accuracy, transparency, and explainability. Growth was further fueled by enterprise demand for Neo4j’s cloud offering and partnerships with leading cloud hyperscalers and ecosystem leaders. Learn more at neo4j.com and follow us on LinkedIn. Our Vision At Neo4j, we have always strived to help the world make sense of data. As business, society and knowledge become increasingly connected, our technology promotes innovation by helping organizations to find and understand data relationships. We created, drive and lead the graph database category, and we’re disrupting how organizations leverage their data to innovate and stay competitive. The Role Develop and execute a territory plan based on target agencies and applicable use cases, resulting in a pipeline of opportunities in the target market, that will help you achieve quarterly and annual sales metrics. Develop expert knowledge of Neo4j solutions and applicability in target market covering Government and Enterprise accounts Develop and present to customers a strong understanding of the benefits and advantages of graph technology. Execute sales cycles that employ Strategic Selling strategies and tactics. Build and present proposals for Neo4j solutions that involve Neo4j products and Services. Work with Pre-Sales Engineering resources to scope and deliver on customer needs. “Land & Expand” - Grow existing account base with a strategic customer first methodology Provide guidance, direction, and support to your assigned SDR in their efforts to support your pipeline development. Ensure the execution of strategies for assigned key accounts to drive plans to increase revenue potential and growth Collaborate with Field Marketing resources targeting programs to increase awareness at the existing customer base resulting in revenue growth. Maintain Neo4j Salesforce.com CRM system with accurate information about your pipeline, in accordance with Neo4j forecasting guidelines. Ideally, You Should Have 8-10 years of consistent success meeting or exceeding sales objectives selling technical solutions and software products into Government and Enterprise accounts. Demonstrable experience executing enterprise complex sales strategies and tactics. Experience with the commercial open-source business model, selling subscriptions for on-premise deployments and/or hybrid on-prem/cloud deployments. Previous experience and thrive in a smaller, high growth software company, where you have leveraged dedicated SDR resources, Field Marketing resources, and Pre-Sales Engineering helping build the business. Strong conviction and approach to how and where graph solutions fit into the enterprise marketplace. Demonstrate attention to detail, ensuring accurate entry and management of lead data in our SalesForce.com CRM system. Be proficient with standard corporate productivity tools (e.g., Google Docs, MS-Office, Salesforce.com, Web-conferencing). Be a team player with the highest level of integrity Show more Show less

Posted 1 month ago

Apply

4.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Role Description Job Title: Lead I - Software Engineering Hiring Location: Mumbai/Chennai/Gurgaon Job Summary We are seeking a Lead I in Software Engineering with 4 to 7 years of experience in software development or software architecture. The ideal candidate will possess a strong background in Angular and Java, with the ability to lead a team and drive technical projects. A Bachelor's degree in Engineering or Computer Science, or equivalent experience, is required. Responsibilities Interact with technical personnel and team members to finalize requirements. Write and review detailed specifications for the development of system components of moderate complexity. Collaborate with QA and development team members to translate product requirements into software designs. Implement development processes, coding best practices, and conduct code reviews. Operate in various development environments (Agile, Waterfall) while collaborating with key stakeholders. Resolve technical issues as necessary. Perform all other duties as assigned. Must-Have Skills Strong proficiency in Angular 1.X (70% Angular and 30% Java OR 50% Angular and 50% Java). Java/J2EE; Familiarity with Singleton and MVC design patterns. Strong proficiency in SQL and/or MySQL, including optimization techniques (at least MySQL). Experience using tools such as Eclipse, GIT, Postman, JIRA, and Confluence. Knowledge of test-driven development. Solid understanding of object-oriented programming. Good-to-Have Skills Expertise in Spring Boot, Microservices, and API development. Familiarity with OAuth2.0 patterns (experience with at least 2 patterns). Knowledge of Graph Databases (e.g., Neo4J, Apache Tinkerpop, Gremlin). Experience with Kafka messaging. Familiarity with Docker, Kubernetes, and cloud development. Experience with CI/CD tools like Jenkins and GitHub Actions. Knowledge of industry-wide technology trends and best practices. Experience Range 4 to 7 years of relevant experience in software development or software architecture. Education Bachelor’s degree in Engineering, Computer Science, or equivalent experience. Additional Information Strong communication skills, both oral and written. Ability to interface competently with internal and external technology resources. Advanced knowledge of software development methodologies (Agile, etc.). Experience in setting up and maintaining distributed applications in Unix/Linux environments. Ability to complete complex bug fixes and support production issues. Skills Angular 1.X,Java 11+,Sql Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

Our team members are at the heart of everything we do. At Cencora, we are united in our responsibility to create healthier futures, and every person here is essential to us being able to deliver on that purpose. If you want to make a difference at the center of health, come join our innovative company and help us improve the lives of people and animals everywhere. Apply today! Job Details Primary Duties & Responsibilities Works with cross-functional stakeholders to finalize desired technical specifications and application design Codes, tests, debugs and documents complex programs, and enhances existing programs to ensure that data processing production systems continue to meet user requirements Develops and maintains application design, program specification documents, and proprietary web applications Contributes effectively as a member of the team; takes ownership of individual assignments and projects with moderate oversight Manages and updates issue-tracking system when gaps in code and documentation are discovered May design and develop software for external clients Works with project lead and internal stakeholders to formulate product and sprint backlog Develops detailed system design specifications to serve as a guide for system/program development Identifies and resolves system operating programs in order to provide continuous business operations Interacts with user management regarding project status and user requirements to promote an environment with improved productivity and satisfaction Provides technical leadership and training to Software Engineers I Assists in scheduling, determining manpower requirements, and estimating costs to project completion in order to meet user requirements. Develops new control applications from a set of specifications and tests new and modified control applications Provides remote support for field personnel as they install and troubleshoot new applications Provides on-site support for some scheduled installations and upgrades and end-user support, primarily concerning application issues Creates documentation for configurations and how to implement and test the applications Skills And Experience Full-stack developer Proficient in React, Vue.js, .Net and . ASP.net, including building and supporting APIs (.NET/C#) Implementation in UX design and development, including best practices, such as WCAG Must demonstrate proficiency in secure coding practices Proficiency in SQL is highly desired Familiarity with Sitecore is a plus Must be comfortable working with a global team of IT members, business stakeholders, contractors, and vendor partners Prior experience delivering software solutions for health, transportation, or other regulated industries is a plus Experience & Educational Requirements Bachelor’s Degree in Computer Science, Information Technology or any other related discipline or equivalent related experience. 3+ years of directly-related or relevant experience, preferably in software designing and development. Preferred Certifications Android Development Certification Microsoft Asp.Net Certification Microsoft Certified Engineer Application/Infrastructure/Enterprise Architect Training and Certification, e.g. TOGAF Certified Scrum Master SAFe Agile Certification DevOps Certifications like AWS Certified DevOps Engineer Skills & Knowledge Behavioral Skills: Critical Thinking Detail Oriented Interpersonal Communication Learning Agility Problem Solving Time Management Technical Skills API Design Cloud Computing Methodologies Integration Testing & Validation Programming/Coding Database Management Software Development Life Cycle (SDLC) Technical Documentation Web Application Infrastructure Web Development Frameworks Tools Knowledge Cloud Computing Tools like AWS, Azure, Google cloud Container Management and Orchestration Tools Big Data Frameworks like Hadoop Java Frameworks like JDBC, Spring, ORM Solutions, JPA, JEE, JMS, Gradle, Object Oriented Design Microsoft Office Suite NoSQL Database Platforms like MongoDB, BigTable, Redis, RavenDB Cassandra, HBase, Neo4j, and CouchDB Programming Languages like JavaScript, HTML/CSS, Python, SQL Operating Systems & Servers like Windows, Linux, Citrix, IBM, Oracle, SQL What Cencora offers Benefit offerings outside the US may vary by country and will be aligned to local market practice. The eligibility and effective date may differ for some benefits and for team members covered under collective bargaining agreements. Full time Affiliated Companies Affiliated Companies: AmerisourceBergen Services Corporation Equal Employment Opportunity Cencora is committed to providing equal employment opportunity without regard to race, color, religion, sex, sexual orientation, gender identity, genetic information, national origin, age, disability, veteran status or membership in any other class protected by federal, state or local law. The company’s continued success depends on the full and effective utilization of qualified individuals. Therefore, harassment is prohibited and all matters related to recruiting, training, compensation, benefits, promotions and transfers comply with equal opportunity principles and are non-discriminatory. Cencora is committed to providing reasonable accommodations to individuals with disabilities during the employment process which are consistent with legal requirements. If you wish to request an accommodation while seeking employment, please call 888.692.2272 or email hrsc@cencora.com. We will make accommodation determinations on a request-by-request basis. Messages and emails regarding anything other than accommodations requests will not be returned Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Description ABOUT CLOUDBEES CloudBees provides the leading software delivery platform for enterprises, enabling them to continuously innovate, compete, and win in a world powered by the digital experience. Designed for the world's largest organizations with the most complex requirements, CloudBees enables software development organizations to deliver scalable, compliant, governed, and secure software from the code a developer writes to the people who use it. The platform connects with other best of breed tools, improves the developer experience, and enables organizations to bring digital innovation to life continuously, adapt quickly, and unlock business outcomes that create market leaders and disruptors. CloudBees was founded in 2010 and is backed by Goldman Sachs, Morgan Stanley,Bridgepoint Credit, HSBC, Golub Capital, Delta-v Capital, Matrix Partners, and Lightspeed Venture Partners. Visit www.cloudbees.com and follow us on Twitter, LinkedIn, and Facebook. WHAT YOU’LL DO! These are some of the tasks that you’ll be engaged on: Design, develop, and maintain automated test scripts using Playwright with TypeScript/JavaScript, as well as Selenium with Java, to ensure comprehensive test coverage across applications. Enhance the existing Playwright framework by implementing modular test design and optimizing performance, while also utilizing Cucumber for Behavior-Driven Development (BDD) scenarios. Execute functional, regression, integration, performance, and security testing of web applications, APIs and microservices. Collaborate in an Agile environment, participating in daily stand-ups, sprint planning, and retrospectives to ensure alignment on testing strategies and workflows. Troubleshoot and analyze test failures and defects using debugging tools and techniques, including logging and tracing within Playwright, Selenium, Postman, Grafana, etc. Document and report test results, defects, and issues using Jira and Confluence, ensuring clarity and traceability for all test activities. Implement page object models and reusable test components in both Playwright and Selenium to promote code reusability and maintainability. Integrate automated tests into CI/CD pipelines using Jenkins and GitHub Actions, ensuring seamless deployment and testing processes. Collaborate on Git for version control, managing branches and pull requests to maintain code quality and facilitate teamwork. Mentor and coach junior QA engineers on best practices for test automation, Playwright and Selenium usage, and CI/CD workflows. Research and evaluate new tools and technologies to enhance testing processes and coverage. WHAT DO YOU NEED TO SHINE IN THIS ROLE? Bachelor’s degree in Computer Science, Engineering, or related field, or equivalent work experience. At least 5 years of experience in software testing, with at least 3 years of experience in test automation. Ability to write functional test, test plan and test strategies Ability to configure test environment and test data using automation tools Experience in creation of an automated regress / CI test suite using Cucumber with Playwright (Preferred) or Selenium and Rest APIs Proficient in one or more programming languages - Java, Javascript or Typescript. Experience in testing web applications, APIs, and microservices using various tools and frameworks such as Selenium, Cucumber etc. Experience in testing SAST/DAST tools (Preferred) Experience in working with cloud platforms such as AWS, Azure, GCP, etc. Experience in working with CI/CD tools such as Jenkins, GitLab, GitHub, etc. Experience in writing queries and working with databases such as MySQL, MongoDB, Neo4j, Cassandra etc. Experience in working with tools such as Postman, JMeter, Grafana, etc. Exposure to Security standards and Compliance Experience in working with Agile methodologies such as Scrum, Kanban, etc. Ability to work independently and as part of a team. Ability to learn new technologies and tools quickly and adapt to changing requirements. Highly analytical mindset, logical approach to find solutions and perform root cause analysis Able to prioritize between critical and non critical path items Excellent communication skills with ability to communicate test results to stakeholders in the functional aspect of the system and its impact. What You’ll Get Highly competitive compensation, benefits, and vacation package Ability to work for one of the fastest growing companies with some of the most talented people in the industry Team outings Fun, Hardworking, and Casual Environment Endless Growth Opportunities We have a culture of movers and shakers and are leading the way for everyone else with a vision to transform the industry. We are authentic in who we are. We believe in our abilities and strengths to change the world for the better. Being inclusive and working together is at the heart of everything we do. We are naturally curious. We ask the right questions, challenge what can be done differently and come up with intelligent solutions to the problems we find. If that’s you, get ready to bee impactful and join the hive. Scam Notice Please be aware that there are individuals and organizations that may attempt to scam job seekers by offering fraudulent employment opportunities in the name of CloudBees. These scams may involve fake job postings, unsolicited emails, or messages claiming to be from our recruiters or hiring managers. Please note that CloudBees will never ask for any personal account information, such as cell phone, credit card details or bank account numbers, during the recruitment process. Additionally, CloudBees will never send you a check for any equipment prior to employment. All communication from our recruiters and hiring managers will come from official company email addresses (@cloudbees.com) or from Paylocity and will never ask for any payment, fee to be paid or purchases to be made by the job seeker. If you are contacted by anyone claiming to represent CloudBees and you are unsure of their authenticity, please do not provide any personal/financial information and contact us immediately at tahelp@cloudbees.com. We take these matters very seriously and will work to ensure that any fraudulent activity is reported and dealt with appropriately. If you feel like you have been scammed in the US, please report it to the Federal Trade Commission at: https://reportfraud.ftc.gov/#/. In Europe, please contact the European Anti-Fraud Office at: https://anti-fraud.ec.europa.eu/olaf-and-you/report-fraud_en Signs of a Recruitment Scam Ensure there are no other domains before or after @cloudbees.com. For example: “name.dr.cloudbees.com” Check any documents for poor spelling and grammar – this is often a sign that fraudsters are at work. If they provide a generic email address such as @Yahoo or @Hotmail as a point of contact. You are asked for money, an “administration fee”, “security fee” or an “accreditation fee”. You are asked for cell phone account information. Show more Show less

Posted 1 month ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Description ABOUT CLOUDBEES CloudBees provides the leading software delivery platform for enterprises, enabling them to continuously innovate, compete, and win in a world powered by the digital experience. Designed for the world's largest organizations with the most complex requirements, CloudBees enables software development organizations to deliver scalable, compliant, governed, and secure software from the code a developer writes to the people who use it. The platform connects with other best of breed tools, improves the developer experience, and enables organizations to bring digital innovation to life continuously, adapt quickly, and unlock business outcomes that create market leaders and disruptors. CloudBees was founded in 2010 and is backed by Goldman Sachs, Morgan Stanley,Bridgepoint Credit, HSBC, Golub Capital, Delta-v Capital, Matrix Partners, and Lightspeed Venture Partners. Visit www.cloudbees.com and follow us on Twitter, LinkedIn, and Facebook. WHAT YOU’LL DO! These are some of the tasks that you’ll be engaged on: Conceptualize product features for compliance capability that will enable organizations to streamline their software development and delivery processes by providing the ‘Sec’ element in DevSecOps. This includes creating features like tools, plugins, and integration that enhance the capabilities of the CloudBees product suite. Work with the product manager to Understand business objective Align product vision and strategy with business objects Align the engineering team with product vision and strategy Work with product owners across capabilities to align with strategic and tactical product roadmap objectives across the board. Understand business risk management, regulatory and security compliance frameworks like SOC2, NIST, SOX, CIS, PCI DSS, other Have a customer centric focus and act as customer advocate. Own and drive the team's product backlog. Create - Based on customer needs, market research. Own - Be accountable for driving the prioritization and delivery of backlog. Manage - Keep the backlog up to date reflecting inputs from stakeholders. Prioritize - Ensure the team is always focussed on top priority items in the backlog. Drive - Set focussed and achievable goals for each sprint. Monitor - Continuously track the progress of the product through each stage of development. Feedback the product backlog features acceptability to the development team. Determine and approve the final deliverable meets stakeholder expectations Make the work visible, transparent, and clear to all Inform and involve internal stakeholders of priority changes, risks, and progress. Work with research and design to create best in class customer experiences. Collaborate with engineering to validate technical feasibility and effort estimates. Work with the team to refine and improve the development process. Drive the Sprint review to celebrate achievements. You would have previously worked with exposure to: Agile methodology Jira, Confluence, Git and other SDLC toolings System analyst or business analyst on projects in AWS, GCP, Azure, others Cloud and container technologies WHAT DO YOU NEED TO SHINE IN THIS ROLE? Bachelor’s or master’s degree in computer science or a related technical field 5+ years of experience working with Scrum and Agile software development methodologies. Working knowledge of software development lifecycle. Working knowledge and/or previous experience in security compliance and cyber security. Exposure to Vulnerability Triage and Remediation Experienced in coordinating work across multiple teams. Ability to empathize with end users on challenges they face and understanding user - product interaction. Excellent communication skills with ability to communicate test results to stakeholders in the functional aspect of the system and its impact. Experience in writing queries and working with databases such as MySQL, MongoDB, Neo4j, Cassandra etc. Experience in working with tools such as Postman, JMeter, Grafana, etc. Experience in working with Agile methodologies such as Scrum, Kanban, etc. Ability to work independently and as part of a team. What You’ll Get Highly competitive compensation, benefits, and vacation package Ability to work for one of the fastest growing companies with some of the most talented people in the industry Team outings Fun, Hardworking, and Casual Environment Endless Growth Opportunities We have a culture of movers and shakers and are leading the way for everyone else with a vision to transform the industry. We are authentic in who we are. We believe in our abilities and strengths to change the world for the better. Being inclusive and working together is at the heart of everything we do. We are naturally curious. We ask the right questions, challenge what can be done differently and come up with intelligent solutions to the problems we find. If that’s you, get ready to bee impactful and join the hive. Scam Notice Please be aware that there are individuals and organizations that may attempt to scam job seekers by offering fraudulent employment opportunities in the name of CloudBees. These scams may involve fake job postings, unsolicited emails, or messages claiming to be from our recruiters or hiring managers. Please note that CloudBees will never ask for any personal account information, such as cell phone, credit card details or bank account numbers, during the recruitment process. Additionally, CloudBees will never send you a check for any equipment prior to employment. All communication from our recruiters and hiring managers will come from official company email addresses (@cloudbees.com) or from Paylocity and will never ask for any payment, fee to be paid or purchases to be made by the job seeker. If you are contacted by anyone claiming to represent CloudBees and you are unsure of their authenticity, please do not provide any personal/financial information and contact us immediately at tahelp@cloudbees.com. We take these matters very seriously and will work to ensure that any fraudulent activity is reported and dealt with appropriately. If you feel like you have been scammed in the US, please report it to the Federal Trade Commission at: https://reportfraud.ftc.gov/#/. In Europe, please contact the European Anti-Fraud Office at: https://anti-fraud.ec.europa.eu/olaf-and-you/report-fraud_en Signs of a Recruitment Scam Ensure there are no other domains before or after @cloudbees.com. For example: “name.dr.cloudbees.com” Check any documents for poor spelling and grammar – this is often a sign that fraudsters are at work. If they provide a generic email address such as @Yahoo or @Hotmail as a point of contact. You are asked for money, an “administration fee”, “security fee” or an “accreditation fee”. You are asked for cell phone account information. Show more Show less

Posted 1 month ago

Apply

6.0 years

0 Lacs

India

On-site

Linkedin logo

Experience : 6+years Preferred Qualifications: Bachelor’s degree in computer science, Information Systems, or related field. 6-12 years of relevant experience in cloud engineering and architecture. Google Cloud Professional Cloud Architect certification. Experience with Kubernetes. Familiarity with DevOps methodologies. Strong problem-solving and analytical skills. Excellent communication skills. Required Skills: Google Cloud Platform (GCP) Services, Compute Engine, Google Kubernetes Engine (GKE), Cloud Storage, Cloud SQL, Cloud Load Balancing, Identity and Access Management (IAM), Google Workflows, Google Cloud Pub/Sub, App Engine, Cloud Functions, Cloud Run, API Gateway, Cloud Build, Cloud Source Repositories, Artifact Registry, Google Cloud Monitoring, Logging and Error Reporting, Python, Terraform, Google Cloud Firestore, GraphQL, MongoDB, Cassandra, Neo4j, ETL (Extract, Transform, Load) Paradigms, Google Cloud Dataflow, Apache Beam, BigQuery, Service Mesh, Content Delivery Network (CDN), Stackdriver, Google Cloud Trace Show more Show less

Posted 1 month ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

We Are Looking For 2+ years of expertise in software development with one or more of the general programming languages (e.g., Python, Java, C/C++, Go). Experience in Python and Django is recommended. Deep understanding of how to build an application with optimized RESTful APIs. Knowledge of a web framework like Django or similar with ORM or multi-tier, multi-DB-based data-heavy web application development will help your profile stand out. Knowledge of Gen AI tools and technologies is a plus. Sound knowledge of SQL queries & DB like PostgreSQL(must) or MySQL. Working knowledge of NoSQL DBs (Elasticsearch, Mongo, Redis, etc.) is a plus. Knowledge of graph DB like Neo4j or AWS Neptune adds extra credits to your profile. Knowing queue-based messaging frameworks like Celery, RQ, Kafka, etc., and distributed system understanding will be advantageous. Understands a programming language's limitations to exploit the language behavior to the fullest potential. Understanding of accessibility and security compliances Ability to communicate complex technical concepts to both technical and non- technical audiences with ease Diversity in skills like version control tools, CI/CD, cloud basics, good debugging skills, and test-driven development will help your profile stand out. Skills:- Python, Java and SQL Show more Show less

Posted 1 month ago

Apply

3.0 - 20.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Experience: 3-20 Years Location: Bangalore, Chennai Experience in building Java application with Spring boot. Should be able to design, develop, test, and deploy high-quality, reusable, and maintainable code. Develop and maintain unit tests. Experience in RESTful APIs and microservices architecture. Possess excellent analytical and problem-solving skills to troubleshoot and debug application issues Experience with any one IDE (e.g., IntelliJ), version control systems (e.g., Git), build tools (e.g., Gradle) and unit testing frameworks. Knowledge on design patterns and principles ( e.g.., SOLID ) Ability to work independently and as part of a team. Excellent communication, collaboration, and problem-solving skills. Must have skills: DB : Oracle, MySQL, Mondo DB, neo4j experience in writing optimal query Good to have skills: Solid understanding of software development methodologies (e.g., Agile, Scrum). Experience with CI/CD pipelines and a good knowledge of DevOps practices Experience with open source libraries and software's – (e.g., Apache Camel, Kafka, Redis, EFK ) Experience with containerization technologies (e.g., Docker, Kubernetes). Disclaimer: EdgeVerve Systems does not engage with external manpower agencies or charge any fees from candidates for recruitment. If you encounter such scams, please report them immediately. Show more Show less

Posted 1 month ago

Apply

3.0 - 5.0 years

1 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Role Purpose The purpose of this role is to design, test and maintain software programs for operating systems or applications which needs to be deployed at a client end and ensure its meet 100% quality assurance parameters Machine Learning & Deep Learning – Strong understanding of LLM architectures, transformers, and fine-tuning techniques. MLOps & DevOps – Experience with CI/CD pipelines, model deployment, and monitoring. Vector Databases – Knowledge of storing and retrieving embeddings efficiently. Prompt Engineering – Ability to craft effective prompts for optimal model responses. Retrieval-Augmented Generation (RAG) – Implementing techniques to enhance LLM outputs with external knowledge. Cloud Platforms – Familiarity with AWS, Azure, or GCP for scalable deployments. Containerization & Orchestration – Using Docker and Kubernetes for model deployment. Observability & Monitoring – Tracking model performance, latency, and drift. Security & Ethics – Ensuring responsible AI practices and data privacy. Programming Skills – Strong proficiency in Python, SQL, and API development. Knowledge of Open-Source LLMs – Familiarity with models like LLaMA, Falcon, and Mistral. Fine-Tuning & Optimization – Experience with LoRA, quantization, and efficient training techniques. LLM Frameworks – Hands-on experience with Hugging Face, LangChain, or OpenAI APIs. Data Engineering – Understanding of ETL pipelines and data preprocessing. Microservices Architecture – Ability to design scalable AI-powered applications. Explainability & Interpretability – Techniques for understanding and debugging LLM outputs. Graph Databases – Knowledge of Neo4j or similar technologies for complex data relationships. Collaboration & Communication – Ability to work with cross-functional teams and explain technical concepts clearly. Deliver No. Performance Parameter Measure 1. Continuous Integration, Deployment & Monitoring of Software 100% error free on boarding & implementation, throughput %, Adherence to the schedule/ release plan 2. Quality & CSAT On-Time Delivery, Manage software, Troubleshoot queries, Customer experience, completion of assigned certifications for skill upgradation 3. MIS & Reporting 100% on time MIS & report generation Mandatory Skills: LLM Ops. Experience3-5 Years.

Posted 1 month ago

Apply

8.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

About Neo4j Neo4j is the leader in Graph Database & Analytics, helping organizations uncover hidden patterns and relationships across billions of data connections deeply, easily, and quickly. Customers use Neo4j to gain a deeper understanding of their business and reveal new ways of solving their most pressing problems. Over 84% of Fortune 100 companies use Neo4j, along with a vibrant community of 250,000+ developers, data scientists, and architects across the globe. At Neo4j, we’re proud to build the technology that powers breakthrough solutions for our customers. These solutions have helped NASA get to Mars two years earlier, broke the Panama Papers for the ICIJ, and are helping Transport for London to cut congestion by 10% and save $750M a year. Some of our other notable customers include Intuit, Lockheed Martin, Novartis, UBS, and Walmart. Neo4j experienced rapid growth this year as organizations looking to deploy generative AI (GenAI) recognized graph databases as essential for improving it’s accuracy, transparency, and explainability. Growth was further fueled by enterprise demand for Neo4j’s cloud offering and partnerships with leading cloud hyperscalers and ecosystem leaders. Learn more at neo4j.com and follow us on LinkedIn. Our Vision At Neo4j, we have always strived to help the world make sense of data. As business, society and knowledge become increasingly connected, our technology promotes innovation by helping organizations to find and understand data relationships. We created, drive and lead the graph database category, and we’re disrupting how organizations leverage their data to innovate and stay competitive. The Opportunity Neo4j is looking for a Director of Revenue Business Systems to build our India Systems Center of Excellence (CoE). Partnering with the global team, you will be responsible for overseeing a ground of Analysts, Admins, Developers, and others to deliver solutions for the business. You will be at the intersection of all departments, and at the intersection of business strategy and technology. Primary Responsibilities Provide subject matter expertise for all aspects of the business applications you manage. Analyze and report key metrics to demonstrate the business value of investments in these applications. Collaborate with stakeholders to establish and manage SLAs, ensuring customer expectations are met through multi-tier support and uptime monitoring. Lead and mentor teams by fostering a collaborative, stable, and cohesive work environment. Monitor industry and market trends to guide strategic decision-making and proactively address challenges. Develop and communicate a clear application roadmap that aligns with business functions and adapts to emerging requirements. Establish and enforce standards, methods, and procedures for inspecting, testing, and evaluating the precision and reliability of applications. Forge long-term strategic partnerships with departmental leaders by understanding their challenges and opportunities, aligning team strategy with company-wide objectives. Accountable for overall team performance and associated performance management Requirements For Success B.S. in Computer Science, Information Systems, or related fields 8-10+ years of IT experience, including demonstrated success in progressively broadening architecture and technology leadership Expertise with managing application development at scale (delivery) A good understanding of software development life cycle methodologies including Agile and Scrum (both) Experience with leading and coordinating cross-functional initiatives, conducting interviews and performing analyses to create business cases for projects Experience using Atlassian products (Confluence, Jira) for project management; strong project management skills Experience in budget planning for projects Experience using Salesforce Core, CPQ, Billing, Netsuite, Must have strong communication, presentation and public speaking skills, with the ability to interact effectively with co-workers and people who have strong opinions Must have strong leadership skills, and collaboration skills with the ability to lead large projects autonomously Exhibits a high level of initiative and integrity and is empathetic to the needs of individuals across the organization Strong problem solving and critical thinking skills Self-starter who’s comfortable with ambiguity, asking questions, being resourceful to resolve issues, and adept at shifting priorities Strong customer service skills with proven service mentality Strong in documenting business processes and communicating system requirements Show more Show less

Posted 1 month ago

Apply

7.0 - 10.0 years

6 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title:DevOps Lead Experience7-10 Years Location:Bengaluru : Overall, 7-10 years of experience in IT In-depth knowledge of GCP services and resources to design, deploy, and manage cloud infrastructure efficiently. Certification is big plus. Proficiency in Java or Shell or Python scripting. Develop, maintain, and optimize Infrastructure as Code scripts and templates using tools like Terraform and Ansible, ensuring resource automation and consistency. Strong expertise in Kubernetes using Helm, HAProxy, and containerization technologies Manage and fine-tune databases, including Neo4j, MySQL, PostgreSQL, and Redis Cache Clusters, to ensure performance and data integrity. Skill in managing and optimizing Apache Kafka and RabbitMQ to facilitate efficient data processing and communication. Design and maintain Virtual Private Cloud (VPC) network architecture for secure and efficient data transmission. Implement and maintain monitoring tools such as Prometheus, Zipkin, Loki and Grafana. Utilize Helm charts and Kubernetes (K8s) manifests for containerized application management. Proficient with Git, Jenkins, and ArgoCD to set up and enhance CI and CD pipelines. Utilize Google Artifact Registry and Google Container Registry for artifact and container image management. Familiarity with CI/CD practices, version control and branching and DevOps methodologies. Strong understanding of cloud network design, security, and best practices. Strong Linux and Network debugging skills Primary Skills: - Strong Kubernetes GKE Clusters Grafana Prometheus Terraform and Ansible - good working knowledge Devops Why Join Us: Opportunity to work in a fast-paced and innovative environment. Collaborative team culture with continuous learning and growth opportunities

Posted 1 month ago

Apply

8.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: azure databricks,sql,data warehouse,skills,azure datafactory,pyspark,azure synapse,airflow,python,data pipeline,data engineering,architect,etl,pipelines,architect designing,data,azure Show more Show less

Posted 1 month ago

Apply

18.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Linkedin logo

Name of company: The Smart Fellowship by Workverse Join our mission: We are building an automation-proof workforce for Bharat. We are rooting for Team Humans by training graduates to think, solve, and communicate beyond what AI can do. We want smart fellows to thrive alongside AI and remain in control - instead of being replaced by it. What we do: Formerly known as X Billion Skills Lab (since 2017), The Smart Fellowship is a hybrid workplace simulation where learners master in-demand workplace skills and GenAI skills through role play. In our immersive & narrative based experience, learners "work" in imaginary companies and solve 50+ generalist workplace scenarios - to build a strong intellectual foundation for rapid growth in the real world of work. Till date we have worked with 50,000+ learners and are even a credit program in one of India's top private universities. The best part about this role: Get direct exposure to customer relationship building, HR strategy, and operations in a fast-growing startup Opportunity to work closely with leadership and see your ideas in action Contribute to building a future-ready, human-first workforce in the age of AI Location: Khar West, Mumbai (Work from office) P.S. We’re looking for someone who genuinely cares about the work we’re doing and sees themselves growing with us. If it’s the right fit on both sides, we’d love to offer a long term commitment with fast tracked career growth. Meet the founder Samyak Chakrabarty has been featured by Forbes as one of Asia's most influential young entrepreneurs and has founded several social impact initiatives that have won national and international recognition. He has over 18 years of entrepreneurial experience and is on a mission to empower humans to outsmart artificial intelligence at work. Till date his work has positively impacted 1,00,000+ youth across the nation. For more information please visit his linkedin profile . Your role: As an AI/ML Architect at Workverse, you'll play a key role in shaping intelligent agents that assist in enterprise decision-making, automate complex data science workflows, and integrate seamlessly into our simulation. These agents will shape Neuroda, World's first AI soft-skills coach and workplace mentor leveraging reasoning, tool use, and memory—while staying aligned with our focus on enhancing the soft-skills learning experience within our simulation environment. We're seeking a strong engineering generalist with deep experience in LLMs, agent frameworks, and production-scale systems. You’ve likely prototyped or shipped agent-based systems, pushed the boundaries of what LLMs can do, and are looking for a meaningful opportunity to build the future of human-first workforce in the age of AI Responsibilities: Lead the design and development of Enterprise AI Agents and Data Science Agent systems that combine reasoning, tool orchestration, and memory. Collaborate with product, research, and infrastructure teams to create scalable agent architectures tailored for enterprise users. Build agent capabilities for tasks like automated analysis, reporting, data wrangling, and domain-specific workflows across business verticals. Integrate real-time knowledge, enterprise APIs, RAG pipelines, and proprietary tools into agentic workflows. Work closely with the alignment and explainability teams to ensure agents remain safe, auditable, and transparent in their reasoning and output. Continuously evaluate and incorporate advances in GenAI (e.g., controllability, multi-modal models, memory layers) into the agent stack. Requirements: Demonstrated experience building with LLMs and Agentic frameworks (e.g., LangChain, LangFlow, Semantic Kernel, CrewAI, Haystack, ReAct, AutoGPT, etc Experience with productionizing AI/LLM workflows and integrating them into real-world applications or systems 2+ years of software engineering experience, ideally with some time in early-stage startups or AI-first environments. Strong Python skills and a solid understanding of full-stack backend architecture (APIs, cloud infrastructure(AWS), relational, non-relational, graph-database (SQL, NO-SQL, ArangoDB, Neo4j) (Bonus) Experience working on agent toolchains for data science, ML ops, game data science and game analytics. Think you’re the one? Apply: Double check if you are comfortable with the work-from-office requirement Share your CV with tanvi@workverse.in , along with a brief note about why you think this role was made for you! Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Delhi, India

On-site

Linkedin logo

About Neo4j Neo4j is the leader in Graph Database & Analytics, helping organizations uncover hidden patterns and relationships across billions of data connections deeply, easily, and quickly. Customers use Neo4j to gain a deeper understanding of their business and reveal new ways of solving their most pressing problems. Over 84% of Fortune 100 companies use Neo4j, along with a vibrant community of 250,000+ developers, data scientists, and architects across the globe. At Neo4j, we’re proud to build the technology that powers breakthrough solutions for our customers. These solutions have helped NASA get to Mars two years earlier, broke the Panama Papers for the ICIJ, and are helping Transport for London to cut congestion by 10% and save $750M a year. Some of our other notable customers include Intuit, Lockheed Martin, Novartis, UBS, and Walmart. Neo4j experienced rapid growth this year as organizations looking to deploy generative AI (GenAI) recognized graph databases as essential for improving it’s accuracy, transparency, and explainability. Growth was further fueled by enterprise demand for Neo4j’s cloud offering and partnerships with leading cloud hyperscalers and ecosystem leaders. Learn more at neo4j.com and follow us on LinkedIn. Our Vision At Neo4j, we have always strived to help the world make sense of data. As business, society and knowledge become increasingly connected, our technology promotes innovation by helping organizations to find and understand data relationships. We created, drive and lead the graph database category, and we’re disrupting how organizations leverage their data to innovate and stay competitive. The Role Develop and execute a territory plan based on target agencies and applicable use cases, resulting in a pipeline of opportunities in the target market, that will help you achieve quarterly and annual sales metrics. Develop expert knowledge of Neo4j solutions and applicability in target market covering BFSI(Banking, Financial Services & Insurance) and Enterprise accounts. Develop and present to customers a strong understanding of the benefits and advantages of graph technology. Execute sales cycles that employ Strategic Selling strategies and tactics. Build and present proposals for Neo4j solutions that involve Neo4j products and Services. Work with Pre-Sales Engineering resources to scope and deliver on customer needs. “Land & Expand” - Grow existing account base with a strategic customer first methodology Provide guidance, direction, and support to your assigned SDR in their efforts to support your pipeline development. Ensure the execution of strategies for assigned key accounts to drive plans to increase revenue potential and growth Collaborate with Field Marketing resources targeting programs to increase awareness at the existing customer base resulting in revenue growth. Maintain Neo4j Salesforce.com CRM system with accurate information about your pipeline, in accordance with Neo4j forecasting guidelines. Ideally, You Should Have 8-10 years of consistent success meeting or exceeding sales objectives selling technical solutions and software products into BFSI(Banking, Financial Services & Insurance) and Enterprise accounts. Demonstrable experience executing enterprise complex sales strategies and tactics. Experience with the commercial open-source business model, selling subscriptions for on-premise deployments and/or hybrid on-prem/cloud deployments. Previous experience and thrive in a smaller, high growth software company, where you have leveraged dedicated SDR resources, Field Marketing resources, and Pre-Sales Engineering helping build the business. Strong conviction and approach to how and where graph solutions fit into the enterprise marketplace. Demonstrate attention to detail, ensuring accurate entry and management of lead data in our SalesForce.com CRM system. Be proficient with standard corporate productivity tools (e.g., Google Docs, MS-Office, Salesforce.com, Web-conferencing). Be a team player with the highest level of integrity Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: azure databricks,sql,data warehouse,skills,azure datafactory,pyspark,azure synapse,airflow,python,data pipeline,data engineering,architect,etl,pipelines,architect designing,data,azure Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: azure databricks,sql,data warehouse,skills,azure datafactory,pyspark,azure synapse,airflow,python,data pipeline,data engineering,architect,etl,pipelines,architect designing,data,azure Show more Show less

Posted 1 month ago

Apply

3.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Linkedin logo

TCS is having vacancy for skill JAVA API for Chennai, Mumbai, Pune, Hyderabad Exp req: 3 to 9 yrs Mode of Interview: virtual JD · Good Programming & Analytical Skills · Thorough knowledge on the java core, spring core & microservices. · Good understanding of CI/CD, Docker & Kubernetes. · Authorization & Authentication process · Neo4j experience will be plus · Java, API, Microservices (Minimum of 3 years) · Any Relationship DB (Oracle, DB2 etc) · Springboot · Experience is deployment in OpenShift · Troubleshooting on incidents reported on OS · Should be able to manage the container platform ecosystem as Installation, Upgrade, Patching and Monitoring · Must have knowledge on OpenShift capacity and availability management Show more Show less

Posted 1 month ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Location: Chennai,Kolkata,Gurgaon,Bangalore and Pune Experience: 8 -12 Years Work Mode: Hybrid Mandatory Skills: Python, Pyspark, SQL, ETL, Data Pipeline, Azure Databricks, Azure DataFactory, Azure Synapse, Airflow, and Architect Designing,Architect. Overview We are seeking a skilled and motivated Data Engineer with experience in Python, SQL, Azure, and cloud-based technologies to join our dynamic team. The ideal candidate will have a solid background in building and optimizing data pipelines, working with cloud platforms, and leveraging modern data engineering tools like Airflow, PySpark, and Azure Data Engineering. If you are passionate about data and looking for an opportunity to work on cutting-edge technologies, this role is for you! Primary Roles And Responsibilities Developing Modern Data Warehouse solutions using Databricks and AWS/ Azure Stack Ability to provide solutions that are forward-thinking in data engineering and analytics space Collaborate with DW/BI leads to understand new ETL pipeline development requirements. Triage issues to find gaps in existing pipelines and fix the issues Work with business to understand the need in reporting layer and develop data model to fulfill reporting needs Help joiner team members to resolve issues and technical challenges. Drive technical discussion with client architect and team members Orchestrate the data pipelines in scheduler via Airflow Skills And Qualifications Bachelor's and/or master’s degree in computer science or equivalent experience. Must have total 6+ yrs. of IT experience and 3+ years' experience in Data warehouse/ETL projects. Deep understanding of Star and Snowflake dimensional modelling. Strong knowledge of Data Management principles Good understanding of Databricks Data & AI platform and Databricks Delta Lake Architecture Should have hands-on experience in SQL, Python and Spark (PySpark) Candidate must have experience in AWS/ Azure stack Desirable to have ETL with batch and streaming (Kinesis). Experience in building ETL / data warehouse transformation processes Experience with Apache Kafka for use with streaming data / event-based data Experience with other Open-Source big data products Hadoop (incl. Hive, Pig, Impala) Experience with Open Source non-relational / NoSQL data repositories (incl. MongoDB, Cassandra, Neo4J) Experience working with structured and unstructured data including imaging & geospatial data. Experience working in a Dev/Ops environment with tools such as Terraform, CircleCI, GIT. Proficiency in RDBMS, complex SQL, PL/SQL, Unix Shell Scripting, performance tuning and troubleshoot Databricks Certified Data Engineer Associate/Professional Certification (Desirable). Comfortable working in a dynamic, fast-paced, innovative environment with several ongoing concurrent projects Should have experience working in Agile methodology Strong verbal and written communication skills. Strong analytical and problem-solving skills with a high attention to detail. Skills: azure databricks,sql,data warehouse,skills,azure datafactory,pyspark,azure synapse,airflow,python,data pipeline,data engineering,architect,etl,pipelines,architect designing,data,azure Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

India

On-site

Linkedin logo

About Us: Our mission at micro1 is to match the most talented people in the world with their dream jobs. If you are looking to be at the forefront of AI innovation and work with some of the fastest growing companies in Silicon Valley, we invite you to apply for a role. By joining the micro1 community, your resume will become visible to top industry leaders, unlocking access to the best career opportunities on the market. Role Overview: Join for one of our top customer as a skilled AI Engineer you will design, develop, and deploy machine learning models and systems that drive our products and enhance user experiences. You will work closely with cross-functional teams to implement cutting-edge AI solutions, including recommendation engines and large language models. Key Responsibilities: Design and implement robust machine learning models and algorithms, focusing on recommendation systems. Conduct data analysis to identify trends, insights, and opportunities for model improvement. Collaborate with data scientists and software engineers to build and integrate end-to-end machine learning systems. Optimize and fine-tune models for performance and scalability, ensuring seamless deployment. Work with large datasets using SQL and Postgres to support model training and evaluation. Implement and refine prompt engineering techniques for large language models (LLMs). Stay current with advancements in AI/ML technologies, particularly in core ML algorithms like clustering and community detection. Monitor model performance, conduct regular evaluations, and retrain models as needed. Document processes, model performance metrics, and technical specifications. Required Skills and Qualifications: Bachelors or Master’s degree in Computer Science, Data Science, or a related field. Strong expertise in Python and experience with machine learning libraries (e.g., TensorFlow, PyTorch, Scikit-learn). Proven experience with SQL and Postgres for data manipulation and analysis. Demonstrated experience building and deploying recommendation engines. Solid understanding of core machine learning algorithms, including clustering and community detection. Prior experience in building end-to-end machine learning systems. Familiarity with prompt engineering and working with large language models (LLMs). Experience working with near-real-time recommendation systems Any graph databases hands-on experience like Neo4j, Neptune, etc Experience in Flask or Fast API frameworks Experience with SQL to write/modify/understand the existing queries and optimize DB connections Experience with AWS services like ECS, EC2, S3, Cloudwatch Preferred Qualifications: Experience with Graph DB (specifically Neo4J and cypher query language) Knowledge of large-scale data handling and optimization techniques. Experience with Improving models with RLHF Show more Show less

Posted 1 month ago

Apply

6.0 - 12.0 years

0 Lacs

Greater Kolkata Area

On-site

Linkedin logo

Job Description We are looking for a highly skilled GCP Technical Lead with 6 to 12 years of experience to join our dynamic team. In this role, you will be responsible for designing and implementing scalable, secure, and highly available cloud infrastructure solutions on Google Cloud Platform (GCP). You will lead the architecture and development of cloud-native applications and ensure that infrastructure and applications are optimized for performance, security, and scalability. Your expertise will play a key role in the design and execution of workload migrations, CI/CD pipelines, and infrastructure : Cloud Architecture and Design : Lead the design and implementation of scalable, secure, and highly available cloud infrastructure solutions on GCP using services like Compute Engine, Kubernetes Engine, Cloud Storage, Cloud SQL, and Cloud Load Balancing. Cloud-Native Applications Design : Develop architecture design and guidelines for the development, deployment, and lifecycle management of cloud-native applications, ensuring optimization for security, performance, and scalability with services such as App Engine, Cloud Functions, and Cloud Run. API Management : Implement secure API interfaces and granular access control using IAM, RBAC, and API Gateway for workloads running on GCP. Workload Migration : Lead the migration of on-premises workloads to GCP, ensuring minimal downtime, data integrity, and smooth transitions. CI/CD : Design and implement CI/CD pipelines using Cloud Build, Cloud Source Repositories, and Artifact Registry to automate development and deployment processes. Infrastructure as Code (IaC) : Automate cloud infrastructure provisioning and management using Terraform. Collaboration : Collaborate closely with cross-functional teams to define requirements, design solutions, and ensure successful project delivery, utilizing tools like Google Workspace and Jira. Monitoring and Optimization : Continuously monitor cloud environments to ensure optimal performance, availability, and security, and perform regular audits and tuning. Documentation : Prepare and maintain comprehensive documentation for cloud infrastructure, configurations, and procedures using Google Docs and Qualifications : Bachelors degree in computer science, Information Systems, or related field. 6-12 years of relevant experience in cloud engineering and architecture. Google Cloud Professional Cloud Architect certification. Experience with Kubernetes. Familiarity with DevOps methodologies. Strong problem-solving and analytical skills. Excellent communication skills. Required Skills Google Cloud Platform (GCP) Services, Compute Engine, Google Kubernetes Engine (GKE), Cloud Storage, Cloud SQL, Cloud Load Balancing, Identity and Access Management (IAM), Google Workflows, Google Cloud Pub/Sub, App Engine, Cloud Functions, Cloud Run, API Gateway Cloud Build, Cloud Source Repositories, Artifact Registry, Google Cloud Monitoring, Logging and Error Reporting, Python, Terraform, Google Cloud Firestore, GraphQL, MongoDB, Cassandra, Neo4j, ETL (Extract, Transform, Load) Paradigms, Google Cloud Dataflow, Apache Beam, BigQuery, Service Mesh, Content Delivery Network (CDN), Stackdriver, Google Cloud Trace (ref:hirist.tech) Show more Show less

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Job Summary We are seeking a highly skilled Data Engineer with expertise in leveraging Data Lake architecture and the Azure cloud platform to develop, deploy, and optimise data-driven solutions. . You will play a pivotal role in transforming raw data into actionable insights, supporting strategic decision-making across the organisation. Responsibilities Design and implement scalable data science solutions using Azure Data Lake, Azure Data Bricks, Azure Data Factory and related Azure services. Develop, train, and deploy machine learning models to address business challenges. Collaborate with data engineering teams to optimise data pipelines and ensure seamless data integration within Azure cloud infrastructure. Conduct exploratory data analysis (EDA) to identify trends, patterns, and insights. Build predictive and prescriptive models to support decision-making processes. Expertise in developing end-to-end Machine learning lifecycle utilizing crisp-DM which includes of data collection, cleansing, visualization, preprocessing, model development, model validation and model retraining Proficient in building and implementing RAG systems that enhance the accuracy and relevance of model outputs by integrating retrieval mechanisms with generative models. Ensure data security, compliance, and governance within the Azure cloud ecosystem. Monitor and optimise model performance and scalability in production environments. Prepare clear and concise documentation for developed models and workflows. Skills Required Good experience in using Pyspark, Python, MLops (Optional), ML flow (Optional), Azure Data Lake Storage. Unity Catalog Worked and utilized data from various RDBMS like MYSQL, SQL Server, Postgres and NoSQL databases like MongoDB, Cassandra, Redis and graph DB like Neo4j, Grakn. Proven experience as a Data Engineer with a strong focus on Azure cloud platform and Data Lake architecture. Proficiency in Python, Pyspark, Hands-on experience with Azure services such as Azure Data Lake, Azure Synapse Analytics, Azure Machine Learning, Azure Databricks, and Azure Functions. Strong knowledge of SQL and experience in querying large datasets from Data Lakes. Familiarity with data engineering tools and frameworks for data ingestion and transformation in Azure. Experience with version control systems (e.g., Git) and CI/CD pipelines for machine learning projects. Excellent problem-solving skills and the ability to work collaboratively in a team environment. Show more Show less

Posted 1 month ago

Apply

1.0 - 3.0 years

11 - 15 Lacs

Mumbai

Work from Office

Naukri logo

Overview The Data Technology team at MSCI is responsible for meeting the data requirements across various business areas, including Index, Analytics, and Sustainability. Our team collates data from multiple sources such as vendors (e.g., Bloomberg, Reuters), website acquisitions, and web scraping (e.g., financial news sites, company websites, exchange websites, filings). This data can be in structured or semi-structured formats. We normalize the data, perform quality checks, assign internal identifiers, and release it to downstream applications. Responsibilities As data engineers, we build scalable systems to process data in various formats and volumes, ranging from megabytes to terabytes. Our systems perform quality checks, match data across various sources, and release it in multiple formats. We leverage the latest technologies, sources, and tools to process the data. Some of the exciting technologies we work with include Snowflake, Databricks, and Apache Spark. Qualifications Core Java, Spring Boot, Apache Spark, Spring Batch, Python. Exposure to sql databases like Oracle, Mysql, Microsoft Sql is a must. Any experience/knowledge/certification on Cloud technology preferrably Microsoft Azure or Google cloud platform is good to have. Exposures to non sql databases like Neo4j or Document database is again good to have. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com

Posted 1 month ago

Apply

2.0 - 3.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Solid experience in NodeJS, TypeScript, React, Neo4j and Firestore (GCP) . In-depth knowledge in software design & development practices Design and develop scalable systems using advanced concepts in NodeJS, TypeScript, Javascript, and React. Should have a better understanding of deploying and working with GKE. Ability to design for scale and performance/peer code reviews Architecture/platform development, API, data modelling at scale Excellent working experience in Express, Knex, Serverless GC Functions Solid experience in JavaScript Frameworks (Angular / React.JS), Redux, JavaScript , JQuery, CSS, HTML5, ES5, ES6 & ES7, in-memory databases (Redis / Hazelcast), Build tools (web pack) Good Error and Exceptional Handling Skills. Ability to work with Git repositories, and remote code hosting services like GitHub and Gitlab Ability to deliver amazing results with minimal guidance and supervision Passionate (especially about web development!), highly motivated, and fun to work with React, Es6 (5), Html & Css, Javascript & Typescript, Api, Node

Posted 1 month ago

Apply

10.0 - 20.0 years

25 - 35 Lacs

Pune

Remote

Naukri logo

Job Description Technical Delivery Manager Saama Technologies Responsibilities: Oversee the end-to-end development and delivery of the Graph RAG system. Manage project timelines, ensuring timely delivery and adherence to milestones. Establish and maintain strong communication with client technical leads, providing regular updates and addressing technical concerns. Offer technical leadership and expertise in Graph Databases (e.g., Neo4j) and LLM-based applications. Collaborate with the team on architectural decisions, ensuring solutions are scalable, robust, and aligned with client requirements. Mitigate technical risks and address challenges proactively. Qualifications: Proven experience in technical project management and delivery, ideally within the AI/ML or data science domain. Strong understanding of Graph Databases and LLM-based systems. Experience with cloud-based development and deployment (AWS, GCP, or Azure). Excellent communication and interpersonal skills, with the ability to bridge the gap between technical and non-technical stakeholders. Ability to work independently and lead a team in a fast-paced environment. Experience with Agile methodologies. Required Skills: Knowledge of Graph Databases (Neo4) - Experience with LLM-based systems - Proficiency in LangChain - API development and cloud deployment expertise - Experience managing engineering teams and Agile methodologies Desired Skills: Familiarity with LangChain and API development. Knowledge of MLOps and CI/CD practices.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies