Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Delhi, India
Remote
About Us Astra is a cybersecurity SaaS company that makes otherwise chaotic pentests a breeze with its one of a kind AI-led offensive Pentest Platform. Astra's continuous vulnerability scanner emulates hacker behavior to scan applications for 13,000+ security tests. CTOs and CISOs love Astra because it helps them to achieve continuous security at scale, fix vulnerabilities in record time, and seamlessly transition from DevOps to DevSecOps with Astra's powerful CI/CD integrations. Astra is loved by 800+ companies across 70+ countries. In 2024 Astra uncovered 2.5 million+ vulnerabilities for its customers, saving customers $110M+ in potential losses due to security vulnerabilities. We've been awarded by the President of France Mr. François Hollande at the La French Tech program and Prime Minister of India Shri Narendra Modi at the Global Conference on Cyber Security. Loom, MamaEarth, Muthoot Finance, Canara Robeco, Dream 11, OLX Autos etc. are a few of Astra’s customers. Job Description This is a remote position. Role Overview As Astra Security’s first AI Engineer, you will play a pivotal role in introducing and embedding AI into our security products. You will be responsible for designing, developing, and deploying AI applications leveraging both open-source models (Llama, Mistral, DeepSeek etc) and proprietary services (OpenAI, Anthropic). Your work will directly impact how AI is used to enhance threat detection, automate security processes, and improve intelligence gathering. This is an opportunity to not only build future AI models but also define Astra Security’s AI strategy, laying the foundation for future AI-driven security solutions. Key Responsibilities Lead the AI integration efforts within Astra Security, shaping the company’s AI roadmap Develop and Optimize Retrieval-Augmented Generation (RAG) Pipelines with multi-tenant capabilities Build and enhance RAG applications using LangChain, LangGraph, and vector databases (e.g. Milvus, Pinecone, pgvector). Implement efficient document chunking, retrieval, and ranking strategies. Optimize LLM interactions using embeddings, prompt engineering, and memory mechanisms. Work with Graph databases (Neo4j or similar) for structuring and querying knowledge bases esign multi-agent workflows using orchestration platforms like LangGraph or other emerging agent frameworks for AI-driven decision-making and reasoning. Integrate vector search, APIs and external knowledge sources into agent workflows. Exposure to end-to-end AI ecosystem like Huggingface to accelerate AI development (while initial work won’t involve extensive model training, the candidate should be ready for fine-tuning, domain adaptation, and LLM deployment when needed) Design and develop AI applications using LLMs (Llama, Mistral, OpenAI, Anthropic, etc.) Build APIs and microservices to integrate AI models into backend architectures.. Collaborate with the product and engineering teams to integrate AI into Astra Security’s core offerings Stay up to date with the latest advancements in AI and security, ensuring Astra remains at the cutting edge What We Are Looking For Exceptional Python skills for AI/ML development Hands-on experience with LLMs and AI frameworks (LangChain, Transformers, RAG-based applications) Strong understanding of retrieval-augmented generation (RAG) and knowledge graphs Experience with AI orchestration tools (LangChain, LangGraph) Familiarity with graph databases (Neo4j or similar) Experience in Ollama for efficient AI model deployment for production workloads is a plus Experience deploying AI models using Docker Hands-on experience with Ollama setup and loading DeepSeek/Llama. Strong problem-solving skills and a self-starter mindset—you will be building AI at Astra from the ground up. Nice To Have Experience with AI deployment frameworks (e.g., BentoML, FastAPI, Flask, AWS) Background in cybersecurity or security-focused AI applications What We Offer Software Engineering Mindset: This role requires a strong software engineering mindset to build AI solutions from 0 to 1 and scale them based on business needs. The candidate should be comfortable designing, developing, testing, and deploying production-ready AI systems while ensuring maintainability, performance, and scalability. Why Join Astra Security? Own and drive the AI strategy at Astra Security from day one Fully remote, agile working environment. Good engineering culture with full ownership in design, development, release lifecycle. A wholesome opportunity where you get to build things from scratch, improve and ship code to production in hours, not weeks. Holistic understanding of SaaS and enterprise security business. Annual trips to beaches or mountains (last one was at Wayanad). Open and supportive culture. Health insurance & other benefits for you and your spouse. Maternity benefits included. Show more Show less
Posted 1 week ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Job Description The Technical Care Specialist provides advanced technical and product support within Care service delivery. A global position in a 24x7 support environment with a responsibility to work on complex troubleshooting cases, especially at solution / system level. It also acts as a primary interface to R&D for escalation of customer problems and their follow-up till resolution. How You Will Contribute And What You Will Learn Helps experts to perform troubleshooting methods like system level tracing, debug, protocol flow analysis. Identifies, reproduces and characterizes defects and collaborates promptly with R&D teams for fixes. Interacts with customer for complex cases, providing workarounds, etc. Ensures SLAs are met for escalated cases. Contributes for Root Causes Analysis (RCA) analysis and report creation Creates knowledge articles (author, reviewer). Develops competencies on products and solutions. Key Skills And Experience You have: Bachelor´s degree or equivalent experience required Typically requires a min.of 5-8 yrs of experience with demonstrated passion and achievement for technology in the areas described below and customer satisfaction Strong in fundamentals including Networking, OS concepts, Virtualization, Security etc. Worked on Microservices Containers (Kubernetes/Docker, Certified Kubernates Administrator (CKA) is an advantage, Worked on Areas of Telecommunication networking (Core/Access) and know the 5G architecture, Worked on Networking and Cybersecurity (TCP/IP v4/v6, Firewalls/iptables, worked on any cybersecurity related solutions like Endpoint Security) It would be nice if you also had: Azure or any public cloud certifications Knowledge on IaaS (Openstack, Red Hat OpenShift is an advantage), Virtualisation (vSphere/vCloud) and Software Defined Networking (SDN, nuage,Openflow). System Administration of Linux/Windows environment. Experience in RDBMS like Oracle, MariaDB, Postgress and Neo4j. About Us Come create the technology that helps the world act together Nokia is committed to innovation and technology leadership across mobile, fixed and cloud networks. Your career here will have a positive impact on people’s lives and will help us build the capabilities needed for a more productive, sustainable, and inclusive world. We challenge ourselves to create an inclusive way of working where we are open to new ideas, empowered to take risks and fearless to bring our authentic selves to work What we offer Nokia offers continuous learning opportunities, well-being programs to support you mentally and physically, opportunities to join and get supported by employee resource groups, mentoring programs and highly diverse teams with an inclusive culture where people thrive and are empowered. Nokia is committed to inclusion and is an equal opportunity employer Nokia has received the following recognitions for its commitment to inclusion & equality: One of the World’s Most Ethical Companies by Ethisphere Gender-Equality Index by Bloomberg Workplace Pride Global Benchmark At Nokia, we act inclusively and respect the uniqueness of people. Nokia’s employment decisions are made regardless of race, color, national or ethnic origin, religion, gender, sexual orientation, gender identity or expression, age, marital status, disability, protected veteran status or other characteristics protected by law. We are committed to a culture of inclusion built upon our core value of respect. Join us and be part of a company where you will feel included and empowered to succeed. About The Team As Nokia's growth engine, we create value for communication service providers and enterprise customers by leading the transition to cloud-native software and as-a-service delivery models. Our inclusive team of dreamers, doers and disruptors push the limits from impossible to possible. Show more Show less
Posted 1 week ago
6.0 - 8.0 years
6 - 9 Lacs
Bengaluru
Work from Office
Flashed JC 72927, enable some profiles today. Experience: 6-8years Band: B3 Location: Bangalore , Chennai & Hyderabad Below is the JD: 6+ years of hands-on experience in software development or data science Support the company s commitment to protect the integrity and confidentiality of systems and data. Experience building E2E analytics platform using Streaming, Graph and Big Data platform Experience with Graph-based data workflows and working with Graph Analytics Extensive hands on experience in designing, developing, and maintaining software frameworks using Kafka, Spark, Neo4J, Tiger Graph DB, Hands one experience on Java, Scala or Python Design and build and deploy streaming and batch data pipelines capable of processing and storing large datasets quickly and reliably using Kafka, Spark and YARN Experience managing and leading small development teams in an Agile environment Drive and maintain a culture of quality, innovation and experimentation Collaborate with product teams, data analysts and data scientists to design and build data-forward solutions Provide the prescriptive point-solution architectures and guide the descriptive architectures within assigned modules Own technical decisions for the solution and application developers in the creation of architectural decisions and artifacts Manage day-to-day technology architectural decision for limited number of specified assigned modules including making decision on best path to achieve requirements and schedules. Own the quality of modules being delivered, insure proper testing and validation processes are followed. Ensure the point-solution architectures are in line with the enterprise strategies and principles Reviews technical designs to ensure that they are consistent with defined architecture principles, standards and best practices. Accountable for the availability, stability, scalability, security, and recoverability enabled by the designs Ability to clearly communicate with team & stakeholders Neo4j, Tigergraph Db
Posted 1 week ago
6.0 - 11.0 years
9 - 10 Lacs
Bengaluru
Work from Office
Required Skills Technology | Monitoring and Observability Implementation | Building and deploying observability tools and solutions Technology | Custom Automation Development | Developing automated workflows for alerting, healing, etc. Technology | API Integration Such as ITSM, Monitoring Tools, Notification Mechanisms | Integrating with ITSM, notification tools, monitoring platforms Technology | Gap Identification and Collaboration with Ops | Works with operational teams to find and fix observability gaps Technology | Programming - Python and JavaScript | Core development skills for integration, logic handling, and scripting Technology | Database Technologies - Neo4J, MongoDB | Supports data-driven observability (graph relationships, log stores) Education Qualification : Engineer - B.E / B.Tech / MCA Certification Mandatory / Desirable : Technology | DevOps Foundation (DOFD by DevOps Institute), CompTIA Server+, Red Hat Certified System Administrator (RHCSA), ITIL 4 Foundation Responsibilities: Implements observability solutions. develops integrations with monitoring tools, ITSM platforms, notification mechanisms. develops custom automations. Responsible for the accuracy and completeness of the Observability environment. works along with operational teams in identifying and addressing gaps in Observability. Has to be very proficient in Python and JavaScript, API integration, multiple database technologies including Neo4J and MongoDB. Has to have experience with stream processing pipelines. Has to have experience working with multiple monitoring tools. Ideally should have worked on Observability tools.
Posted 1 week ago
6.0 - 8.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do BCG is looking for a Global IT Software Engineer Senior Manager to contribute to the development, deployment, and optimization of cutting-edge Generative AI (GenAI) tools and IT solutions. In this role, you will work closely with cross-functional teams, bringing technical expertise and hands-on problem-solving to ensure the successful delivery of innovative and scalable software solutions that support BCG’s business objectives. Leading the implementation and optimization of GenAI applications and IT tools to enhance productivity and operational efficiency. Collaborate with Product Owners, Tribe Leaders, and other stakeholders to align technology solutions with business requirements. Administer and configure AI-powered SaaS tools, ensuring secure deployment and smooth integration across the organization. Identify opportunities for enhancements to enterprise AI tools, focusing on improving efficiency and user satisfaction. Support proof-of-concept (POC) projects to explore and validate innovative technologies and solutions. Continuously assess and optimize software architecture, focusing on scalability, reliability, and alignment with emerging trends. Document designs, development processes, and best practices to promote knowledge sharing and operational efficiency. Stay updated on emerging technologies such as LLMs, APIs, and cloud-based solutions, applying these innovations to drive impactful outcomes. What You'll Bring A bachelor’s degree in Computer Science, Engineering, or a related field. Advanced degrees are a plus. 6-8 years of professional experience in software development or IT operations, with increasing responsibility. Proven experience in implementing AI-driven applications and SaaS solutions. Strong technical proficiency in both frontend and backend development (e.g., React, Python, Java, Typescript). Experience with cloud technologies and infrastructure as code (e.g., AWS, Kubernetes, Terraform). Familiarity with software design patterns, architecture trade-offs, and integration best practices. Knowledge of DevOps practices, CI/CD pipelines, and automated testing frameworks. Experience And Skills (Nice To Have) Previous experience building a user-facing GenAI/LLM software application Previous experience with vectors and embeddings (pgvector, chromadb) Knowledge of LLM RAG/Agent core concepts and fundamentals Experience with Helm, Neo4J, GraphQL for efficient data querying for APIs, and CI/CD tools like Jenkins for automating deployments Other AWS Managed Services (RDS, Batch, Lambda, Fargate, Step Functions, SQS/SNS, etc.) FastAPI and NextJS experience (if we’re still using the latter) Websockets, Server-Side Events, Pub/Sub (RabbitMQ, Kafka, etc.) Who You'll Work With Squad members of a specific squad, led by a Product Owner. Tribe Leaders, Product Owners, and other Chapter Leads to align resources and priorities. Agile Coaches and Scrum Masters to embed Agile practices and principles into daily operations. Cross-functional IT teams to ensure alignment with BCG’s overall IT strategy and architecture. Additional info YOU’RE GOOD AT Driving the adoption and optimization of SaaS tools and AI-driven applications to meet organizational needs. Solving technical challenges and developing scalable, innovative solutions. Applying Change Management disciplines to ensure successful technology rollouts. Proactively identifying and implementing automation capabilities to reduce manual effort and errors. Collaborating effectively with diverse stakeholders, including technical teams and business leaders. Adapting to fast-paced environments and evolving priorities with high energy and autonomy. Leveraging expertise in GenAI, SaaS integrations, cloud technologies, and security to deliver impactful solutions. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify. Show more Show less
Posted 1 week ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
EXL (NASDAQ: EXLS) is a $7 billion public-listed NASDAQ company and a rapidly expanding global digital data-led AI transformation solutions company with double digit growth. EXL Digital division spearheads the development and implementation of Generative AI (GenAI) business solutions for our clients in Banking & Finance, Insurance, and Healthcare. As a global leader in analytics, digital transformation, and AI innovation, EXL is committed to helping clients unlock the potential of generative AI to drive growth, efficiency, and innovation. Job Summary We are seeking a highly skilled AI/ML Engineer - Generative AI to design, develop, and deploy production-grade AI systems and agentic applications. The ideal candidate will have a strong background in Python 3.11+, deep learning, large language models, and distributed systems, with experience building performant, clean, and scalable services. Key Responsibilities Build and maintain high-performance REST/WebSocket APIs using FastAPI (Pydantic v2). Implement and optimize agentic AI systems using LangGraph, AutoGen, and LangChain. Architect real-time event-driven microservices using Apache Kafka 4.0 and KRaft. Design clean, testable services using SOLID principles, Python async, and type hints. Integrate vector databases like Pinecone and Weaviate for embedding storage and retrieval. Implement graph databases like Neo4j for knowledge graph-based use cases. Manage experiment tracking and model lifecycle using MLflow 3.0 or Weights & Biases. Build and deploy containers using Docker, GitHub Actions, and Kubernetes (nice-to-have). Maintain CI/CD pipelines and infrastructure as code with Git and optionally Terraform. Stay current with trends in GenAI, deep learning, and orchestration frameworks. Minimum Qualifications Bachelor's degree in computer science, Data Science, or related field. 5+ years of experience in AI/ML engineering with focus on LLMs and NLP. 2–3 years of hands-on experience with GenAI and LLMs (e.g., GPT, Claude, LLaMA3). Proficiency in Python 3.11+ (async, typing, OOP, SOLID principles). Experience with FastAPI, Pydantic v2, PyTorch 2.x, Hugging Face Transformers. Working knowledge of agentic frameworks like LangChain, LangGraph, or AutoGen. Experience building REST/WebSocket APIs and microservices with Kafka streams. Proficient in SQL, Pandas, and NumPy for data manipulation. Preferred Qualifications Master’s or PhD degree in Computer Science, Data Science, or related field. Familiarity with Graph DBs such as Neo4j for knowledge graphs. Experience with Vector DBs like Pinecone, Weaviate. Proficiency in MLflow 3.0 or Weights & Biases for experiment tracking. Experience with CI/CD pipelines, containerization (Docker), and orchestration (K8s) and automated deployment workflows. Exposure to Infrastructure as Code (IaC) using Terraform. Knowledge of advanced optimization, quantization, and fine-tuning techniques. Skills and Competencies Proven ability to architect GenAI solutions and multi-agent systems. Strong testing skills (unit, integration, performance). Excellent communication and cross-functional collaboration. Strong analytical and problem-solving skills. Leadership and mentoring capability for engineering teams. Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
About ZenDot ZenDot is a cutting-edge technology company building AI-driven solutions that power the next generation of productivity, intelligence, and automation for businesses. Our focus lies in delivering enterprise-grade tools that combine large language models, real-time data, and deep integrations across knowledge ecosystems. We're building a state-of-the-art internal platform for enterprise semantic search, secure document retrieval, and intelligent knowledge graphs . To lead this mission, we are hiring a Senior AI Engineer to architect and implement a search and knowledge engine inspired by world-class products like Glean — but tailored to our own innovation roadmap. Key Responsibilities Lead the end-to-end design and implementation of an enterprise semantic search engine with hybrid retrieval capabilities. Build robust, scalable data ingestion pipelines to index content from sources like Google Workspace, Slack, Jira, Confluence, GitHub, Notion, and more. Design and optimize a reranking and LLM augmentation layer to improve the quality and relevance of search results. Construct an internal knowledge graph mapping users, documents, metadata, and relationships to personalize responses. Implement permission-aware access filters , ensuring secure and role-based query results across users and teams. Collaborate on a modular AI orchestration layer , integrating search, chat, summarization, and task triggers. Maintain model benchmarks, A/B testing frameworks, and feedback loops for continuous learning and improvement. Work closely with product, security, infra, and frontend teams to deliver high-performance and compliant AI solutions . Require Skills & Experience 3+ years of experience in AI/ML engineering with deep expertise in information retrieval (IR) , NLP , and vector search . Strong understanding and hands-on work with BM25, vector stores (Faiss, Weaviate, Vespa, Elasticsearch) . Proficiency in transformer-based models (BERT, RoBERTa, OpenAI embeddings) and document embedding techniques . Experience in building hybrid search pipelines (sparse + dense), rerankers, and multi-modal retrieval systems. Skilled in Python , PyTorch/TensorFlow , and data engineering frameworks (Airflow, Spark, etc.). Familiar with RBAC systems, OAuth2 , and enterprise permissioning logic. Hands-on with graph data structures or knowledge graph tools like Neo4j, RDF, or custom DAG engines. Cloud-native architecture experience (AWS/GCP), Kubernetes, and microservices best practices. Bonus Points For Building or contributing to open-source IR/NLP/search frameworks (e.g., Haystack, Milvus, LangChain). Past work with LLM-driven RAG (Retrieval-Augmented Generation) systems. Familiarity with document-level compliance, access auditing, and SAML/SCIM integrations. Ability to work in fast-paced, zero-to-one product environments with deep ownership. Show more Show less
Posted 1 week ago
7.0 years
0 Lacs
India
On-site
About the Role: We are seeking a passionate and proven Full-Stack Software Engineer I to join our collaborative and fast-paced R&D team. In this role, you will be responsible for implementing software features across our suite of R&D applications. You will work closely with engineers and research scientists to transform business requirements into robust products and features that support data science and research initiatives. Technologies We Use: Python, FastAPI, SQLAlchemy, Postgres, TypeScript, Next.js, AWS, Terraform Key Responsibilities: Design, build, and maintain efficient, reusable, and reliable applications and systems using Python, TypeScript/JavaScript, and AWS Collaborate with end-users to understand requirements, develop use cases, and translate them into scalable technical solutions Develop creative and scalable engineering solutions for structured and unstructured data integration Continuously improve code quality through unit testing, automation, and code reviews Contribute to team discussions to improve our technology stack, coding standards, and product development Required Qualifications: 7+ years of professional software development experience Strong experience with web frameworks such as Next.js and Strapi Proficiency with API frameworks, particularly FastAPI Solid understanding of relational databases and SQL Hands-on experience with CI/CD pipelines Proficiency with AWS or other cloud platforms Strong grasp of OOP principles and software design best practices Ability to work independently with minimal supervision Preferred Qualifications: Experience in Linux/Unix environments Exposure to Agile development methodologies Experience with building cloud-based data pipelines and ETL processes Familiarity with Neo4j or other graph databases Knowledge of C# and .NET Understanding of DevOps and cloud security best practices Experience with Infrastructure as Code (IaC) tools like Terraform Self-motivated and eager to learn new technologies If you're excited about solving challenging problems, working with modern technologies, and contributing to impactful research and data initiatives, we would love to hear from you. Show more Show less
Posted 1 week ago
3.0 - 5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Description Job Summary: Responsible for developing software programs per technical specifications following programming standards and procedures, performing testing, executing program modifications, and responding to problems by diagnosing and correcting errors in logic and coding. Key Responsibilities Applies secure coding and UI standards and best practices to develop, enhance, and maintain IT applications and programs. Assists with efforts to configures, analyzes, designs, develops, and maintains program code and applications. Performs unit testing and secure code testing, and issues resolution. Follow the process for source code management. Participate in integration, systems, and performance testing and tuning of code. Participates in peer secure code reviews. Harvest opportunities for re-usability of code, configurations, procedures, and techniques. Responsibilities Competencies: Action oriented - Taking on new opportunities and tough challenges with a sense of urgency, high energy, and enthusiasm. Balances stakeholders - Anticipating and balancing the needs of multiple stakeholders. Business insight - Applying knowledge of business and the marketplace to advance the organization’s goals. Drives results - Consistently achieving results, even under tough circumstances. Plans and aligns - Planning and prioritizing work to meet commitments aligned with organizational goals. Tech savvy - Anticipating and adopting innovations in business-building digital and technology applications. Performance Tuning - Conceptualizes, analyzes and solves application, database and hardware problems using industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Programming - Creates, writes and tests computer code, test scripts, and build scripts using algorithmic analysis and design, industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Solution Configuration - Configures, creates and tests a solution for commercial off-the-shelf (COTS) applications using industry standards and tools, version control, and build and test automation to meet business, technical, security, governance and compliance requirements. Solution Functional Fit Analysis - Composes and decomposes a system into its component parts using procedures, tools and work aides for the purpose of studying how well the component parts were designed, purchased and configured to interact holistically to meet business, technical, security, governance and compliance requirements. Solution Validation Testing - Validates a configuration item change or solution using the Function's defined best practices, including the Systems Development Life Cycle (SDLC) standards, tools and metrics, to ensure that it works as designed and meets customer requirements. Values differences - Recognizing the value that different perspectives and cultures bring to an organization. Education, Licenses, Certifications College, university, or equivalent degree in Computer Science, Information Technology, Business, or related subject, or relevant equivalent experience required. This position may require licensing for compliance with export controls or sanctions regulations. Experience Intermediate level of relevant work experience required. 3-5 years of experience. Qualifications Key Responsibilities: Development & Coding: Design, develop, and maintain scalable web applications using modern front-end and back-end technologies. Write clean, efficient, and reusable code for both front-end and back-end components. Integrate APIs and third-party services into the web applications. Develop and manage NoSQL database schemas, and optimize queries for performance and scalability. Collaborative Problem-Solving: Collaborate with product managers, designers, and other developers to create functional, user-friendly, and visually appealing web applications. Participate in code reviews to ensure code quality, security, and maintainability. Troubleshoot, debug, and optimize applications for better performance and user experience. Technical Leadership & Mentorship: Provide guidance and support to junior developers and help them grow technically. Continuously stay updated with new technologies, tools, and best practices to contribute innovative ideas to the team. Front-End Development: Build responsive and adaptive user interfaces using modern front-end frameworks and libraries (e.g., React, Angular, Vue.js). Implement best practices for UI/UX design and ensure the application is mobile-friendly. Back-End Development: Develop RESTful APIs, microservices, and server-side logic using backend technologies (e.g., Node.js, Python, Java, Typescript). Ensure security, data protection, and compliance with industry standards. Database & Storage: Design, implement, and manage relational (SQL) and non-relational (NoSQL) databases such as PostgreSQL, MySQL, Neo4J, CosmosDB, etc. Perform database optimizations for faster query processing and better performance. Version Control & Deployment: Use version control systems (e.g., Git) to manage and document changes to the codebase. Participate in continuous integration and continuous deployment (CI/CD) processes, ensuring the software is regularly deployed to production. Testing & Debugging: Write unit, integration, and end-to-end tests for applications to ensure robustness and reliability. Conduct thorough testing and debugging to ensure a smooth user experience. Documentation: Document technical specifications, API endpoints, and any relevant development processes. Maintain clear and concise documentation for code, database schemas, and deployment procedures. Technical Skill Set Front-End Technologies: Strong experience with HTML5, CSS3, and JavaScript. Proficiency in front-end frameworks such as React, Angular, or Vue.js. Knowledge of responsive design and cross-browser compatibility. Familiarity with front-end build tools (Webpack, Gulp, etc.). Back-End Technologies: Proficient in one or more back-end programming languages such as Node.js, Python or Java. Experience with server-side frameworks (Express.js, Django, Spring, GraphQL etc.). Strong knowledge of RESTful API and GraphQL design and development. Strong experience in Azure Cloud web services. Experience in Kubernetes development and deployment. Databases: Proficiency in relational databases (SQL Server, PostgreSQL, etc.). Knowledge of NoSQL databases (MongoDB, Neo4J, CosmosDB, Redis, etc.). Strong SQL skills and ability to write optimized queries. Version Control: Experience with Git for version control, including branching, merging, and pull requests. Familiarity with Git workflows such as GitFlow or trunk-based development. Deployment & DevOps: Experience with CI/CD tools such as Jenkins, GitLab CI, or CircleCI. Familiarity with containerization technologies like Docker and container orchestration platforms like Kubernetes. Knowledge of cloud platforms (AWS, Azure, GCP) for hosting and deploying applications. Testing & Debugging: Knowledge of testing frameworks and tools like Jest, Mocha, or Jasmine. Experience with test-driven development (TDD) and writing unit and integration tests. Familiarity with debugging tools and strategies. Agile Methodology: Experience working in Agile development environments, participating in Scrum ceremonies (stand-ups, sprint planning, etc.). Familiarity with project management tools like Jira, Trello, or Asana. Additional Skills: Strong problem-solving skills and ability to think critically. Good understanding of web security best practices (e.g., OWASP Top 10). Ability to work in a collaborative, team-oriented environment. Strong communication skills and ability to articulate technical concepts to non-technical stakeholders. Preferred Qualifications 3-5 years of hands-on experience as a full-stack developer. Bachelor's degree in Computer Science, Engineering, or related field (or equivalent experience). Familiarity with additional technologies or frameworks like React, Vue.js, Svelte, etc. Job Systems/Information Technology Organization Cummins Inc. Role Category Hybrid Job Type Exempt - Experienced ReqID 2411090 Relocation Package Yes Show more Show less
Posted 2 weeks ago
15.0 - 20.0 years
4 - 8 Lacs
Bengaluru
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Neo4j, Stardog Good to have skills : JavaMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Neo4j.- Good To Have Skills: Experience with Java.- Strong understanding of data modeling and graph database concepts.- Experience with data integration tools and ETL processes.- Familiarity with data quality frameworks and best practices.- Proficient in programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 5 years of experience in Neo4j.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
15.0 - 20.0 years
4 - 8 Lacs
Pune
Work from Office
Project Role : Data Engineer Project Role Description : Design, develop and maintain data solutions for data generation, collection, and processing. Create data pipelines, ensure data quality, and implement ETL (extract, transform and load) processes to migrate and deploy data across systems. Must have skills : Neo4j, Stardog Good to have skills : JavaMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As a Data Engineer, you will design, develop, and maintain data solutions that facilitate data generation, collection, and processing. Your typical day will involve creating data pipelines, ensuring data quality, and implementing ETL processes to migrate and deploy data across various systems. You will collaborate with cross-functional teams to understand their data needs and provide effective solutions, ensuring that the data infrastructure is robust and scalable to meet the demands of the organization. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge in data engineering.- Continuously evaluate and improve data processes to enhance efficiency and effectiveness. Professional & Technical Skills: - Must To Have Skills: Proficiency in Neo4j.- Good To Have Skills: Experience with Java.- Strong understanding of data modeling and graph database concepts.- Experience with data integration tools and ETL processes.- Familiarity with data quality frameworks and best practices.- Proficient in programming languages such as Python or Scala for data manipulation. Additional Information:- The candidate should have minimum 5 years of experience in Neo4j.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education
Posted 2 weeks ago
2.0 - 6.0 years
7 - 11 Lacs
Bengaluru
Work from Office
We are looking for: Experience At least2to 3yrs of experience in NodeJS, TypeScript, React is required Proven experience in building, deploying, maintaining & scaling APIs, microservices Job Responsibilities Solid experience in NodeJS, TypeScript, React, Neo4j and Firestore (GCP) . In-depth knowledge in software design & development practices Design and develop scalable systems using advanced concepts in NodeJS, TypeScript, Javascript, and React. Should have a better understanding of deploying and working with GKE. Ability to design for scale and performance/peer code reviews Architecture/platform development, API, data modelling at scale Excellent working experience in Express, Knex, Serverless GC Functions Solid experience in JavaScript Frameworks (Angular / React.JS), Redux, JavaScript , JQuery, CSS, HTML5, ES5, ES6 & ES7, in-memory databases (Redis / Hazelcast), Build tools (web pack) Good Error and Exceptional Handling Skills. Ability to work with Git repositories, and remote code hosting services like GitHub and Gitlab Ability to deliver amazing results with minimal guidance and supervision Passionate (especially about web development!), highly motivated, and fun to work with
Posted 2 weeks ago
3.0 - 5.0 years
22 - 25 Lacs
Hyderabad
Work from Office
We are looking for an experienced NEO4j / Neptune Developer to join our team in Hyderabad In this role, you will be responsible for designing, implementing, and optimizing graph-based solutions using NEO4j or Amazon Neptune databases You will collaborate with cross-functional teams to integrate and deploy graph technologies that solve complex business problems This is a fantastic opportunity for someone who thrives in a dynamic environment and is excited about leveraging graph databases to create innovative solutions Key Responsibilities: Design, develop, and maintain graph database models using NEO4j or Amazon Neptune Develop and implement graph query languages like Cypher (for NEO4j) or SPARQL for efficient data retrieval Optimize graph database performance for large-scale data and high-volume queries Collaborate with teams to identify business requirements and design graph-based data models Integrate graph database solutions with existing systems and applications Troubleshoot and resolve performance issues or bugs within the graph database solutions Contribute to the continuous improvement of the development process, tools, and techniques Provide support for data migration and integration of graph technologies with other enterprise systems Write high-quality, clean, and maintainable code, ensuring best practices are followed Required Skills and Qualifications: 3-5 years of experience in developing with NEO4j or Amazon Neptune Strong knowledge of graph database modeling, relationships, and graph theory Proficiency in Cypher query language (for NEO4j) and SPARQL (for Amazon Neptune) Hands-on experience with graph analytics and performance tuning Experience integrating graph databases with other systems and services Familiarity with NoSQL databases and distributed data architectures Understanding of cloud-based graph database solutions (eg, AWS Neptune) Ability to work in an Agile development environment Strong troubleshooting and problem-solving skills Excellent written and verbal communication skills Technical Skills: NEO4j | Amazon Neptune | Cypher | SPARQL | Graph Database Modeling | NoSQL | Graph Analytics | Python | Java | AWS | Data Migration | ETL | Cloud Solutions | Agile
Posted 2 weeks ago
7.0 years
5 - 7 Lacs
Hyderābād
On-site
You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don’t pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Job responsibilities Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices Evaluates new and current technologies using existing data architecture standards and frameworks Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others Drives data architecture decisions that impact data product & platform design, application functionality, and technical operations and processes Serves as a function-wide subject matter expert in one or more areas of focus Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle Influences peers and project decision-makers to consider the use and application of leading-edge technologies Advises junior architects and technologists Required qualifications, capabilities, and skills 7+ years of hands-on practical experience delivering data architecture and system designs, data engineer, testing, and operational stability Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e.g., data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc.) Practical cloud based data architecture and deployment experience, preferably AWS Practical SQL development experiences in cloud native relational databases, e.g. Snowflake, Athena, Postgres Ability to deliver various types of data models with multiple deployment targets, e.g. conceptual, logical and physical data models deployed as an operational vs. analytical data stores Advanced in one or more data engineering disciplines, e.g. streaming, ELT, event processing Ability to tackle design and functionality problems independently with little to no oversight Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture Preferred qualifications, capabilities, and skills Financial services experience, card and banking a big plus Practical experience in modern data processing technologies, e.g., Kafka streaming, DBT, Spark, Airflow, etc. Practical experience in data mesh and/or data lake Practical experience in machine learning/AI with Python development a big plus Practical experience in graph and semantic technologies, e.g. RDF, LPG, Neo4j, Gremlin Knowledge of architecture assessments frameworks, e.g. Architecture Trade off Analysis
Posted 2 weeks ago
7.0 years
0 Lacs
Hyderābād
On-site
JOB DESCRIPTION You strive to be an essential member of a diverse team of visionaries dedicated to making a lasting impact. Don’t pass up this opportunity to collaborate with some of the brightest minds in the field and deliver best-in-class solutions to the industry. As a Senior Lead Data Architect at JPMorgan Chase within the Consumer and Community Banking Data Technology, you are an integral part of a team that works to develop high-quality data architecture solutions for various software applications, platform and data products. Drive significant business impact and help shape the global target state architecture through your capabilities in multiple data architecture domains. Job responsibilities Represents the data architecture team at technical governance bodies and provides feedback regarding proposed improvements regarding data architecture governance practices Evaluates new and current technologies using existing data architecture standards and frameworks Regularly provides technical guidance and direction to support the business and its technical teams, contractors, and vendors Design secure, high-quality, scalable solutions and reviews architecture solutions designed by others Drives data architecture decisions that impact data product & platform design, application functionality, and technical operations and processes Serves as a function-wide subject matter expert in one or more areas of focus Actively contributes to the data engineering community as an advocate of firmwide data frameworks, tools, and practices in the Software Development Life Cycle Influences peers and project decision-makers to consider the use and application of leading-edge technologies Advises junior architects and technologists Required qualifications, capabilities, and skills 7+ years of hands-on practical experience delivering data architecture and system designs, data engineer, testing, and operational stability Advanced knowledge of architecture, applications, and technical processes with considerable in-depth knowledge in data architecture discipline and solutions (e.g., data modeling, native cloud data services, business intelligence, artificial intelligence, machine learning, data domain driven design, etc.) Practical cloud based data architecture and deployment experience, preferably AWS Practical SQL development experiences in cloud native relational databases, e.g. Snowflake, Athena, Postgres Ability to deliver various types of data models with multiple deployment targets, e.g. conceptual, logical and physical data models deployed as an operational vs. analytical data stores Advanced in one or more data engineering disciplines, e.g. streaming, ELT, event processing Ability to tackle design and functionality problems independently with little to no oversight Ability to evaluate current and emerging technologies to select or recommend the best solutions for the future state data architecture Preferred qualifications, capabilities, and skills Financial services experience, card and banking a big plus Practical experience in modern data processing technologies, e.g., Kafka streaming, DBT, Spark, Airflow, etc. Practical experience in data mesh and/or data lake Practical experience in machine learning/AI with Python development a big plus Practical experience in graph and semantic technologies, e.g. RDF, LPG, Neo4j, Gremlin Knowledge of architecture assessments frameworks, e.g. Architecture Trade off Analysis ABOUT US JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. ABOUT THE TEAM Our Consumer & Community Banking division serves our Chase customers through a range of financial services, including personal banking, credit cards, mortgages, auto financing, investment advice, small business loans and payment processing. We’re proud to lead the U.S. in credit card sales and deposit growth and have the most-used digital solutions – all while ranking first in customer satisfaction.
Posted 2 weeks ago
3.0 - 5.0 years
12 - 20 Lacs
Pune
Remote
Role & responsibilities Lead the creation, development, and implementation of critical system design changes, enhancements, and software projects. Ensure timely execution of project deliverables. Work with other engineers to ensure the system and product is consistent and aligned through all processes. Improve product quality, performance, and security through substantial process improvements. Follow development standards and promote best practices. Individual contributor as an engineer. Requirement and Qualification: 3+ years experience in Python programming. Experience with Neo4j for graph database management and querying. Familiarity with Postgres and Clickhouse for database management and optimization. Experience with cloud platforms including AWS, Azure, and GCP. Understanding of serverless architecture for building and deploying applications. Experience with SaaS (Software as a Service) /product development. Experience with containerization and orchestration technologies (e.g., Docker, Kubernetes). Exceptional problem-solving and analytical skills. Excellent communication and teamwork abilities. Bonus points if you... Experience in AWS ECS, EKS Familiarity with any open-source vulnerability/secret scanning tool. Benefits Our Culture: We have an autonomous and empowered work culture encouraging individuals to take ownership and grow quickly. Flat hierarchy with fast decision making and a startup-oriented get things done culture. A strong, fun & positive environment with regular celebrations of our success. We pride ourselves in creating an inclusive, diverse & authentic environment.
Posted 2 weeks ago
9.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Generative AI – Application Developer EY’s GDS Tax Technology team’s mission is to develop, implement and integrate technology solutions that better serve our clients and engagement teams. As a member of EY’s core Tax practice, you’ll develop a deep tax technical knowledge and outstanding database, data analytics and programming skills. Ever-increasing regulations require tax departments to gather, organize and study more data than ever before. Often the data necessary to satisfy these ever-increasing and complex regulations must be collected from a variety of systems and departments throughout an organization. Effectively and efficiently handling the variety and volume of data is often extremely challenging and time consuming for a company. EY's GDS Tax Technology team members work side-by-side with the firm's partners, clients and tax technical subject matter experts to develop and incorporate technology solutions that enhance value-add, improve efficiencies and enable our clients with disruptive and market leading tools supporting Tax. GDS Tax Technology works closely with clients and professionals in the following areas: Federal Business Tax Services, Partnership Compliance, Corporate Compliance, Indirect Tax Services, Human Capital, and Internal Tax Services. GDS Tax Technology provides solution architecture, application development, testing and maintenance support to the global TAX service line both on a pro-active basis and in response to specific requests. EY is currently seeking a Generative AI – Application Developer (.NET) to join our Tax Technology practice in Bangalore & Kolkata India. The opportunity We’re looking for Tax Seniors with expertise in Full-stack Application Development using .NET C# for Generative AI applications to join the TTT team in Tax Service Line. This is a fantastic opportunity to be part of a pioneer firm whilst being instrumental in the growth of a new service offering. Your Key Responsibilities Design, develop, and implement AI agents/plugins/interfaces and APIs, ensuring integration with various systems aligns with the core product/ platform development strategy. Estimate and manage technical efforts, including work breakdown structures, risks, and solutions, while adhering to development methodologies and KPIs. Maintain effective communication within the team and with stakeholders, proactively managing expectations and collaborating on problem-solving. Contribute to the refinement of development/engineering methodologies and standards, anticipating potential issues and leading the resolution process. Skills And Attributes For Success Must-Have: Skilled in full-stack application development with .NET C#, REST Api, React or any other typescript based UI frameworks, SQL databases Advanced knowledge of Azure services such as Azure app services, Azure Functions, Entra ID etc. Containerisation – Docker, Azure container apps, Azure Kubernetes Services (AKS) No-SQL database such Cosmos or Mongo DB Working experience with source control such as git or TFVC CI/CD pipelines, Azure DevOps, GitHub Actions etc. Generative AI application development with Azure OpenAI, Semantic Kernel, and Vector databases like Azure AI search, Postgres, etc. Fundamental understanding of various types of Large Language Models (LLMs) Fundamental understanding of Retrieval Augment Generation (RAG) techniques Fundamental understanding of classical AI/ML Skilled in Advanced prompt engineering Nice-to-Have: Awareness about various AI Agents/ Agentic workflow frameworks and SDKs Graph Database such as Neo4j Experience with M365 Copilot Studio Microsoft Azure AI-900/ AI-102 Certification Behavioural Skills: Excellent learning ability. Strong communication skill. Flexibility to work both independently and as part of a larger team. Strong analytical skills and attention to detail. The ability to adapt your work style to work with both internal and client team members. To qualify for the role, you must have Bachelor’s / master’s degree in software engineering / information technology / BE/ B.TECH An overall 5 – 9 years of experience. Ideally, you’ll also have Thorough knowledge Tax or Finance Domain. Strong analytical skills and attention to detail. What We Look For A Team of people with commercial acumen, technical experience and enthusiasm to learn new things in this fast-moving environment An opportunity to be a part of market-leading, multi-disciplinary team of 1400 + professionals, in the only integrated global transaction business worldwide. Opportunities to work with EY TAS practices globally with leading businesses across a range of industries What We Offer EY Global Delivery Services (GDS) is a dynamic and truly global delivery network. We work across six locations – Argentina, China, India, the Philippines, Poland and the UK – and with teams from all EY service lines, geographies and sectors, playing a vital role in the delivery of the EY growth strategy. From accountants to coders to advisory consultants, we offer a wide variety of fulfilling career opportunities that span all business disciplines. In GDS, you will collaborate with EY teams on exciting projects and work with well-known brands from across the globe. We’ll introduce you to an ever-expanding ecosystem of people, learning, skills and insights that will stay with you throughout your career. Continuous learning: You’ll develop the mindset and skills to navigate whatever comes next. Success, as defined by you: We’ll provide the tools and flexibility, so you can make a meaningful impact, your way. Transformative leadership: We’ll give you the insights, coaching and confidence to be the leader the world needs. Diverse and inclusive culture: You’ll be embraced for who you are and empowered to use your voice to help others find theirs. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 2 weeks ago
2.0 years
3 Lacs
Coimbatore
On-site
Technical Expertise : (minimum 2 year relevant experience) ● Solid understanding of Generative AI models and Natural Language Processing (NLP) techniques, including Retrieval-Augmented Generation (RAG) systems, text generation, and embedding models. ● Exposure to Agentic AI concepts, multi-agent systems, and agent development using open-source frameworks like LangGraph and LangChain. ● Hands-on experience with modality-specific encoder models (text, image, audio) for multi-modal AI applications. ● Proficient in model fine-tuning, prompt engineering, using both open-source and proprietary LLMs. ● Experience with model quantization, optimization, and conversion techniques (FP32 to INT8, ONNX, TorchScript) for efficient deployment, including edge devices. ● Deep understanding of inference pipelines, batch processing, and real-time AI deployment on both CPU and GPU. ● Strong MLOps knowledge with experience in version control, reproducible pipelines, continuous training, and model monitoring using tools like MLflow, DVC, and Kubeflow. ● Practical experience with scikit-learn, TensorFlow, and PyTorch for experimentation and production-ready AI solutions. ● Familiarity with data preprocessing, standardization, and knowledge graphs (nice to have). ● Strong analytical mindset with a passion for building robust, scalable AI solutions. ● Skilled in Python, writing clean, modular, and efficient code. ● Proficient in RESTful API development using Flask, FastAPI, etc., with integrated AI/ML inference logic. ● Experience with MySQL, MongoDB, and vector databases like FAISS, Pinecone, or Weaviate for semantic search. ● Exposure to Neo4j and graph databases for relationship-driven insights. ● Hands-on with Docker and containerization to build scalable, reproducible, and portable AI services. ● Up-to-date with the latest in GenAI, LLMs, Agentic AI, and deployment strategies. ● Strong communication and collaboration skills, able to contribute in cross-functional and fast-paced environments. Bonus Skills ● Experience with cloud deployments on AWS, GCP, or Azure, including model deployment and model inferencing. ● Working knowledge of Computer Vision and real-time analytics using OpenCV, YOLO, and similar Job Type: Full-time Pay: From ₹300,000.00 per year Schedule: Day shift Experience: AI Engineer: 1 year (Required) Work Location: In person Expected Start Date: 23/06/2025
Posted 2 weeks ago
4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Role Summary Pfizer’s purpose is to deliver breakthroughs that change patients’ lives. Research and Development is at the heart of fulfilling Pfizer’s purpose as we work to translate advanced science and technologies into the therapies and vaccines that matter most. Whether you are in the discovery sciences, ensuring drug safety and efficacy or supporting clinical trials, you will apply cutting edge design and process development capabilities to accelerate and bring the best in class medicines to patients around the world. Pfizer is seeking a highly skilled and motivated AI Engineer to join our advanced technology team. The successful candidate will be responsible for developing, implementing, and optimizing artificial intelligence models and algorithms to drive innovation and efficiency in our Data Analytics and Supply Chain solutions. This role demands a collaborative mindset, a passion for cutting-edge technology, and a commitment to improving patient outcomes. Role Responsibilities Lead data modeling and engineering efforts within advanced data platforms teams to achieve digital outcomes. Provides guidance and may lead/co-lead moderately complex projects. Oversee the development and execution of test plans, creation of test scripts, and thorough data validation processes. Lead the architecture, design, and implementation of Cloud Data Lake, Data Warehouse, Data Marts, and Data APIs. Lead the development of complex data products that benefit PGS and ensure reusability across the enterprise. Collaborate effectively with contractors to deliver technical enhancements. Oversee the development of automated systems for building, testing, monitoring, and deploying ETL data pipelines within a continuous integration environment. Collaborate with backend engineering teams to analyze data, enhancing its quality and consistency. Conduct root cause analysis and address production data issues. Lead the design, develop, and implement AI models and algorithms to solve sophisticated data analytics and supply chain initiatives. Stay abreast of the latest advancements in AI and machine learning technologies and apply them to Pfizer's projects. Provide technical expertise and guidance to team members and stakeholders on AI-related initiatives. Document and present findings, methodologies, and project outcomes to various stakeholders. Integrate and collaborate with different technical teams across Digital to drive overall implementation and delivery. Ability to work with large and complex datasets, including data cleaning, preprocessing, and feature selection. Basic Qualifications A bachelor's or master’s degree in computer science, Artificial Intelligence, Machine Learning, or a related discipline. Over 4 years of experience as a Data Engineer, Data Architect, or in Data Warehousing, Data Modeling, and Data Transformations. Over 2 years of experience in AI, machine learning, and large language models (LLMs) development and deployment. Proven track record of successfully implementing AI solutions in a healthcare or pharmaceutical setting is preferred. Strong understanding of data structures, algorithms, and software design principles Programming Languages: Proficiency in Python, SQL, and familiarity with Java or Scala AI and Automation: Knowledge of AI-driven tools for data pipeline automation, such as Apache Airflow or Prefect. Ability to use GenAI or Agents to augment data engineering practices Preferred Qualifications Data Warehousing: Experience with data warehousing solutions such as Amazon Redshift, Google BigQuery, or Snowflake. ETL Tools: Knowledge of ETL tools like Apache NiFi, Talend, or Informatica. Big Data Technologies: Familiarity with Hadoop, Spark, and Kafka for big data processing. Cloud Platforms: Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform (GCP). Containerization: Understanding of Docker and Kubernetes for containerization and orchestration. Data Integration: Skills in integrating data from various sources, including APIs, databases, and external files. Data Modeling: Understanding of data modeling and database design principles, including graph technologies like Neo4j or Amazon Neptune. Structured Data: Proficiency in handling structured data from relational databases, data warehouses, and spreadsheets. Unstructured Data: Experience with unstructured data sources such as text, images, and log files, and tools like Apache Solr or Elasticsearch. Data Excellence: Familiarity with data excellence concepts, including data governance, data quality management, and data stewardship. Non-standard Work Schedule, Travel Or Environment Requirements Occasionally travel required Work Location Assignment: Hybrid The annual base salary for this position ranges from $96,300.00 to $160,500.00. In addition, this position is eligible for participation in Pfizer’s Global Performance Plan with a bonus target of 12.5% of the base salary and eligibility to participate in our share based long term incentive program. We offer comprehensive and generous benefits and programs to help our colleagues lead healthy lives and to support each of life’s moments. Benefits offered include a 401(k) plan with Pfizer Matching Contributions and an additional Pfizer Retirement Savings Contribution, paid vacation, holiday and personal days, paid caregiver/parental and medical leave, and health benefits to include medical, prescription drug, dental and vision coverage. Learn more at Pfizer Candidate Site – U.S. Benefits | (uscandidates.mypfizerbenefits.com). Pfizer compensation structures and benefit packages are aligned based on the location of hire. The United States salary range provided does not apply to Tampa, FL or any location outside of the United States. Relocation assistance may be available based on business needs and/or eligibility. Sunshine Act Pfizer reports payments and other transfers of value to health care providers as required by federal and state transparency laws and implementing regulations. These laws and regulations require Pfizer to provide government agencies with information such as a health care provider’s name, address and the type of payments or other value received, generally for public disclosure. Subject to further legal review and statutory or regulatory clarification, which Pfizer intends to pursue, reimbursement of recruiting expenses for licensed physicians may constitute a reportable transfer of value under the federal transparency law commonly known as the Sunshine Act. Therefore, if you are a licensed physician who incurs recruiting expenses as a result of interviewing with Pfizer that we pay or reimburse, your name, address and the amount of payments made currently will be reported to the government. If you have questions regarding this matter, please do not hesitate to contact your Talent Acquisition representative. EEO & Employment Eligibility Pfizer is committed to equal opportunity in the terms and conditions of employment for all employees and job applicants without regard to race, color, religion, sex, sexual orientation, age, gender identity or gender expression, national origin, disability or veteran status. Pfizer also complies with all applicable national, state and local laws governing nondiscrimination in employment as well as work authorization and employment eligibility verification requirements of the Immigration and Nationality Act and IRCA. Pfizer is an E-Verify employer. This position requires permanent work authorization in the United States. Information & Business Tech Show more Show less
Posted 2 weeks ago
4.0 - 7.0 years
13 - 17 Lacs
Pune
Work from Office
Overview The Data Technology team at MSCI is responsible for meeting the data requirements across various business areas, including Index, Analytics, and Sustainability. Our team collates data from multiple sources such as vendors (e.g., Bloomberg, Reuters), website acquisitions, and web scraping (e.g., financial news sites, company websites, exchange websites, filings). This data can be in structured or semi-structured formats. We normalize the data, perform quality checks, assign internal identifiers, and release it to downstream applications. Responsibilities As data engineers, we build scalable systems to process data in various formats and volumes, ranging from megabytes to terabytes. Our systems perform quality checks, match data across various sources, and release it in multiple formats. We leverage the latest technologies, sources, and tools to process the data. Some of the exciting technologies we work with include Snowflake, Databricks, and Apache Spark. Qualifications Core Java, Spring Boot, Apache Spark, Spring Batch, Python. Exposure to sql databases like Oracle, Mysql, Microsoft Sql is a must. Any experience/knowledge/certification on Cloud technology preferrably Microsoft Azure or Google cloud platform is good to have. Exposures to non sql databases like Neo4j or Document database is again good to have. What we offer you Transparent compensation schemes and comprehensive employee benefits, tailored to your location, ensuring your financial security, health, and overall wellbeing. Flexible working arrangements, advanced technology, and collaborative workspaces. A culture of high performance and innovation where we experiment with new ideas and take responsibility for achieving results. A global network of talented colleagues, who inspire, support, and share their expertise to innovate and deliver for our clients. Global Orientation program to kickstart your journey, followed by access to our Learning@MSCI platform, LinkedIn Learning Pro and tailored learning opportunities for ongoing skills development. Multi-directional career paths that offer professional growth and development through new challenges, internal mobility and expanded roles. We actively nurture an environment that builds a sense of inclusion belonging and connection, including eight Employee Resource Groups. All Abilities, Asian Support Network, Black Leadership Network, Climate Action Network, Hola! MSCI, Pride & Allies, Women in Tech, and Women’s Leadership Forum. At MSCI we are passionate about what we do, and we are inspired by our purpose – to power better investment decisions. You’ll be part of an industry-leading network of creative, curious, and entrepreneurial pioneers. This is a space where you can challenge yourself, set new standards and perform beyond expectations for yourself, our clients, and our industry. MSCI is a leading provider of critical decision support tools and services for the global investment community. With over 50 years of expertise in research, data, and technology, we power better investment decisions by enabling clients to understand and analyze key drivers of risk and return and confidently build more effective portfolios. We create industry-leading research-enhanced solutions that clients use to gain insight into and improve transparency across the investment process. MSCI Inc. is an equal opportunity employer. It is the policy of the firm to ensure equal employment opportunity without discrimination or harassment on the basis of race, color, religion, creed, age, sex, gender, gender identity, sexual orientation, national origin, citizenship, disability, marital and civil partnership/union status, pregnancy (including unlawful discrimination on the basis of a legally protected parental leave), veteran status, or any other characteristic protected by law. MSCI is also committed to working with and providing reasonable accommodations to individuals with disabilities. If you are an individual with a disability and would like to request a reasonable accommodation for any part of the application process, please email Disability.Assistance@msci.com and indicate the specifics of the assistance needed. Please note, this e-mail is intended only for individuals who are requesting a reasonable workplace accommodation; it is not intended for other inquiries. To all recruitment agencies MSCI does not accept unsolicited CVs/Resumes. Please do not forward CVs/Resumes to any MSCI employee, location, or website. MSCI is not responsible for any fees related to unsolicited CVs/Resumes. Note on recruitment scams We are aware of recruitment scams where fraudsters impersonating MSCI personnel may try and elicit personal information from job seekers. Read our full note on careers.msci.com
Posted 2 weeks ago
6.0 - 11.0 years
25 - 30 Lacs
Chennai
Work from Office
Internal Data Structures & Modeling Design, maintain, and optimize internal data models and structures within the Flexera environment. Map business asset data to Flexeras normalized software models with precision and accuracy. Ensure accurate data classification, enrichment, and normalization to support software lifecycle tracking. Partner with infrastructure, operations, and IT teams to ingest and reconcile data from various internal systems. Reporting & Analytics Design and maintain reports and dashboards in Flexera or via external BI tools such as Power BI or Tableau. Provide analytical insights on software usage, compliance, licensing, optimization, and risk exposure. Automate recurring reporting processes and ensure timely delivery to business stakeholders. Work closely with business users to gather requirements and translate them into meaningful reports and visualizations. Automated Data Feeds & API Integrations Develop and support automated data feeds using Flexera REST/SOAP APIs. Integrate Flexera with enterprise tools (e.g., CMDB, SCCM, ServiceNow, ERP) to ensure reliable and consistent data flow. Monitor, troubleshoot, and resolve issues related to data extracts and API communication. Implement robust logging, alerting, and exception handling for integration pipelines. Skills Must have Minimum 6+ years of working with Flexera or similar software. Flexera Expertise: Strong hands-on experience with Flexera One, FlexNet Manager Suite, or similar tools. Technical Skills: Proficient in REST/SOAP API development and integration. Strong SQL skills and familiarity with data transformation/normalization concepts. Experience using reporting tools like Power BI, Tableau, or Excel for data visualization. Familiarity with enterprise systems such as SCCM, ServiceNow, ERP, CMDBs, etc. Process & Problem Solving: Strong analytical and troubleshooting skills for data inconsistencies and API failures. Understanding of license models, software contracts, and compliance requirements. Nice to have Soft Skills: Excellent communication skills to translate technical data into business insights. Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Data Engineer with Neo4j Data Science India Gurugram Data Engineer with Neo4j Data Science India Bengaluru Data Scientist Data Science India Bengaluru Chennai, India Req. VR-114544 Data Science BCM Industry 23/05/2025 Req. VR-114544 Apply for Senior Flexera Data Analyst in Chennai *
Posted 2 weeks ago
6.0 - 11.0 years
25 - 30 Lacs
Gurugram
Work from Office
Internal Data Structures & Modeling Design, maintain, and optimize internal data models and structures within the Flexera environment. Map business asset data to Flexeras normalized software models with precision and accuracy. Ensure accurate data classification, enrichment, and normalization to support software lifecycle tracking. Partner with infrastructure, operations, and IT teams to ingest and reconcile data from various internal systems. Reporting & Analytics Design and maintain reports and dashboards in Flexera or via external BI tools such as Power BI or Tableau. Provide analytical insights on software usage, compliance, licensing, optimization, and risk exposure. Automate recurring reporting processes and ensure timely delivery to business stakeholders. Work closely with business users to gather requirements and translate them into meaningful reports and visualizations. Automated Data Feeds & API Integrations Develop and support automated data feeds using Flexera REST/SOAP APIs. Integrate Flexera with enterprise tools (e.g., CMDB, SCCM, ServiceNow, ERP) to ensure reliable and consistent data flow. Monitor, troubleshoot, and resolve issues related to data extracts and API communication. Implement robust logging, alerting, and exception handling for integration pipelines. Skills Must have Minimum 6+ years of working with Flexera or similar software. Flexera Expertise: Strong hands-on experience with Flexera One, FlexNet Manager Suite, or similar tools. Technical Skills: Proficient in REST/SOAP API development and integration. Strong SQL skills and familiarity with data transformation/normalization concepts. Experience using reporting tools like Power BI, Tableau, or Excel for data visualization. Familiarity with enterprise systems such as SCCM, ServiceNow, ERP, CMDBs, etc. Process & Problem Solving: Strong analytical and troubleshooting skills for data inconsistencies and API failures. Understanding of license models, software contracts, and compliance requirements. Nice to have Soft Skills: Excellent communication skills to translate technical data into business insights. Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Data Engineer with Neo4j Data Science India Chennai Data Engineer with Neo4j Data Science India Bengaluru Data Scientist Data Science India Bengaluru Gurugram, India Req. VR-114544 Data Science BCM Industry 23/05/2025 Req. VR-114544 Apply for Senior Flexera Data Analyst in Gurugram *
Posted 2 weeks ago
6.0 - 11.0 years
25 - 30 Lacs
Bengaluru
Work from Office
Internal Data Structures & Modeling Design, maintain, and optimize internal data models and structures within the Flexera environment. Map business asset data to Flexeras normalized software models with precision and accuracy. Ensure accurate data classification, enrichment, and normalization to support software lifecycle tracking. Partner with infrastructure, operations, and IT teams to ingest and reconcile data from various internal systems. Reporting & Analytics Design and maintain reports and dashboards in Flexera or via external BI tools such as Power BI or Tableau. Provide analytical insights on software usage, compliance, licensing, optimization, and risk exposure. Automate recurring reporting processes and ensure timely delivery to business stakeholders. Work closely with business users to gather requirements and translate them into meaningful reports and visualizations. Automated Data Feeds & API Integrations Develop and support automated data feeds using Flexera REST/SOAP APIs. Integrate Flexera with enterprise tools (e.g., CMDB, SCCM, ServiceNow, ERP) to ensure reliable and consistent data flow. Monitor, troubleshoot, and resolve issues related to data extracts and API communication. Implement robust logging, alerting, and exception handling for integration pipelines. Skills Must have Minimum 6+ years of working with Flexera or similar software. Flexera Expertise: Strong hands-on experience with Flexera One, FlexNet Manager Suite, or similar tools. Technical Skills: Proficient in REST/SOAP API development and integration. Strong SQL skills and familiarity with data transformation/normalization concepts. Experience using reporting tools like Power BI, Tableau, or Excel for data visualization. Familiarity with enterprise systems such as SCCM, ServiceNow, ERP, CMDBs, etc. Process & Problem Solving: Strong analytical and troubleshooting skills for data inconsistencies and API failures. Understanding of license models, software contracts, and compliance requirements. Nice to have Soft Skills: Excellent communication skills to translate technical data into business insights. Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Data Engineer with Neo4j Data Science India Chennai Data Engineer with Neo4j Data Science India Gurugram Business Analyst Data Science Poland Remote Poland Bengaluru, India Req. VR-114544 Data Science BCM Industry 23/05/2025 Req. VR-114544 Apply for Senior Flexera Data Analyst in Bengaluru *
Posted 2 weeks ago
3.0 - 5.0 years
13 - 15 Lacs
Gurugram
Work from Office
Graph Data Modeling & Implementation. Design and implement complex graph data models using Cypher and Neo4j best practices. Leverage APOC procedures, custom plugins, and advanced graph algorithms to solve domain-specific problems. Oversee integration of Neo4j with other enterprise systems, microservices, and data platforms. Develop and maintain APIs and services in Java, Python, or JavaScript to interact with the graph database. Mentor junior developers and review code to maintain high-quality standards. Establish guidelines for performance tuning, scalability, security, and disaster recovery in Neo4j environments. Work with data scientists, analysts, and business stakeholders to translate complex requirements into graph-based solutions. Skills Must have 12+ years in software/data engineering, with at least 3-5 years hands-on experience with Neo4j. Lead the technical strategy, architecture, and delivery of Neo4j-based solutions. Design, model, and implement complex graph data structures using Cypher and Neo4j best practices. Guide the integration of Neo4j with other data platforms and microservices. Collaborate with cross-functional teams to understand business needs and translate them into graph-based models. Mentor junior developers and ensure code quality through reviews and best practices. Define and enforce performance tuning, security standards, and disaster recovery strategies for Neo4j. Stay up-to-date with emerging technologies in the graph database and data engineering space. Strong proficiency in Cypher query language, graph modeling, and data visualization tools (e.g., Bloom, Neo4j Browser). Solid background in Java, Python, or JavaScript and experience integrating Neo4j with these languages. Experience with APOC procedures, Neo4j plugins, and query optimization. Familiarity with cloud platforms (AWS) and containerization tools (Docker, Kubernetes). Proven experience leading engineering teams or projects. Excellent problem-solving and communication skills. Nice to have N/A Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Senior Flexera Data Analyst Data Science India Bengaluru Senior Flexera Data Analyst Data Science India Chennai Data Scientist Data Science India Bengaluru Gurugram, India Req. VR-114556 Data Science BCM Industry 23/05/2025 Req. VR-114556 Apply for Data Engineer with Neo4j in Gurugram *
Posted 2 weeks ago
3.0 - 5.0 years
13 - 15 Lacs
Bengaluru
Work from Office
Graph Data Modeling & Implementation. Design and implement complex graph data models using Cypher and Neo4j best practices. Leverage APOC procedures, custom plugins, and advanced graph algorithms to solve domain-specific problems. Oversee integration of Neo4j with other enterprise systems, microservices, and data platforms. Develop and maintain APIs and services in Java, Python, or JavaScript to interact with the graph database. Mentor junior developers and review code to maintain high-quality standards. Establish guidelines for performance tuning, scalability, security, and disaster recovery in Neo4j environments. Work with data scientists, analysts, and business stakeholders to translate complex requirements into graph-based solutions. Skills Must have 12+ years in software/data engineering, with at least 3-5 years hands-on experience with Neo4j. Lead the technical strategy, architecture, and delivery of Neo4j-based solutions. Design, model, and implement complex graph data structures using Cypher and Neo4j best practices. Guide the integration of Neo4j with other data platforms and microservices. Collaborate with cross-functional teams to understand business needs and translate them into graph-based models. Mentor junior developers and ensure code quality through reviews and best practices. Define and enforce performance tuning, security standards, and disaster recovery strategies for Neo4j. Stay up-to-date with emerging technologies in the graph database and data engineering space. Strong proficiency in Cypher query language, graph modeling, and data visualization tools (e.g., Bloom, Neo4j Browser). Solid background in Java, Python, or JavaScript and experience integrating Neo4j with these languages. Experience with APOC procedures, Neo4j plugins, and query optimization. Familiarity with cloud platforms (AWS) and containerization tools (Docker, Kubernetes). Proven experience leading engineering teams or projects. Excellent problem-solving and communication skills. Nice to have N/A Other Languages English: C1 Advanced Seniority Senior Refer a Friend Positive work environments and stellar reputations attract and retain top talent. Find out why Luxoft stands apart from the rest. Recommend a friend Related jobs View all vacancies Senior Flexera Data Analyst Data Science India Gurugram Senior Flexera Data Analyst Data Science India Chennai Business Analyst Data Science Poland Remote Poland Bengaluru, India Req. VR-114556 Data Science BCM Industry 23/05/2025 Req. VR-114556 Apply for Data Engineer with Neo4j in Bengaluru *
Posted 2 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20183 Jobs | Dublin
Wipro
10025 Jobs | Bengaluru
EY
8024 Jobs | London
Accenture in India
6531 Jobs | Dublin 2
Amazon
6260 Jobs | Seattle,WA
Uplers
6244 Jobs | Ahmedabad
Oracle
5916 Jobs | Redwood City
IBM
5765 Jobs | Armonk
Capgemini
3771 Jobs | Paris,France
Tata Consultancy Services
3728 Jobs | Thane