LLM Engineer with AI and PYthon

3 - 6 years

10 - 22 Lacs

Posted:3 weeks ago| Platform: Linkedin logo

Apply

Work Mode

On-site

Job Type

Full Time

Job Description

Generative AI Engineer (Hybrid, India)
A fast-growing provider in the Enterprise Software & Artificial Intelligence services sector, we architect and deliver production-ready large-language-model platforms, data pipelines, and intelligent assistants for global customers. Our cross-functional squads blend deep ML expertise with robust engineering practices to unlock rapid business value while maintaining enterprise-grade security and compliance.Role & Responsibilities
  • Design, build, and optimise end-to-end LLM solutions covering data ingestion, fine-tuning, evaluation, and real-time inference.
  • Develop Python micro-services that integrate LangChain workflows, vector databases, and tool-calling agents into secure REST and gRPC APIs.
  • Implement retrieval-augmented generation (RAG) pipelines, embedding models, and semantic search to deliver accurate, context-aware responses.
  • Collaborate with data scientists to productionise experiments, automate training schedules, and monitor drift, latency, and cost.
  • Harden deployments through containerisation, CI/CD, IaC, and cloud GPU orchestration on Azure or AWS.
  • Contribute to engineering playbooks, mentor peers, and champion best practices in clean code, testing, and observability.

Skills & Qualifications

Must-Have

  • 3-6 years Python backend or data engineering experience with strong OO & async patterns.
  • Hands-on building LLM or GenAI applications using LangChain/LlamaIndex and vector stores such as FAISS, Pinecone, or Milvus.
  • Proficiency in prompt engineering, tokenisation, and evaluation metrics (BLEU, ROUGE, perplexity).
  • Experience deploying models via Azure ML, SageMaker, or similar, including GPU optimisation and autoscaling.
  • Solid grasp of MLOps fundamentals: Docker, Git, CI/CD, monitoring, and feature governance.

Preferred

  • Knowledge of orchestration frameworks (Kubeflow, Airflow) and streaming tools (Kafka, Kinesis).
  • Exposure to transformer fine-tuning techniques (LoRA, PEFT, quantisation).
  • Understanding of data privacy standards (SOC 2, GDPR) in AI workloads.
Benefits & Culture Highlights
  • Hybrid work model with flexible hours and quarterly in-person sprint planning.
  • Annual upskilling stipend covering cloud certifications and research conferences.
  • Collaborative, experimentation-driven culture where engineers influence product strategy.
Join us to turn breakthrough research into real-world impact and shape the next generation of intelligent software.
Skills: git,monitoring,oo patterns,ci/cd,llamaindex,python,feature governance,evaluation metrics (bleu, rouge, perplexity),faiss,prompt engineering,cloud,prompt engg,azure ml,agent framework,async patterns,gen ai,langchain,tokenisation,docker,sagemaker,vectordb,milvus,pinecone

Mock Interview

Practice Video Interview with JobPe AI

Start Python Interview
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

coding practice

Enhance Your Python Skills

Practice Python coding challenges to boost your skills

Start Practicing Python Now

RecommendedJobs for You