Home
Jobs

23 Streamlit Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

9 - 13 Lacs

Chennai, Bengaluru

Work from Office

We are Hiring for one of the Leading MNC Client..! Job Role- Prompt engineer position Experience- 6 to 8 years Job Location: Chennai/Bangalore only work Mode: work from office Looking for Immediate joiners only. Job Description: We are seeking an experienced Prompt Engineer with 5+ years of expertise in developing advanced prompts and integrating them into AI systems. This role requires a deep understanding of prompt design, language models, and knowledge structures to enhance AI reasoning and performance. Must have : Prompt the code in AI. Execution of AI Execution of Code Execution of RAC Key Responsibilities: Design and develop structured prompts for multi-step planning, agent dialogue, and tool usage. Specialize in few-shot prompting, chain-of-thought (CoT), and planner-executor prompt strategies. Structure and manage enterprise knowledge (e.g., taxonomies, graphs, ontologies) to enable robust AI reasoning and memory retention. Collaborate with data scientists and AI engineers to deploy prompt systems effectively. Required Skills & Tools: Strong understanding of Large Language Models (LLMs) Experience with LangChain, Python, Streamlit Knowledge of vector databases like FAISS and Pinecone Familiarity with containerization and deployment using Docker Preferred Qualifications: Hands-on experience with AI toolchains and prompt testing frameworks Ability to work in cross-functional teams and communicate prompt strategies effectively Role & responsibilities

Posted 1 day ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Pune

Remote

Role & responsibilities At least 5 years of experience in data engineering with a strong background on Azure Databricks and Scala/Python and Streamlit •Experience in handling unstructured data processing and transformation with programming knowledge. •Hands on experience in building data pipelines using Scala/Python •Big data technologies such as Apache Spark, Structured Streaming, SQL, Databricks Delta Lake •Strong analytical and problem solving skills with the ability to troubleshoot spark applications and resolve data pipeline issues. •Familiarity with version control systems like Git, CICD pipelines using Jenkins.

Posted 6 days ago

Apply

4.0 - 9.0 years

15 - 27 Lacs

Hyderabad

Hybrid

Streamlit Developer We are looking for a skilled and motivated Streamlit Developer to design and build interactive, data-driven dashboards and internal tools. You will work closely with leads, analysts, and product teams to transform complex datasets and models into clean, intuitive web apps using Python and Streamlit . Roles & responsibilities Design and develop web-based dashboards and applications using Streamlit Integrate data from sources such as APIs Jira, GraphQL, databases (SQL Server, Oracle) Automation experience using Selenium, pytest, Rest Assured, TestNG, Cucumber/Serenity leveraging Java or Python as programing language Create E2E automation across multiple platforms desktop / mobile Create automation frameworks for Micro Services using REST APIs and Web Services Extensive experience in programming languages such as Java, JavaScript, Python & C# Work with deployment automation and orchestration with tools such as Jenkins or Gitlab Deploy, maintain, and version-control: Git, GitLab Able to store code in Git and understand its branching strategy Able to use Cloud, AWS, EC2, SonarQube and Docker Required Skills and Qualifications: Application architecture and identify business critical workflows Translate business requirements and data analysis into intuitive user interfaces Define Automation roadmap for technology stack Design and build test automation processes Setting up and configuring a test automation suite Advice and support for the implementation of all types of automation tools Define coding standards and code walk through processes Raise risks on time and provide alternate solutions to mitigate the risk Mentor & Coach test automation engineers on various coding aspects Provide demos presentations for the automation team members and leadership team Provide training, mentoring to other automation engineers Conduct Trouble shooting, code debugging, root cause analysis and optimize application performance Preferred Skills: Familiarity with SQL and relational databases (e.g., MySQL) Strong communication and collaboration skills to work effectively in cross-functional teams. Able to adapt to a fast-paced env in a large team setup.

Posted 6 days ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Bengaluru, Karnataka, India

On-site

Required Qualifications: 3 - 5 years of experience in working on ML projects that includes business requirement gathering, model development, training, deployment at scale and monitoring model performance for production use cases Strong knowledge on Python, NLP, Data Engineering, Langchain, Langtrace, Langfuse, RAGAS, AgentOps (optional) Should have worked on proprietary and open-source large language models Experience on LLM fine tuning, creating distilled model from hosted LLMs Building data pipelines for model training Experience on model performance tuning, RAG, guardrails, prompt engineering, evaluation,and observability Experience in GenAI application deployment on cloud and on-premises at scale for production Experience in creating CI/CD pipelines Working knowledge on Kubernetes Experience in minimum one cloud: AWS / GCP / Azure to deploy AI services Experience in creating workable prototypes using Agentic AI frameworks like CrewAI, Taskweaver, AutoGen Experience in light weight UI development using streamlit or chainlit (optional) Desired experience onopen-source toolsfor ML development, deployment, observability, and integration Background on DevOps and MLOps will be a plus Experience working on collaborative code versioning tools like GitHub/GitLab Team player with good communication and presentation skills EDUCATION: B.E/B.Tech/M.Tech in Computer Science or related technical degree OR Equivalent.

Posted 1 week ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Gurgaon, Haryana, India

On-site

Required Qualifications: 3 - 5 years of experience in working on ML projects that includes business requirement gathering, model development, training, deployment at scale and monitoring model performance for production use cases Strong knowledge on Python, NLP, Data Engineering, Langchain, Langtrace, Langfuse, RAGAS, AgentOps (optional) Should have worked on proprietary and open-source large language models Experience on LLM fine tuning, creating distilled model from hosted LLMs Building data pipelines for model training Experience on model performance tuning, RAG, guardrails, prompt engineering, evaluation,and observability Experience in GenAI application deployment on cloud and on-premises at scale for production Experience in creating CI/CD pipelines Working knowledge on Kubernetes Experience in minimum one cloud: AWS / GCP / Azure to deploy AI services Experience in creating workable prototypes using Agentic AI frameworks like CrewAI, Taskweaver, AutoGen Experience in light weight UI development using streamlit or chainlit (optional) Desired experience onopen-source toolsfor ML development, deployment, observability, and integration Background on DevOps and MLOps will be a plus Experience working on collaborative code versioning tools like GitHub/GitLab Team player with good communication and presentation skills EDUCATION: B.E/B.Tech/M.Tech in Computer Science or related technical degree OR Equivalent.

Posted 1 week ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Mumbai, Maharashtra, India

On-site

Required Qualifications: 3 - 5 years of experience in working on ML projects that includes business requirement gathering, model development, training, deployment at scale and monitoring model performance for production use cases Strong knowledge on Python, NLP, Data Engineering, Langchain, Langtrace, Langfuse, RAGAS, AgentOps (optional) Should have worked on proprietary and open-source large language models Experience on LLM fine tuning, creating distilled model from hosted LLMs Building data pipelines for model training Experience on model performance tuning, RAG, guardrails, prompt engineering, evaluation,and observability Experience in GenAI application deployment on cloud and on-premises at scale for production Experience in creating CI/CD pipelines Working knowledge on Kubernetes Experience in minimum one cloud: AWS / GCP / Azure to deploy AI services Experience in creating workable prototypes using Agentic AI frameworks like CrewAI, Taskweaver, AutoGen Experience in light weight UI development using streamlit or chainlit (optional) Desired experience onopen-source toolsfor ML development, deployment, observability, and integration Background on DevOps and MLOps will be a plus Experience working on collaborative code versioning tools like GitHub/GitLab Team player with good communication and presentation skills EDUCATION: B.E/B.Tech/M.Tech in Computer Science or related technical degree OR Equivalent.

Posted 1 week ago

Apply

2.0 - 4.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose - the relentless pursuit of a world that works better for people - we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Business Analyst, Data Science! This role supports Commercial Analytics enablement, which includes understanding of business trends, provide data driven solutions. Within the Analytics, crafted to support brands and go-to-market strategies, Customer Analytics focuses on bringing and scaling analytics solutions that advances Customer centricity and transformation. The Analytics member is accountable for providing analytic expertise and delivering business insights to the Commercial organization. We&rsquore looking for a technically sound and who is interested in working within customer environment. Core data analytics work experience in Life Sciences/Healthcare/CPG for minimum 2+ years Second preference data analytics experience coupled with hands on technical skills Python, MS Word, Excel. Work location: Bangalore Responsibilities . Atleast 2 years of professional experience in implementing Analytics for business support. Must have experience with classification, regression and time series forecasting using ensemble methods with clear understanding. Well versed with data exploration and statistical inference from data. . Having 1 year experience building front end application using Streamlit/Gradio/Flask/Fast for Generative AI input and output display. . Using Generative AI to work on tasks like text generation, sentiment analysis, chatbots and language translation using RAG and agent-based models. . Analyze data and Generating insights using generative AI techniques for predictive modelling and data generation. . Building chatbot using different scalable data source for generating charts and statement. . Proficiency in programming language Python, SQL. . Responsible for weekly release of upgrades and new features to enhance current algorithm. Qualifications we seek in you! Minimum Qualifications / Skills . Masters or bachelor&rsquos in engineering - BE/B- Tech, BCA, MCA, BSc/MSc . Master&rsquos in science or related Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

New Delhi, Gurugram, Delhi / NCR

Work from Office

About the Role: We are seeking a talented and driven Software Engineer with 5-10 years of hands-on experience to join our growing technology team. The ideal candidate will have a strong foundation in software development and proven expertise in Streamlit, Snowflake, Cortex, and Generative AI (GenAI) applications. Must haves: Strong hands-on experience with Streamlit for building data applications Proficient in working with Snowflake SQL scripting, data integration, and performance tuning Working knowledge of Cortex (machine learning model deployment or similar) Experience with Generative AI tools, frameworks, or APIs (e.g., OpenAI, Hugging Face, LangChain) Proficiency in Python and version control systems like Git M lid understanding of cloud services (AWS, GCP, or Azure preferred) To Apply: Please send your updated resume to recruitment@cloudethix.com

Posted 1 week ago

Apply

3.0 - 12.0 years

3 - 12 Lacs

Hyderabad, Telangana, India

On-site

What you will do Let s do this. Let s change the world. In this vital role, you will primarily focus on analyzing scientific requirements from Global Research and translating them into efficient and effective information systems solutions. As a domain expert, the prospective BA collaborate with cross-functional teams to identify data product enhancement opportunities, perform data analysis, solve issues, and support system implementation and maintenance. Additionally, it will involve development of data product launch and user adoption strategy of Amgen Research Foundational Data Systems. Your expertise in business process analysis and technology will contribute to the successful delivery of IT solutions that drive operational efficiency and meet business objectives. Collaborate with geographically dispersed teams, including those in the US, EU and other international locations. Partner and ensure alignment of the Amgen India DTI site leadership and follow global standards and practices. Foster a culture of collaboration, innovation, and continuous improvement. Function as a Scientific Business Analyst, providing domain expertise for Research Data and Analytics within a Scaled Agile Framework (SAFe) product team Serve as Agile team scrum master or project manager as needed Serve as a liaison between global DTI functional areas and global research scientists, prioritizing their needs and expectations Create functional analytics dashboards and fit-for-purposes applications for quantitative research, scientific analysis and business intelligence (Databricks, Spotfire, Tableau, Dash, Streamlit, RShiny) Handle a suite of custom internal platforms, commercial off-the-shelf (COTS) software, and systems integrations Translate complex scientific and technological needs into clear, actionable requirements for development teams Develop and maintain release deliverables that clearly outlines the planned features and enhancements, timelines, and milestones Identify and handle risks associated with the systems, including technological risks, scientific validation, and user acceptance Develop documentations, communication plans and training plans for end users Ensure scientific data operations are scoped into building Research-wide Artificial Intelligence/Machine Learning capabilities Ensure operational excellence, cybersecurity and compliance. What we expect of you We are all different, yet we all use our unique contributions to serve patients. This role requires expertise in biopharma scientific domains as well as informatics solution delivery. Additionally, extensive collaboration with global teams is required to ensure seamless integration and operational excellence. The ideal candidate will have a solid background in the end-to-end software development lifecycle and be a Scaled Agile practitioner, coupled with change management and transformation experience. This role demands the ability to deliver against key organizational strategic initiatives, develop a collaborative environment, and deliver high-quality results in a matrixed organizational structure. Basic Qualifications/Skills: Doctorate degree OR Master s degree and 4 to 6 years of Life Science / Biotechnology / Pharmacology / Information Systems experience OR Bachelor s degree and 6 to 8 years of Life Science / Biotechnology / Pharmacology / Information Systems experience OR Diploma and 10 to 12 years of Life Science / Biotechnology / Pharmacology / Information Systems experience Excellent problem-solving skills and a passion for solving complex challenges in drug discovery with technology and data Superb communication skills and experience creating impactful slide decks with data Collaborative spirit and effective communication skills to work seamlessly in a multi-functional team Familiarity with data analytics and scientific computing platforms such as Databricks, Dash, Streamlit, RShiny, Spotfire, Tableau and related programming languages like SQL, python, R Preferred Qualifications/Skills: BS, MS or PhD in Bioinformatics, Computational Biology, Computational Chemistry, Life Sciences, Computer Science or Engineering 3+ years of experience in implementing and supporting biopharma scientific research data analytics Demonstrated expertise in a scientific domain area and related technology needs Understanding of semantics and FAIR (Findability, Accessibility Interoperability and Reuse) data concepts Understanding of scientific data strategy, data governance, data infrastructure Experience with cloud (e. g. AWS) and on-premise compute infrastructure Familiarity with advanced analytics, AI/ML and scientific computing infrastructure, such as High Performance Compute (HPC) environments and clusters (e. g SLURM, Kubernetes) Experience with scientific and technical team collaborations, ensuring seamless coordination across teams and driving the successful delivery of technical projects Ability to deliver features meeting research user demands using Agile methodology An ongoing commitment to learning and staying at the forefront of AI/ML advancements. We understand that to successfully sustain and grow as a global enterprise and deliver for patients we must ensure a diverse and inclusive work environment. Professional Certifications: SAFe for Teams certification (preferred) SAFe Scrum Master or similar (preferred) Soft Skills: Strong transformation and change management experience. Exceptional collaboration and communication skills. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented with a focus on achieving team goals. Strong presentation and public speaking skills. What you can expect of us As we work to develop treatments that take care of others, we also work to care for your professional and personal growth and well-being. From our competitive benefits to our collaborative culture, we ll support your journey every step of the way. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

12 - 22 Lacs

Jaipur

Remote

Hi Folks I hope you all are doing well ! We are hiring for the one of the leading IT industry across the world for Sr, Data/Python Engineer role where we are looking for the person who is expertise in p ython, pandas/streamlit, SQL, Pyspark. Job Description: Job Title: Sr. Data/Python Engineer Location: Pan India- Remote Job Type: Full Time Job Summary: We are seeking a skilled and collaborative Sr. Data/Python Engineer with experience in the development of production Python-based applications (Such as Pandas, Numpy, Django, Flask, FastAPI on AWS) to support our data platform initiatives and application development. This role will initially focus on building and optimizing Streamlit application development frameworks, CI/CD Pipelines, ensuring code reliability through automated testing with Pytest , and enabling team members to deliver updates via CI/CD pipelines . Once the deployment framework is implemented, the Sr Engineer will own and drive data transformation pipelines in dbt and implement a data quality framework. Key Responsibilities: Lead application testing and productionalization of applications built on top of Snowflake - This includes implementation and execution of unit testing and integration testing - Automated test suites include use of Pytest and Streamlit App Tests to ensure code quality, data accuracy, and system reliability. Development and Integration of CI/CD pipelines (e.g., GitHub Actions, Azure DevOps, or GitLab CI) for consistent deployments across dev, staging, and production environments. Development and testing of AWS-based pipelines - AWS Glue, Airflow (MWAA), S3 Design, develop, and optimize data models and transformation pipelines in Snowflake using SQL and Python. Build Streamlit-based applications to enable internal stakeholders to explore and interact with data and models. Collaborate with team members and application developers to align requirements and ensure secure, scalable solutions. Monitor data pipelines and application performance, optimizing for speed, cost, and user experience. Create end-user technical documentation and contribute to knowledge sharing across engineering and analytics teams. Work in CST hours and collaborate with onshore and offshore teams. Required Skills and Experience: 4+ years of experience in Data Engineering or Python based application development on AWS (Pandas, Flask, Django, FastAPI, Streamlit) - Experience building data data-intensive applications on python as well as data pipelines on AWS in a must. Strong in python and Pandas Proficient in SQL and Python for data manipulation and automation tasks. Experience with developing and productionalizing applications built on Python based Frameworks such as FastAPI, Django, Flask (Strong Python Pandas,Flask, Django, FastAPI, or Streamlit experience Experience with application frameworks such as Streamlit, Angular, React etc for rapid data app deployment. Solid understanding of software testing principles and experience using Pytest or similar Python frameworks. Experience configuring and maintaining CI/CD pipelines for automated testing and deployment. Familiarity with version control systems such as Gitlab . Knowledge of data governance, security best practices, and role-based access control (RBAC) in Snowflake. Preferred Qualifications: Experience with dbt (data build tool) for transformation modeling. Knowledge of Snowflakes advanced features (e.g., masking policies, external functions, Snowpark). Exposure to cloud platforms (e.g., AWS, Azure, GCP). Strong communication and documentation skills. Interested Candidate share their resume on sweta@talentvidas.com

Posted 2 weeks ago

Apply

3.0 - 6.0 years

10 - 13 Lacs

Bengaluru

Hybrid

Hi all, We are hiring for the role Generative AI Engineer Experience: 3 - 6 Years Location: Bangalore Notice Period: Immediate - 15 Days Skills: Generative AI Engineer Position Overview: We are looking for a Generative AI Engineer with expertise in Azure OpenAI and hands-on experience with models such as GPT-4o, GPT-o1, and open source LLMs like Llama, mistral. You will work on GenAI solutions development, RAG, fine-tuning, and deploying resources in Azure environment. Proficiency in prompt engineering, Python, PostgreSQL, FastAPI, Streamlit, Django and Angular is essential. This role also requires strong skills in AI models orchestration using intent mapping, Semantic Kernel or function calling, along with proficiency in presentation and public speaking. Key Responsibilities: • RAG, fine-tune, and deploy Azure OpenAI models (e.g., GPT-4o, GPT-o1) and other open-source large language models (LLMs). • Build AI-powered applications using frameworks such as FastAPI, Streamlit, Django, and Angular. • Design and execute AI workflows using tools like prompt flow, Semantic Kernel and implement function calling for complex use cases. • Conduct prompt engineering to improve model performance for specific business cases. • Visualize data and create user interaction insights using Power BI. • Ensure smooth deployment and maintenance of models on Azure cloud infrastructure, including scalability and optimization. • Prepare and deliver presentations, demos, and technical documentation to internal and external stakeholders. • Stay updated with advancements in generative AI, NLP, and machine learning to continuously improve models and methodologies. Required Skills & Qualifications: • Bachelors degree in Computer science, Artificial intelligence, Machine learning, or related field. • At least 2+ year of hands-on experience working on generative AI projects. • Strong expertise in Azure OpenAI models (GPT-4o, GPT-3.5, GPT-o1 etc.). • Proficient in Prompt Engineering, Python, Streamlit, Django, FastAPI, and Angular. • Basics of html, css, javascript, typescript and angular. • Basic understanding of neural networks, machine and transformer architectures. • Experience in retrieval-augmented generation (RAG) and fine-tuning Large language models. • Familiarity with AI model orchestration tools such as Semantic Kernel, intent mapping and function calling techniques. • Excellent public speaking and presentation skills to convey technical concepts to business stakeholders. • Azure Certified AZ900 or AI900 Preferred Qualifications: • Masters degree in Artificial Intelligence, Machine Learning, or related field. • At least 3+ years of experience working on generative AI, NLP, and machine learning projects. • Strong understanding of neural networks, machine learning and transformer architectures. • Implemented GenAI solutions in production. • Familiarity with Automotive Industry • Hands on experience in RAG, RAFT and optimized fine-tuning. • Azure Certified AI-102, DP-100, AZ-204 or DP-203 If you are interested drop your resume at mojesh.p@acesoftlabs.com Call: 9701971793

Posted 3 weeks ago

Apply

6.0 - 11.0 years

30 - 35 Lacs

Coimbatore, Bengaluru, Delhi / NCR

Hybrid

Collaborate with business stakeholders to gather and validate requirements Create and manage Jira tickets Support sprint planning, backlog grooming Create clear, structured requirements documentation and user stories Required Candidate profile Experience in analytics, business intelligence, or data warehouse projects (Snowflake, Power BI, Streamlit Working knowledge of Jira knowledge in Alternative Asset Management or Investment Banking.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

We are seeking a highly skilled Full Stack Developer with 3 to 5 years of experience to join our dynamic team. The ideal candidate will have a passion for developing innovative, scalable, and efficient web applications and backend systems. You will work on cutting-edge projects involving AI-driven solutions, API development, and seamless user interfaces while leveraging frameworks likeReact.js, Python, and FastAPI/Flask. Key Responsibilities Frontend Development: Design and develop responsive, user-friendly interfaces using React.js. Implement reusable UI components with modern design principles. Optimize front-end performance for maximum speed and scalability. Handson Experience working with React.js and Streamlit Knowledge of responsive design principles and CSS frameworks like Tailwind CSS or Material-UI Proficiency in HTML5, CSS3, and JavaScript (ES6+) Familiarity with testing frameworks (eg, Jest, Cypress, Enzyme) is a plus Backend Development: Strong experience with Python-based frameworks: FastAPI and/or Flask Proficiency in designing RESTful APIs and microservices architecture. Implement robust authentication and authorization mechanisms. Optimize backend processes for scalability and efficiency. Ensure the backend integrates seamlessly with the front end and other services Database expertise: SQL (Snowflake/PostgreSQL) or NoSQL (MongoDB) Experience with API documentation tools like Swagger/OpenAPI. Testing Deployment: Write unit, integration, and end-to-end tests to ensure code reliability. Automate build, test, and deployment pipelines using CI/CD tools. Code Quality Standards: Ensure code adheres to best practices for readability, maintainability, and performance. Required Skills and Qualifications Other Full Stack Skills: Proficiency in Git for version control and collaboration. Should have worked in Azure cloud platforms Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Preferred Qualifications Knowledge of WebSockets for real-time application features. Familiarity with Agile/Scrum methodologies. Experience with performance optimization tools and techniques. Educational Requirements bachelors or masters degree in Computer Science, Software Engineering, or a related field. Flexible to work from office 3 days (in a week) from 12:30pm to 9:30pm

Posted 3 weeks ago

Apply

4.0 - 8.0 years

4 - 8 Lacs

Chennai, Tamil Nadu, India

On-site

Primary Responsibilities: Design and develop AI-driven web applications using Streamlit and LangChain. Implement multi-agent workflows with LangGraph. Integrate Claude 3 (via AWS Bedrock) into intelligent systems for document and image processing. Work with FAISS for vector search and similarity matching. Develop document integration solutions for PDF, DOCX, XLSX, PPTX, and image-based formats. Implement OCR and summarization features using EasyOCR, PyMuPDF, and AI models. Create features such as spell-check, chatbot accuracy tracking, and automatic re-training pipelines. Build secure apps with SSO authentication, transcript downloads, and reference link generation. Integrate external platforms like Confluence, SharePoint, ServiceNow, Veeva Vault, Outlook, G.Net/G.Share, and JIRA. Collaborate on architecture, performance optimization, and deployment. Required Skills: Strong expertise in Streamlit, LangChain, LangGraph, and Claude 3 (AWS Bedrock). Hands-on experience with boto3, FAISS, EasyOCR, and PyMuPDF. Advanced skills in document parsing and image/video-to-text summarization. Proficient in modular architecture design and real-time AI response systems. Experience in enterprise integration with tools like ServiceNow, Confluence, Outlook, and JIRA. Familiar with chatbot monitoring and retraining strategies. Secondary Skills: Working knowledge of PostgreSQL, JSON, and file I/O with Python libraries like os, io, time, datetime, and typing. Experience with dataclasses and numpy for efficient data handling and numerical process

Posted 4 weeks ago

Apply

5.0 - 7.0 years

5 - 7 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Machine Learning, Deep Learning, and AI Research Generative AI (GANs, VAEs, Diffusion Models) Document Detail Extraction, Feature Engineering Prompt Engineering & Chain-of-Thought Reasoning Python Programming, AI Agent Design Developed and deployed an AI-powered generative tool for extracting structured details from unstructured documents. Implemented chain of thought reasoning techniques to craft optimized prompts for generative AI models. Designed and deployed AI agents to automate complex workflows and improve efficiency. Experience with neural networks, generative models, and advanced AI techniques, Python, GANs, VAEs, diffusion models, Streamlit

Posted 4 weeks ago

Apply

7.0 - 9.0 years

12 - 16 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Responsibilities : - design, development and implementation of NLP projects and solve complex business problems.. Well versed with Artificial Intelligence, Machine Learning and Deep Learning algorithms and techniques. Evaluating and selecting appropriate machine learning models for tasks, as well as building and training working versions of those models using Python and other open-source technologies Proven experience in developing and deploying NER models. Experience with transfer learning and pre-trained language models (e.g., BERT, GPT). Proficiency in programming languages such as Python, with experience in NLP libraries (e.g., spaCy, NLTK, Stanford NLP, Hugging Face Transformers). Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch, Keras, scikit-learn, Caffe, CNTK Working across client teams to develop and implement LLM solutions. Develop prompts that instruct LLM to generate relevant and accurate responses, RAG Architecture, LLM Fine Tuning Expertise in EDA, data engineering, including data curation, cleaning, and preprocessing. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Track record of driving innovation and staying updated with the latest AI research and advancements

Posted 4 weeks ago

Apply

6.0 - 8.0 years

0 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Job Description: Who are we Infosys NYSE INFY is a global leader in consulting technology and outsourcing solutions with annual revenues of over 15 64 B as of 2022 We enable clients in more than 50 countries to stay a step ahead of emerging business trends and outperform the competition Infosys Recognized as one of the 2022 World s Most Ethical Companies for the Second Consecutive Year by Ethisphere What are we looking for Lead Machine Learning Engineer We are looking for smart self driven high energy people with top notch communication skills intellectual curiosity and passion for technology in Machine Learning Space Our analysts have a blend of in depth domain expertise in one or more areas Retail CPG Logistics strong business and technical acumen along with excellent soft skills What do we require To work with clients to understand the issues they face diagnose problems design solutions and facilitate solution deployment on Azure ML One can be an individual contributor or lead small teams depending on the project You will be pivotal to understanding the requirement problem definition and discovery of the overall solution One will also have the opportunity to shape value adding consulting solutions for clients by connecting various functions of cloud components Industry knowledge Knows basics of machine learning is aware of cloud services Azure services has a deep understanding of coding practices knows how to guide teams on debugging the issues can connect the dots to arrivie at a solution and is very good at presentation of the ideas thoughts and solutions Key Responsibilities: Technical knowledge has expertise in cloud technologies specifically MS Azure and services with hands on coding to Expertise in Object Oriented Python Programming with 6 8 years experience DevOps Working knowledge with implementation experience 1 or 2 projects a minimum Hands On MS Azure Cloud knowledge Understand and take requirements on Operationalization of ML Models from Data Scientist Help team with ML Pipelines from creation to execution List Azure services required for deployment Azure Data bricks and Azure DevOps Setup Assist team to coding standards flake8 etc Guide team to debug on issues with pipeline failures Engage with Business Stakeholders with status update on progress of development and issue fix Automation Technology and Process Improvement for the deployed projects Setup Standards related to Coding Pipelines and Documentation Adhere to KPI SLA for Pipeline Run Execution Research on new topics services and enhancements in Cloud Technologies Domain Technical Tools Knowledge Object oriented programming coding standards architecture design patterns Config management Package Management Logging documentation Experience in Test Driven Development and experience in using Pytest frameworks git version control Rest APIs Python programming with OOPs concept SQL XML YAML Bash JSON Pydantic models Class based frameworks Dependency injections FastAPI Flask Streamlit Python Azure API management API Gateways Traffic Manager Load Balancers Nginx Uvicorn Gunicorn Azure ML best practices in environment management run time configurations Azure ML Databricks clusters alerts Experience designing and implementing ML Systems pipelines MLOps practices and tools such a MLFlow Kubernetes etc Exposure to event driven orchestration Online Model deployment Contribute towards establishing best practices in MLOps Systems development Proficiency with data analysis tools e g SQL R Python High level understanding of database concepts reporting Data Science concepts Hands on experience in working with client IT Business teams in gathering business requirement and converting into requirement for development team Experience in managing client relationship and developing business cases for opportunities Azure AZ 900 Certification with Azure Architect Technical Requirements: Primary skills Technology OpenSystem Python OpenSystem Responsible for successful delivery of MLOps solutions and services in client consulting environments Define key business problems to be solved formulate high level solution approaches and identify data to solve those problems develop analyze draw conclusions and present to client Assist clients with operationalization metrics to track performance of ML Models Agile trained to manage team effort and track through JIRA High Impact Communication Assesses the target audience need prepares and practices a logical flow answers audience questions appropriately and sticks to timeline Additional Responsibilities: Good knowledge on software configuration management systems Strong business acumen strategy and cross industry thought leadership Awareness of latest technologies and Industry trends Education and Experience Overall 6 to 8 years of experience in Data driven software engineering with 3 5 years of experience designing building and deploying enterprise AI or ML applications with at least 2 years of experience implementing full lifecycle ML automation using MLOps scalable development to deployment of complex data science workflows Bachelors or Master s degree in Computer Science Engineering or equivalent Domain experience in Retail CPG and Logistics etc Azure Certified DP100 AZ AI900 Logical thinking and problem solving skills along with an ability to collaborate Two or three industry domain knowledge Understanding of the financial processes for various types of projects and the various pricing models available Client Interfacing skills Knowledge of SDLC and agile methodologies Project and Team management Preferred Skills: Technology->OpenSystem->Python - OpenSystem->Python

Posted 4 weeks ago

Apply

5.0 - 10.0 years

1 - 2 Lacs

Remote, , India

On-site

Develop scalable data collection, storage, and distribution platform to house data from vendors, research providers, exchanges, PBs, and web-scraping. Make data available to systematic & fundamental PMs, and enterprise functions: Ops, Risk, Trading, and Compliance. Develop internal data products and analytics Responsibilities Web scraping using scripts/APIs/Tools Help build and maintain greenfield data platform running on Snowflake and AWS Understand the existing pipelines and enhance pipelines for the new requirements. Onboarding new data providers Data migration projects Skills SQL senior level Python senior level Streamlit Expertise Linux Containerization (Docker, Kubernetes) Good communication skills AWS Dev ops skills (K8s, Docker, Jenkins) Nice to have Market Data Projects/ Capital markets exp Snowflake is a big plus Airflow

Posted 4 weeks ago

Apply

6.0 - 11.0 years

30 - 35 Lacs

Hyderabad, Pune, Delhi / NCR

Hybrid

Collaborate with business stakeholders to gather and validate requirements Create and manage Jira tickets Support sprint planning, backlog grooming Create clear, structured requirements documentation and user stories Required Candidate profile Experience in analytics, business intelligence, or data warehouse projects (Snowflake, Power BI, Streamlit Working knowledge of Jira knowledge in Alternative Asset Management or Investment Banking.

Posted 1 month ago

Apply

6.0 - 10.0 years

30 - 35 Lacs

Bengaluru, Delhi / NCR, Mumbai (All Areas)

Hybrid

Create documentation and user stories. Work with engineering teams to review upcoming and backlog Jira tickets. Provide guidance on design decisions in areas including Credit and tech including Snowflake and Streamlit Develop reporting in powerBI Required Candidate profile 5+ years of experience as a Business analyst especially in Alternative assets, Credit, CLO, Real Estate etc. Experience creating complex dashboards in powerBI Exposure to Snowflake and Streamlit

Posted 1 month ago

Apply

6.0 - 10.0 years

30 - 35 Lacs

Hyderabad, Pune, Delhi / NCR

Hybrid

Create documentation and user stories. Work with engineering teams to review upcoming and backlog Jira tickets. Provide guidance on design decisions in areas including Credit and tech including Snowflake and Streamlit Develop reporting in powerBI Required Candidate profile 5+ years of experience as a Business analyst especially in Alternative assets, Credit, CLO, Real Estate etc. Experience creating complex dashboards in powerBI Exposure to Snowflake and Streamlit

Posted 1 month ago

Apply

3 - 8 years

3 - 8 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Hiring GenAI Python Developer with experience range 3 to 8 Years of experience Mandatory Skills: Python, LangChain techniques, LLM architectures, GPT, BERT, Streamlit, RAG mechanism Education: Btech/BE, BCA/MCA,Bsc/MSc

Posted 1 month ago

Apply

3 - 8 years

5 - 12 Lacs

Pune, Ahmedabad, Bengaluru

Work from Office

Hiring GenAI Python Developer with experience range 3 to 8 Years of experience Mandatory Skills: Python, LangChain techniques, LLM architectures, GPT, BERT, Streamlit, RAG mechanism Education: Btech/BE, BCA/MCA,Bsc/MSc

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies