Jobs
Interviews

6123 Retrieval Jobs - Page 19

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0 years

0 Lacs

Mysore, Karnataka, India

On-site

About Pandita AI Solutions Pandita AI Solutions is a rapidly growing AI startup with headquarters in Palo Alto, CA and a thriving engineering division in Mysore, India . We are pioneers in developing generative AI systems, foundation models, and multimodal applications that solve real-world problems across industries like healthcare, scientific computing, and document intelligence. We’re now looking to automate our internal operations using cutting-edge agentic AI and LLM-driven tools. This effort will be led by the founding team in close collaboration with a dedicated Generative AI Engineer , who will help implement intelligent automation — and later scale it for external clients and products . Key Responsibilities Work closely with the leadership team to map internal processes that can benefit from AI-based automation (e.g., document handling, research synthesis, client onboarding, code generation, workflow orchestration). Build internal tools using LLMs and agent frameworks (LangChain, OpenAI Functions, AutoGen, CrewAI, etc.) for semi-autonomous task execution. Prototype, test, and refine agentic applications that integrate with internal systems (e.g., Notion, Slack, GitHub, Airtable, email, databases). As the automation matures internally, support the transition to external automation offerings for clients across domains. Maintain a portfolio of your work in GitHub, and contribute reusable components to the broader engineering team. Minimum Qualifications Master’s degree in Computer Science, AI, Data Science, or a related technical field. Solid programming skills in Python and experience building backend services, APIs, or automation tools. Demonstrated ability to build and maintain agent-based apps using tools like LangChain, CrewAI, LlamaIndex , or custom orchestration frameworks. A strong GitHub portfolio of previous automation projects or agentic AI applications (personal or professional). Familiarity with REST APIs, cloud services (AWS/GCP), and tools like Docker, FastAPI, or Streamlit. Preferred Qualifications Experience integrating LLMs with external APIs and multi-step workflows (e.g., form parsing + summarization + ticket generation). Familiarity with tools like Zapier , Make , Airflow , or custom workflow engines . Understanding of retrieval-augmented generation (RAG) , embeddings, and prompt engineering best practices. Publications in International conferences such as IEEE, ICML, NIPS Entrepreneurial mindset with a willingness to explore ambiguous spaces and iterate rapidly. What We Offer A unique opportunity to work directly with founders and AI researchers on defining the next era of automation. A fast-paced, early-stage startup culture with a focus on delivery, ownership, and learning . ESOPs , competitive compensation, and access to high-end compute infrastructure. Long-term growth opportunity to lead automation efforts both internally and across external enterprise deployments .

Posted 5 days ago

Apply

1.0 - 3.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Position: MERN Stack Developer Location: Onsite - Mumbai Employment Type: Full-Time Experience Required: 1 to 3 years About Nurdd Nurdd is an AI-powered marketing attribution platform built for the modern influencer economy. We help brands understand what actually drives performance in creator-led campaigns through advanced data tracking, attribution models, and real-time insights. Our systems enable brands to make ROI-driven decisions while eliminating inefficiencies in influencer marketing. We’re building scalable, intelligent infrastructure that powers the future of brand-creator collaboration—and we’re looking for engineers who can bring that vision to life. Role Overview We are hiring a MERN Stack Developer to join our engineering team. You will be responsible for developing and maintaining backend services and full-stack features that support our attribution engine. The ideal candidate will be confident in both frontend and backend development using MongoDB, Express.js, React.js, and Node.js, with experience in building scalable APIs and integrating AI-based services. Responsibilities Design, develop, and maintain web applications using the MERN stack Build RESTful and GraphQL APIs to power frontend and mobile apps Work closely with the AI/ML team to integrate model outputs and analytics Optimize data storage and retrieval using MongoDB for large-scale attribution data Implement secure, scalable, and modular backend architecture on AWS or similar platforms Collaborate with frontend developers to deliver seamless end-to-end product experiences Write clean, efficient, and well-documented code Participate in code reviews and continuously improve development workflows Requirements 1 to 3 years of hands-on experience with the full MERN stack Strong proficiency in Node.js, Express.js, React.js, and MongoDB Experience with building and consuming REST/GraphQL APIs Familiarity with cloud platforms (preferably AWS) for deployment and scaling Working knowledge of Git and collaborative development workflows Ability to translate business logic into scalable backend services Strong debugging and problem-solving skills Preferred Skills Experience with Firebase Authentication, AWS Lambda, or EC2 Exposure to AI/ML-driven applications or analytics dashboards Understanding of authentication, authorization, and data privacy best practices Familiarity with Docker, CI/CD pipelines, or serverless architecture Why Work With Us Be part of a product team solving attribution and performance measurement at scale Work in a fast-moving, technically challenging environment with high ownership Join an early-stage team building the next-gen stack for the creator economy

Posted 5 days ago

Apply

0 years

0 Lacs

Nashik, Maharashtra, India

On-site

Accounting Financial Management Bookkeeping Record Keeping: Maintain accurate and up-to-date financial records , including ledgers, journals, and subsidiary books, making sure all transactions are recorded correctly and on time. Organize and maintain proper financial documentation, both physical and digital, for easy retrieval and audit purposes. Accounts Payable (AP): Process vendor invoices , ensuring proper approvals and accurate coding to relevant accounts. Prepare and process vendor payments, making sure they're disbursed on time and reconciling vendor statements. Accounts Receivable (AR): Generate and issue invoices to clients for services or goods. Monitor outstanding receivables, follow up with clients for timely collections, and accurately process customer payments. Bank Cash Management: Perform daily, weekly, or monthly bank reconciliations , ensuring accuracy between bank statements and internal records. Manage petty cash , handle cash transactions, and ensure proper reconciliation. Statutory Compliance: Assist with the preparation and filing of various tax returns, including GST (Goods and Services Tax), TDS (Tax Deducted at Source), and Income Tax , ensuring adherence to Indian tax laws and deadlines. Help prepare documentation for audits (internal and external) and liaise with auditors as needed. Financial Reporting Assistance: Support the preparation of basic financial statements , such as Profit Loss statements and Balance Sheets, under the guidance of senior management or consultants. Assist during month-end and year-end closing processes. II. Administrative Office Management Office Operations: Manage general office administration , ensuring a smooth workflow and an organized work environment. Oversee office supplies inventory , placing orders and making sure everything is available when needed. Handle incoming and outgoing mail, couriers, and general correspondence. Reception Communication: Greet visitors, answer and direct phone calls, and manage general email inquiries professionally. Provide administrative support to management and other departments as needed. HR Support (Basic): Maintain basic employee records (attendance, leaves, personal information). Assist with basic payroll data collection (e. g., attendance data for processing). Support the onboarding of new employees by preparing necessary paperwork and setting up workspaces. Vendor Management: Liaise with various service providers (e. g., internet, utilities, cleaning services) to ensure smooth operations. Document Management: Maintain an organized filing system for all administrative documents, contracts, and records. This job is provided by Shine.com

Posted 5 days ago

Apply

5.0 - 7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Title: Python Developer Experience Level: 5-7 Years Location : Hyderabad Job Description We are seeking an experienced Lead Python Developer with a proven track record of building scalable and secure applications, specifically in the travel and tourism industry. The ideal candidate should possess in-depth knowledge of Python, modern development frameworks, and expertise in integrating third-party travel APIs. This role demands a leader who can foster innovation while adhering to industry standards for security, scalability, and performance. Roles and Responsibilities Application Development: Architect and develop robust, high-performance applications using Python frameworks such as Django, Flask, and FastAPI. API Integration: Design and implement seamless integration with third-party APIs, including GDS, CRS, OTA, and airline-specific APIs, to enable real-time data retrieval for booking, pricing, and availability. Data Management: Develop and optimize complex data pipelines to manage structured and unstructured data, utilizing ETL processes, data lakes, and distributed storage solutions. Microservices Architecture: Build modular applications using microservices principles to ensure scalability, independent deployment, and high availability. Performance Optimization: Enhance application performance through efficient resource management, load balancing, and faster query handling to deliver an exceptional user experience. Security and Compliance: Implement secure coding practices, manage data encryption, and ensure compliance with industry standards such as PCI DSS and GDPR. Automation and Deployment: Leverage CI/CD pipelines, containerization, and orchestration tools to automate testing, deployment, and monitoring processes. Collaboration: Work closely with front-end developers, product managers, and stakeholders to deliver high- quality, user-centric solutions aligned with business goals.Requirements  Education: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field.  Technical Expertise: o At least 4 years of hands-on experience with Python frameworks like Django, Flask, and FastAPI. o Proficiency in RESTful APIs, GraphQL, and asynchronous programming. o Strong knowledge of SQL/No SQL databases (PostgreSQL, MongoDB) and big data tools (e.g., Spark, Kafka). o Experience with cloud platforms (AWS, Azure, Google Cloud), containerization (Docker, Kubernetes), and CI/CD tools (e.g., Jenkins, GitLab CI). o Familiarity with testing tools such as PyTest, Selenium, and SonarQube. o Expertise in travel APIs, booking flows, and payment gateway integrations.  Soft Skills: o Excellent problem-solving and analytical abilities. o Strong communication, presentation, and teamwork skills. o A proactive attitude with a willingness to take ownership and perform under pressure.

Posted 5 days ago

Apply

0.0 - 4.0 years

0 Lacs

Bengaluru, Karnataka

On-site

Location Bangalore, Karnataka, 560048 Category Engineering Job Type Full time Job Id 1191041 No AI & Analytics Workload Specialist This role has been designed as ‘’Onsite’ with an expectation that you will primarily work from an HPE office. Who We Are: Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description: In the HPE Hybrid Cloud, we lead the innovation agenda and technology roadmap for all of HPE. This includes managing the design, development, and product portfolio of our next-generation cloud platform, Green Lake. Working with customers, we help them reimagine their information technology needs to deliver a simple, consumable solution that helps them drive their business results. Join us redefine what’s next for you. The HPE Worldwide Hybrid Cloud Acceleration Team is seeking a technically skilled and innovative Solutions Engineer to join our global AI and Analytics team. This intermediate-level role is ideal for individuals who want to build real-world AI solutions to address customer problems, contribute to enterprise-scale validation projects, and drive field enablement with impactful assets. You will work closely with Solution Product Managers, workload Subject Matter Experts, and field teams to deliver innovative AI/Analytics workload solutions that position HPE as a leader in data-driven transformation. What you’ll do: Design & Validation Apply technical expertise to design and validate AI/Analytics workload solutions. Contribute technical assets such as demos, white papers, videos, blogs, labs, and internal enablement materials. Collaborate with stakeholders to define the scope and align solutions with business needs. Leverage internal infrastructure and AI-powered tools to accelerate development and validation. Field Enablement & Asset Creation Create compelling enablement content for the field, partners, and customers. Support solution adoption through TekTalks, webinars, Slack forums, and internal events. Integrate assets with field tools and help measure asset utilization. Career Development Invest in technical and personal growth through internal training, certifications, mentorships, and participation in HPE’s Technical Career Path (TCP). Contribute to team goals, mentor peers, and participate in a collaborative engineering environment. What you need to bring: 2 - 4 years of hands-on experience in building or validating AI and Analytics solutions, with a focus on real-world enterprise use cases. Proven experience developing or deploying AI-powered applications such as: Retrieval-Augmented Generation (RAG) systems Conversational AI/chatbot solutions ML model pipelines for analytics or inference Strong proficiency in Python and familiarity with common AI/ML frameworks (e.g., LangChain, Hugging Face, PyTorch, TensorFlow, OpenAI APIs). Hands-on experience with data manipulation, embedding/vector databases (e.g., FAISS, Chroma, Weaviate), and prompt engineering. Experience with virtualization platforms (e.g., VMware, KVM) and containers (e.g., Docker, Kubernetes) is a plus. Familiarity with deploying AI workloads in cloud environments (e.g., Azure, AWS, or GCP), particularly using GPU-accelerated instances, is a plus. Strong written and verbal communication skills, with the ability to explain complex technical ideas clearly. Bachelor’s degree in computer science, Data Science, Engineering, or a related technical field. Preferred Qualifications: Understanding of enterprise IT systems, storage, and hybrid cloud architecture. Prior experience building AI/ML solution reference architectures or benchmarks. Published technical content (e.g., blogs, demos, white papers). Additional Skills: Cloud Architectures, Cross Domain Knowledge, Design Thinking, Development Fundamentals, DevOps, Distributed Computing, Microservices Fluency, Full Stack Development, Security-First Mindset, Solutions Design, Testing & Automation, User Experience (UX) What We Can Offer You: Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Let's Stay Connected: Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. #india #hybridcloud Job: Engineering Job Level: TCP_02 HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity. Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories.

Posted 5 days ago

Apply

21.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

What we do? At ClearTrail, work is more than just a job. Our calling is to develop solutions that empower those dedicated to keep their people, places and communities safe. For over 21 years, law enforcement & federal agencies across the globe have trusted ClearTrail as their committed partner in safeguarding nations & enriching lives. We are envisioning the future of intelligence gathering by developing artificial intelligence and machine learning based lawful interception & communication analytics solutions that solve the worlds most challenging are we looking for? We are looking for ML Engineer (2-4 years experience) with ML and LLM skill set in Indore, M. Roles And Responsibilities Develop end-to-end machine learning pipeline which includes model development, refining, and implementation for variety of analytics problems. Communicating results to diverse technical and non-technical audiences. Provides LLM expertise to solve problems using state-of-the-art language models and off-the-shelf LLM services, such as OpenAI models etc. Along with knowledge about Retrieval augmented generation and relevant techniques to improve performance & capabilities of LLMs. Research and Innovation : Stay up to date with the latest advancements in the fields of AI. Problem-solving and code debugging skills. Hands on practical experience with usage of large language and generative AI models both proprietary and opensource, including transformers and GPT models (preferred). Skills Mandatory hands on experience in the following : Libraries : Python, Scikit-Learn, PyTorch, LangChain, Transformers. Techniques : Exploratory Data Analysis, Machine Learning and Neural Networks. Machine learning model building, Hyperparameter tuning and Model performance metrics and Model Deployment. Practical knowledge of LLM and its fine to have experience : Deep Learning. MLOps. SQL. Qualifications Bachelors degree in computer science & engineering. 2-4 years of Proven experience as a Machine Learning Engineer or similar role with LLM skill set. Sound theoretical and practical knowledge of working with machine learning algorithms and hands-on with LLM applications. (ref:hirist.tech)

Posted 5 days ago

Apply

0 years

0 Lacs

Nagpur, Maharashtra, India

On-site

Role Summary Were looking for a data-science leader who can turn raw data into clear insight and innovative AI products. Youll own projects end-to-end - from framing business questions and exploring datasets to deploying and monitoring models in the cloud. Along the way youll introduce generative-AI ideas such as chat assistants and retrieval-augmented search, steer a small team of data professionals, and work closely with product, engineering, an business stakeholders to deliver measurable Responsibilities : Design and build predictive & forecasting models that drive measurable impact. Plan and run A/B experiments to validate ideas and guide product decisions. Develop and maintain data pipelines that ensure clean, trusted, and timely datasets. Lead generative-AI initiatives (e.g., LLM-powered chat, RAG search, custom embeddings). Package, deploy, and monitor models using modern MLOps practices in public cloud. Establish monitoring & alerting for accuracy, latency, drift, and cost. Mentor and coach the team, conducting code reviews and sharing best practices. Translate complex findings into clear, action-oriented stories for non-technical audiences. Ensure data governance and privacy across all projects, meeting internal and industry standards. Continuously evaluate new tools & methods, running quick PoCs to keep solutions Skills & Experience : Solid foundation in statistics, experiment design, and end-to-end ML workflows. Strong Python and SQL; proven record of moving models from notebook to production. Hands-on cloud experience (AWS, Azure, or GCP) with container-based deployment and CI/CD. Practical exposure to generative-AI projects - prompt engineering, fine-tuning, or retrieval-augmented pipelines. (ref:hirist.tech)

Posted 5 days ago

Apply

5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

We are a technology-led healthcare solutions provider. We are driven by our purpose to enable healthcare organizations to be future-ready. We offer accelerated, global growth opportunities for talent that is bold, industrious and nimble. With Indegene, you gain a unique career experience that celebrates entrepreneurship and is guided by passion, innovation, collaboration and empathy. To explore exciting opportunities at the convergence of healthcare and technology, check out www.careers.indegene.com What if we told you that you can move to an exciting role in an entrepreneurial organization without the usual risks associated with it? We understand that you are looking for growth and variety in your career at this point and we would love for you to join us in our journey and grow with us. At Indegene, our roles come with the excitement you require at this stage of your career with the reliability you seek. We hire the best and trust them from day 1 to deliver global impact, handle teams and be responsible for the outcomes while our leaders support and mentor you. We are a profitable rapidly growing global organization and are scouting for the best talent for this phase of growth. With us, you are at the intersection of two of the most exciting industries of healthcare and technology. We offer global opportunities with fast-track careers while you work with a team that is fueled by purpose. The combination of these will lead to a truly differentiated experience for you. If this excites you, then apply below. Role: Associate Manager - AI Engineer Description: We are seeking a Senior AI Engineer to lead the design, development, and deployment of advanced AI and Generative AI (GenAI) solutions that deliver significant business value. The ideal candidate will have extensive hands-on experience with AI/ML frameworks, cloud platforms, and MLOps, coupled with a deep understanding of GenAI technologies. Key Responsibilities Design, develop, and optimize AI/ML models for practical applications, ensuring high performance, scalability, and reliability. Innovate using advanced GenAI technologies (e.g., LLMs, RAG, OpenAI APIs) to address complex business challenges. Implement and manage end-to-end MLOps pipelines, including CI/CD, model monitoring, retraining workflows, and versioning. Architect and deploy scalable, cost-effective AI/ML solutions on cloud platforms (AWS, Azure). Develop and maintain robust APIs for seamless AI/ML integration into enterprise systems. Collaborate with cross-functional stakeholders to align AI solutions with business objectives. Mentor and guide junior engineers, fostering innovation and adherence to best practices in AI/ML development and deployment. Develop and train Generative AI models. Perform data analysis and prepare data for AI model training. Integrate AI models with Snowflake, AWS, and other systems. Must Have 5 years of experience in AI/ML development and deployment, including leadership roles. Good knowledge in machine learning and Generative AI, especially content generation using AWS Bedrock and OpenAI based models. Proficiency in working with GenAI tools and technologies such as LLMs, Retrieval-Augmented Generation (RAG), OpenAI APIs, LangChain, Streamlit, and vector databases. Strong experience in building scalable (Gen) AI applications on AWS. Strong background and understanding of vector databases. Experience in building (Gen) AI solutions on Snowflake is a plus. Good knowledge in Python for data science, as well as Streamlit for rapid deployment of prototypes. Good knowledge in Git. Experience to work in an agile and international environment. Experience in the setup and usage of CI/CD pipelines as well as writing software in a test-driven fashion. Good documentation and coaching practice. EQUAL OPPORTUNITY Indegene is proud to be an Equal Employment Employer and is committed to the culture of Inclusion and Diversity. We do not discriminate on the basis of race, religion, sex, colour, age, national origin, pregnancy, sexual orientation, physical ability, or any other characteristics. All employment decisions, from hiring to separation, will be based on business requirements, the candidates merit and qualification. We are an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, colour, religion, sex, national origin, gender identity, sexual orientation, disability status, protected veteran status, or any other characteristics. Locations - Bangalore, KA, IN

Posted 5 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Code: BTL-2507106 Job Title: AI/ML Intern Location: Hyderabad Mode: Work from the office Company Overview BeamX TechLabs Pvt Ltd. is a leading technology company specializing in software development and IT solutions. With a focus on innovation and cutting-edge technologies, we strive to provide exceptional solutions to our clients across various industries. Join our dynamic team and contribute to the development of groundbreaking software applications. Position Summary We are seeking a highly motivated and talented AI/ML Intern to join our growing team in Hyderabad. As an intern, you will gain hands-on experience in developing and implementing AI-powered solutions, working alongside experienced engineers and researchers. This is an excellent opportunity to learn and contribute to real-world projects while expanding your knowledge of the latest AI/ML technologies. Key Responsibilities Assist in the development and fine-tuning of Large Language Models (LLMs) using frameworks like LangChain, OpenAI Agents SDK, or similar. Contribute to the implementation of Retrieval Augmented Generation (RAG) architectures for building intelligent applications. Experiment with and evaluate various vector databases such as FAISS, Opensearch, etc., for efficient LLM data management. Develop and test LLM-based prototypes and demonstrate your solutions through code samples and documentation. Collaborate with team members to brainstorm innovative AI solutions and contribute to project discussions. Stay up-to-date with the latest advancements in AI/ML research and best practices. Required Skills and Qualifications Strong knowledge or foundational experience in Python programming Basic hands-on experience with LLMs: Ability to build, fine-tune, or implement LLMs in simple projects or prototypes. Familiarity with agentic AI frameworks: Awareness of frameworks such as LangChain, OpenAI Agents SDK, or similar tools. Understanding of RAG architectures: Basic knowledge of RAG concepts with some experience in simple implementations. Experience with vector databases: Basic familiarity with vector databases like FAISS, Opensearch, or similar, and understanding of their use in AI/ML applications. Preferred Qualifications . Portfolio of LLM-based projects: A GitHub repository or other coding samples showcasing your LLM development experience. Familiarity with cloud computing platforms like AWS, Azure, or GCP. Experience with machine learning algorithms and techniques. Education Requirements Bachelor's or Master's degree in Computer Science, Data Science, Artificial Intelligence, or a related field Compensation and Benefits Competitive stipend based on experience and performance. Opportunity to learn from experienced AI/ML professionals. Exposure to cutting-edge technologies and real-world projects. Opportunities for Conversion to Full-Time Positions based on performance.

Posted 5 days ago

Apply

1.0 - 2.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Job Description Alimentation Couche-Tard Inc., (ACT) is a global Fortune 200 company and a leader in the convenience store and fuel space with over 16,700 stores. It has footprints across 31 countries and territories. Circle K India Data & Analytics team is an integral part of ACT’s Global Data & Analytics Team, and the Associate ML Ops Analyst will be a key player on this team that will help grow analytics globally at ACT. The hired candidate will partner with multiple departments, including Global Marketing, Merchandising, Global Technology, and Business Units. About The Role The incumbent will be responsible for implementing Azure data services to deliver scalable and sustainable solutions, build model deployment and monitor pipelines to meet business needs. Roles & Responsibilities Development and Integration Collaborate with data scientists to deploy ML models into production environments Implement and maintain CI/CD pipelines for machine learning workflows Use version control tools (e.g., Git) and ML lifecycle management tools (e.g., MLflow) for model tracking, versioning, and management. Design, build as well as optimize applications containerization and orchestration with Docker and Kubernetes and cloud platforms like AWS or Azure Automation & Monitoring Automating pipelines using understanding of Apache Spark and ETL tools like Informatica PowerCenter, Informatica BDM or DEI, Stream Sets and Apache Airflow Implement model monitoring and alerting systems to track model performance, accuracy, and data drift in production environments. Collaboration and Communication Work closely with data scientists to ensure that models are production-ready Collaborate with Data Engineering and Tech teams to ensure infrastructure is optimized for scaling ML applications. Optimization and Scaling Optimize ML pipelines for performance and cost-effectiveness Operational Excellence Help the Data teams leverage best practices to implement Enterprise level solutions. Follow industry standards in coding solutions and follow programming life cycle to ensure standard practices across the project Helping to define common coding standards and model monitoring performance best practices Continuously evaluate the latest packages and frameworks in the ML ecosystem Build automated model deployment data engineering pipelines from plain Python/PySpark mode Stakeholder Engagement Collaborate with Data Scientists, Data Engineers, cloud platform and application engineers to create and implement cloud policies and governance for ML model life cycle. Job Requirements Education & Relevant Experience Bachelor’s degree required, preferably with a quantitative focus (Statistics, Business Analytics, Data Science, Math, Economics, etc.) Master’s degree preferred (MBA/MS Computer Science/M.Tech Computer Science, etc.) 1-2 years of relevant working experience in MLOps Behavioural Skills Delivery Excellence Business disposition Social intelligence Innovation and agility Knowledge Knowledge of core computer science concepts such as common data structures and algorithms, OOPs Programming languages (R, Python, PySpark, etc.) Big data technologies & framework (AWS, Azure, GCP, Hadoop, Spark, etc.) Enterprise reporting systems, relational (MySQL, Microsoft SQL Server etc.), non-relational (MongoDB, DynamoDB) database management systems and Data Engineering tools Exposure to ETL tools and version controlling Experience in building and maintaining CI/CD pipelines for ML models. Understanding of machine-learning, information retrieval or recommendation systems Familiarity with DevOps tools (Docker, Kubernetes, Jenkins, GitLab).

Posted 5 days ago

Apply

14.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Description This is a unique opportunity to apply your skills and contribute to impactful global business initiatives. As an Applied AI ML Lead - Data Scientist- Vice President at JPMorgan Chase within the Commercial & Investment Bank's Global Banking team, you’ll leverage your technical expertise and leadership abilities to support AI innovation. You should have deep knowledge of AI/ML and effective leadership to inspire the team, align cross-functional stakeholders, engage senior leadership, and drive business results. Job Responsibilities Lead a local AI/ML team with accountability and engagement into a global organization. Mentor and guide team members, fostering an inclusive culture with a growth mindset. Collaborate on setting the technical vision and executing strategic roadmaps to drive AI innovation. Deliver AI/ML projects through our ML development life cycle using Agile methodology. Help transform business requirements into AI/ML specifications, define milestones, and ensure timely delivery. Work with product and business teams to define goals and roadmaps. Maintain alignment with cross-functional stakeholders. Exercise sound technical judgment, anticipate bottlenecks, escalate effectively, and balance business needs versus technical constraints. Design experiments, establish mathematical intuitions, implement algorithms, execute test cases, validate results and productionize highly performant, scalable, trustworthy and often explainable solution. Mentor Junior Machine Learning associates in delivering successful projects and building successful career in the firm. Participate and contribute back to firmwide Machine Learning communities through patenting, publications and speaking engagements. Evaluate and design effective processes and systems to facilitate communication, improve execution, and ensure accountability. Required Qualifications, Capabilities, And Skills 14+ years (BS) or 8+ (MS) or 5+ (PhD) years of relevant in Computer Science, Data Science, Information Systems, Statistics, Mathematics or equivalent experience. Track record of managing AI/ML or software development teams. Experience as a hands-on practitioner developing production AI/ML solutions. Deep knowledge and experience in machine learning and artificial intelligence. Ability to set teams up for success in speed and quality, and design effective metrics and hypotheses. Expert in at least one of the following areas: Natural Language Processing, Knowledge Graph, Computer Vision, Speech Recognition, Reinforcement Learning, Ranking and Recommendation, or Time Series Analysis. Deep knowledge in Data structures, Algorithms, Machine Learning, Data Mining, Information Retrieval, Statistics. Demonstrated expertise in machine learning frameworks: Tensorflow, Pytorch, pyG, Keras, MXNet, Scikit-Learn. Strong programming knowledge of python, spark; Strong grasp on vector operations using numpy, scipy; Strong grasp on distributed computation using Multithreading, Multi GPUs, Dask, Ray, Polars etc. Familiarity in AWS Cloud services such as EMR, Sagemaker etc., Strong people management and team-building skills. Ability to coach and grow talent, foster a healthy engineering culture, and attract/retain talent. Ability to build a diverse, inclusive, and high-performing team. Ability to inspire collaboration among teams composed of both technical and non-technical members. Effective communication, solid negotiation skills, and strong leadership. About Us JPMorganChase, one of the oldest financial institutions, offers innovative financial solutions to millions of consumers, small businesses and many of the world’s most prominent corporate, institutional and government clients under the J.P. Morgan and Chase brands. Our history spans over 200 years and today we are a leader in investment banking, consumer and small business banking, commercial banking, financial transaction processing and asset management. We recognize that our people are our strength and the diverse talents they bring to our global workforce are directly linked to our success. We are an equal opportunity employer and place a high value on diversity and inclusion at our company. We do not discriminate on the basis of any protected attribute, including race, religion, color, national origin, gender, sexual orientation, gender identity, gender expression, age, marital or veteran status, pregnancy or disability, or any other basis protected under applicable law. We also make reasonable accommodations for applicants’ and employees’ religious practices and beliefs, as well as mental health or physical disability needs. Visit our FAQs for more information about requesting an accommodation. About The Team J.P. Morgan’s Commercial & Investment Bank is a global leader across banking, markets, securities services and payments. Corporations, governments and institutions throughout the world entrust us with their business in more than 100 countries. The Commercial & Investment Bank provides strategic advice, raises capital, manages risk and extends liquidity in markets around the world.

Posted 6 days ago

Apply

5.0 years

0 Lacs

India

Remote

AI/ML Lead Experience : 8+ Yrs Relevant Exp : 5+ Years Location : Remote Duration : 6 months & extendable Work Timings: 10 am to 7 pm IST Industry/Domain: Medical Budget : up to 1.1L per month Job Overview: AI/ML Lead We are seeking a dynamic and technically strong AI Lead with 6–8 years of industry experience, including a minimum of 5 years in AI/ML and Conversational AI technologies, with a specific focus on Microsoft’s AI ecosystem. The ideal candidate will lead the design, development, and delivery of intelligent solutions using Azure OpenAI, Copilot Studio, Microsoft Bot Framework, and AI Foundry. The individual will act as a hands-on technical lead, collaborating closely with product teams, architects, and business stakeholders to build impactful AI-powered copilots, chatbots, and enterprise automation solutions. Mandatory Skills Required: • 5+ years in AI/ML or Conversational AI • Azure ML / Cognitive Services / AI Foundry • Experience with LLMs, NLP (GPT, BERT) • Integration with Microsoft Graph, REST APIs • Prompt engineering, fine-tuning • RAG architecture, embeddings, vector DBs • Azure OpenAI Services & APIs • M365 Copilot APIs / Plugin development • Microsoft Copilot Studio • Python/Node.js coding & orchestration • Microsoft Bot Framework (SDK, Composer) • Good to have CI/CD, Azure DevOps, containerization (Docker/K8s) Must Have Skills: • 6–8 years of overall experience, including 5+ years in AI/ML or Conversational AI • Deep hands-on knowledge of: • Azure OpenAI services and APIs • Copilot Studio for building Microsoft 365-integrated assistants • Microsoft Bot Framework SDK/Composer for chatbot development • Prompt engineering for LLM optimization • Strong Python or Node.js development skills (for AI orchestration and integration) • Experience with enterprise system integration using APIs (Microsoft Graph, REST, JSON, OAuth) • Familiarity with Azure ML, Azure Cognitive Services, and Azure DevOps • Ability to design RAG-based architectures, manage embeddings, and leverage vector databases (e.g., Azure AI Search) • Strong understanding of natural language processing (NLP) and foundational models (GPT, BERT) • Excellent communication, leadership, and stakeholder engagement capabilities Good-to-Have Skills: • Experience with Semantic Kernel or LangChain. • Working knowledge of AI Foundry for orchestrating AI pipelines. • Familiarity with Copilot extensibility and Teams App Studio. • Exposure to M365 Copilot APIs and custom plugin creation. • Knowledge of Responsible AI, data security, and compliance principles. • Familiarity with containerized deployment (Docker, Kubernetes). • Experience in building dashboards and analytics (Kibana, Grafana) to visualize bot usage and performance. • Basic understanding of Power Platform (Power Automate, Power Apps) and its integration with AI. Key Responsibilities : • Lead end-to-end technical implementation of AI-driven projects using Microsoft AI tools: Azure OpenAI, Copilot Studio, and Bot Framework. • Design and develop intelligent copilots, multi-turn chatbots, and custom GPT solutions integrated within enterprise tools such as Microsoft Teams, SharePoint, and Dynamics 365. • Translate business requirements into technical architecture and AI flows using OpenAI APIs, prompt engineering, and integration with enterprise systems. • Leverage AI Foundry to manage the AI lifecycle including model selection, deployment, monitoring, and optimization. • Architect AI/ML solutions that use Retrieval-Augmented Generation (RAG), semantic search, and contextual memory frameworks (LangChain, Semantic Kernel, etc.). • Collaborate with product owners and business analysts to identify high-value use cases and define solution roadmaps. • Develop and execute POCs and MVPs with hands-on coding, configuration, and orchestration of LLMs and chatbot pipelines. • Integrate with enterprise data sources via APIs, GraphQL, and Microsoft Graph to create holistic user experiences. • Mentor junior developers and work with DevOps teams to ensure stable deployment, CI/CD, and performance monitoring. • Create documentation and reusable components/templates for repeated use across the organization. • Stay current on Microsoft’s AI advancements and recommend tools, features, or practices that improve time-to-value and performance. • Design, build and maintain automated unit and integration tests • Support healthy system operations and ensure high levels of availability are achieved. Be part of the on-call rotation during business hours NOTE Copilot, Azure, Microsoft bot framework experience must, and other Mandatory Skills are mentioned in the JD. Make sure every skill mentioned in the resume as well (Overall Summary, Technical Skillsets Table, Projects)

Posted 6 days ago

Apply

7.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 7+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 6 days ago

Apply

7.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description – Business Intelligence Developer (OAC, PowerBI, ETL, Data Modelling) Competency: Oracle ERP Analytics We are seeking an experienced Business Intelligence Developer with 7+ years of experience having expertise in Oracle Analytics Cloud (OAC), PowerBI, ETL tools, and Data Modelling to join our dynamic team. The successful candidate will be responsible for developing and maintaining scalable data models, creating insightful analytics dashboards, and managing ETL workflows to support data-driven decision-making across the organization. They will work closely with customers, data architects, software developers, and business analysts for suitable product development. The candidate will be highly skilled individual and will be accountable for their career development and growth in EY. Responsibilities: Collaborate with stakeholders to understand data requirements and translate business needs into data models. Design and implement effective data models to support business intelligence activities. Develop and maintain ETL processes to ensure data accuracy and availability. Create interactive dashboards and reports using Oracle Analytics Cloud (OAC) and PowerBI. Work with stakeholders to gather requirements and translate business needs into technical specifications. Optimize data retrieval and develop dashboard visualizations for performance efficiency. Ensure data integrity and compliance with data governance and security policies. Collaborate with IT and data teams to integrate BI solutions into the existing data infrastructure. Conduct data analysis to identify trends, patterns, and insights that can inform business strategies. Provide training and support to end-users on BI tools and dashboards. Document all processes, models, and activities to maintain transparency and facilitate knowledge sharing. Stay up to date with the latest BI technologies and best practices to drive continuous improvement. Qualifications: Bachelor’s degree in computer science, Information Systems, Business Analytics, or a related field. Proven experience with Oracle Analytics Cloud (OAC), PowerBI, and other BI tools. Strong experience in ETL (SSIS, Informatica, Dell Boomi etc) processes and data warehousing solutions. Proficiency in data modelling techniques and best practices. Solid understanding of SQL and experience with relational databases. Familiarity with cloud platforms and services (e.g., AWS, Azure, Google Cloud). Excellent analytical, problem-solving, and project management skills. Ability to communicate complex data concepts to non-technical stakeholders. Detail-oriented with a strong focus on accuracy and quality. Well-developed business acumen, analytical and strong problem-solving attitude with the ability to visualize scenarios, possible outcomes & operating constraints. Strong consulting skills with proven experience in client and stakeholder management and collaboration abilities. Good communication skills both written and oral, ability to make impactful presentations & expertise at using excel & PPTs. Detail-oriented with a commitment to quality and accuracy. Good to have knowledge on data security and controls to address customer’s data privacy needs inline to regional regulations such as GDPR, CCPA et EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 6 days ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Overview As an SDE / ML Engineer, you will be responsible for designing, developing, and maintaining scalable pipelines and infrastructure for machine learning models, ensuring high performance, reliability, and efficient deployment. You will work closely with a team of senior engineers to optimize systems, implement robust engineering practices, and contribute to the evolution of our ML and AI infrastructure through scalable software solutions. This position is for immediate joinees only. Key Responsibilities ML Engineering Translating working models into scalable ML pipeline solutions. Maintaining, deploying, and optimizing machine learning pipelines for production environments. Participate in the operational lifecycle of ML systems, including feature engineering support, performance benchmarking, monitoring, and seamless deployment. Tuning pipelines and infrastructure for efficiency, scalability, and low-latency operations. Collaborate with team members to integrate ML models into high-throughput data pipelines and distributed workflows, focusing on code quality, system reliability, and maintainability. Data Engineering Building and maintaining robust, scalable, and high-performance modelling pipelines that handle large-scale data processing. Cleaning, aggregating, & preprocessing data from diverse sources with an emphasis on efficient, parallelizable code. Optimize data retrieval, storage, and processing for speed and resource efficiency in distributed systems. Collaborate with the team on the development and maintenance of large-scale database systems, ensuring fault-tolerance and high availability. Qualifications Bachelor’s degree with 3-5 years of experience or Master’s degree with 2-4 years of experience in Computer Science, Software Engineering, or a related field. Excellent understanding of data structures, algorithms, and software engineering principles, with a proven track record of writing clean, efficient, and maintainable code. Strong command over programming languages, especially Python and Java. Experience with distributed systems and orchestration frameworks such as Kafka for streaming data and Kubernetes for container management. Knowledge of SQL and hands-on experience with large-scale databases and distributed systems (e.g., BigTable, Spark, Hive). Strong problem-solving skills, ability to optimize for performance and scale, and quick learning aptitude. Excellent communication and teamwork skills. About Us Minivet.ai is a small AI products and services firm providing on-prem AI solutions tailored to clients in India and across the world. We take pride in our innovation, and in our ability to empower businesses to address their unique challenges. We collaborate closely with each client, ensuring that every AI solution is a pivotal business advantage to them. This role is a full-time position and is for people living in Bengaluru, India. Application Process Please send your resume as a PDF to jobs@minivet.ai with the subject “SDEJUL25” and the following fields in the message: “Name”, “Highest Qualification”, “Year of completion”, “Phone Number”, “Earliest Joining Date”, “Reason” why you think you are a good fit. If you have less than 2 years of experience, we will be opening up a different position, please DO NOT apply . Applications will be scanned by software and hence applications not meeting the format requirements will not be considered.

Posted 6 days ago

Apply

2.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Work Schedule Standard (Mon-Fri) Environmental Conditions Office Job Title: Technical Consultant When you join us at Thermo Fisher Scientific, you’ll be part of a hard-working team that shares your passion for exploration and discovery. Thermo Fisher recognizes that digital enablement has the power to change the way our customers work — providing them with unmatched capabilities for digital science execution, commerce, and services and to drive efficiency and ultimately power science. How will you make an impact? Thermo Fisher is seeking a new colleague who is passionate about customer experience, is technologically savvy, and has a passion for the digitization of science. The vision of Digital Science Solutions is to make it easy for scientific customers to digitize their research, development, and manufacturing lab operations. The Technical Consultant will be responsible for providing guidance, support, and implementation services related to Digital Sciences Solutions products such as LIMS, ELN, LES, SDMS, and other enterprise applications. You will work closely with clients to understand their laboratory processes and requirements, design solutions, perform system implementation tasks, and provide ongoing support and training to users. Your experience in configuration, data management, laboratory workflows, and regulatory compliance will contribute to the successful implementation and utilization of Digital Sciences solutions in various laboratory environments. Essential Duties and Responsibilities: Strategic: Trusted advisor to our customers, global account managers, services leaders, product managers, and delivery teams. Maintain and assume accountability for a culture of high customer service. Position Thermo Fisher Digital Science as a leader in life and laboratory science digitalization through successful delivery. Work with broader organization to achieve business objectives, expand multi-functional client engagements. Effectively share knowledge to help build a world-class digital solutions consulting and implementation team, working closely with global services leaders. Provide feedback to product management and engineering to rapidly advance our product capabilities to meet customer needs and expectations. Contribute to 3rd party technology, product, and solution evaluations in the context of our portfolio. Operational: Collaborate with clients, business analysts, project managers, and solution architects to understand their laboratory information management needs, workflows, and regulatory compliance requirements. Develop and document comprehensive solutions based on the gathered requirements, including system configuration, customizations, and integration with other laboratory systems. Participate in the implementation of enterprise laboratory solutions, ensuring that they are configured and customized correctly to meet client specifications and industry standard methodologies. Define data management strategies, including data mapping, migration, and validation, to ensure accurate and reliable data entry, storage, and retrieval within the solution. Assess laboratory workflows and find opportunities for process improvement and automation. Develop and execute test plans to ensure the system meets functional and performance requirements. Conduct user training sessions and provide ongoing support to laboratory staff, addressing questions, resolving issues, and ensuring effective system utilization. Prepare detailed user documentation, including system requirements, design specifications, user manuals, and standard operating procedures (SOPs). Work closely with multi-functional teams, including software developers, quality assurance analysts, and laboratory personnel, fostering effective collaboration and communication. Remain current with relevant industry regulations and guidelines (e.g., FDA, ISO) and ensure that the implemented solutions align with these standards. Contribute to improving processes, ensuring compliance, and driving improvements. Provide recommendations into planning, resource allocation, management, tracking, and reporting on all aspects of customer engagements. Participate in team and customer meetings delivering engaging, informative presentations both to internal and external audiences. Travel, as needed, for internal and customer meetings. Culture: In line with the 4I values of Integrity, Intensity, Innovation, and Involvement, that form the foundation of the Thermo Fisher culture and ways of working, this role will bring intensity, innovation, and a high degree of involvement to designing, proposing, and delivering on Digital Science platform solutions. Business Partnership: Working collaboratively with Digital Science and broader Thermo Fisher colleagues to create and sustain a culture of delivering excellent customer experience, embracing continuous learning, leading with digital innovation, analytical thinking, and managing complexity. Knowledge, Skills, And Abilities Knowledge of enterprise laboratory software platforms, such as LIMS, ELN, LES, SDMS, CDS, or similar systems. Understanding of laboratory processes, data management principles, and laboratory workflows in various domains (e.g., pharmaceutical, biotechnology, manufacturing). Familiarity with regulatory requirements and compliance standards relevant to laboratory operations (e.g., FDA 21 CFR Part 11, ISO 17025, GLP, GMP). Experience with relational databases – Oracle, SQL Server, Postgres. Knowledge of cloud services and infrastructure highly desirable. Excellent problem-solving skills and ability to analyze complex business requirements and translate them into solutions. Demonstrated experience delivering in a matrix, global environment, across internal and external resources. Understanding of IT processes, SDLC methodologies, Quality Management Systems, and knowledge of regulatory landscape, with preference for experience in delivering and supporting validated systems. Superb communication and interpersonal skills, integrity, and credibility. Results focused, with attention to detail and a concern for quality. Planning, prioritizing, reporting, problem solving and analytical capabilities. Collaborative, initiates and facilitates communications and relevant information sharing, and works with different functions to achieve the best outcomes. Ability to exercise judgment and discretion concerning critical, confidential, and proprietary information. Flexibility in work schedule to accommodate communications with global team. Able to innovate and bring ideas forward and advance issues and risks in a positive way. Minimum Education And Experience Requirements Bachelor’s or master's degree in IT, IS, Engineering, Life Sciences, or equivalent. At least 2-5 years of relevant experience in life sciences technical and business consulting with medium and large customers in the life and laboratory sciences industry, specifically, developing and deploying solutions catering to one or more areas of discovery, research, development, or manufacturing. Thermo Fisher Scientific Inc. is the world leader in serving science, with annual revenue of approximately $40 billion. Our Mission is to enable our customers to make the world healthier, cleaner and safer. Whether our customers are accelerating life sciences research, solving complex analytical challenges, increasing efficiency in their laboratories, improving patient health through diagnostics or the development and manufacture of life-changing therapies, we are here to support them. Our global team of more than 100,000 colleagues delivers an unrivaled combination of innovative technologies, purchasing convenience and pharmaceutical services through our industry-leading brands, including Thermo Scientific, Applied Biosystems, Invitrogen, Fisher Scientific, Unity Lab Services, Patheon and PPD. Thermo Fisher Scientific is an EEO/Affirmative Action Employer and does not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability or any other legally protected status. Apply today http://jobs.thermofisher.com

Posted 6 days ago

Apply

2.0 - 5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Work Schedule Standard (Mon-Fri) Environmental Conditions Office Job Title: Technical Consultant When you join us at Thermo Fisher Scientific, you’ll be part of a hard-working team that shares your passion for exploration and discovery. Thermo Fisher recognizes that digital enablement has the power to change the way our customers work — providing them with unmatched capabilities for digital science execution, commerce, and services and to drive efficiency and ultimately power science. How will you make an impact? Thermo Fisher is seeking a new colleague who is passionate about customer experience, is technologically savvy, and has a passion for the digitization of science. The vision of Digital Science Solutions is to make it easy for scientific customers to digitize their research, development, and manufacturing lab operations. The Technical Consultant will be responsible for providing guidance, support, and implementation services related to Digital Sciences Solutions products such as LIMS, ELN, LES, SDMS, and other enterprise applications. You will work closely with clients to understand their laboratory processes and requirements, design solutions, perform system implementation tasks, and provide ongoing support and training to users. Your experience in configuration, data management, laboratory workflows, and regulatory compliance will contribute to the successful implementation and utilization of Digital Sciences solutions in various laboratory environments. Essential Duties and Responsibilities: Strategic: Trusted advisor to our customers, global account managers, services leaders, product managers, and delivery teams. Maintain and assume accountability for a culture of high customer service. Position Thermo Fisher Digital Science as a leader in life and laboratory science digitalization through successful delivery. Work with broader organization to achieve business objectives, expand multi-functional client engagements. Effectively share knowledge to help build a world-class digital solutions consulting and implementation team, working closely with global services leaders. Provide feedback to product management and engineering to rapidly advance our product capabilities to meet customer needs and expectations. Contribute to 3rd party technology, product, and solution evaluations in the context of our portfolio. Operational: Collaborate with clients, business analysts, project managers, and solution architects to understand their laboratory information management needs, workflows, and regulatory compliance requirements. Develop and document comprehensive solutions based on the gathered requirements, including system configuration, customizations, and integration with other laboratory systems. Participate in the implementation of enterprise laboratory solutions, ensuring that they are configured and customized correctly to meet client specifications and industry standard methodologies. Define data management strategies, including data mapping, migration, and validation, to ensure accurate and reliable data entry, storage, and retrieval within the solution. Assess laboratory workflows and find opportunities for process improvement and automation. Develop and execute test plans to ensure the system meets functional and performance requirements. Conduct user training sessions and provide ongoing support to laboratory staff, addressing questions, resolving issues, and ensuring effective system utilization. Prepare detailed user documentation, including system requirements, design specifications, user manuals, and standard operating procedures (SOPs). Work closely with multi-functional teams, including software developers, quality assurance analysts, and laboratory personnel, fostering effective collaboration and communication. Remain current with relevant industry regulations and guidelines (e.g., FDA, ISO) and ensure that the implemented solutions align with these standards. Contribute to improving processes, ensuring compliance, and driving improvements. Provide recommendations into planning, resource allocation, management, tracking, and reporting on all aspects of customer engagements. Participate in team and customer meetings delivering engaging, informative presentations both to internal and external audiences. Travel, as needed, for internal and customer meetings. Culture: In line with the 4I values of Integrity, Intensity, Innovation, and Involvement, that form the foundation of the Thermo Fisher culture and ways of working, this role will bring intensity, innovation, and a high degree of involvement to designing, proposing, and delivering on Digital Science platform solutions. Business Partnership: Working collaboratively with Digital Science and broader Thermo Fisher colleagues to create and sustain a culture of delivering excellent customer experience, embracing continuous learning, leading with digital innovation, analytical thinking, and managing complexity. Knowledge, Skills, And Abilities Knowledge of enterprise laboratory software platforms, such as LIMS, ELN, LES, SDMS, CDS, or similar systems. Understanding of laboratory processes, data management principles, and laboratory workflows in various domains (e.g., pharmaceutical, biotechnology, manufacturing). Familiarity with regulatory requirements and compliance standards relevant to laboratory operations (e.g., FDA 21 CFR Part 11, ISO 17025, GLP, GMP). Experience with relational databases – Oracle, SQL Server, Postgres. Knowledge of cloud services and infrastructure highly desirable. Excellent problem-solving skills and ability to analyze complex business requirements and translate them into solutions. Demonstrated experience delivering in a matrix, global environment, across internal and external resources. Understanding of IT processes, SDLC methodologies, Quality Management Systems, and knowledge of regulatory landscape, with preference for experience in delivering and supporting validated systems. Superb communication and interpersonal skills, integrity, and credibility. Results focused, with attention to detail and a concern for quality. Planning, prioritizing, reporting, problem solving and analytical capabilities. Collaborative, initiates and facilitates communications and relevant information sharing, and works with different functions to achieve the best outcomes. Ability to exercise judgment and discretion concerning critical, confidential, and proprietary information. Flexibility in work schedule to accommodate communications with global team. Able to innovate and bring ideas forward and advance issues and risks in a positive way. Minimum Education And Experience Requirements Bachelor’s or master's degree in IT, IS, Engineering, Life Sciences, or equivalent. At least 2-5 years of relevant experience in life sciences technical and business consulting with medium and large customers in the life and laboratory sciences industry, specifically, developing and deploying solutions catering to one or more areas of discovery, research, development, or manufacturing. Thermo Fisher Scientific Inc. is the world leader in serving science, with annual revenue of approximately $40 billion. Our Mission is to enable our customers to make the world healthier, cleaner and safer. Whether our customers are accelerating life sciences research, solving complex analytical challenges, increasing efficiency in their laboratories, improving patient health through diagnostics or the development and manufacture of life-changing therapies, we are here to support them. Our global team of more than 100,000 colleagues delivers an unrivaled combination of innovative technologies, purchasing convenience and pharmaceutical services through our industry-leading brands, including Thermo Scientific, Applied Biosystems, Invitrogen, Fisher Scientific, Unity Lab Services, Patheon and PPD. Thermo Fisher Scientific is an EEO/Affirmative Action Employer and does not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, disability or any other legally protected status. Apply today http://jobs.thermofisher.com

Posted 6 days ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Siemens Digital Industries Software is a leading provider of solutions for the design, simulation, and manufacture of products across many different industries. Formula 1 cars, skyscrapers, ships, space exploration vehicles, and many of the objects we see in our daily lives are being conceived and manufactured using our Product Lifecycle Management (PLM) software. We are seeking AI Backend Engineers to play a pivotal role in building our Agentic Workflow Service and Retrieval-Augmented Generation (RAG) Service. In this hybrid role, you'll leverage your expertise in both backend development and machine learning to create robust, scalable AI-powered systems using AWS Kubernetes, Amazon Bedrock models, AWS Strands Framework, and LangChain / LangGraph. Understanding of and expertise in: Design and implement core backend services and APIs for agentic framework and RAG systems LLM-based applications using Amazon Bedrock models RAG systems with advanced retrieval mechanisms and vector database integration Implement agentic workflows using technologies such as AWS Strands Framework, LangChain / LangGraph Design and develop microservices that efficiently integrate AI capabilities Create scalable data processing pipelines for training data and document ingestion Optimize model performance, inference latency, and overall system efficiency Implement evaluation metrics and monitoring for AI components Write clean, maintainable, and well-tested code with comprehensive documentation Collaborate with multiple cross-functional team members including DevOps, product, and frontend engineers Stay current with the latest advancements in LLMs and AI agent architectures Minimum Experience Requirements 6+ years of total software engineering experience Backend development experience with strong Python programming skills Experience in ML/AI engineering, particularly with LLMs and generative AI applications Experience with microservices architecture, API design, and asynchronous programming Demonstrated experience building RAG systems and working with vector databases LangChain/LangGraph or similar LLM orchestration frameworks Strong knowledge of AWS services, particularly Bedrock, Lambda, and container services Experience with containerization technologies and Kubernetes Understanding of ML model deployment, serving, and monitoring in production environments Knowledge of prompt engineering and LLM fine-tuning techniques Excellent problem-solving abilities and system design skills Strong communication skills and ability to explain complex technical concepts Experience in Kubernetes, AWS Serverless Experience in working with Databases (SQL, NoSQL) and data structures Ability to learn new technologies quickly Preferred Qualifications: Must have AWS certifications - Associate Architect / Developer / Data Engineer / AI Track Must have familiarity with streaming architectures and real-time data processing Must have experience with ML experiment tracking and model versioning Must have understanding of ML/AI ethics and responsible AI development Experience with AWS Strands Framework Knowledge of semantic search and embedding models Contributions to open-source ML/AI projects We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We are Siemens A collection of over 377,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we welcome applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit, and business need. Bring your curiosity and creativity and help us shape tomorrow! We offer a comprehensive reward package which includes a competitive basic salary, bonus scheme, generous holiday allowance, pension, and private healthcare. Siemens Software. ‘Transform the everyday' , #SWSaaS

Posted 6 days ago

Apply

4.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Siemens Digital Industries Software is a leading provider of solutions for the design, simulation, and manufacture of products across many different industries. Formula 1 cars, skyscrapers, ships, space exploration vehicles, and many of the objects we see in our daily lives are being conceived and manufactured using our Product Lifecycle Management (PLM) software. We are seeking Backend Engineers to play a pivotal role in building our Data & AI services Agentic Workflow Service and Retrieval-Augmented Generation (RAG) Service. In this hybrid role, you'll leverage your expertise in both backend development and AI knowledge and skills to create robust, scalable Data & AI services using AWS Kubernetes, Amazon Bedrock models. Expertise and understanding in: Backend development experience with strong Java programming skills along with basic Python programming knowledge Design and develop microservices with Java spring boot that efficiently integrate AI capabilities Experience with microservices architecture, API design, and asynchronous programming Experience in working with Databases (SQL, NoSQL) and data structures Solid understanding of AWS services, particularly Bedrock, Lambda, and container services Experience with containerization technologies, Kubernetes and AWS serverless Understanding of RAG systems with advanced retrieval mechanisms and vector database integration Understanding of agentic workflows using technologies such as AWS Strands Framework, LangChain / LangGraph Build scalable data processing pipelines for training data and document ingestion Write clean, maintainable, and well-tested code with comprehensive documentation Collaborate with multiple cross-functional team members including DevOps, product, and frontend engineers Stay current with the latest advancements in Data, LLMs and AI agent architectures Minimum Experience Requirements 4+ years of total software engineering experience Understanding building RAG systems and working with vector databases ML/AI engineering, particularly with LLMs and generative AI applications Awareness about LangChain/LangGraph or similar LLM orchestration frameworks Understanding of ML model deployment, serving, and supervising in production environments Knowledge of timely engineering Excellent problem-solving abilities and system design skills Strong communication skills and ability to explain complex technical concepts Ability to learn new technologies quickly Preferred Qualifications: Must have AWS certifications - Associate Architect / Developer / Data Engineer / AI Track Must have familiarity with streaming architectures and real-time data processing Must have developed, delivered and operated microservices on AWS Understanding of ML/AI ethics and responsible AI development Knowledge of semantic search and embedding models We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We are Siemens A collection of over 377,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we welcome applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit, and business need. Bring your curiosity and creativity and help us shape tomorrow! We offer a comprehensive reward package which includes a competitive basic salary, bonus scheme, generous holiday allowance, pension, and private healthcare. Siemens Software. ‘Transform the every day ' , #SWSaaS

Posted 6 days ago

Apply

3.0 years

2 - 5 Lacs

Cochin

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Title-Senior Developer Overall Years of Experience-3 to 5 years Relevant Years of Experience-3+ Technical Lead: The Artificial Intelligence (AI) & Machine Learning (ML) Developer is responsible for designing and implementing solutions based on AI and Machine Learning. Position Summary 3+yrs of experience in a similar profile with strong service delivery background Lead and guide a team of Junior Developers. Design technical specifications for AI, Machine Learning, Deep Learning, NLP, NLU, NLG projects and implement the same. Contribute to products or tools built on Artificial Intelligence technologies and paradigms, that can enable high-value offerings. Building AI solutions would involve the use and creation of AI and ML techniques including but not limited to deep learning, computer vision, natural language processing, search, information retrieval, information extraction, probabilistic graphical models and machine learning. Plan and implement version control of source control Define and implement best practises for software development Excellent computer skills - proficient in excel, PowerPoint, word and outlook Excellent interpersonal skills and a collaborative management style Ability to analyse and suggest solutions Strong command on verbal and written English language Roles and Responsibilities Essential Create Technical Design for AI, Machine Learning, Deep Learning, NLP, NLU, NLG projects and implementing the same in production. Solid understanding and experience of deep learning architectures and algorithms Experience solving problems in industry using deep learning methods such as recurrent neural networks (RNN, LSTM), convolutional neural nets, auto-encoders etc. Should have experience of 2-3 production implementation of machine learning projects. Knowledge of open source libraries such as Keras, Tensor Flow, Pytorch Work with business analysts/consultants and other necessary teams to create a strong solution Should have in depth understanding and experience of Data science and Machine Learning projects using Python, R etc. Skills in Java/C are a plus Should developing solutions using python in AI/ML projects Should be able to train and build a team of technical developers Desired to have experience as leads in designing and developing applications/tools using Microsoft technologies - ASP.Net, C#, HTML5, MVC Desired to have knowledge on any of the cloud solutions such as Azure or AWS Desired to have knowledge on any of container technology such as Docker Should be able to work with a multi culture global teams and team virtually Should be able to build strong relationship with project stakeholders Should be able to build strong relationship with project stakeholders EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 6 days ago

Apply

5.0 - 7.0 years

0 Lacs

Thiruvananthapuram

On-site

5 - 7 Years 2 Openings Trivandrum Role description Role Proficiency: This role requires proficiency in data pipeline development including coding and testing data pipelines for ingesting wrangling transforming and joining data from various sources. Must be skilled in ETL tools such as Informatica Glue Databricks and DataProc with coding expertise in Python PySpark and SQL. Works independently and has a deep understanding of data warehousing solutions including Snowflake BigQuery Lakehouse and Delta Lake. Capable of calculating costs and understanding performance issues related to data solutions. Outcomes: Act creatively to develop pipelines and applications by selecting appropriate technical options optimizing application development maintenance and performance using design patterns and reusing proven solutions.rnInterpret requirements to create optimal architecture and design developing solutions in accordance with specifications. Document and communicate milestones/stages for end-to-end delivery. Code adhering to best coding standards debug and test solutions to deliver best-in-class quality. Perform performance tuning of code and align it with the appropriate infrastructure to optimize efficiency. Validate results with user representatives integrating the overall solution seamlessly. Develop and manage data storage solutions including relational databases NoSQL databases and data lakes. Stay updated on the latest trends and best practices in data engineering cloud technologies and big data tools. Influence and improve customer satisfaction through effective data solutions. Measures of Outcomes: Adherence to engineering processes and standards Adherence to schedule / timelines Adhere to SLAs where applicable # of defects post delivery # of non-compliance issues Reduction of reoccurrence of known defects Quickly turnaround production bugs Completion of applicable technical/domain certifications Completion of all mandatory training requirements Efficiency improvements in data pipelines (e.g. reduced resource consumption faster run times). Average time to detect respond to and resolve pipeline failures or data issues. Number of data security incidents or compliance breaches. Outputs Expected: Code Development: Develop data processing code independently ensuring it meets performance and scalability requirements. Define coding standards templates and checklists. Review code for team members and peers. Documentation: Create and review templates checklists guidelines and standards for design processes and development. Create and review deliverable documents including design documents architecture documents infrastructure costing business requirements source-target mappings test cases and results. Configuration: Define and govern the configuration management plan. Ensure compliance within the team. Testing: Review and create unit test cases scenarios and execution plans. Review the test plan and test strategy developed by the testing team. Provide clarifications and support to the testing team as needed. Domain Relevance: Advise data engineers on the design and development of features and components demonstrating a deeper understanding of business needs. Learn about customer domains to identify opportunities for value addition. Complete relevant domain certifications to enhance expertise. Project Management: Manage the delivery of modules effectively. Defect Management: Perform root cause analysis (RCA) and mitigation of defects. Identify defect trends and take proactive measures to improve quality. Estimation: Create and provide input for effort and size estimation for projects. Knowledge Management: Consume and contribute to project-related documents SharePoint libraries and client universities. Review reusable documents created by the team. Release Management: Execute and monitor the release process to ensure smooth transitions. Design Contribution: Contribute to the creation of high-level design (HLD) low-level design (LLD) and system architecture for applications business components and data models. Customer Interface: Clarify requirements and provide guidance to the development team. Present design options to customers and conduct product demonstrations. Team Management: Set FAST goals and provide constructive feedback. Understand team members' aspirations and provide guidance and opportunities for growth. Ensure team engagement in projects and initiatives. Certifications: Obtain relevant domain and technology certifications to stay competitive and informed. Skill Examples: Proficiency in SQL Python or other programming languages used for data manipulation. Experience with ETL tools such as Apache Airflow Talend Informatica AWS Glue Dataproc and Azure ADF. Hands-on experience with cloud platforms like AWS Azure or Google Cloud particularly with data-related services (e.g. AWS Glue BigQuery). Conduct tests on data pipelines and evaluate results against data quality and performance specifications. Experience in performance tuning of data processes. Expertise in designing and optimizing data warehouses for cost efficiency. Ability to apply and optimize data models for efficient storage retrieval and processing of large datasets. Capacity to clearly explain and communicate design and development aspects to customers. Ability to estimate time and resource requirements for developing and debugging features or components. Knowledge Examples: Knowledge Examples Knowledge of various ETL services offered by cloud providers including Apache PySpark AWS Glue GCP DataProc/DataFlow Azure ADF and ADLF. Proficiency in SQL for analytics including windowing functions. Understanding of data schemas and models relevant to various business contexts. Familiarity with domain-related data and its implications. Expertise in data warehousing optimization techniques. Knowledge of data security concepts and best practices. Familiarity with design patterns and frameworks in data engineering. Additional Comments: Data Engineering Role Summary: Skilled Data Engineer with strong Python programming skills and experience in building scalable data pipelines across cloud environments. The candidate should have a good understanding of ML pipelines and basic exposure to GenAI solutioning. This role will support large-scale AI/ML and GenAI initiatives by ensuring high-quality, contextual, and real-time data availability. ________________________________________ Key Responsibilities: • Design, build, and maintain robust, scalable ETL/ELT data pipelines in AWS/Azure environments. • Develop and optimize data workflows using PySpark, SQL, and Airflow. • Work closely with AI/ML teams to support training pipelines and GenAI solution deployments. • Integrate data with vector databases like ChromaDB or Pinecone for RAG-based pipelines. • Collaborate with solution architects and GenAI leads to ensure reliable, real-time data availability for agentic AI and automation solutions. • Support data quality, validation, and profiling processes. ________________________________________ Key Skills & Technology Areas: • Programming & Data Processing: Python (4–6 years), PySpark, Pandas, NumPy • Data Engineering & Pipelines: Apache Airflow, AWS Glue, Azure Data Factory, Databricks • Cloud Platforms: AWS (S3, Lambda, Glue), Azure (ADF, Synapse), GCP (optional) • Databases: SQL/NoSQL, Postgres, DynamoDB, Vector databases (ChromaDB, Pinecone) – preferred • ML/GenAI Exposure (basic): Hands-on with Pandas, scikit-learn, knowledge of RAG pipelines and GenAI concepts • Data Modeling: Star/Snowflake schema, data normalization, dimensional modeling • Version Control & CI/CD: Git, Jenkins, or similar tools for pipeline deployment ________________________________________ Other Requirements: • Strong problem-solving and analytical skills • Flexible to work on fast-paced and cross-functional priorities • Experience collaborating with AI/ML or GenAI teams is a plus • Good communication and a collaborative, team-first mindset • Experience in Telecom, E- Commerce, or Enterprise IT Operations is a plus. Skills ETL,BIGDATA,PYSPARK,SQL About UST UST is a global digital transformation solutions provider. For more than 20 years, UST has worked side by side with the world’s best companies to make a real impact through transformation. Powered by technology, inspired by people and led by purpose, UST partners with their clients from design to operation. With deep domain expertise and a future-proof philosophy, UST embeds innovation and agility into their clients’ organizations. With over 30,000 employees in 30 countries, UST builds for boundless impact—touching billions of lives in the process.

Posted 6 days ago

Apply

9.0 - 13.0 years

2 - 5 Lacs

Cochin

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. EY-Consulting – AI Enabled Automation -GenAI/Agentic – Manager We are looking to hire people with strong AI Enabled Automation skills and who are interested in applying AI in the process automation space – Azure, AI, ML, Deep Learning, NLP, GenAI , large Lang Models(LLM), RAG ,Vector DB , Graph DB, Python. At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture, and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Responsibilities: Development and implementation of AI enabled automation solutions, ensuring alignment with business objectives. Design and deploy Proof of Concepts (POCs) and Points of View (POVs) across various industry verticals, demonstrating the potential of AI enabled automation applications. Ensure seamless integration of optimized solutions into the overall product or system Collaborate with cross-functional teams to understand requirements, to integrate solutions into cloud environments (Azure, GCP, AWS, etc.) and ensure it aligns with business goals and user needs Educate team on best practices and keep updated on the latest tech advancements to bring innovative solutions to the project Technical Skills requirements 9 to 13 years of relevant professional experience Proficiency in Python and frameworks like PyTorch, TensorFlow, Hugging Face Transformers. Strong foundation in ML algorithms, feature engineering, and model evaluation .- Must Strong foundation in Deep Learning, Neural Networks, RNNs, CNNs, LSTMs, Transformers (BERT, GPT), and NLP. -Must Experience in GenAI technologies — LLMs (GPT, Claude, LLaMA), prompting, fine-tuning. Experience with LangChain, LlamaIndex, LangGraph, AutoGen, or CrewAI .(Agentic Framework) Knowledge of retrieval augmented generation (RAG) Knowledge of Knowledge Graph RAG Experience with multi-agent orchestration, memory, and tool integrations Experience/Implement MLOps practices and tools (CI/CD for ML, containerization, orchestration, model versioning and reproducibility)-Good to have Experience with cloud platforms (AWS, Azure, GCP) for scalable ML model deployment. Good understanding of data pipelines, APIs, and distributed systems. Build observability into AI systems — latency, drift, performance metrics. Strong written and verbal communication, presentation, client service and technical writing skills in English for both technical and business audiences. Strong analytical, problem solving and critical thinking skills. Ability to work under tight timelines for multiple project deliveries. What we offer: At EY GDS, we support you in achieving your unique potential both personally and professionally. We give you stretching and rewarding experiences that keep you motivated, working in an atmosphere of integrity and teaming with some of the world's most successful companies. And while we encourage you to take personal responsibility for your career, we support you in your professional development in every way we can. You enjoy the flexibility to devote time to what matters to you, in your business and personal lives. At EY you can be who you are and express your point of view, energy and enthusiasm, wherever you are in the world. It's how you make a difference. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 6 days ago

Apply

4.0 years

2 - 5 Lacs

Cochin

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description: Senior AI Engineer (Tech Lead) Role Overview: We are seeking a highly skilled and experienced Senior AI Engineers with a minimum of 4 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Leading a team of 4-6 developers Assist in the development and implementation of AI models and systems, leveraging techniques such as Large Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, Agentic Framework to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Minimum 4 years of experience in Python, Data Science, Machine Learning, OCR and document intelligence Experience in leading a team of 4-6 developers Demonstrated ability to conceptualize technical solutions, apply accurate estimation techniques, and effectively engage with customer stakeholders In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Strong knowledge of Python frameworks such as Django, Flask, or FastAPI. Experience with RESTful API design and development. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Understanding of agentic AI concepts and frameworks Proficiency in designing or interacting with agent-based AI architectures Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 6 days ago

Apply

2.0 years

2 - 5 Lacs

Cochin

On-site

At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. AI Engineer Role Overview: We are seeking a highly skilled and experienced AI Engineers with a minimum of 2 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Your technical responsibilities: Assist in the development and implementation of AI models and systems, leveraging techniques such as Large Language Models (LLMs) and generative AI. Design, develop, and maintain efficient, reusable, and reliable Python code Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, Agentic Framework to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Write unit tests and conduct code reviews to ensure high-quality, bug-free software. Troubleshoot and debug applications to optimize performance and fix issues. Work with databases (SQL, NoSQL) and integrate third-party APIs. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Minimum 2 years of experience in Python, Data Science, Machine Learning, OCR and document intelligence In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Strong knowledge of Python frameworks such as Django, Flask, or FastAPI. Experience with RESTful API design and development. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Good to Have Skills: Understanding of agentic AI concepts and frameworks Proficiency in designing or interacting with agent-based AI architectures Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today.

Posted 6 days ago

Apply

0 years

1 - 4 Lacs

Thrissur

On-site

Job Summary: We are seeking a detail-oriented and organized individual to manage pest control documentation, report creation, and photo record-keeping. This role is essential in maintaining accurate, Monthly, Quarterly & compliant records of pest control activities and supporting operational and audit needs. Key Responsibilities: Maintain detailed records of pest control inspections, treatments and monthly service reports. Collect and organize photographic evidence of pest control activities for documentation and compliance Clear documentation of pesticide usage. Create and submit clear, timely reports to management, Clients and regulatory authorities Coordinate with pest control technicians to gather accurate data and images from the field Ensure documentation (including photos) complies with company and legal standards Assist during audits or inspections by preparing and presenting relevant records Organize both digital and physical files for efficient storage and retrieval, includes software uploads. Qualifications: Preferred Science graduate with good communication and time-management skills Prior experience in documentation, reporting, or administrative roles (pest control industry experience preferred) Capacity to adhere to deadlines Familiarity with handling image files for documentation Strong attention to detail and ability to maintain organized records Proficiency in MS Office (Word, Excel) and basic data entry or reporting software Ability to work independently and collaborate with field teams Job Type: Full-time Pay: ₹8,342.60 - ₹41,017.70 per month Work Location: In person

Posted 6 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies