Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
India
Remote
Machine Learning Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 28th May 2025 About WebBoost Solutions by UM WebBoost Solutions by UM provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science . Role Overview As a Machine Learning Intern , you’ll work on real-world projects , gaining practical experience in machine learning and data analysis . Responsibilities ✅ Design, test, and optimize machine learning models. ✅ Analyze and preprocess datasets. ✅ Develop algorithms and predictive models for various applications. ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn . ✅ Document findings and create reports to present insights. Requirements 🎓 Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field) 📊 Knowledge of machine learning concepts and algorithms . 🐍 Proficiency in Python or R (preferred). 🤝 Strong analytical and teamwork skills . Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Practical machine learning experience . ✔ Internship Certificate & Letter of Recommendation . ✔ Build your portfolio with real-world projects . How to Apply 📩 Submit your application by 28th May 2025 with the subject: "Machine Learning Intern Application" . Equal Opportunity WebBoost Solutions by UM is an equal opportunity employer , welcoming candidates from all backgrounds . Show more Show less
Posted 1 month ago
4.0 years
0 Lacs
Jaipur, Rajasthan, India
On-site
Job Title: Machine Learning Engineer Location: Malaviya Nagar, Jaipur (On-site) Experience Required: 2 – 4 Years Industry: Blockchain Technology Employment Type: Full-Time About the Company: Our client is an innovative tech company specializing in cutting-edge blockchain solutions, working on decentralized applications, smart contracts, and fintech platforms. They're now expanding into AI/ML-driven blockchain analytics, fraud detection, and predictive systems and are looking for a skilled Machine Learning Engineer to join their growing team. Key Responsibilities: Design, develop, and deploy ML models to enhance blockchain data analysis, fraud detection, or smart contract optimization. Work with blockchain developers and data engineers to integrate ML solutions into decentralized systems. Preprocess large datasets from blockchain networks and external APIs. Conduct exploratory data analysis to derive meaningful insights and trends. Build and maintain scalable ML pipelines and model deployment workflows. Optimize models for performance, scalability, and accuracy in production environments. Research and evaluate new technologies in the intersection of AI/ML and blockchain. Required Skills: Solid understanding of core machine learning algorithms (supervised, unsupervised, NLP, etc.) Hands-on experience with Python and ML libraries like TensorFlow, PyTorch, Scikit-learn, etc. Strong knowledge of data preprocessing, feature engineering, and model evaluation techniques. Experience with REST APIs, data collection from APIs or databases. Good understanding of blockchain fundamentals and how decentralized systems work. Familiarity with blockchain analytics tools or platforms is a plus. Good to Have: Exposure to smart contracts and Ethereum/Solidity. Experience with graph-based ML (e.g., using blockchain transaction graphs). Knowledge of tools like Docker, Kubernetes, or cloud services (AWS/GCP/Azure). What We Offer: Opportunity to work on real-world blockchain + AI innovations. A collaborative team with a passion for decentralization and disruptive technologies. Competitive salary package and career growth in a fast-growing domain. To Apply: Send your updated resume to ridhamstaffing@gmail.com with the subject line: “ML Engineer – Blockchain | Jaipur” Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
⚠️ Applications without a GitHub or Portfolio link in the resume will be automatically rejected. Please include it to be considered At NilAi, we’re building an AI-powered platform that helps hospitals (starting with the NHS) optimize energy and water consumption, reduce carbon emissions, and meet Net Zero goals—without any new hardware. We're looking for a passionate AI Intern to join our mission-driven team and help us shape the future of sustainable healthcare. 🌍 Company: NilAI 📍 Location: India (Remote) 💼 Position: AI Intern 💰 Stipend: ₹5,000/month Responsibilities Clean, preprocess, and analyze large datasets related to hospital energy usage, water consumption, and operational workflows. Develop and implement machine learning models (e.g., regression, time-series forecasting, anomaly detection) using Scikit-learn, TensorFlow/PyTorch to predict and optimize energy consumption. Explore the application of LLMs (Large Language Models) for automating reports or extracting insights from unstructured data (e.g., maintenance logs, audit reports). Create interactive dashboards and visualizations using Power BI or Tableau to communicate findings to stakeholders. Integrate open-source APIs (e.g., OpenAI API) for enhancing data processing or generating sustainability recommendations. Assist in deploying lightweight models or prototypes using Flask or Streamlit for internal testing. Collaborate with the team to refine AI-driven recommendations for reducing carbon emissions and improving resource efficiency. Take ownership of complex challenges, demonstrating a commitment to continuous learning and delivering innovative, scalable solutions. Required Skills & Qualifications - Pursuing or recently completed a degree in Data Science, Computer Science, Engineering, Statistics, or a related field. - Proficiency in Python and experience with data science libraries (e.g., Pandas, NumPy, Scikit-learn). - Familiarity with machine learning frameworks (TensorFlow/PyTorch) and model deployment. - Experience with data visualization tools (Power BI, Tableau) and storytelling with data. - Basic understanding of LLMs and API integrations (e.g., OpenAI, Hugging Face). - Exposure to time-series forecasting (e.g., Prophet, ARIMA) or anomaly detection techniques. - Experience with ETL pipelines (e.g., Apache Airflow, Alteryx, or custom Python scripts) and data warehousing concepts. - Knowledge of SQL for data querying and manipulation. - Ability to work with messy, real-world datasets and strong problem-solving skills. - Passion for sustainability, healthcare innovation, or energy efficiency is a plus! Nice-to-Have Skills Experience with cloud platforms (AWS, GCP) or big data tools. What You’ll Gain Hands-on experience with AI for sustainability in a high-impact startup. Mentorship from experienced data scientists and exposure to real-world energy challenges. Opportunity to contribute to a product that directly reduces carbon emissions and saves costs for hospitals. Flexible work environment and potential for future full-time roles. Please Note: Kindly attach your CV with portfolios for review. Let’s build something that matters. 🌍 #AIforGood #ClimateTech #HealthcareInnovation #NilAi Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. AI Engineer Role Overview: We are seeking a highly skilled and experienced AI Engineers with a minimum of 2 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Your technical responsibilities: Assist in the development and implementation of AI models and systems, leveraging techniques such as Large Language Models (LLMs) and generative AI. Design, develop, and maintain efficient, reusable, and reliable Python code Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, Agentic Framework to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Write unit tests and conduct code reviews to ensure high-quality, bug-free software. Troubleshoot and debug applications to optimize performance and fix issues. Work with databases (SQL, NoSQL) and integrate third-party APIs. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Minimum 2 years of experience in Python, Data Science, Machine Learning, OCR and document intelligence In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Strong knowledge of Python frameworks such as Django, Flask, or FastAPI. Experience with RESTful API design and development. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Good to Have Skills: Understanding of agentic AI concepts and frameworks Proficiency in designing or interacting with agent-based AI architectures Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Kanayannur, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. AI Engineer Role Overview: We are seeking a highly skilled and experienced AI Engineers with a minimum of 2 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Your technical responsibilities: Assist in the development and implementation of AI models and systems, leveraging techniques such as Large Language Models (LLMs) and generative AI. Design, develop, and maintain efficient, reusable, and reliable Python code Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, Agentic Framework to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Write unit tests and conduct code reviews to ensure high-quality, bug-free software. Troubleshoot and debug applications to optimize performance and fix issues. Work with databases (SQL, NoSQL) and integrate third-party APIs. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Minimum 2 years of experience in Python, Data Science, Machine Learning, OCR and document intelligence In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Strong knowledge of Python frameworks such as Django, Flask, or FastAPI. Experience with RESTful API design and development. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Good to Have Skills: Understanding of agentic AI concepts and frameworks Proficiency in designing or interacting with agent-based AI architectures Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Trivandrum, Kerala, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. AI Engineer Role Overview: We are seeking a highly skilled and experienced AI Engineers with a minimum of 2 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Your technical responsibilities: Assist in the development and implementation of AI models and systems, leveraging techniques such as Large Language Models (LLMs) and generative AI. Design, develop, and maintain efficient, reusable, and reliable Python code Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, Agentic Framework to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Write unit tests and conduct code reviews to ensure high-quality, bug-free software. Troubleshoot and debug applications to optimize performance and fix issues. Work with databases (SQL, NoSQL) and integrate third-party APIs. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. Minimum 2 years of experience in Python, Data Science, Machine Learning, OCR and document intelligence In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Strong knowledge of Python frameworks such as Django, Flask, or FastAPI. Experience with RESTful API design and development. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Good to Have Skills: Understanding of agentic AI concepts and frameworks Proficiency in designing or interacting with agent-based AI architectures Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Experience: 3 - 7 Years Shift timing: 1.00 pm to 10.00 pm Domain: Banking and BFSI Work mode: Hybrid Notice Period: Immediate to 30 days Job Summary: We are seeking a skilled Credit Risk Modeller to develop, validate, and maintain credit risk models that assess the creditworthiness of individuals and organizations. The role involves analyzing financial data, creating predictive models, and supporting the credit decision-making process to minimize potential losses and optimize risk-adjusted returns. Key Responsibilities: Develop and implement credit risk models (e.g., Probability of Default (PD), Loss Given Default (LGD), Exposure at Default (EAD)) for retail and/or corporate portfolios. Conduct statistical analysis and predictive modeling using techniques such as logistic regression, decision trees, machine learning algorithms, and other quantitative methods. Collaborate with data teams to collect, clean, and preprocess data from multiple sources. Perform back-testing and validation of existing credit risk models to ensure accuracy and compliance with regulatory standards (e.g., Basel II/III). Prepare detailed documentation of modeling assumptions, methodology, and results. Provide insights and recommendations to credit risk managers and business stakeholders to improve risk management strategies. Stay up to date with industry best practices, regulatory requirements, and emerging trends in credit risk analytics. Participate in internal and external audits related to credit risk models. Support stress testing and scenario analysis for credit portfolios. Qualifications: Strong experience 3+ years in credit risk modeling, preferably in banking or financial services. Bachelor’s or Master’s degree in Finance, Economics, Statistics, Mathematics, Data Science, or related quantitative discipline. Proficiency in statistical and modeling tools such as SAS, R, Python, SQL, or equivalent. Good understanding of credit risk concepts and regulatory frameworks (Basel Accords, IFRS 9). Strong analytical skills with attention to detail and problem-solving ability. Excellent communication skills for explaining complex technical information to non-technical stakeholders. Experience with big data tools and machine learning techniques is a plus. Familiarity with credit risk software platforms is advantageous. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Location: Delhi (for projects across India) About Varahe Analytics: Varahe Analytics is one of India’s premier integrated political consulting firms, specialising in building data-driven 360-degree election management. We help our clients with strategic advice and implementation, combining data-backed insights and in-depth ground intelligence into a holistic electoral campaign. We are passionate about our democracy and the politics that shape our world. We draw on some of the sharpest minds from distinguished institutions and diverse professional backgrounds to help us achieve our goal of building electoral strategies that spark conversations, effect change, and help shape electoral and legislative ecosystems in our country. About the Team: As part of the Data Analytics team, you will have the opportunity to contribute to impactful research and insights that drive strategic decisions. Your role will involve analyzing datasets, building dashboards, and generating visual reports using tools like Power BI. You will work closely with cross-functional teams to uncover trends, support data-driven strategies, and provide actionable intelligence. This internship offers a unique chance to be part of high-impact analytical work that informs key decisions and contributes to shaping outcomes at scale. What Would This Role Entail? Report Making and Visualization: Develop, design, and maintain interactive and insightful reports and dashboards using Power BI. Transform raw data into meaningful visualizations that provide actionable insights. Ensure that reports are user-friendly, visually appealing, and accessible to a diverse audience. Data Analysis: Analyze and interpret complex data sets to identify trends, patterns, and key insights. Collaborate with stakeholders to understand their data requirements and deliver customized reporting solutions Data Management: Extract, clean, and preprocess data from various sources to ensure data integrity and accuracy. Maintain and update existing reports and dashboards to reflect new data and evolving business needs. Necessary Qualifications/Skills: Currently pursuing a Bachelor's or Master's degree in Economics, Data Science, Engineering, or a related field. Strong analytical and problem-solving skills with a keen attention to detail. Excellent communication skills to effectively convey data insights to non-technical Stakeholders. Proficient in Power BI and Excel, with introductory-level knowledge of Pandas. Ability to integrate Excel with Power BI for enhanced data analysis and reporting. Good to Have Skills: Proficiency in creating interactive and dynamic reports and dashboards using Power BI Enthusiasm for learning and applying data analysis techniques. How to Apply If you're a fresh professional looking for a high-impact challenge, interested in joining a team of like-minded and motivated individuals who think strategically, act decisively, and get things done, drop in an email at internship@varaheanalytics.com Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
On-site
Apply now: https://forms.office.com/r/RFESZssevc Key Responsibilities: - Data Analysis and Preprocessing: Analyze and preprocess diverse datasets relevant to the mortgage industry, ensuring data quality and relevance for model training. Model Development and Fine-Tuning: Research and implement state-of-the-art NLP models, focusing on pre-training as well instruction tuning pre-trained LLMs for mortgage-specific applications. Utilize techniques like RLHF to improve model alignment with human preferences and enhance decision-making capabilities. Algorithm Implementation: Develop and optimize machine learning algorithms to enhance model performance, accuracy, and efficiency. Collaboration: Work with domain experts to incorporate industry knowledge into model development, ensuring outputs are relevant and actionable. Experimentation: Conduct experiments to validate model hypotheses, analyze results, and iterate on model improvements. Documentation: Maintain comprehensive documentation of methodologies, experiments, and results to support transparency and reproducibility. Ethics and Bias Mitigation: Ensure responsible AI practices are followed by identifying potential biases in data and models, implementing strategies to mitigate them. Required Skills: Technical Expertise: Strong background in machine learning, deep learning, and NLP. Proficiency in Python and experience with ML frameworks such as TensorFlow or PyTorch. NLP Knowledge: Experience with NLP frameworks and libraries (e.g., Hugging Face Transformers) for developing language models. Data Handling: Proficiency in handling large datasets, feature engineering, and statistical analysis. Problem Solving: Strong analytical skills with the ability to solve complex problems using data-driven approaches. Communication: Excellent communication skills to effectively collaborate with technical teams and non-technical stakeholders. Preferred Qualifications: Educational Background: Master’s or Ph.D. in Data Science, Computer Science, Statistics, or a related field. Cloud Computing: Familiarity with cloud platforms (e.g., AWS, Azure) for scalable computing solutions. Ethics Awareness: Understanding of ethical considerations in AI development, including bias detection and mitigation. ⚠️ Disclaimer: Firstsource follows a fair, transparent, and merit-based hiring process. We never ask for money at any stage. Beware of fraudulent offers and always verify through our official channels or @firstsource.com email addresses. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Mumbai Metropolitan Region
On-site
About About Firstsource Firstsource Solutions Limited, an RP-Sanjiv Goenka Group company (NSE: FSL, BSE: 532809, Reuters: FISO.BO, Bloomberg: FSOL:IN), is a specialized global business process services partner, providing transformational solutions and services spanning the customer lifecycle across Healthcare, Banking and Financial Services, Communications, Media and Technology, Retail, and other diverse industries. With an established presence in the US, the UK, India, Mexico, Australia, South Africa, and the Philippines, we make it happen for our clients, solving their biggest challenges with hyper-focused, domain-centered teams and cutting-edge tech, data, and analytics. Our real-world practitioners work collaboratively to deliver future-focused outcomes. Summary Job Summary We are seeking a skilled AI/ML professional to develop and fine-tune NLP models tailored to the mortgage industry. The role involves end-to-end data analysis, model training (including instruction tuning and RLHF), and algorithm optimization. The ideal candidate will collaborate with domain experts, conduct rigorous experimentation, and uphold ethical AI practices to deliver accurate, relevant, and bias-mitigated solutions. Responsibilities Key Roles & Responsibilities: Data Analysis and Preprocessing: Analyze and preprocess diverse datasets relevant to the mortgage industry, ensuring data quality and relevance for model training. Model Development and Fine-Tuning: Research and implement state-of-the-art NLP models, focusing on pre-training as well instruction tuning pre- trained LLMs for mortgage-specific applications. Utilize techniques like RLHF to improve model alignment with human preferences and enhance decision-making capabilities. Algorithm Implementation: Develop and optimize machine learning algorithms to enhance model performance, accuracy, and efficiency. Collaboration: Work with domain experts to incorporate industry knowledge into model development, ensuring outputs are relevant and actionable. Experimentation: Conduct experiments to validate model hypotheses, analyze results, and iterate on model improvements. Documentation: Maintain comprehensive documentation of methodologies, experiments, and results to support transparency and reproducibility. Ethics and Bias Mitigation: Ensure responsible AI practices are followed by identifying potential biases in data and models, implementing strategies to mitigate them. Qualifications Required Skills and Qualifications Technical Expertise: Strong background in machine learning, deep learning, and NLP, Proficiency in Python and experience with ML frameworks such as TensorFlow or PyTorch. NLP Knowledge: Experience with NLP frameworks and libraries (e.g., Hugging Face Transformers) for developing language models. Data Handling: Proficiency in handling large datasets, feature engineering, and statistical analysis Problem Solving: Strong analytical skills with the ability to solve complex problems using data-driven approaches. Communication: Excellent communication skills to effectively collaborate with technical teams and non-technical stakeholders. Educational Background: Master’s or Ph.D. in Data Science, Computer Science, Statistics, or a related field. Cloud Computing: Familiarity with cloud platforms (e.g., AWS, Azure) for scalable computing solutions. Ethics Awareness: Understanding of ethical considerations in AI development, including bias detection and mitigation. ⚠️ Disclaimer: Firstsource follows a fair, transparent, and merit-based hiring process. We never ask for money at any stage. Beware of fraudulent offers and always verify through our official channels or @firstsource.com email addresses. Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Job Title: Machine Learning Intern (Paid) Company: Coreline solutions Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 27th May 2025 About Coreline solutions Coreline solutions provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science. Role Overview As a Machine Learning Intern, you’ll work on real-world projects, gaining practical experience in machine learning and data analysis. Responsibilities ✅ Design, test, and optimize machine learning models. ✅ Analyze and preprocess datasets. ✅ Develop algorithms and predictive models for various applications. ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn. ✅ Document findings and create reports to present insights. Requirements 🎓 Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field) 📊 Knowledge of machine learning concepts and algorithms. 🐍 Proficiency in Python or R (preferred). 🤝 Strong analytical and teamwork skills. Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Practical machine learning experience. ✔ Internship Certificate & Letter of Recommendation. ✔ Build your portfolio with real-world projects. How to Apply 📩 Submit your application by 27th May 2025 with the subject: "Machine Learning Intern Application". Equal Opportunity Coreline solutions is an equal opportunity employer, welcoming candidates from all backgrounds. Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Job Title: Data Science Intern (Paid) Company: Coreline solutions Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About Coreline solutions Coreline solutions provides aspiring professionals with hands-on experience in data science, offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms. ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions. Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field. 🐍 Proficiency in Python or R for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred). 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib). 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects. ✔ Certificate of Internship & Letter of Recommendation. ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 27th May 2025 Equal Opportunity Coreline solutions is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
India
Remote
ob Title: AI Full stack Developer – GenAI & NLP Location: Pune, India (Hybrid) Work Mode: Remote Experience Required: 2+ Years (Relevant AI/ML with GenAI & NLP) Salary: Up to ₹15 LPA (CTC) Employment Type: Full-time Department: AI Research & Development Role Overview We are looking for a passionate AI Developer with strong hands-on experience in Generative AI and Natural Language Processing (NLP) to help build intelligent and scalable solutions. In this role, you will design and deploy advanced AI models for tasks such as language generation, summarization, chatbot development, document analysis, and more. You’ll work with cutting-edge LLMs (Large Language Models) and contribute to impactful AI initiatives. Key Responsibilities Design, fine-tune, and deploy NLP and GenAI models using LLMs like GPT, BERT, LLaMA, or similar. Build applications for tasks like text generation, question-answering, summarization, sentiment analysis, and semantic search. Integrate language models into production systems using RESTful APIs or cloud services. Evaluate and optimize models for accuracy, latency, and cost. Collaborate with product and engineering teams to implement intelligent user-facing features. Preprocess and annotate text data, create custom datasets, and manage model pipelines. Stay updated on the latest advancements in generative AI, transformer models, and NLP frameworks. Required Skills & Qualifications Bachelor’s or Master’s degree in Computer Science, AI/ML, or a related field. Minimum 2 years of experience in fullstack development and AI/ML development, with recent work in NLP or Generative AI. Hands-on experience with models such as GPT, T5, BERT, or similar transformer-based architectures. Proficient in Python and libraries such as Hugging Face Transformers, spaCy, NLTK, or OpenAI APIs. Hands-on experience in any frontend/ backend technologies for software development. Experience with deploying models using Flask, FastAPI, or similar frameworks. Strong understanding of NLP tasks, embeddings, vector databases (e.g., FAISS, Pinecone), and prompt engineering. Familiarity with MLOps tools and cloud platforms (AWS, Azure, or GCP). Preferred Qualifications Experience with LangChain, RAG (Retrieval-Augmented Generation), or custom LLM fine-tuning. Knowledge of model compression, quantization, or inference optimization. Exposure to ethical AI, model interpretability, and data privacy practices. What We Offer Competitive salary package up to ₹15 LPA. Remote work flexibility with hybrid team collaboration in Pune. Opportunity to work on real-world generative AI and NLP applications. Access to resources for continuous learning and certification support. Inclusive, fast-paced, and innovative work culture. Skills: nltk,computer vision,inference optimization,model interpretability,gpt,bert,mlops,artificial intelligence,next.js,tensorflow,ai development,machine learning,generative ai,ml,openai,node.js,kubernetes,large language models (llms),openai apis,natural language processing,machine learning (ml),fastapi,natural language processing (nlp),java,azure,nlp tasks,model compression,embeddings,vector databases,aws,typescript,r,hugging face transformers,google cloud,hugging face,llama,ai tools,mlops tools,rag architectures,langchain,spacy,docker,retrieval-augmented generation (rag),pytorch,gcp,cloud,large language models,react.js,deep learning,python,ai technologies,flask,ci/cd,data privacy,django,quantization,javascript,ethical ai,nlp Show more Show less
Posted 1 month ago
8.0 - 11.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description ----------------------------------------------- Job Description: Lead Data Scientist – Healthcare Domain Specialist Location: Bangalore Company: RT Global Infosolutions Pvt Ltd (www.rgisol.com) Employment Type: Full-Time Industry: Healthcare/AI/Analytics Domain Expertise: Predictive Analytics, Healthcare Data As a key leader in our data science team, you will define strategy, lead projects, and collaborate with healthcare professionals, engineers, and product teams to deploy scalable AI solutions. Key Responsibilities • AI Model Development: Design, develop, and optimize predictive models for elderly fall risk assessment using advanced machine learning (ML) and deep learning techniques. • Data Analysis: Work with healthcare-specific data (e.g., patient records, sensor data, clinical data) to uncover patterns and actionable insights. • Domain Expertise Application: Leverage healthcare domain knowledge to ensure accuracy, reliability, and ethical use of models in predicting fall risks. • Collaborate with Experts: Collaborate with clinicians, healthcare providers, and crossfunctional teams to align AI solutions with clinical workflows and patient care strategies. • Data Engineering: Develop robust ETL pipelines to preprocess and integrate healthcare data from multiple sources, ensuring data quality and compliance. • Evaluation & Optimization: Continuously evaluate model performance and refine algorithms to achieve high accuracy and generalizability. • Compliance & Ethics: Ensure compliance with healthcare data regulations such as HIPAA, GDPR, and implement best practices for data privacy and security. • Research & Innovation: Stay updated with the latest research in healthcare AI, predictive analytics, and elderly care solutions, integrating new techniques as applicable. • Team Management: Guide all team members in technical and domain-specific problemsolving, manage day to day task deliverables, evaluate individual’s performance and coach. • Stakeholder Management: Present insights, models, and business impact assessments to senior leadership and healthcare stakeholders. Required Skills & Qualifications • Education: Master's or PhD in Data Science, Computer Science, Statistics, Bioinformatics, or a related field. A strong academic background in healthcare is preferred. • Experience: o 8 - 11 years of experience in data science, with at least 2 years in the healthcare domain. o Prior experience in leading AI projects in healthcare startups, hospitals, or MedTech companies. o Ability to work in cross-functional teams. o Ability to publish papers and research findings related to healthcare data science • Technical Expertise: o Proficiency in Python, R, or other programming languages used for ML and data analysis. o Hands-on experience with ML/DL frameworks (e.g., TensorFlow, PyTorch, Scikitlearn). o Experience with time-series data, wearable/sensor data, or IoT data integration is a plus. o Strong knowledge of statistics, probability, and feature engineering. o Familiarity with cloud platforms (AWS, Azure, GCP) and tools for scalable ML pipelines. • Healthcare Domain Knowledge: o Understanding of geriatric healthcare challenges, fall risks, and predictive care strategies. o Familiarity with Electronic Health Records (EHR), wearable devices, and sensor data. o Knowledge of healthcare data compliance (e.g., HIPAA, GDPR). • Soft Skills: o Strong analytical and problem-solving abilities. o Excellent communication skills to present findings to non-technical stakeholders. o A collaborative mindset to work with interdisciplinary teams. Preferred Qualifications • Knowledge of biomechanics or human movement analysis. • Experience with explainable AI (XAI) and interpretable ML models. What We Offer • Opportunity to work on cutting-edge healthcare AI solutions that make a meaningful impact on elderly lives. • Competitive salary and benefits package. • Flexible work environment, with options for hybrid work. • Opportunities for professional growth and leadership. • Collaborative and inclusive culture that values innovation and teamwork. Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Razorpay was founded by Shashank Kumar and Harshil Mathur in 2014. Razorpay is building a new-age digital banking hub (Neobank) for businesses in India with the mission is to enable frictionless banking and payments experiences for businesses of all shapes and sizes. What started as a B2B payments company is processing billions of dollars of payments for lakhs of businesses across India. We are a full-stack financial services organisation, committed to helping Indian businesses with comprehensive and innovative payment and business banking solutions built over robust technology to address the entire length and breadth of the payment and banking journey for any business. Over the past year, we've disbursed loans worth millions of dollars in loans to thousands of businesses. In parallel, Razorpay is reimagining how businesses manage money by simplifying business banking (via Razorpay X) and enabling capital availability for businesses (via Razorpay Capital). The Role Senior Analytics Specialist will work with the central analytics team at Razorpay. This will give you an opportunity to work in a fast-paced environment aimed at creating a very high impact and to work with a diverse team of smart and hardworking professionals from various backgrounds. Some of the responsibilities include working with large, complex data sets, developing strong business and product understanding and closely being involved in the product life cycle. Roles And Responsibilities You will work with large, complex data sets to solve open-ended, high impact business problems using data mining, experimentation, statistical analysis and related techniques, machine learning as needed You would have/develop a strong understanding of the business & product and conduct analysis to derive insights, develop hypothesis and validate with sound rigorous methodologies or formulate the problems for modeling with ML You would apply excellent problem solving skills and independently scope, deconstruct and formulate solutions from first-principles that bring outside-in and state of the art view You would be closely involved with the product life cycle working on ideation, reviewing Product Requirement Documents, defining success criteria, instrumenting for product features, Impact assessment and identifying and recommending improvements to further enhance the Product features You would expedite root cause analyses/insight generation against a given recurring use case through automation/self-serve platforms You will develop compelling stories with business insights, focusing on strategic goals of the organization You will work with Business, Product and Data engineering teams for continuous improvement of data accuracy through feedback and scoping on instrumentation quality and completeness Set high standards in project management; own scope and timelines for the team Mandatory Qualifications Bachelor's/Master’s degree in Engineering, Economics, Finance, Mathematics, Statistics, Business Administration or a related quantitative field 3+ years of high quality hands-on experience in analytics and data science Hands on experience in SQL, Python and Tableau Define the business and product metrics to be evaluated, work with engg on data instrumentation, create and automate self-serve dashboards to present to relevant stakeholders leveraging tools such as Tableau. Ability to structure and analyze data leveraging techniques like EDA, Cohort analysis, Funnel analysis and transform them into understandable and actionable recommendations and then communicate them effectively across the organization. Hands on experience in working with large scale structured, semi structured and unstructured data and various approach to preprocess/cleanse data, dimensionality reduction Work experience in Consumer-tech organizations would be a plus Developed a clear understanding of the qualitative and quantitative aspects of the product/strategic initiative and leverage it to identify and act upon existing Gaps and Opportunities Hands on experience of A/B testing, Significance testing, supervised and unsupervised ML, Web Analytics and Statistical Learning Razorpay believes in and follows an equal employment opportunity policy that doesn't discriminate on gender, religion, sexual orientation, colour, nationality, age, etc. We welcome interests and applications from all groups and communities across the globe. Follow us on LinkedIn & Twitter Show more Show less
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Work Level : Individual Core : Communication Skills, Problem Solving, Execution Leadership : Decisive, Team Alignment, Working Independently Role : Industry Type : IT Services & Consulting Function : Data Analyst Key Skills : Data Analytics,Data Analysis,Python,R,MySQL,Cloud,AWS,Bigdata,Big Data Platforms,Business Intelligence (BI),Tableau,Data Science,Statistical Modeling Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Responsibilities: This is a Remote Position. Collect, clean, and preprocess data from various sources. Perform exploratory data analysis (EDA) to identify trends and patterns. Develop dashboards and reports using tools like Excel, Power BI, or Tableau. Use SQL to query and manipulate large datasets. Assist in building predictive models and performing statistical analyses. Present insights and recommendations based on data findings. Collaborate with cross-functional teams to support data-driven decision-making. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Machine Learning Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 27th May 2025 About WebBoost Solutions by UM WebBoost Solutions by UM provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science . Role Overview As a Machine Learning Intern , you’ll work on real-world projects , gaining practical experience in machine learning and data analysis . Responsibilities ✅ Design, test, and optimize machine learning models. ✅ Analyze and preprocess datasets. ✅ Develop algorithms and predictive models for various applications. ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn . ✅ Document findings and create reports to present insights. Requirements 🎓 Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field). 📊 Knowledge of machine learning concepts and algorithms . 🐍 Proficiency in Python or R (preferred). 🤝 Strong analytical and teamwork skills . Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Practical machine learning experience . ✔ Internship Certificate & Letter of Recommendation . ✔ Build your portfolio with real-world projects . How to Apply 📩 Submit your application by 27th May 2025 with the subject: "Machine Learning Intern Application" . Equal Opportunity WebBoost Solutions by UM is an equal opportunity employer , welcoming candidates from all backgrounds . Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Job Title: Machine Learning Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with Certificate of Internship Application Deadline : 27th May 2025 About Unified Mentor Unified Mentor provides students and graduates with hands-on learning opportunities and career growth in Machine Learning and Data Science. Role Overview As a Machine Learning Intern, you will work on real-world projects, enhancing your practical skills in data analysis and model development. Responsibilities ✅ Design, test, and optimize machine learning models ✅ Analyze and preprocess datasets ✅ Develop algorithms and predictive models ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn ✅ Document findings and create reports Requirements 🎓 Enrolled in or a graduate of a relevant program (Computer Science, AI, Data Science, or related field) 🧠 Knowledge of machine learning concepts and algorithms 💻 Proficiency in Python or R (preferred) 🤝 Strong analytical and teamwork skills Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Hands-on machine learning experience ✔ Internship Certificate & Letter of Recommendation ✔ Real-world project contributions for your portfolio Equal Opportunity Unified Mentor is an equal-opportunity employer, welcoming candidates from all backgrounds. Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Application Deadline: 27th May 2025 Opportunity: Full-time role based on performance + Internship Certificate About Unified Mentor Unified Mentor provides aspiring professionals with hands-on experience in data science through industry-relevant projects, helping them build successful careers. Responsibilities Collect, preprocess, and analyze large datasets Develop predictive models and machine learning algorithms Perform exploratory data analysis (EDA) to extract insights Create data visualizations and dashboards for effective communication Collaborate with cross-functional teams to deliver data-driven solutions Requirements Enrolled in or a graduate of Data Science, Computer Science, Statistics, or a related field Proficiency in Python or R for data analysis and modeling Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) Familiarity with data visualization tools like Tableau, Power BI, or Matplotlib Strong analytical and problem-solving skills Excellent communication and teamwork abilities Stipend & Benefits Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) Hands-on experience in data science projects Certificate of Internship & Letter of Recommendation Opportunity to build a strong portfolio of data science models and applications Potential for full-time employment based on performance How to Apply Submit your resume and a cover letter with the subject line "Data Science Intern Application." Equal Opportunity: Unified Mentor welcomes applicants from all backgrounds. Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Machine Learning Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 27th May 2025 About WebBoost Solutions by UM WebBoost Solutions by UM provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science . Role Overview As a Machine Learning Intern , you’ll work on real-world projects , gaining practical experience in machine learning and data analysis . Responsibilities ✅ Design, test, and optimize machine learning models. ✅ Analyze and preprocess datasets. ✅ Develop algorithms and predictive models for various applications. ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn . ✅ Document findings and create reports to present insights. Requirements 🎓 Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field) 📊 Knowledge of machine learning concepts and algorithms . 🐍 Proficiency in Python or R (preferred). 🤝 Strong analytical and teamwork skills . Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Practical machine learning experience . ✔ Internship Certificate & Letter of Recommendation . ✔ Build your portfolio with real-world projects . How to Apply 📩 Submit your application by 27th May 2025 with the subject: "Machine Learning Intern Application" . Equal Opportunity WebBoost Solutions by UM is an equal opportunity employer , welcoming candidates from all backgrounds . Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Job Title: Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python or R for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 27th May 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀 Show more Show less
Posted 1 month ago
0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
We are urgently looking for a highly skilled Python Developer with hands-on experience in Machine Learning and Computer Vision to join our dynamic team. In this role, you will be responsible for designing, developing, and deploying intelligent vision systems and ML-based solutions to solve real-world problems. 🔴 This is an immediate requirement – applications will be reviewed on a rolling basis. If you're ready to make an impact, apply now! Key Responsibilities: Develop and maintain Python-based ML/CV pipelines and applications. Design and implement computer vision algorithms for tasks like object detection, tracking, image classification, OCR, etc. Train, evaluate, and optimize machine learning models using frameworks like TensorFlow, PyTorch, or scikit-learn. Preprocess and annotate image and video datasets for training and testing models. Deploy models into production environments using tools like Docker, FastAPI, or Flask. Collaborate in a cross-functional team to integrate ML/CV solutions into products. Stay updated with the latest advancements in AI, ML, and CV research and practices. Required Skills & Qualifications: Strong proficiency in Python and related libraries (NumPy, OpenCV, Pandas, etc.) Solid understanding of Machine Learning concepts and experience with ML frameworks (e.g., TensorFlow, PyTorch, scikit-learn). Practical experience in Computer Vision , including techniques such as image segmentation, feature extraction, object detection, and recognition. Experience with REST APIs, version control (Git), and containerization (Docker). Familiarity with cloud platforms (AWS, GCP, Azure) is a plus. Bachelor's or Master's degree in Computer Science, Engineering, or a related field. What We Offer: Competitive salary and performance-based incentives. Flexible work hours Opportunity to work on cutting-edge AI and vision projects. A collaborative and growth-oriented team environment. Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
CryptoChakra is a leading cryptocurrency analytics and education platform committed to decoding the complexities of digital asset markets for traders, investors, and institutions. By merging advanced machine learning frameworks, real-time blockchain intelligence, and immersive educational resources, we empower users to navigate market volatility with precision. Our platform leverages Python, TensorFlow, and AWS-powered infrastructure to deliver AI-driven price forecasts, risk assessment tools, and interactive tutorials that transform raw data into actionable strategies. As a remote-first innovator, we prioritize transparency, scalability, and inclusivity to redefine accessibility in decentralized finance. Position: Data Science Intern Remote | Full-Time Internship | Compensation: Paid/Unpaid based on suitability Role Summary Join CryptoChakra’s data science team to refine predictive models, analyze blockchain trends, and contribute to tools used by thousands globally. This role offers hands-on experience in machine learning, sentiment analysis, and DeFi analytics, with mentorship from industry experts. Key Responsibilities Predictive Modeling: Develop and optimize ML algorithms (LSTM, Random Forest) for cryptocurrency price forecasting using historical and real-time blockchain data. Sentiment Analysis: Scrape and analyze social media (Twitter, Reddit) and news data to gauge market sentiment with NLP techniques. Blockchain Analytics: Decode on-chain metrics (wallet activity, gas fees) from explorers like Etherscan to identify market trends. Data Pipelines: Clean, preprocess, and structure datasets from exchanges (Binance, CoinGecko) for model training. Collaboration: Partner with engineers to deploy models into production and with educators to create data-backed tutorials. Qualifications Technical Skills Proficiency in Python/R for data manipulation (Pandas, NumPy) and machine learning (Scikit-learn, TensorFlow). Strong grasp of statistics (hypothesis testing, regression) and SQL/NoSQL databases. Familiarity with data visualization tools (Tableau, Plotly) and cloud platforms (AWS, GCP). Professional Competencies Analytical rigor to derive insights from unstructured datasets. Ability to communicate technical findings to non-technical stakeholders. Self-driven with adaptability to remote collaboration tools (Slack, Zoom). Preferred (Not Required) Academic projects involving time-series forecasting, clustering, or NLP. Exposure to blockchain fundamentals, DeFi protocols, or crypto APIs. Pursuing or holding a degree in Data Science, Computer Science, or related fields. What We Offer Skill Development: Master tools like PyTorch, Spark, and blockchain analytics platforms. Portfolio Impact: Contribute to models powering CryptoChakra’s predictions, used by 1M+ users. Flexibility: Remote-first culture with mentorship tailored to your learning pace. Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Machine Learning Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 26th May 2025 About WebBoost Solutions by UM WebBoost Solutions by UM provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science . Role Overview As a Machine Learning Intern , you’ll work on real-world projects , gaining practical experience in machine learning and data analysis . Responsibilities ✅ Design, test, and optimize machine learning models. ✅ Analyze and preprocess datasets. ✅ Develop algorithms and predictive models for various applications. ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn . ✅ Document findings and create reports to present insights. Requirements 🎓 Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field). 📊 Knowledge of machine learning concepts and algorithms . 🐍 Proficiency in Python or R (preferred). 🤝 Strong analytical and teamwork skills . Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Practical machine learning experience . ✔ Internship Certificate & Letter of Recommendation . ✔ Build your portfolio with real-world projects . How to Apply 📩 Submit your application by 26th May 2025 with the subject: "Machine Learning Intern Application" . Equal Opportunity WebBoost Solutions by UM is an equal opportunity employer , welcoming candidates from all backgrounds . Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Job Title: Data Science Intern Job Type: Internship (3 to 6 Months) Location: Remote / Pune, India Stipend: Unpaid (with opportunity for full-time offer upon completion) Work Mode: Remote (with optional in-office collaboration) About Coreline Solutions Coreline Solutions is an innovation-led IT services and consulting company helping organizations leverage the power of data and technology. We specialize in custom software development, digital transformation, and building intelligent data solutions. With a culture rooted in learning and growth, we offer opportunities that challenge, empower, and elevate your career. 🌐 Website: [ www.corelinesolutions. site] 📧 Email: hr@corelinesolutions.site 📍 Address: 2nd Floor, TechHub Plaza,Pune, India About the Role We are looking for a highly motivated Data Science Intern to join our team. This is an exciting opportunity for students or recent graduates who are eager to apply theoretical knowledge to real-world datasets and gain hands-on experience in data science projects. You’ll work closely with our data science and engineering teams on impactful initiatives involving predictive modeling, data wrangling, and algorithm development. Key Responsibilities Assist in designing and building machine learning models. Collect, clean, and preprocess structured and unstructured datasets. Perform exploratory data analysis (EDA) to identify patterns and insights. Support data science projects by implementing algorithms and validating results. Work on statistical modeling, feature engineering, and model evaluation. Contribute to the development of automation tools and pipelines. Qualifications Pursuing or recently completed a degree in Data Science, Computer Science, Statistics, Engineering, or a related field. Strong foundation in Python and relevant libraries (NumPy, pandas, scikit-learn, Matplotlib, etc.). Understanding of statistics, linear regression, classification, clustering, and model evaluation. Experience with Jupyter Notebooks, Git, and collaborative coding. Basic knowledge of SQL and database systems. Strong problem-solving and analytical thinking skills. Preferred (Nice to Have): Exposure to deep learning frameworks like TensorFlow or PyTorch. Knowledge of cloud platforms (AWS, Google Cloud, or Azure). Experience with real-world datasets or open-source projects. Understanding of business problem framing and solution deployment. What You’ll Gain Exposure to real-time data science problems and solutions. Mentorship and feedback from experienced data scientists and engineers. Access to in-house training materials and tools. Internship Certificate on successful completion. Letter of Recommendation for exceptional performance. Strong chance for full-time placement based on performance. Equal Opportunity Statement Coreline Solutions is an equal opportunity employer . We are committed to fostering an inclusive workplace where diversity is valued and discrimination of any kind is not tolerated. Application Instructions Send your resume and a short cover letter hr@corelinesolutions.site with the subject line: “Application for Data Science Intern – [Your Full Name]” 💼 Stay connected and follow our LinkedIn page to keep up with more openings and updates from Coreline Solutions. Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane