Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Application Deadline: 26th May 2025 Opportunity: Full-time role based on performance + Internship Certificate About Unified Mentor Unified Mentor provides aspiring professionals with hands-on experience in data science through industry-relevant projects, helping them build successful careers. Responsibilities Collect, preprocess, and analyze large datasets Develop predictive models and machine learning algorithms Perform exploratory data analysis (EDA) to extract insights Create data visualizations and dashboards for effective communication Collaborate with cross-functional teams to deliver data-driven solutions Requirements Enrolled in or a graduate of Data Science, Computer Science, Statistics, or a related field Proficiency in Python or R for data analysis and modeling Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) Familiarity with data visualization tools like Tableau, Power BI, or Matplotlib Strong analytical and problem-solving skills Excellent communication and teamwork abilities Stipend & Benefits Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) Hands-on experience in data science projects Certificate of Internship & Letter of Recommendation Opportunity to build a strong portfolio of data science models and applications Potential for full-time employment based on performance How to Apply Submit your resume and a cover letter with the subject line "Data Science Intern Application." Equal Opportunity: Unified Mentor welcomes applicants from all backgrounds. Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Machine Learning Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 26th May 2025 About WebBoost Solutions by UM WebBoost Solutions by UM provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science . Role Overview As a Machine Learning Intern , you’ll work on real-world projects , gaining practical experience in machine learning and data analysis . Responsibilities ✅ Design, test, and optimize machine learning models. ✅ Analyze and preprocess datasets. ✅ Develop algorithms and predictive models for various applications. ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn . ✅ Document findings and create reports to present insights. Requirements 🎓 Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field) 📊 Knowledge of machine learning concepts and algorithms . 🐍 Proficiency in Python or R (preferred). 🤝 Strong analytical and teamwork skills . Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Practical machine learning experience . ✔ Internship Certificate & Letter of Recommendation . ✔ Build your portfolio with real-world projects . How to Apply 📩 Submit your application by 26th May 2025 with the subject: "Machine Learning Intern Application" . Equal Opportunity WebBoost Solutions by UM is an equal opportunity employer , welcoming candidates from all backgrounds . Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Job Title: Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python or R for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 26th May 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀 Show more Show less
Posted 1 month ago
0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Work Level : Individual Core : Communication Skills, Problem Solving, Execution Leadership : Decisive, Team Alignment, Working Independently Industry Type : IT Services & Consulting Function : Data Analyst Key Skills : MySQL,Python,Bigdata,Data Science,Data Analytics,Data Analysis,Cloud,AWS,Business Intelligence (BI),Statistical Modeling,R,Big Data Platforms,Tableau Education : Graduate Note: This is a requirement for one of the Workassist Hiring Partner. Responsibilities: This is a Remote Position. Collect, clean, and preprocess data from various sources. Perform exploratory data analysis (EDA) to identify trends and patterns. Develop dashboards and reports using tools like Excel, Power BI, or Tableau. Use SQL to query and manipulate large datasets. Assist in building predictive models and performing statistical analyses. Present insights and recommendations based on data findings. Collaborate with cross-functional teams to support data-driven decision-making. Company Description Workassist is an online recruitment and employment solution platform based in Lucknow, India. We provide relevant profiles to employers and connect job seekers with the best opportunities across various industries. With a network of over 10,000+ recruiters, we help employers recruit talented individuals from sectors such as Banking & Finance, Consulting, Sales & Marketing, HR, IT, Operations, and Legal. We have adapted to the new normal and strive to provide a seamless job search experience for job seekers worldwide. Our goal is to enhance the job seeking experience by leveraging technology and matching job seekers with the right employers. For a seamless job search experience, visit our website: https://bit.ly/3QBfBU2 (Note: There are many more opportunities apart from this on the portal. Depending on the skills, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you! Show more Show less
Posted 1 month ago
0.0 - 31.0 years
0 - 0 Lacs
Bardez, Goa Region
Remote
DATA ANALYST A Data Analyst is responsible for collecting, processing, and analyzing data to provide valuable insights and support decision-making within an organization. Their role is crucial in helping businesses make informed choices and optimize their operations. Here are the key responsibilities of a Data Analyst: Data Collection and Gathering: Collect data from various sources, including databases, spreadsheets, and external data providers. Ensure data is accurate, complete, and properly documented. Data Cleaning and Preprocessing: Clean, transform, and preprocess data to remove inconsistencies, missing values, and errors. Format data for analysis. Data Analysis and Exploration: Analyze data using statistical methods, data visualization, and exploratory data analysis (EDA). Identify trends, patterns, and correlations within the data. Data Modelling and Hypothesis Testing: Build predictive models and perform hypothesis testing to answer specific business questions. Use statistical and machine learning techniques to make data-driven predictions and recommendations. Reporting and Visualization: Create clear and informative reports, dashboards, and data visualizations to communicate insights to stakeholders. Use tools like Tableau, Power BI, or data visualization libraries in Python or R. KPI Tracking: Develop and track key performance indicators (KPIs) to measure the success of business initiatives. Monitor KPI trends and provide regular reports. Data Integration: Integrate data from multiple sources to create a unified dataset for analysis. Ensure data compatibility and consistency. Data Security and Compliance: Implement data security measures to protect sensitive information. Ensure compliance with data privacy regulations (e.g., GDPR, HIPAA). Data Interpretation: Interpret data findings in the context of business goals and objectives. Translate technical insights into actionable recommendations. Data Governance: Establish data governance policies and procedures to maintain data quality and integrity. Document data sources and definitions. A/B Testing and Experimentation: Design and conduct A/B tests to evaluate the impact of changes in marketing campaigns, product features, or website design. Analyze and interpret the results of experiments. Collaboration with Stakeholders: Work closely with cross-functional teams, including marketing, product development, finance, and management, to address data-related challenges and opportunities. Continuous Learning: Stay up-to-date with industry trends, data analysis techniques, and new tools. Seek opportunities for professional development. Data-Driven Decision Support: Provide insights and recommendations to support strategic decision-making. Help identify areas for improvement and growth. Documentation: Document data analysis processes, methodologies, and findings for future reference. Ensure knowledge sharing within the organization. Troubleshooting and Problem-Solving: Investigate data-related issues and anomalies. Propose solutions and improvements. Project Management: Manage data analysis projects, including setting objectives, timelines, and deliverables. Coordinate with team members to ensure project success. Data Analysts are critical for transforming raw data into actionable insights that can drive business decisions and improvements. They should have strong analytical skills, proficiency in data analysis tools and programming languages (e.g., SQL, Python, R), and effective communication skills to convey complex findings to non-technical stakeholders.
Posted 1 month ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 4 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, Agentic Framework to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 4 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models Utilize optimization tools and techniques, including MIP (Mixed Integer Programming. Deep knowledge of classical AIML (regression, classification, time series, clustering) Drive DevOps and MLOps practices, covering CI/CD and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
3.0 - 6.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Who We Are The next step of your career starts here, where you can bring your own unique mix of skills and perspectives to a fast-growing team. Metyis is a global and forward-thinking firm operating across a wide range of industries, developing and delivering AI & Data, Digital Commerce, Marketing & Design solutions and Advisory services. At Metyis, our long-term partnership model brings long-lasting impact and growth to our business partners and clients through extensive execution capabilities. With our team, you can experience a collaborative environment with highly skilled multidisciplinary experts, where everyone has room to build bigger and bolder ideas. Being part of Metyis means you can speak your mind and be creative with your knowledge. Imagine the things you can achieve with a team that encourages you to be the best version of yourself. We are Metyis. Partners for Impact. What We Offer Interact with C-level at our clients on regular basis to drive their business towards impactful change Lead your team in creating new business solutions Seize opportunities at the client and at Metyis in our entrepreneurial environment Become part of a fast growing international and diverse team What You Will Do Execute data science projects from start to end. Understand client business problems, define analytical approaches, and develop actionable solutions. Engage directly with stakeholders to gather requirements, present findings, and guide data-driven decisions. Preprocess and analyze structured and unstructured data using statistical approach. Build and deploy predictive models, forecasting solutions, recommendation systems. Collaborate with engineering, product, and business teams to translate insights into outcomes. Communicate results clearly through presentations and storytelling. What You’ll Bring Graduate degree or higher with courses in programming, econometrics / data science Experience: 3 - 6 years of professional work experience in advanced analytics domain, using statistical modeling and deep learning for business problem solutions. Well-developed Logical Reasoning, Critical Thinking & Problem-Solving abilities. Excellent presentations skills and storytelling capabilities. Self-driven with a collaborative attitude and a passion for delivering business value through data. Strong hands-on experience in Python/R and SQL. Good understanding and Experience with cloud platforms such as Azure, AWS, or GCP. Experience with data visualization tools in python like – Seaborn, Plotly. Good understanding of Git concepts. Good experience with data manipulation tools in python like – Pandas, Numpy. Must have worked with scikit learn, NLTK, Spacy, transformers. Strong foundation in machine learning algorithms, predictive modeling, and statistical analysis. Good understanding of deep learning concepts, especially in NLP and Computer Vision applications. Proficiency in time-series forecasting and business analytics for functions like marketing, sales, operations, and CRM. Exposure to tools like – Mlflow, model deployment, API integration, and CI/CD pipelines. Good to have: Generative AI Experience with text and Image data. Familiarity with LLM frameworks such as LangChain and hubs like Hugging Face. Exposure to vector databases (e.g., FAISS, Pinecone, Weaviate) for semantic search or retrieval-augmented generation (RAG). In a changing world, diversity and inclusion are core values for team well-being and performance. At Metyis, we want to welcome and retain all talents, regardless of gender, age, origin or sexual orientation, and irrespective of whether or not they are living with a disability, as each of them has their own experience and identity. Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Job Title: Machine Learning Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with Certificate of Internship Application Deadline : 25th May 2025 About Unified Mentor Unified Mentor provides students and graduates with hands-on learning opportunities and career growth in Machine Learning and Data Science. Role Overview As a Machine Learning Intern, you will work on real-world projects, enhancing your practical skills in data analysis and model development. Responsibilities ✅ Design, test, and optimize machine learning models ✅ Analyze and preprocess datasets ✅ Develop algorithms and predictive models ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn ✅ Document findings and create reports Requirements 🎓 Enrolled in or a graduate of a relevant program (Computer Science, AI, Data Science, or related field) 🧠 Knowledge of machine learning concepts and algorithms 💻 Proficiency in Python or R (preferred) 🤝 Strong analytical and teamwork skills Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Hands-on machine learning experience ✔ Internship Certificate & Letter of Recommendation ✔ Real-world project contributions for your portfolio Equal Opportunity Unified Mentor is an equal-opportunity employer, welcoming candidates from all backgrounds. Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Job Title: Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python or R for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 25th May 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀 Show more Show less
Posted 1 month ago
3.0 - 6.0 years
0 Lacs
Gurgaon, Haryana, India
On-site
Who We Are The next step of your career starts here, where you can bring your own unique mix of skills and perspectives to a fast-growing team. Metyis is a global and forward-thinking firm operating across a wide range of industries, developing and delivering AI & Data, Digital Commerce, Marketing & Design solutions and Advisory services. At Metyis, our long-term partnership model brings long-lasting impact and growth to our business partners and clients through extensive execution capabilities. With our team, you can experience a collaborative environment with highly skilled multidisciplinary experts, where everyone has room to build bigger and bolder ideas. Being part of Metyis means you can speak your mind and be creative with your knowledge. Imagine the things you can achieve with a team that encourages you to be the best version of yourself. We are Metyis. Partners for Impact. What We Offer Interact with C-level at our clients on regular basis to drive their business towards impactful change Lead your team in creating new business solutions Seize opportunities at the client and at Metyis in our entrepreneurial environment Become part of a fast growing international and diverse team What You Will Do Execute data science projects from start to end. Understand client business problems, define analytical approaches, and develop actionable solutions. Engage directly with stakeholders to gather requirements, present findings, and guide data-driven decisions. Preprocess and analyze structured and unstructured data using statistical approach. Build and deploy predictive models, forecasting solutions, recommendation systems. Collaborate with engineering, product, and business teams to translate insights into outcomes. Communicate results clearly through presentations and storytelling. What You’ll Bring Graduate degree or higher with courses in programming, econometrics / data science Experience: 3 - 6 years of professional work experience in advanced analytics domain, using statistical modeling and deep learning for business problem solutions. Well-developed Logical Reasoning, Critical Thinking & Problem-Solving abilities. Excellent presentations skills and storytelling capabilities. Self-driven with a collaborative attitude and a passion for delivering business value through data. Strong hands-on experience in Python/R and SQL. Good understanding and Experience with cloud platforms such as Azure, AWS, or GCP. Experience with data visualization tools in python like – Seaborn, Plotly. Good understanding of Git concepts. Good experience with data manipulation tools in python like – Pandas, Numpy. Must have worked with scikit learn, NLTK, Spacy, transformers. Strong foundation in machine learning algorithms, predictive modeling, and statistical analysis. Good understanding of deep learning concepts, especially in NLP and Computer Vision applications. Proficiency in time-series forecasting and business analytics for functions like marketing, sales, operations, and CRM. Exposure to tools like – Mlflow, model deployment, API integration, and CI/CD pipelines. Good to have: Generative AI Experience with text and Image data. Familiarity with LLM frameworks such as LangChain and hubs like Hugging Face. Exposure to vector databases (e.g., FAISS, Pinecone, Weaviate) for semantic search or retrieval-augmented generation (RAG). In a changing world, diversity and inclusion are core values for team well-being and performance. At Metyis, we want to welcome and retain all talents, regardless of gender, age, origin or sexual orientation, and irrespective of whether or not they are living with a disability, as each of them has their own experience and identity. Show more Show less
Posted 1 month ago
2.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Title: L3 SDE (AI/ML) Location: Arjan Garh, MG Road (Delhi) Job Type: Full-Time, On site Salary: RS.30,000- 70,000 No. of Openings: 10 ***IMMEDIATE JOINERS REQUIRED*** About Us : Timble is a forward-thinking organization dedicated to leveraging cutting-edge technology to solve real-world problems. Our mission is to drive innovation and create impactful solutions through artificial intelligence and machine learning. Key Responsibilities: • Model Development: Design, develop, and deploy machine learning models and AI solutions to address complex business problems. • Data Analysis: Work with large datasets to clean, preprocess, and analyze data, ensuring high data quality and relevance. • Algorithm Optimization: Implement and refine algorithms to improve the performance and accuracy of AI models. • Collaboration: Collaborate with cross-functional teams, including data scientists, software engineers, and product managers, to integrate AI solutions into existing systems and workflows. • Research: Stay up-to-date with the latest advancements in AI and machine learning, and apply new techniques and methodologies to enhance our AI capabilities. • Performance Monitoring: Monitor the performance of AI systems, troubleshoot issues, and make necessary adjustments to maintain optimal functionality. • Documentation: Create and maintain comprehensive documentation for AI models, including design specifications, training processes, and deployment strategies. Required Qualifications and Experience: • Education: B.tech or M.tech or MCA or Bachelor’s or master's degree in Artificial intelligence, computer science or Data Analyst or related fields. • Experience: 2 year of AI/ML development, with a proven track record of deploying AI solutions in a production environment. **IMMEDIATE JOINERS REQUIRED** Technical Skills requirement: Must have proficiency in programming languages such as Python, Machine Learning, Deep Learning Experience with AI/ML frameworks and libraries (e.g., TensorFlow, PyTorch, Scikit-learn, Keras). Strong knowledge of machine learning algorithms, neural networks, and natural language processing (NLP). Experience in integrating AI based solutions with web frameworks like Django, Flask. Problem-Solving: Excellent analytical and problem-solving skills, with the ability to work independently and as part of a team. Communication: Strong verbal and written communication skills, with the ability to explain complex technical concepts to non-technical stakeholders. Project Management: Ability to manage multiple projects simultaneously and meet deadlines in a fast-paced environment. How to Apply: Drop your resume at hr@timbletech.com with Current Notice , Current CTC , Expected CTC Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Machine Learning Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship Application Deadline: 25th May 2025 About WebBoost Solutions by UM WebBoost Solutions by UM provides students and graduates with hands-on learning and career growth opportunities in machine learning and data science . Role Overview As a Machine Learning Intern , you’ll work on real-world projects , gaining practical experience in machine learning and data analysis . Responsibilities ✅ Design, test, and optimize machine learning models. ✅ Analyze and preprocess datasets. ✅ Develop algorithms and predictive models for various applications. ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn . ✅ Document findings and create reports to present insights. Requirements 🎓 Enrolled in or graduate of a relevant program (AI, ML, Data Science, Computer Science, or related field). 📊 Knowledge of machine learning concepts and algorithms . 🐍 Proficiency in Python or R (preferred). 🤝 Strong analytical and teamwork skills . Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Practical machine learning experience . ✔ Internship Certificate & Letter of Recommendation . ✔ Build your portfolio with real-world projects . How to Apply 📩 Submit your application by 25th May 2025 with the subject: "Machine Learning Intern Application" . Equal Opportunity WebBoost Solutions by UM is an equal opportunity employer , welcoming candidates from all backgrounds . Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Data Science Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Application Deadline: 25th May 2025 Opportunity: Full-time role based on performance + Internship Certificate About Unified Mentor Unified Mentor provides aspiring professionals with hands-on experience in data science through industry-relevant projects, helping them build successful careers. Responsibilities Collect, preprocess, and analyze large datasets Develop predictive models and machine learning algorithms Perform exploratory data analysis (EDA) to extract insights Create data visualizations and dashboards for effective communication Collaborate with cross-functional teams to deliver data-driven solutions Requirements Enrolled in or a graduate of Data Science, Computer Science, Statistics, or a related field Proficiency in Python or R for data analysis and modeling Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) Familiarity with data visualization tools like Tableau, Power BI, or Matplotlib Strong analytical and problem-solving skills Excellent communication and teamwork abilities Stipend & Benefits Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) Hands-on experience in data science projects Certificate of Internship & Letter of Recommendation Opportunity to build a strong portfolio of data science models and applications Potential for full-time employment based on performance How to Apply Submit your resume and a cover letter with the subject line "Data Science Intern Application." Equal Opportunity: Unified Mentor welcomes applicants from all backgrounds. Show more Show less
Posted 1 month ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Company Description NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. Job Description Responsibilities: Develop and maintain Power BI dashboards and reports. Utilize Power Query for data transformation and manipulation. Write and optimize SQL queries for data extraction and analysis. Perform statistical analysis using R. Develop and implement data analysis scripts in Python. Collaborate with cross-functional teams to understand data needs and deliver solutions. Present findings and recommendations to stakeholders. This role requires a strong analytical mindset, proficiency in BI tools, and the ability to translate complex data into actionable insights. Produce comprehensive Discover reports for our clients worldwide, which includes gathering and analysing data, ensuring accuracy and relevance, and presenting findings in a clear and actionable format. Collaborate closely with various teams to understand client needs and deliver Discover reports that drive business decisions. Collaborate with cross-functional teams to understand business requirements and translate them into specifications. Ensure data accuracy, integrity, and consistency across all BI solutions. Stay updated with the latest BI technologies and industry trends to continuously improve BI processes. Key Duties: Collect, clean, and preprocess data from various sources. Design and implement data models to support reporting and analytics. Execute, monitor, and continuously improve assigned production tasks, including maintenance and data quality checks. Identify trends, patterns, and anomalies in data sets. Provide technical support and training to team members on data tools and techniques. Stay updated with the latest industry trends and best practices in data analysis. Demonstrate proficiency in VBA (for creating custom excel dashboards for real-time reporting) and Power BI, as these skills are highly advantageous. Understand the regular execution process with thorough attention to detail and identify opportunities for automation and improvement. Maintain the high quality of setups across various markets reported by NielsenIQ and analyse any potential client data concerns. Engage frequently in cross-departmental collaboration as a crucial link in the chain of NielsenIQ activities. Execute production tasks to ensure data accuracy and trend analysis within scheduled deadlines. Investigate data inquiries and challenges in collaboration with local, regional, and offshore teams. Prepare accurate tracking KPIs to monitor and improve quality performance promptly. Qualifications Minimum 3+ yrs industry experience Advanced proficiency in Power BI, Python and SQL. Intermediate to advanced proficiency in Tableau. Strong SQL & Python skills for data querying and manipulation. Experience with R for statistical analysis. Excellent analytical and problem-solving skills. Ability to communicate complex data insights effectively. Strong attention to detail and organizational skills. Show interest in the Market Research domain and possess knowledge in collating, cleansing, analysing, interpreting, and visualizing large volumes of data. Demonstrate good communication skills. Be enthusiastic about learning and growing within the function. Be capable of learning upstream and downstream processes to ensure efficiency and quality delivery in the current role. Be flexible with shift timings, including night shifts. Additional Information Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
CryptoChakra is a leading cryptocurrency analytics and education platform committed to demystifying digital asset markets for traders, investors, and enthusiasts worldwide. By integrating cutting-edge AI-driven predictions, blockchain analytics, and immersive learning modules, we empower users to navigate market volatility with confidence. Our platform combines advanced tools like Python, TensorFlow, and AWS to deliver actionable insights, risk assessments, and educational content that bridge the gap between complex data and strategic decision-making. As a remote-first innovator, we champion accessibility in decentralized finance, fostering a future where crypto literacy is universal. Position: Fresher Data Scientist Intern Remote | Full-Time Internship | Compensation: Paid/Unpaid based on suitability Role Summary Join CryptoChakra’s data science team to gain hands-on experience in transforming raw blockchain data into impactful insights. This role is tailored for recent graduates or students eager to apply foundational skills in machine learning, statistical analysis, and data storytelling to real-world crypto challenges. Key Responsibilities Data Processing: Clean and preprocess blockchain datasets from sources like Etherscan or CoinGecko using Python/R. Predictive Modeling: Assist in building and testing ML models for price forecasting or DeFi trend analysis. Insight Generation: Create visualizations (Tableau, Matplotlib) to simplify complex trends for educational content. Collaboration: Work with engineers and educators to refine analytics tools and tutorials. Documentation: Maintain clear records of methodologies and findings for team reviews. Who We’re Looking For Technical Skills Foundational knowledge of Python/R for data manipulation (Pandas, NumPy). Basic understanding of statistics (regression, hypothesis testing). Familiarity with data visualization tools (Tableau, Power BI) or libraries (Seaborn). Curiosity about blockchain technology, DeFi, or crypto markets. Soft Skills Eagerness to learn and adapt in a fast-paced remote environment. Strong problem-solving mindset and attention to detail. Ability to communicate technical concepts clearly. Preferred (Not Required) Academic projects involving data analysis or machine learning. Exposure to SQL, AWS, or big data tools. Pursuing a degree in Data Science, Computer Science, Statistics, or related fields. What We Offer Mentorship: Guidance from experienced data scientists and blockchain experts. Skill Development: Training in real-world tools like TensorFlow and Tableau. Portfolio Projects: Contribute to live projects featured on CryptoChakra’s platform. Flexibility: Remote work with adaptable hours for students. Show more Show less
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description : EY GDS – Data and Analytics (D And A) – Senior – Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description : EY GDS – Data and Analytics (D And A) – Senior – Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description : EY GDS – Data and Analytics (D And A) – Senior – Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
3.0 - 7.0 years
0 Lacs
Pune, Maharashtra, India
On-site
At EY, you’ll have the chance to build a career as unique as you are, with the global scale, support, inclusive culture and technology to become the best version of you. And we’re counting on your unique voice and perspective to help EY become even better, too. Join us and build an exceptional experience for yourself, and a better working world for all. Job Description : EY GDS – Data and Analytics (D And A) – Senior – Senior Data Scientist Role Overview: We are seeking a highly skilled and experienced Senior Data Scientist with a minimum of 3 - 7 years of experience in Data Science and Machine Learning, preferably with experience in NLP, Generative AI, LLMs, MLOps, Optimization techniques, and AI solution Architecture. In this role, you will play a key role in the development and implementation of AI solutions, leveraging your technical expertise. The ideal candidate should have a deep understanding of AI technologies and experience in designing and implementing cutting-edge AI models and systems. Additionally, expertise in data engineering, DevOps, and MLOps practices will be valuable in this role. Responsibilities: Your technical responsibilities: Contribute to the design and implementation of state-of-the-art AI solutions. Assist in the development and implementation of AI models and systems, leveraging techniques such as Language Models (LLMs) and generative AI. Collaborate with stakeholders to identify business opportunities and define AI project goals. Stay updated with the latest advancements in generative AI techniques, such as LLMs, and evaluate their potential applications in solving enterprise challenges. Utilize generative AI techniques, such as LLMs, to develop innovative solutions for enterprise industry use cases. Integrate with relevant APIs and libraries, such as Azure Open AI GPT models and Hugging Face Transformers, to leverage pre-trained models and enhance generative AI capabilities. Implement and optimize end-to-end pipelines for generative AI projects, ensuring seamless data processing and model deployment. Utilize vector databases, such as Redis, and NoSQL databases to efficiently handle large-scale generative AI datasets and outputs. Implement similarity search algorithms and techniques to enable efficient and accurate retrieval of relevant information from generative AI outputs. Collaborate with domain experts, stakeholders, and clients to understand specific business requirements and tailor generative AI solutions accordingly. Conduct research and evaluation of advanced AI techniques, including transfer learning, domain adaptation, and model compression, to enhance performance and efficiency. Establish evaluation metrics and methodologies to assess the quality, coherence, and relevance of generative AI outputs for enterprise industry use cases. Ensure compliance with data privacy, security, and ethical considerations in AI applications. Leverage data engineering skills to curate, clean, and preprocess large-scale datasets for generative AI applications. Requirements: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. A Ph.D. is a plus. Minimum 3-7 years of experience in Data Science and Machine Learning. In-depth knowledge of machine learning, deep learning, and generative AI techniques. Proficiency in programming languages such as Python, R, and frameworks like TensorFlow or PyTorch. Strong understanding of NLP techniques and frameworks such as BERT, GPT, or Transformer models. Familiarity with computer vision techniques for image recognition, object detection, or image generation. Experience with cloud platforms such as Azure, AWS, or GCP and deploying AI solutions in a cloud environment. Expertise in data engineering, including data curation, cleaning, and preprocessing. Knowledge of trusted AI practices, ensuring fairness, transparency, and accountability in AI models and systems. Strong collaboration with software engineering and operations teams to ensure seamless integration and deployment of AI models. Excellent problem-solving and analytical skills, with the ability to translate business requirements into technical solutions. Strong communication and interpersonal skills, with the ability to collaborate effectively with stakeholders at various levels. Understanding of data privacy, security, and ethical considerations in AI applications. Track record of driving innovation and staying updated with the latest AI research and advancements. Good to Have Skills: Apply trusted AI practices to ensure fairness, transparency, and accountability in AI models and systems. Utilize optimization tools and techniques, including MIP (Mixed Integer Programming). Drive DevOps and MLOps practices, covering continuous integration, deployment, and monitoring of AI models. Implement CI/CD pipelines for streamlined model deployment and scaling processes. Utilize tools such as Docker, Kubernetes, and Git to build and manage AI pipelines. Apply infrastructure as code (IaC) principles, employing tools like Terraform or CloudFormation. Implement monitoring and logging tools to ensure AI model performance and reliability. Collaborate seamlessly with software engineering and operations teams for efficient AI model integration and deployment. Familiarity with DevOps and MLOps practices, including continuous integration, deployment, and monitoring of AI models. EY | Building a better working world EY exists to build a better working world, helping to create long-term value for clients, people and society and build trust in the capital markets. Enabled by data and technology, diverse EY teams in over 150 countries provide trust through assurance and help clients grow, transform and operate. Working across assurance, consulting, law, strategy, tax and transactions, EY teams ask better questions to find new answers for the complex issues facing our world today. Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Job Title: Machine Learning Intern (Paid) Company: Unified Mentor Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with Certificate of Internship Application Deadline : 24th May 2025 About Unified Mentor Unified Mentor provides students and graduates with hands-on learning opportunities and career growth in Machine Learning and Data Science. Role Overview As a Machine Learning Intern, you will work on real-world projects, enhancing your practical skills in data analysis and model development. Responsibilities ✅ Design, test, and optimize machine learning models ✅ Analyze and preprocess datasets ✅ Develop algorithms and predictive models ✅ Use tools like TensorFlow, PyTorch, and Scikit-learn ✅ Document findings and create reports Requirements 🎓 Enrolled in or a graduate of a relevant program (Computer Science, AI, Data Science, or related field) 🧠 Knowledge of machine learning concepts and algorithms 💻 Proficiency in Python or R (preferred) 🤝 Strong analytical and teamwork skills Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based) (Paid) ✔ Hands-on machine learning experience ✔ Internship Certificate & Letter of Recommendation ✔ Real-world project contributions for your portfolio Equal Opportunity Unified Mentor is an equal-opportunity employer, welcoming candidates from all backgrounds. Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Job Title: Data Science Intern Job Type: Internship (3 to 6 Months) Location: Remote / Pune, India Stipend: Unpaid (with opportunity for full-time offer upon completion) Work Mode: Remote (with optional in-office collaboration) About Coreline Solutions Coreline Solutions is an innovation-led IT services and consulting company helping organizations leverage the power of data and technology. We specialize in custom software development, digital transformation, and building intelligent data solutions. With a culture rooted in learning and growth, we offer opportunities that challenge, empower, and elevate your career. 🌐 Website: [www.corelinesolutions.site] 📧 Email: hr@corelinesolutions.site 📍 Address: 2nd Floor, TechHub Plaza,Pune, India About the Role We are looking for a highly motivated Data Science Intern to join our team. This is an exciting opportunity for students or recent graduates who are eager to apply theoretical knowledge to real-world datasets and gain hands-on experience in data science projects. You’ll work closely with our data science and engineering teams on impactful initiatives involving predictive modeling, data wrangling, and algorithm development. Key Responsibilities Assist in designing and building machine learning models. Collect, clean, and preprocess structured and unstructured datasets. Perform exploratory data analysis (EDA) to identify patterns and insights. Support data science projects by implementing algorithms and validating results. Work on statistical modeling, feature engineering, and model evaluation. Contribute to the development of automation tools and pipelines. Qualifications Pursuing or recently completed a degree in Data Science, Computer Science, Statistics, Engineering, or a related field. Strong foundation in Python and relevant libraries (NumPy, pandas, scikit-learn, Matplotlib, etc.). Understanding of statistics, linear regression, classification, clustering, and model evaluation. Experience with Jupyter Notebooks, Git, and collaborative coding. Basic knowledge of SQL and database systems. Strong problem-solving and analytical thinking skills. Preferred (Nice to Have): Exposure to deep learning frameworks like TensorFlow or PyTorch. Knowledge of cloud platforms (AWS, Google Cloud, or Azure). Experience with real-world datasets or open-source projects. Understanding of business problem framing and solution deployment. What You’ll Gain Exposure to real-time data science problems and solutions. Mentorship and feedback from experienced data scientists and engineers. Access to in-house training materials and tools. Internship Certificate on successful completion. Letter of Recommendation for exceptional performance. Strong chance for full-time placement based on performance. Equal Opportunity Statement Coreline Solutions is an equal opportunity employer. We are committed to fostering an inclusive workplace where diversity is valued and discrimination of any kind is not tolerated. Application Instructions Send your resume and a short cover letter hr@corelinesolutions.site with the subject line: “Application for Data Science Intern – [Your Full Name]” 💼 Stay connected and follow our LinkedIn page to keep up with more openings and updates from Coreline Solutions. Show more Show less
Posted 1 month ago
0 years
0 Lacs
India
Remote
Job Title: Data Science Intern (Paid) Company: WebBoost Solutions by UM Location: Remote Duration: 3 months Opportunity: Full-time based on performance, with a Certificate of Internship About WebBoost Solutions by UM WebBoost Solutions by UM provides aspiring professionals with hands-on experience in data science , offering real-world projects to develop and refine their analytical and machine learning skills for a successful career. Responsibilities ✅ Collect, preprocess, and analyze large datasets. ✅ Develop predictive models and machine learning algorithms . ✅ Perform exploratory data analysis (EDA) to extract meaningful insights. ✅ Create data visualizations and dashboards for effective communication of findings. ✅ Collaborate with cross-functional teams to deliver data-driven solutions . Requirements 🎓 Enrolled in or graduate of a program in Data Science, Computer Science, Statistics, or a related field . 🐍 Proficiency in Python or R for data analysis and modeling. 🧠 Knowledge of machine learning libraries such as scikit-learn, TensorFlow, or PyTorch (preferred) . 📊 Familiarity with data visualization tools (Tableau, Power BI, or Matplotlib) . 🧐 Strong analytical and problem-solving skills. 🗣 Excellent communication and teamwork abilities. Stipend & Benefits 💰 Stipend: ₹7,500 - ₹15,000 (Performance-Based). ✔ Hands-on experience in data science projects . ✔ Certificate of Internship & Letter of Recommendation . ✔ Opportunity to build a strong portfolio of data science models and applications. ✔ Potential for full-time employment based on performance. How to Apply 📩 Submit your resume and a cover letter with the subject line "Data Science Intern Application." 📅 Deadline: 24th May 2025 Equal Opportunity WebBoost Solutions by UM is committed to fostering an inclusive and diverse environment and encourages applications from all backgrounds. Let me know if you need any modifications! 🚀 Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
What we're looking for: At least 5 years of experience in designing & building AI applications for customer and deploying them into production. Software engineering experience in building Secure, scalable and performant applications for customers Experience with Document extraction using AI, Conversational AI, Vision AI, NLP or Gen AI Design, develop, and operationalize existing ML models by fine tuning, personalizing it Evaluate machine learning models and perform necessary tuning Develop prompts that instruct LLM to generate relevant and accurate responses Collaborate with data scientists and engineers to analyze and preprocess datasets for prompt development, including data cleaning, transformation, and augmentation Conduct thorough analysis to evaluate LLM responses, iteratively modify prompts to improve LLM performance Hands on customer experience with RAG solution or fine tuning of LLM model Build and deploy scalable machine learning pipelines on GCP or any equivalent cloud platform involving data warehouses, machine learning platforms, dashboards or CRM tools Experience working with the end-to-end steps involving but not limited to data cleaning, exploratory data analysis, dealing outliers, handling imbalances, analyzing data distributions (univariate, bivariate, multivariate), transforming numerical and categorical data into features, feature selection, model selection, model training and deployment Proven experience building and deploying machine learning models in production environments for real life applications Good understanding of natural language processing, computer vision or other deep learning techniques Expertise in Python, Numpy, Pandas and various ML libraries (e.g., XGboost, TensorFlow, PyTorch, Scikit-learn, LangChain) Familiarity with Google Cloud or any other Cloud Platform and its machine learning services Excellent communication, collaboration, and problem-solving skills Good to Have Google Cloud Certified Professional Machine Learning or TensorFlow Certified Developer certifications or equivalent Experience of working with one or more public cloud platforms - namely GCP, AWS or Azure Experience with AutoML and vision techniques Master’s degree in statistics, machine learning or related fields Show more Show less
Posted 1 month ago
3.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
About the Business Manupatra provides legal, regulatory, and analytics that help customers increase their productivity, improve decision-making, achieve better outcomes, and advance the rule of law. As a digital pioneer, the company was the first to bring legal and business information online in India. About Our Team Manupatra, serves customers in more than 20 countries is a leading provider of information-based analytics and decision tools for professional and business customers. Our company has been a long-time leader in deploying advanced technologies to the legal market to improve productivity and transform the overall business and practice of law, About the Role As the Senior AI Engineer/ Data Engineer, we are looking for a skilled LLM Application Developer to join our team. You will be responsible for implementing large language model (LLM) based applications, working with proprietary and open-source models as well as popular frameworks to ensure seamless integration and deployment. Responsibilities Building standalone applications that interact with LLM models Building RAG-based applications Understanding of Vector DBs like Solr for LLM. Preprocess and manage data for training and deployment. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, maintainable, and efficient code. Document development processes, code, and APIs. Design, prototype, implement, deploy, and maintain features for NLP or AI-related projects Requirements B.Tech in Computer Science, Data Science, or related field or equivalent experience. Proven experience in building customer facing ML based APIs Experience in developing applications that are scalable to handle TBs of data Strong knowledge of API integration (RESTful, GraphQL). Experience with data preprocessing, SQL, and NoSQL databases as well as vector stores (e.g., Postgres, MySQL, Solr, Elasticsearch/OpenSearch, etc.) Familiarity with deployment tools (Docker, Kubernetes). Experience with DevOps tools like Jenkins, Terraform, Cloud Formation templates is highly preferred. Excellent problem-solving and communication skills. Experience with Spark/Hadoop, EMR or any other Big Data technology would be a plus Certifications in machine learning, data science, or cloud computing. Portfolio showcasing past projects or contributions to open-source projects. Have at least 3+ years of Software Engineering experiences as a team member or team mentor in a mid to large technical company Experience working with Python, and optional at least one other programming language such as Flask, Django, FastAPI , Golang, Java, SQL, etc. Experience of successfully implement development processes, coding best practices, and code reviews, familiar with CI/CD, DevOps, Redis, Docker,K8S,AZURE Good sense of software architecture design, application scaling, performance, and security. Solid verbal and written communication skills Show more Show less
Posted 1 month ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Micoworks is a company with a clear mission: to Empower every brand for the better future . This ambitious goal sets the stage for their vision and core values. Who we are By 2030, Micoworks aims to be the Asia No.1 Brand Empowerment Company . This mid-term goal outlines their dedication to becoming the leading force in empowering brands across Asia. To achieve their mission and vision, Micoworks identifies four key values that guide their work: WOW THE CUSTOMER SMART SPEED OPEN MIND ALL FOR ONE Micoworks' mission, vision, and values paint a picture of a company dedicated to empowering brands, working with agility and open-mindedness, and prioritising customer success. Job Summary The Senior Data Scientist will work on data-driven initiatives to solve complex business challenges, leveraging advanced analytics, machine learning, and statistical modeling. This role requires expertise in translating data insights into actionable strategies and collaborating with cross-functional teams. Ideal candidates will have a strong background in analytics, or tech-driven industries.Key Responsibilities Develop and deploy predictive models (e.g., customer lifetime value, media mix modeling, time-series forecasting) using Python/R, TensorFlow, or PyTorch. Clean, preprocess, and validate large datasets (structured/unstructured) from multiple sources. Partner with stakeholders (e.g., marketing, finance) to design data-driven solutions (e.g., A/B testing) Ensure adherence to data privacy and ethical AI practices Research and implement cutting-edge techniques (e.g., NLP, deep learning) to enhance business strategies. Required Qualifications Education: Master’s/PhD in Statistics, Computer Science, Econometrics, or related quantitative fields. Experience: 5+ years in data science, with expertise in: Programming: Python/R, SQL, Spark, and libraries (Pandas, Scikit-learn). Statistical methods: Decision trees, regression, DL and experimental design. Cloud platforms: Azure, Databricks, or AWS 5. Soft Skills: Strong storytelling, stakeholder management, and problem-solving. Show more Show less
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane