Jobs
Interviews

1530 Matplotlib Jobs - Page 27

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

5 - 7 Lacs

Gurgaon

Remote

We are seeking a dynamic RPA & Data Automation Developer with 3+ years of hands-on experience in building automated workflows, data pipelines, and API-based integrations. The role demands strong analytical skills, advanced scripting capabilities in Python, experience with RPA tools like Power Automate, and solid SQL knowledge for backend automation. Design, develop, and maintain RPA solutions using Python, Selenium, and Power Automate. Automate business processes using scripts and bots that interact with Excel, browsers, databases, and APIs. Work extensively with Python libraries including Pandas, NumPy, Matplotlib, re (regex), smtp, and FastAPI. Create and consume RESTful APIs for data services and automation endpoints. Perform complex data analysis and transformation using Pandas and SQL queries. Write and maintain SQL components such as stored procedures, views, functions, and perform schema design and query optimization. Automate data flows across platforms including Excel, emails, and databases using VBA macros and Power Automate flows. Implement exception handling, logging, and monitoring mechanisms for all automation processes. Job Types: Full-time, Permanent Pay: ₹500,000.00 - ₹700,000.00 per year Benefits: Health insurance Paid sick time Paid time off Provident Fund Work from home Location Type: In-person Schedule: Day shift Work Location: In person Speak with the employer +91 9810508252

Posted 1 month ago

Apply

0 years

6 - 12 Lacs

Mangalore

Remote

Key Responsibilities Curriculum & Training Design Analyze client-specific product/domain requirements to define learning objectives, outcomes, and success metrics. Develop modular, role-based training materials (slide decks, hands-on labs, code samples, quizzes) covering: Databases: Relational (PostgreSQL, MS SQL Server), NoSQL (MongoDB, Cassandra) Programming & Frameworks: Python (including ML pipelines), Java, GoLang, C#/.NET Core, Blazor Front-end: ReactJS (components, state management, hooks) Big Data & Messaging: HDFS (core commands, shell scripting), Apache Kafka, RabbitMQ, MQTT Data Analytics: ETL concepts, Pandas-based data wrangling, Matplotlib visualizations Continuously iterate on training content based on learner feedback, technology updates, and best practices. Hands-On Development & Domain Engagement Collaborate with client SMEs, product owners, and engineering teams to understand architecture, codebase standards, and deployment pipelines. Build proof-of-concepts, reference implementations, and sample “mini-projects” that align with the training curriculum. Troubleshoot and optimize sample code (both back-end and front-end) to illustrate best practices in performance, scalability, and maintainability. Instruction & Mentoring Lead instructor-led sessions (in-person or remote) covering theory (20–30%), code walkthroughs (30%), and hands-on labs (40–50%). Provide one-on-one mentoring: review trainee code, offer constructive feedback, troubleshoot roadblocks, and ensure skill acquisition. Design assessment mechanisms (quizzes, coding assignments, capstone projects) to evaluate learner readiness and provide remediation as needed. Onboarding & Trainee Progress Tracking Create and maintain a structured onboarding roadmap for college graduates, including prerequisites, recommended readings, and milestone checklists. Monitor trainee progress through weekly checkpoints, code reviews, and performance metrics—adjust training pace or content accordingly. Knowledge Management & Continuous Improvement Maintain and update a centralized repository (internal wiki or LMS) of training artifacts, best practices, troubleshooting guides, and FAQs. Host regular “Knowledge-Sharing” sessions or brown-bag workshops to showcase trending technologies, emerging frameworks, and industry insights. Solicit and analyse post-training feedback to refine content, delivery style, and instructional tools. Collaboration & Stakeholder Communication Work closely with HR/Talent Acquisition to align training agendas with hiring timelines and candidate profiles. Coordinate with senior architects, DevOps, QA, and UX/UI teams to ensure consistent messaging about coding standards, development processes, and testing methodologies. Provide periodic status updates and training metrics (enrolment, completion rates, assessment scores) to project sponsors or leadership. Job Type: Full-time Pay: ₹50,000.00 - ₹100,000.00 per month Work Location: In person

Posted 1 month ago

Apply

10.0 years

26 - 30 Lacs

Chennai

On-site

We are looking for Associate Division Manager for one of our Major Client .This role includes designing and building AI/ML products at scale to improve customer Understanding & Sentiment analysis, recommend customer requirements, recommend optimal inputs, Improve efficiency of Process. This role will collaborate with product owners and business owners Key Responsibilities: - Leading a team of junior and experienced data scientists Lead and participate in end-to-end ML projects deployments that require feasibility analysis, design, development, validation, and application of state-of-the art data science solutions. Push the state of the art in terms of the application of data mining, visualization, predictive modelling, statistics, trend analysis, and other data analysis techniques to solve complex business problems including lead classification, recommender systems, product life-cycle modelling, Design Optimization problems, Product cost & weigh optimization problems.Functional Responsibilities :- Leverage and enhance applications utilizing NLP, LLM, OCR, image based models and Deep Learning Neural networks for use cases including text mining, speech and object recognition Identify future development needs, advance new emerging ML and AI technology, and set the strategy for the data science team Cultivate a product-centric, results-driven data science organization Write production ready code and deploy real time ML models; expose ML outputs through APIs Partner with data/ML engineers and vendor partners for input data pipes development and ML models automation Provide leadership to establish world-class ML lifecycle management processes.Qualification :- MTech / BE / BTech / MSc in CS Exp:- Over 10 years of Applied Machine learning experience in the fields of Machine Learning, Statistical Modelling, Predictive Modelling, Text Mining, Natural Language Processing (NLP), LLM, OCR, Image based models, Deep learning Expert Python Programmer: SQL, C#, extremely proficient with the SciPy stack (e.g. numpy, pandas, sci-kit learn, matplotlib) Proficiency in work with open source deep learning platforms like TensorFlow, Keras, Pytorch Knowledge of the Big Data Ecosystem: (Apache Spark, Hadoop, Hive, EMR, MapReduce) Proficient in Cloud Technologies and Service (Azure Databricks, ADF, Databricks MLflow).Functional Competencies :- A demonstrated ability to mentor junior data scientists and proven experience in collaborative work environments with external customers Proficient in communicating technical findings to non-technical stakeholders Holding routine peer code review of ML work done by the team Experience in leading and / or collaborating with small to midsized teams Experienced in building scalable / highly available distribute systems in production Experienced in ML lifecycle mgmt. and ML Ops tools & frameworks.Job type:- FTE Location:- Chennai Job Type: Contractual / Temporary Pay: ₹2,633,123.63 - ₹3,063,602.96 per year Schedule: Monday to Friday Education: Bachelor's (Preferred) Work Location: In person

Posted 1 month ago

Apply

7.0 years

0 Lacs

Coimbatore

On-site

The Opportunity: Works independently under close supervision, provide analysis, insight, and recommendations by synthesizing information for various product categories and geographies by using competitive analyses, analytical modeling, and leveraging knowledge of the product portfolio and customer programs. Manage pricing process for the sales team including assessing, approving, and loading all pricing requests. Review, research, and analyze pricing to make recommendations on price enhancements, SKU rationalization and margin opportunities. Process quotation requests for stock standard items and special order (customer specific) items. Job Summary: We are seeking a skilled and detail-oriented Data Analyst with minimum 7 years of experience in data analysis, data handling, and reporting. The ideal candidate should have strong proficiency in Python, SQL, Excel and Power BI with a solid understanding of data manipulation, cleaning, and visualization techniques. You will work closely with cross-functional teams to derive actionable insights from complex datasets and support data-driven decision-making. Key Responsibilities: Analyze large datasets to identify trends, patterns, and insights. Develop and maintain Python scripts for data processing and automation. Removing corrupted data and fixing coding errors and related problems Developing and maintaining databases, data systems - reorganizing data in a readable format Demonstrate strong proficiency in Microsoft Excel, including advanced functions, pivot tables, Power Query, data cleaning, and charting for reporting and analysis. Write SQL queries to extract, transform, and load data from various sources. Clean, validate, and organize raw data for analysis and reporting. Create dashboards and visualizations to communicate findings effectively. Collaborate with business stakeholders to understand data requirements and deliver solutions. Ensure data integrity and consistency across systems and reports. Preparing final analysis reports for the stakeholders to understand the data-analysis steps, enabling them to take important decisions based on various facts and trends. Required Skills & Qualifications: Proven working experience in data analysis - Minimum of 7 Yrs Bachelor’s degree in computer science, Statistics, Mathematics, or a related field. Proficiency in Python (Pandas, NumPy, Matplotlib, etc.). Good Knowledge in MS Excel (Knowledge on SQL will be an added advantage) Experience with data visualization tools (e.g., Power BI, Qlik). Solid understanding of data handling, data quality, and data governance principles. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Disclaimer: The above statements are intended to describe the general nature and level of work being performed by employees assigned to this classification. They are not intended to be construed as an exhaustive list of all responsibilities, duties and skills required of employees assigned to this position. Avantor is proud to be an equal opportunity employer. Why Avantor? Dare to go further in your career. Join our global team of 14,000+ associates whose passion for discovery and determination to overcome challenges relentlessly advances life-changing science. The work we do changes people's lives for the better. It brings new patient treatments and therapies to market, giving a cancer survivor the chance to walk his daughter down the aisle. It enables medical devices that help a little boy hear his mom's voice for the first time. Outcomes such as these create unlimited opportunities for you to contribute your talents, learn new skills and grow your career at Avantor. We are committed to helping you on this journey through our diverse, equitable and inclusive culture which includes learning experiences to support your career growth and success. At Avantor, dare to go further and see how the impact of your contributions set science in motion to create a better world. Apply today! EEO Statement: We are an Equal Employment/Affirmative Action employer and VEVRAA Federal Contractor. We do not discriminate in hiring on the basis of sex, gender identity, sexual orientation, race, color, religious creed, national origin, physical or mental disability, protected Veteran status, or any other characteristic protected by federal, state/province, or local law. If you need a reasonable accommodation for any part of the employment process, please contact us by email at recruiting@avantorsciences.com and let us know the nature of your request and your contact information. Requests for accommodation will be considered on a case-by-case basis. Please note that only inquiries concerning a request for reasonable accommodation will be responded to from this email address. 3rd party non-solicitation policy: By submitting candidates without having been formally assigned on and contracted for a specific job requisition by Avantor, or by failing to comply with the Avantor recruitment process, you forfeit any fee on the submitted candidates, regardless of your usual terms and conditions. Avantor works with a preferred supplier list and will take the initiative to engage with recruitment agencies based on its needs and will not be accepting any form of solicitation

Posted 1 month ago

Apply

0 years

1 - 1 Lacs

Thanjāvūr

Remote

Job Title: Data Analyst Intern Location: [City / Remote / Hybrid] Internship Type: Full-time Duration: 3 Months About the Role: We are looking for a detail-oriented and enthusiastic Data Analyst Intern to join our team. You will assist in collecting, analyzing, and interpreting data to help the team make informed decisions. This internship will give you hands-on experience in real-world data projects and exposure to data tools and technologies. Key Responsibilities: Work with the team to collect, clean, and validate data from various sources. Assist in data visualization and report generation using tools like Excel, Power BI, or Tableau. Conduct basic exploratory data analysis (EDA) and derive insights. Support in building dashboards and visual reports. Create documentation and reports based on data trends. Collaborate with different departments to understand data requirements. Present findings in a clear and structured format to stakeholders. Required Skills: Basic understanding of statistics and data analysis. Proficiency in Microsoft Excel or Google Sheets. Familiarity with SQL and Python (Pandas, NumPy) is a plus. Exposure to any data visualization tools (Tableau, Power BI, Matplotlib, etc.) Strong analytical and problem-solving skills. Good communication and teamwork abilities. Eligibility: Pursuing or recently completed a degree in Computer Science, Data Science, Statistics, Mathematics, or related fields. Available for a minimum of [2-3] months. Eagerness to learn and grow in the data domain. Job Types: Full-time, Permanent Pay: ₹10,000.00 - ₹12,000.00 per month Schedule: Day shift Work Location: In person

Posted 1 month ago

Apply

8.0 - 11.0 years

35 - 37 Lacs

Kolkata, Ahmedabad, Bengaluru

Work from Office

Dear Candidate, We are hiring a Flask Developer to create lightweight and high-performance web services using Python. Key Responsibilities: Develop web APIs using Flask and deploy them on cloud or containers Use SQLAlchemy or MongoEngine for data access Write modular blueprints and configure middleware Perform request validation and error handling Work on REST/GraphQL integration with frontend teams Required Skills & Qualifications: Expertise in Flask , Python , and Jinja2 Familiar with Gunicorn , Docker , and PostgreSQL Understanding of JWT , OAuth , and API security Bonus: Experience in FastAPI or Flask-SocketIO Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies

Posted 1 month ago

Apply

3.0 years

0 Lacs

New Delhi, Delhi, India

Remote

Location: Remote (with full-day availability) Company: Baoiam – India’s Fastest Growing EdTech for Practical Skill-Based Learning Type: Contractual | 12 Months | Individual Contributor Role 🧠 About the Role: Baoiam is looking for a passionate and experienced Data Science Expert to join us as a full-time Trainer & Mentor for our newly launched 6-Month Data Science Career Program . You will be responsible for teaching, mentoring, evaluating, and preparing students for real-world careers in Data Science. This is a contract-based role with complete teaching responsibility for multiple student batches. ✅ Key Responsibilities: Deliver live, interactive classes (online) as per the structured 6-month curriculum Teach Python, ML, Statistics, Deep Learning, NLP, and project execution Evaluate assignments, capstone projects, and performance Conduct weekly doubt sessions, assessments, and mock interviews Provide 1:1 mentorship and learning feedback to students Ensure student engagement and high satisfaction levels Collaborate with the program team for quality improvement and curriculum updates 🎓 Requirements: Bachelor's or Master’s degree in Computer Science, Data Science, or a related field Minimum 3+ years of experience in Data Science / Machine Learning Prior experience teaching or mentoring (bootcamps, online courses, or edtech preferred) Proficient in Python, NumPy, Pandas, Scikit-learn, TensorFlow/Keras, SQL, Matplotlib, Seaborn Strong command of statistics, model evaluation techniques, and real-world use cases Comfortable teaching diverse learners in live sessions Strong communication and presentation skills 💻 Mandatory Setup: Personal Laptop/Desktop with high processing capacity High-speed internet connection (at least 50 Mbps) Professional teaching environment (quiet, well-lit setup) Must be available full-day (Monday to Saturday) to handle multiple sessions and mentoring needs 📦 What We Offer: Monthly retainer (negotiable based on experience) Performance bonuses & long-term engagement opportunities Exposure to national-level learners & fast-growing EdTech environment Opportunity to co-build India’s most accessible data career program Recognition across Baoiam platforms as a lead mentor 🚀 How to Apply: If you're passionate about teaching and shaping careers in data science, we'd love to hear from you. 📩 Send your CV, LinkedIn profile, and teaching sample (if available) to: hr @baoiam.com Or apply through LinkedIn.

Posted 1 month ago

Apply

7.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

The Opportunity Works independently under close supervision, provide analysis, insight, and recommendations by synthesizing information for various product categories and geographies by using competitive analyses, analytical modeling, and leveraging knowledge of the product portfolio and customer programs. Manage pricing process for the sales team including assessing, approving, and loading all pricing requests. Review, research, and analyze pricing to make recommendations on price enhancements, SKU rationalization and margin opportunities. Process quotation requests for stock standard items and special order (customer specific) items. Job Summary We are seeking a skilled and detail-oriented Data Analyst with minimum 7 years of experience in data analysis, data handling, and reporting. The ideal candidate should have strong proficiency in Python, SQL, Excel and Power BI with a solid understanding of data manipulation, cleaning, and visualization techniques. You will work closely with cross-functional teams to derive actionable insights from complex datasets and support data-driven decision-making. Key Responsibilities Analyze large datasets to identify trends, patterns, and insights. Develop and maintain Python scripts for data processing and automation. Removing corrupted data and fixing coding errors and related problems Developing and maintaining databases, data systems - reorganizing data in a readable format Demonstrate strong proficiency in Microsoft Excel, including advanced functions, pivot tables, Power Query, data cleaning, and charting for reporting and analysis. Write SQL queries to extract, transform, and load data from various sources. Clean, validate, and organize raw data for analysis and reporting. Create dashboards and visualizations to communicate findings effectively. Collaborate with business stakeholders to understand data requirements and deliver solutions. Ensure data integrity and consistency across systems and reports. Preparing final analysis reports for the stakeholders to understand the data-analysis steps, enabling them to take important decisions based on various facts and trends. Required Skills & Qualifications Proven working experience in data analysis - Minimum of 7 Yrs Bachelor’s degree in computer science, Statistics, Mathematics, or a related field. Proficiency in Python (Pandas, NumPy, Matplotlib, etc.). Good Knowledge in MS Excel (Knowledge on SQL will be an added advantage) Experience with data visualization tools (e.g., Power BI, Qlik). Solid understanding of data handling, data quality, and data governance principles. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Disclaimer The above statements are intended to describe the general nature and level of work being performed by employees assigned to this classification. They are not intended to be construed as an exhaustive list of all responsibilities, duties and skills required of employees assigned to this position. Avantor is proud to be an equal opportunity employer. Why Avantor? Dare to go further in your career. Join our global team of 14,000+ associates whose passion for discovery and determination to overcome challenges relentlessly advances life-changing science. The work we do changes people's lives for the better. It brings new patient treatments and therapies to market, giving a cancer survivor the chance to walk his daughter down the aisle. It enables medical devices that help a little boy hear his mom's voice for the first time. Outcomes such as these create unlimited opportunities for you to contribute your talents, learn new skills and grow your career at Avantor. We are committed to helping you on this journey through our diverse, equitable and inclusive culture which includes learning experiences to support your career growth and success. At Avantor, dare to go further and see how the impact of your contributions set science in motion to create a better world. Apply today! EEO Statement We are an Equal Employment/Affirmative Action employer and VEVRAA Federal Contractor. We do not discriminate in hiring on the basis of sex, gender identity, sexual orientation, race, color, religious creed, national origin, physical or mental disability, protected Veteran status, or any other characteristic protected by federal, state/province, or local law. If you need a reasonable accommodation for any part of the employment process, please contact us by email at recruiting@avantorsciences.com and let us know the nature of your request and your contact information. Requests for accommodation will be considered on a case-by-case basis. Please note that only inquiries concerning a request for reasonable accommodation will be responded to from this email address. 3rd Party Non-solicitation Policy By submitting candidates without having been formally assigned on and contracted for a specific job requisition by Avantor, or by failing to comply with the Avantor recruitment process, you forfeit any fee on the submitted candidates, regardless of your usual terms and conditions. Avantor works with a preferred supplier list and will take the initiative to engage with recruitment agencies based on its needs and will not be accepting any form of solicitation

Posted 1 month ago

Apply

3.0 - 7.0 years

14 - 18 Lacs

Kolkata

Work from Office

Scale an existing RAG code base for a production grade AI application Proficiency in Prompt Engineering, LLMs, and Retrieval Augmented Generation Programming languages like Python or Java Experience with vector databases Experience using LLMs in software applications, including prompting, calling, and processing outputs Experience with AI frameworks such as LangChain Troubleshooting skills and creative in finding new ways to leverage LLM Experience with Azure Proof of Concept (POC) DevelopmentDevelop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Collaborate with development teams to implement and iterate on POCs, ensuring alignment with customer requirements and expectations. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another, particularly COBOL to JAVA through rapid prototypes/ PoC Documentation and Knowledge SharingDocument solution architectures, design decisions, implementation details, and lessons learned. Create technical documentation, white papers, and best practice guides. Contribute to internal knowledge sharing initiatives and mentor new team members. Industry Trends and InnovationStay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation. Experience in python and pyspark will be added advantage Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Strong programming skills, with proficiency in Python and experience with AI frameworks such as TensorFlow, PyTorch, Keras or Hugging Face. Understanding in the usage of libraries such as SciKit Learn, Pandas, Matplotlib, etc. Familiarity with cloud platforms (e.g. Kubernetes, AWS, Azure, GCP) and related services is a plus. Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation Prepare the effort estimates, WBS, staffing plan, RACI, RAID etc. . Excellent interpersonal and communication skills. Engage with stakeholders for analysis and implementation. Commitment to continuous learning and staying updated with advancements in the field of AI. Demonstrate a growth mindset to understand clients' business processes and challenges Preferred technical and professional experience EducationBachelor's, Master's, or Ph.D. degree in Computer Science, Artificial Intelligence, Data Science or a related field. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 1 month ago

Apply

3.0 - 7.0 years

14 - 18 Lacs

Bengaluru

Work from Office

As an Associate Data Scientist at IBM, you will work to solve business problems using leading edge and open-source tools such as Python, R, and TensorFlow, combined with IBM tools and our AI application suites. You will prepare, analyze, and understand data to deliver insight, predict emerging trends, and provide recommendations to stakeholders. In your role, you may be responsible for: Implementing and validating predictive and prescriptive models and creating and maintaining statistical models with a focus on big data & incorporating machine learning. techniques in your projects Writing programs to cleanse and integrate data in an efficient and reusable manner Working in an Agile, collaborative environment, partnering with other scientists, engineers, consultants and database administrators of all backgrounds and disciplines to bring analytical rigor and statistical methods to the challenges of predicting behaviors Communicating with internal and external clients to understand and define business needs and appropriate modelling techniques to provide analytical solutions. Evaluating modelling results and communicating the results to technical and non-technical audiences. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proof of Concept (POC) DevelopmentDevelop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Collaborate with development teams to implement and iterate on POCs, ensuring alignment with customer requirements and expectations. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another, particularly COBOL to JAVA through rapid prototypes/ PoC Document solution architectures, design decisions, implementation details, and lessons learned. Create technical documentation, white papers, and best practice guides Preferred technical and professional experience Strong programming skills, with proficiency in Python and experience with AI frameworks such as TensorFlow, PyTorch, Keras or Hugging Face. Understanding in the usage of libraries such as SciKit Learn, Pandas, Matplotlib, etc. Familiarity with cloud platforms Experience and working knowledge in COBOL & JAVA would be preferred Experience in python and pyspark will be added advantage

Posted 1 month ago

Apply

3.0 - 8.0 years

15 - 30 Lacs

Hyderabad

Work from Office

We Advantum Health Pvt. Ltd - US Healthcare MNC looking for Data Scientist. We Advantum Health Private Limited is a leading RCM and Medical Coding company, operating since 2013. Our Head Office is located in Hyderabad, with branch operations in Chennai and Noida. We are proud to be a Great Place to Work certified organization and a recipient of the Telangana Best Employer Award. Our office spans 35,000 sq. ft. in Cyber Gateway, Hitech City, Hyderabad Job Title: Data Scientist Location: Hitech City, Hyderabad, India Work from office Ph: 9177078628, 7382307530, 9059683624 Address: Advantum Health Private Limited, Cyber gateway, Block C, 4th floor Hitech City, Hyderabad. Location: https://www.google.com/maps/place/Advantum+Health+India/@17.4469674,78.3747158,289m/data=!3m2!1e3!5s0x3bcb93e01f1bbe71:0x694a7f60f2062a1!4m6!3m5!1s0x3bcb930059ea66d1:0x5f2dcd85862cf8be!8m2!3d17.4467126!4d78.3767566!16s%2Fg%2F11whflplxg?entry=ttu&g_ep=EgoyMDI1MDMxNi4wIKXMDSoASAFQAw%3D%3D Job Summary: We are seeking a highly skilled and motivated Data Scientist to analyze complex datasets, build predictive models, and generate actionable insights that support data-driven decision-making. The ideal candidate will collaborate with cross-functional teams to solve business challenges using statistical analysis, machine learning, and advanced analytics. Key Responsibilities: Collect, process, and analyze large datasets from multiple sources to uncover trends and patterns. Develop and implement predictive models and machine learning algorithms to solve business problems. Communicate insights and recommendations to stakeholders through reports, visualizations, and presentations. Collaborate with product, technology, and business teams to integrate data-driven solutions into operations. Continuously improve models and processes based on new data, business requirements, and feedback. Conduct A/B testing and experimental design to validate hypotheses. Stay updated with the latest developments in data science, AI, and analytics tools. Required Skills and Qualifications: Bachelor's or Masters degree in Computer Science, Statistics, Mathematics, Data Science, or a related field. Should have 3 plus years of relevant experience Proven experience as a Data Scientist or in a similar analytical role. Proficiency in programming languages such as Python, R, or SQL. Experience with machine learning libraries and frameworks (e.g., scikit-learn, TensorFlow, PyTorch). Strong knowledge of statistical analysis, data mining, and predictive modeling techniques. Familiarity with data visualization tools (e.g., Tableau, Power BI, Matplotlib, Seaborn). Excellent problem-solving and communication skills. Ability to work with large, complex datasets and translate findings into actionable business insights. Follow us on LinkedIn, Facebook, Instagram, Youtube and Threads for all updates: Advantum Health Linkedin Page: https://www.linkedin.com/showcase/advantum-health-india/ Advantum Health Facebook Page: https://www.facebook.com/profile.php?id=61564435551477 Advantum Health Instagram Page: https://www.instagram.com/reel/DCXISlIO2os/?igsh=dHd3czVtc3Fyb2hk Advantum Health India Youtube link: https://youtube.com/@advantumhealthindia-rcmandcodi?si=265M1T2IF0gF-oF1 Advantum Health Threads link: https://www.threads.net/@advantum.health.india HR Dept, Advantum Health Pvt Ltd Cybergateway, Block C, Hitech City, Hyderabad Ph: 9177078628, 7382307530, 9059683624

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Matillion is The Data Productivity Cloud. We are on a mission to power the data productivity of our customers and the world, by helping teams get data business ready, faster. Our technology allows customers to load, transform, sync and orchestrate their data. We are looking for passionate, high-integrity individuals to help us scale up our growing business. Together, we can make a dent in the universe bigger than ourselves. With offices in the UK, US and Spain, we are now thrilled to announce the opening of our new office in Hyderabad, India. This marks an exciting milestone in our global expansion, and we are now looking for talented professionals to join us as part of our founding team. Matillion is a fast paced hyper-scale software development company. You will be based in India but working with colleagues globally specifically across the US, the UK and in Hyderabad. The Enterprise data team is responsible for producing Matillion's reporting metrics and KPIs. We work closely with Finance colleagues, the product team and Go To Market to interpret the data that we have and provide actionable insight across the business. The purpose of this role is to: Increase the value of strategic information from the data warehouse, Salesforce, Thoughtspot, and DPC HubDevelop models to help us understand customer behaviour specifically onboarding, product usage and churnUse our rich data assets to streamline operational processes What will you be doing? Run structured experiments to evaluate and improve LLM performance across generative and task-oriented functions Improving our AI evaluation frameworks Investigating ways generative AI can be used to improve data quality Some more traditional data science predictive models to forecast customer consumption, churn and/or anomaly detection for failing data pipelines Keeping current on the latest research and proposing proof of concept projects to explore how it can assist us Educating other team members to raise the team’s understating of theoretical concepts and the latest developments What are we looking for? Technical/Role Specific - Core Skills MSc, PhD, or equivalent experience in ML, NLP, or a related field Strong understanding of LLM internals: transformer architecture, tokenization, embeddings, sampling strategies Python fluency, especially for data science and experimentation (NumPy, Pandas, Matplotlib, Jupyter) Experience with LLM tools (e.g. Hugging Face, LangChain, OpenAI API) Familiarity with prompt engineering and structured evaluation of generative outputs Technical/Role Specific - Preferrable Skills Any experience of reinforcement learning techniques, even if on a small scale Experience of model evaluation fine tuning, model distillation, instruction tuning or transfer learning agentic systems (tool use / agentic frameworks) implementing guardrails RAG architecture design and vector search Understanding of Model failure modes, fallback strategies, and error recovery LLM performance optimization tradeoffs (latency, cost, accuracy) Uncertainty estimation and confidence scoring in generative systems Privacy and compliance considerations in AI for SaaS Personal Capabilities Enthusiasm to learn Able to coach and mentor those around you to increase their knowledge Comfort working across teams Ability to translate requirements between data scientists (research focus) and software engineers (product focus) Clear communication of challenges, timelines, and possible solutions to stakeholders Adaptability to rapid changes in a dynamic tech startup environment Enthusiasm for learning new AI/ML Ops tools, libraries, and techniques Proactive at diagnosing problems to understand a true root cause Willingness to experiment and to look for ways to optimise existing systems Willingness to pivot quickly in a rapidly evolving generative AI landscape Matillion has fostered a culture that is collaborative, fast-paced, ambitious, and transparent, and an environment where people genuinely care about their colleagues and communities. Our 6 core values guide how we work together and with our customers and partners. We operate a truly flexible and hybrid working culture that promotes work-life balance, and are proud to be able to offer the following benefits: - Company Equity - 27 days paid time off - 12 days of Company Holiday - 5 days paid volunteering leave - Group Mediclaim (GMC) - Enhanced parental leave policies - MacBook Pro - Access to various tools to aid your career development More about Matillion Thousands of enterprises including Cisco, DocuSign, Slack, and TUI trust Matillion technology to load, transform, sync, and orchestrate their data for a wide range of use cases from insights and operational analytics, to data science, machine learning, and AI. With over $300M raised from top Silicon Valley investors, we are on a mission to power the data productivity of our customers and the world. We are passionate about doing things in a smart, considerate way. We’re honoured to be named a great place to work for several years running by multiple industry research firms. We are dual headquartered in Manchester, UK and Denver, Colorado. We are keen to hear from prospective Matillioners, so even if you don’t feel you match all the criteria please apply and a member of our Talent Acquisition team will be in touch. Alternatively, if you are interested in Matillion but don't see a suitable role, please email talent@matillion.com. Matillion is an equal opportunity employer. We celebrate diversity and we are committed to creating an inclusive environment for all of our team. Matillion prohibits discrimination and harassment of any type. Matillion does not discriminate on the basis of race, colour, religion, age, sex, national origin, disability status, genetics, sexual orientation, gender identity or expression, or any other characteristic protected by law.

Posted 1 month ago

Apply

10.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

We are looking for a hands-on coding Director/Technical leader to join our AI/ML consulting practice. This role will report directly to VP – Digital Transformation & Strategy and will extensively collaborate and oversee a team of Data Scientists and Data Engineers to deliver ground-breaking solutions for our customers. We are looking for builders to support our efforts in Enterprise, SMB, and start-up communities. In this role, you would drive the roadmap and strategic business objectives that enables how Idexcel successfully solve our customers’ most challenging AI and Data problems and ensures seamless alignment with the latest industry advancements and meet our growth goals Job Description: BS or master’s degrees in computer science, engineering, or related technical, math, or scientific field 10+ years of engineering experience in a medium or larger software company including 6+ years managing high performance teams. 12+ years of Software Development Experience: scripting languages (Python, R), database languages (SQL, PL/SQL, PG-PL/SQL), version control (GitHub, Bitbucket, AWS Code Commit, data structures, algorithms) 8+ years of Machine Learning Experience: ML frameworks, ML algorithms (understanding of classification, regression, clustering, embedding, NLP, and computer vision). experience with training models, hyperparameter tuning, distributed model training, hosting and deployment of models, ML pipelines (able to whiteboard common components if ML pipelines) At least 8 years’ experience in building large-scale machine learning and AI solutions at scale. 5+ years of Architecture experience: data pipelines, distributed computing engines 8+ years of Data Visualization experience: Python/R frameworks such as matplotlib, seaborn, ploty, ggplot2; JavaScript frameworks such as D. Collaborate with Executive Leadership to conceptualize, strategize, and develop new products centered around AI/ML Exceptional business acumen, communication, and presentation skills and experience working directly with our senior customer stakeholders. Serve as a trusted advisor to our clients, understanding their unique challenges, goals, and opportunities related to data and AI. Be responsible for identifying hiring needs, managing P&L, and the overall capability team’s performance. Have a deep focus on identifying new and extending business opportunities with the most strategic prospective and current clients. Deep understanding of data management, data governance, cloud and AI technologies, and their application in a business context. Strong project management skills with the ability to manage multiple projects simultaneously and deliver high-quality results.

Posted 1 month ago

Apply

0 years

12 - 24 Lacs

Bengaluru, Karnataka, India

On-site

About The Company (Industry & Sector) An advanced-technology scale-up at the crossroads of Quantum Computing, Artificial Intelligence and Semiconductor Engineering . The hardware division designs full-stack enterprise quantum computers—spanning superconducting processors, cryogenic control electronics and RF instrumentation—to unlock breakthroughs across life-sciences, finance, transportation and space. Role & Responsibilities Design and execute quantum-device experiments—from cryogenic fixture design to automated data acquisition—for superconducting-qubit characterisation. Develop and refine protocols to measure coherence times, gate fidelities, and perform quantum-state / process tomography, feeding results back into device design. Maintain, troubleshoot and optimise cryogenic measurement stacks and microwave-RF chains to guarantee low-noise, high-throughput data collection. Implement data pipelines in Python / MATLAB that process raw traces into actionable metrics and dashboards for cross-functional teams. Collaborate with quantum-processor, control-electronics and theory groups to correlate empirical results with simulations and accelerate design-of-experiments cycles. Document methodologies, publish findings and help shape the roadmap for next-generation, fault-tolerant quantum processors. Skills & Qualifications Must-Have MSc / MTech / PhD in Physics, Electrical Engineering, Materials Science or related field with quantum focus. Hands-on experience designing cryogenic or microwave testbeds and performing quantum measurements on superconducting qubits or similar platforms. Proven ability to measure and analyse device parameters (T₁/T₂, gate fidelity, tomography). Solid understanding of circuit QED and error-correction concepts relevant to superconducting hardware. Proficiency in Python (NumPy/Pandas/Matplotlib) or MATLAB for data analysis and instrument control. Strong problem-solving, communication and teamwork skills; comfortable in fast-paced R&D settings. Preferred Track record of peer-reviewed publications or conference presentations in quantum technology. Experience writing DoE-driven analysis reports that steer experimentation plans. Familiarity with cold-atom or spin-qubit platforms, autonomous calibration routines, or GPU-accelerated simulators. Knowledge of error-mitigation / bosonic-code techniques and their experimental implementation. Exposure to clean-room fabrication workflows and materials studies for superconducting devices. Contributions to open-source quantum-measurement tooling or instrument-control libraries. Skills: hamiltonian engineering,coherence times measurement,quantum-state tomography,ldpc codes,surface codes,gate fidelity measurement,python-based quantum platforms,circuit qed,automated data acquisition,numerical tool-chains,fault-tolerant architectures,superconducting-qubit error-correction schemes,computational modelling of quantum circuits,data processing in matlab,quantum device characterization,problem-solving,experimental protocols,matlab,error-mitigation techniques,quantum-software stacks,cryogenic fixture design,collaboration,quantum computing,artificial intelligence,data processing in python,quantum error-correction codes,quantum-process tomography,teamwork,python,error-correction concepts,quantum-state & process tomography,communication,qubit-control schemes,semiconductor,peer-reviewed publications,dynamical decoupling,numerical methods

Posted 1 month ago

Apply

0 years

12 - 24 Lacs

Bengaluru, Karnataka, India

On-site

About The Company (Industry & Sector) An advanced-technology scale-up at the crossroads of Quantum Computing, Artificial Intelligence and Semiconductor Engineering . The hardware division designs full-stack enterprise quantum computers—spanning superconducting processors, cryogenic control electronics and RF instrumentation—to unlock breakthroughs across life-sciences, finance, transportation and space. Role & Responsibilities Design and execute quantum-device experiments—from cryogenic fixture design to automated data acquisition—for superconducting-qubit characterisation. Develop and refine protocols to measure coherence times, gate fidelities, and perform quantum-state / process tomography, feeding results back into device design. Maintain, troubleshoot and optimise cryogenic measurement stacks and microwave-RF chains to guarantee low-noise, high-throughput data collection. Implement data pipelines in Python / MATLAB that process raw traces into actionable metrics and dashboards for cross-functional teams. Collaborate with quantum-processor, control-electronics and theory groups to correlate empirical results with simulations and accelerate design-of-experiments cycles. Document methodologies, publish findings and help shape the roadmap for next-generation, fault-tolerant quantum processors. Skills & Qualifications Must-Have MSc / MTech / PhD in Physics, Electrical Engineering, Materials Science or related field with quantum focus. Hands-on experience designing cryogenic or microwave testbeds and performing quantum measurements on superconducting qubits or similar platforms. Proven ability to measure and analyse device parameters (T₁/T₂, gate fidelity, tomography). Solid understanding of circuit QED and error-correction concepts relevant to superconducting hardware. Proficiency in Python (NumPy/Pandas/Matplotlib) or MATLAB for data analysis and instrument control. Strong problem-solving, communication and teamwork skills; comfortable in fast-paced R&D settings. Preferred Track record of peer-reviewed publications or conference presentations in quantum technology. Experience writing DoE-driven analysis reports that steer experimentation plans. Familiarity with cold-atom or spin-qubit platforms, autonomous calibration routines, or GPU-accelerated simulators. Knowledge of error-mitigation / bosonic-code techniques and their experimental implementation. Exposure to clean-room fabrication workflows and materials studies for superconducting devices. Contributions to open-source quantum-measurement tooling or instrument-control libraries. Skills: hamiltonian engineering,coherence times measurement,quantum-state tomography,ldpc codes,surface codes,gate fidelity measurement,python-based quantum platforms,circuit qed,automated data acquisition,numerical tool-chains,fault-tolerant architectures,superconducting-qubit error-correction schemes,computational modelling of quantum circuits,data processing in matlab,quantum device characterization,problem-solving,experimental protocols,matlab,error-mitigation techniques,quantum-software stacks,cryogenic fixture design,collaboration,quantum computing,artificial intelligence,data processing in python,quantum error-correction codes,quantum-process tomography,teamwork,python,error-correction concepts,quantum-state & process tomography,communication,qubit-control schemes,semiconductor,peer-reviewed publications,dynamical decoupling,numerical methods

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

Data Science Internship – Evoastra Ventures Pvt. Ltd. Location: Remote / Hybrid (Hyderabad HQ) Duration: upto 6 Months About Evoastra Ventures Evoastra is a next-generation research and analytics firm delivering high-impact insights across data science, market intelligence, and business strategy. We work with startups, enterprises, and academia to unlock value from data and empower the next generation of talent through real-time projects, mentorship, and innovation-driven learning. Role: Data Scientist Intern As a Data Science Intern at Evoastra, you’ll work on real-world projects involving data cleaning, analysis, predictive modeling, and data-driven storytelling. You’ll gain hands-on experience under expert mentorship and build a strong project portfolio that stands out. Key Responsibilities Assist in data collection, cleaning, and preprocessing from multiple sources Perform Exploratory Data Analysis (EDA) and visualize findings Work with statistical models and machine learning algorithms for predictive analytics Participate in live projects involving real datasets and business problems Collaborate with data scientists, analysts, and project leads on assigned tasks Present insights and outcomes through dashboards or reports Learn to deploy models using beginner-friendly tools (based on internship level) What You Will Learn End-to-end data science project lifecycle Hands-on with tools like Python, Pandas, NumPy, Scikit-learn, Matplotlib, Seaborn, Power BI, or Excel Basics of ML algorithms like linear regression, decision trees, clustering, etc. How to work with real-time datasets Basics of model evaluation, feature selection, and deployment strategies Communicating data insights like a professional Eligibility Criteria Open to students and recent graduates from any background (STEM preferred) Basic understanding of Python and statistics is a plus (not mandatory) Passion for data, analytics, and solving real-world problems Willingness to learn and complete project-based tasks on time What You’ll Get ✅ Certificate of Completion (recognized globally) ✅ Project Completion Letter with tools, techniques, and outcomes ✅ Letter of Recommendation based on performance ✅ 1-on-1 Mentorship and support from our experts ✅ Access to Exclusive Discord Community ✅ Stipend eligibility for long-term or top-performing interns ✅ Profile-building guidance (LinkedIn/Resume reviews) Important Note This is a training + internship program . As an authorized provider, you get mentorship, certifications, documentation, and live project hosting. Many of our partner colleges and industries also sponsor this program for their students. How to Apply  Fill the internship form : https://short.evoastra.com/US3AY

Posted 1 month ago

Apply

6.0 years

0 Lacs

Gurugram, Haryana, India

On-site

About Us: Website: https://www.cognitioanalytics.com/ Cognitio Analytics, founded in 2013, aims to be the preferred provider of AI / ML driven productivity solutions for large enterprises. The company has received awards for its Smart Operations and Total Rewards Analytics Solutions and is dedicated to innovation, R&D, and creating sustained value for clients. Cognitio Analytics has been recognized as a "Great Place to Work" for its commitment to fostering an innovative work environment and employee satisfaction. Our solutions include Total Rewards Analytics powered by Cognitio’s Total Rewards Data Factory, The Total Rewards Analytics solutions help our clients achieve better outcomes and higher ROI on investments in all kinds of Total Rewards programs. Our smart operations solutions drive productivity in complex operations, such as claims processing, commercial underwriting etc. These solutions, based on proprietary capabilities based on AI, advanced process and task mining, and deep understanding of operations drive effective digital transformation for our clients. Ideal qualifications, skills and experiences we are looking for are: - We are actively seeking a talented and results-driven Data Scientist to join our team and take on a leadership role in driving business outcomes through the power of data analytics and insights. - Your contributions will be instrumental in making data-informed decisions, identifying growth opportunities, and propelling our organization to new levels of success. - Doctorate/Master's/bachelor's degree in data science, Statistics, Computer Science, Mathematics, Economics, commerce or a related field. - Minimum of 6 years of experience working as a Data Scientist or in a similar analytical role, with experience leading data science projects and teams. Experience in Healthcare domain with exposure to clinical operations, financial, risk rating, fraud, digital, sales and marketing, and wellness, e-commerce or the ed tech industry is a plus. - Proven ability to lead and mentor a team of data scientists, fostering an innovative environment. Strong decision-making and problem-solving skills to guide strategic initiatives. - Expertise in programming languages such as Python and R, and proficiency with data manipulation, analysis, and visualization libraries (e.g., pandas, NumPy, Matplotlib, seaborn). Very strong python and exceptional with pandas, NumPy, advanced python (pytest, class, inheritance, docstrings). - Deep understanding of machine learning algorithms, model evaluation, and feature engineering. Experience with frameworks like scikit-learn, TensorFlow, or Py torch. Experience of leading a team and handling projects with end-to-end ownership is a must Deep understanding of ML and Deep Learning is a must Basis NLP experience is highly valuable. Pyspark experience is highly valuable. Competitive coding experience (LeetCode) is highly valuable. - Strong expertise in statistical modelling techniques such as regression, clustering, time series analysis, and hypothesis testing. - Experience of building & deploying machine learning models in cloud environment: Microsoft Azure preferred (Databricks, Synapse, Data Factory, etc.) - Basic MLOPs experience with FastAPIs and experience of docker is highly valuable and AI governance - Ability to understand business objectives, market dynamics, and strategic priorities. Demonstrated experience translating data insights into tangible business outcomes and driving data-informed decision-making. - Excellent verbal and written communication skills - Proven experience leading data science projects, managing timelines, and delivering results within deadlines. - Strong collaboration skills with the ability to work effectively in cross-functional teams, build relationships, and foster a culture of knowledge sharing and continuous learning. "Cognitio Analytics is an equal-opportunity employer. We are committed to a work environment that celebrates diversity. We do not discriminate against any individual based on race, color, sex, national origin, age, religion, marital status, sexual orientation, gender identity, gender expression, military or veteran status, disability, or any factors protected by applicable law. All Cognitio employees are expected to understand and adhere to all Cognitio Security and Privacy related policies in order to protect Cognitio data and our client’s data. Our salary ranges are based on paying competitively for our size and industry and are one part of the total compensation package that also includes a bonus plan, equity, benefits, and other opportunities at Cognitio. Individual pay decisions are based on a number of factors, including qualifications for the role, experience level, and skillset."

Posted 1 month ago

Apply

0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

• Develop strategies/solutions to solve problems in logical yet creative ways, leveraging state-of-the-art machine learning, deep learning and GEN AI techniques. • Technically lead a team of data scientists to produce project deliverables on time and with high quality. • Identify and address client needs in different domains, by analyzing large and complex data sets, processing, cleansing, and verifying the integrity of data, and performing exploratory data analysis (EDA) using state-of-the-art methods. • Select features, build and optimize classifiers/regressors, etc. using machine learning and deep learning techniques. • Enhance data collection procedures to include information that is relevant for building analytical systems, and ensure data quality and accuracy. • Perform ad-hoc analysis and present results in a clear manner to both technical and non-technical stakeholders. • Create custom reports and presentations with strong data visualization and storytelling skills to effectively communicate analytical conclusions to senior officials in a company and other stakeholders. • Expertise in data mining, EDA, feature selection, model building, and optimization using machine learning and deep learning techniques. • Strong programming skills in Python. • Excellent communication and interpersonal skills, with the ability to present complex analytical concepts to both technical and non-technical stakeholders. Primary Skills : - Excellent understanding and hand-on experience of data-science and machine learning techniques & algorithms for supervised & unsupervised problems, NLP and computer vision and GEN AI. Good applied statistics skills, such as distributions, statistical inference & testing, etc. - Excellent understanding and hand-on experience on building Deep-learning models for text & image analytics (such as ANNs, CNNs, LSTM, Transfer Learning, Encoder and decoder, etc). - Proficient in coding in common data science language & tools such as R, Python. - Experience with common data science toolkits, such as NumPy, Pandas, Matplotlib, StatsModel, Scikitlearn, SciPy, NLTK, Spacy, OpenCV etc. - Experience with common data science frameworks such as Tensorflow, Keras, PyTorch, XGBoost,etc. - Exposure or knowledge in cloud (Azure/AWS). - Experience on deployment of model in production.

Posted 1 month ago

Apply

3.0 - 7.0 years

0 Lacs

Hyderābād

On-site

Job requisition ID :: 81110 Date: Jul 3, 2025 Location: Hyderabad Designation: Deputy Manager Entity: Deloitte Touche Tohmatsu India LLP Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Innovation, transformation and leadership occur in many ways. At Deloitte, our ability to help solve clients’ most complex issues is distinct. We deliver strategy and implementation, from a business and technology view, to help you lead in the markets where you compete. Learn more about our Tax Practice. What impact will you make? Every day, your work will make an impact that matters, while you thrive in a dynamic culture of inclusion, collaboration, and high performance. As the undisputed leader in professional services, Deloitte is where you’ll find unrivalled opportunities to succeed and realize your full potential. Deloitte is where you’ll find unrivalled opportunities to succeed and realize your full potential. Work you’ll do Deloitte has institutionalized a new AI and Analytics capability for Tax Technology Consulting, this group is a part of the Deloitte South Asia Tax & Legal function and focuses to embed AI in everything we do, for our clients and for ourselves across all business of Deloitte. You will be engaged in internal projects to disrupt the way we operate and focus on building assets and solutions for our clients, including the latest technologies and methods around predictive models, prescriptive analytics, generative AI etc. We are looking for a highly skilled data scientist to join our dynamic team. The ideal candidate will have a solid background in artificial intelligence and machine learning, with hands-on experience in frameworks such as TensorFlow, PyTorch, scikit-learn, etc. The candidate should possess a deep understanding of data structures, algorithms, and distributed computing. Additionally, experience in deploying machine learning models in production environments, working with various database systems, and familiarity with version control, containerization, and cloud platforms are essential for success in this role. Also, candidates with great storyboarding skills and a penchant to convert AI driven mathematical insights to stories will be given preference. Responsibilities: Collaborate with cross-functional teams to translate business requirements into actual implementation of models, algorithms, and technologies. Execute the product road map and planning of the programs and initiatives as defined by the product owners. Independently solve complex business problems with minimal supervision, while escalating more complex issues to appropriate next level. Develop and maintain software programs, algorithms, dashboards, information tools, and queries to clean, model, integrate and evaluate data sets. Build and optimize pipelines for data intake, validation, and mining as well as modelling and visualization by applying best practices to the engineering of large data sets. Develop and implement advanced machine learning algorithms and models for various applications. Apply the latest advances in deep learning, machine learning and natural language processing to improve performance of legacy models. Customize latest available large language models to develop generative AI solutions for multiple business problems across multiple functional areas. Apply A/B testing framework and test model quality. Experience in taking the models to Production using Cloud Technologies Provide findings and analysis to take informed business decisions. Stay updated with the latest developments in AI/ML technologies and contribute to the continuous improvement of our systems. Requirement: Minimum of 3-7 years of relevant work experience. Master's degree in a related field (Statistics, Mathematics or Computer Science) or MBA in Data Science/AI/Analytics Experience with database systems such as MySQL, PostgreSQL, or MongoDB. Experience in collecting and manipulating structured and unstructured data from multiple data systems (on-premises, cloud-based data sources, APIs, etc) Familiarity with version control systems, preferably Git. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. Solid understanding of data structures, algorithms, and distributed computing. Excellent knowledge of Jupyter Notebooks for experimentation and prototyping. Strong programming skills in Python. In-depth understanding of machine learning, deep learning & natural language processing (NLP) algorithms. Experience with popular machine learning frameworks such as TensorFlow, PyTorch, or scikit-learn. Knowledge of containerization tools such as Docker. Experience in deploying machine learning models in production environments. Excellent problem-solving and communication skills. Proficient in using data visualization tools such as Tableau or Matplotlib, or dashboarding packages like Flask, Streamlit. Good working knowledge of MS PowerPoint and storyboarding skills to translate mathematical results to business insights. Your role as a leader At Deloitte India, we believe in the importance of leadership at all levels. We expect our people to embrace and live our purpose by challenging themselves to identify issues that are most important for our clients, our people, and for society and make an impact that matters. In addition to living our purpose, Executive across our organization: o Builds own understanding of our purpose and values; explores opportunities for impact. o Demonstrates strong commitment to personal learning and development; acts as a brand. o ambassador to help attract top talent. o Understands expectations and demonstrates personal accountability for keeping performance on track. o Actively focuses on developing effective communication and relationship-building skills. o Understands how their daily work contributes to the priorities of the team and the business. How you’ll grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to help build world-class skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Center. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our purpose Deloitte is led by a purpose: To make an impact that matters . Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world Recruiter tips We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you’re applying to. Check out recruiting tips from Deloitte professionals. Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee (“DTTL”), its network of member firms, and their related entities. DTTL and each of its member firms are legally separate and independent entities. DTTL (also referred to as “Deloitte Global”) does not provide services to clients. Please see www.deloitte.com/about for a more detailed description of DTTL and its member firms. This communication is for internal distribution and use only among personnel of Deloitte Touche Tohmatsu Limited, its member firms, and their related entities (collectively, the “Deloitte network”). None of the Deloitte network shall be responsible for any loss whatsoever sustained by any person who relies on this communication. © 2025. For information, contact Deloitte Touche Tohmatsu Limited

Posted 1 month ago

Apply

2.0 - 4.0 years

0 Lacs

Hyderābād

On-site

Job requisition ID :: 81109 Date: Jul 3, 2025 Location: Hyderabad Designation: Senior Executive Entity: Deloitte Touche Tohmatsu India LLP Your potential, unleashed. India’s impact on the global economy has increased at an exponential rate and Deloitte presents an opportunity to unleash and realise your potential amongst cutting edge leaders, and organisations shaping the future of the region, and indeed, the world beyond. At Deloitte, your whole self to work, every day. Combine that with our drive to propel with purpose and you have the perfect playground to collaborate, innovate, grow, and make an impact that matters. The Team Deloitte delivers deep knowledge of tax and statutory requirements as well as a breadth of experience applying them in practice worldwide. Practical tax advice combined with our consistent tax compliance framework instils confidence that a consistent approach is followed across jurisdictions. We help simplify tax management and oversight while providing global visibility for making informed strategic decisions ― all with the ease of working with a global provider Learn More about our Tax Practice Your work profile Deloitte has institutionalized a new AI and Analytics capability for Tax Technology Consulting, this group is a part of the Deloitte South Asia Tax & Legal function and focuses to embed AI in everything we do, for our clients and for ourselves across all business of Deloitte. You will be engaged in internal projects to disrupt the way we operate and focus on building assets and solutions for our clients, including the latest technologies and methods around predictive models, prescriptive analytics, generative AI etc. We are looking for a highly skilled data scientist to join our dynamic team. The ideal candidate will have a solid background in artificial intelligence and machine learning, with hands-on experience in frameworks such as TensorFlow, PyTorch, scikit-learn, etc. The candidate should possess a deep understanding of data structures, algorithms, and distributed computing. Additionally, experience in deploying machine learning models in production environments, working with various database systems, and familiarity with version control, containerization, and cloud platforms are essential for success in this role. Also, candidates with great storyboarding skills and a penchant to convert AI driven mathematical insights to stories will be given preference. Responsibilities: Collaborate with cross-functional teams to translate business requirements into actual implementation of models, algorithms, and technologies. Execute the product road map and planning of the programs and initiatives as defined by the product owners. Independently solve complex business problems with minimal supervision, while escalating more complex issues to appropriate next level. Develop and maintain software programs, algorithms, dashboards, information tools, and queries to clean, model, integrate and evaluate data sets. Build and optimize pipelines for data intake, validation, and mining as well as modelling and visualization by applying best practices to the engineering of large data sets. Develop and implement advanced machine learning algorithms and models for various applications. Apply the latest advances in deep learning, machine learning and natural language processing to improve performance of legacy models. Customize latest available large language models to develop generative AI solutions for multiple business problems across multiple functional areas. Apply A/B testing framework and test model quality. Experience in taking the models to Production using Cloud Technologies Provide findings and analysis to take informed business decisions. Stay updated with the latest developments in AI/ML technologies and contribute to the continuous improvement of our systems. Requirement: Minimum of 2-4 years of relevant work experience. Master's degree in a related field (Statistics, Mathematics or Computer Science) or MBA in Data Science/AI/Analytics Experience with database systems such as MySQL, PostgreSQL, or MongoDB. Experience in collecting and manipulating structured and unstructured data from multiple data systems (on-premises, cloud-based data sources, APIs, etc) Familiarity with version control systems, preferably Git. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. Solid understanding of data structures, algorithms, and distributed computing. Excellent knowledge of Jupyter Notebooks for experimentation and prototyping. Strong programming skills in Python. In-depth understanding of machine learning, deep learning & natural language processing (NLP) algorithms. Experience with popular machine learning frameworks such as TensorFlow, PyTorch, or scikit-learn. Knowledge of containerization tools such as Docker. Experience in deploying machine learning models in production environments. Excellent problem-solving and communication skills. Proficient in using data visualization tools such as Tableau or Matplotlib, or dashboarding packages like Flask, Streamlit. Good working knowledge of MS PowerPoint and storyboarding skills to translate mathematical results to business insights. How you’ll grow At Deloitte, our professional development plan focuses on helping people at every level of their career to identify and use their strengths to do their best work every day. From entry-level employees to senior leaders, we believe there’s always room to learn. We offer opportunities to help build world-class skills in addition to hands-on experience in the global, fast-changing business world. From on-the-job learning experiences to formal development programs at Deloitte University, our professionals have a variety of opportunities to continue to grow throughout their career. Explore Deloitte University, The Leadership Center. Benefits At Deloitte, we know that great people make a great organization. We value our people and offer employees a broad range of benefits. Learn more about what working at Deloitte can mean for you. Our purpose Deloitte is led by a purpose: To make an impact that matters . Every day, Deloitte people are making a real impact in the places they live and work. We pride ourselves on doing not only what is good for clients, but also what is good for our people and the communities in which we live and work—always striving to be an organization that is held up as a role model of quality, integrity, and positive change. Learn more about Deloitte's impact on the world Recruiter tips We want job seekers exploring opportunities at Deloitte to feel prepared and confident. To help you with your interview, we suggest that you do your research: know some background about the organization and the business area you’re applying to. Check out recruiting tips from Deloitte professionals. Deloitte refers to one or more of Deloitte Touche Tohmatsu Limited, a UK private company limited by guarantee (“DTTL”), its network of member firms, and their related entities. DTTL and each of its member firms are legally separate and independent entities. DTTL (also referred to as “Deloitte Global”) does not provide services to clients. Please see www.deloitte.com/about for a more detailed description of DTTL and its member firms. This communication is for internal distribution and use only among personnel of Deloitte Touche Tohmatsu Limited, its member firms, and their related entities (collectively, the “Deloitte network”). None of the Deloitte network shall be responsible for any loss whatsoever sustained by any person who relies on this communication. © 2025. For information, contact Deloitte Touche Tohmatsu Limited

Posted 1 month ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description AI engineer with Python experience developing applications powered by LLMs and integrating with data warehouse like GCP Big Query & other standard data sources. Responsibilities Design, develop, and maintain core functionalities and backend services using Python, focusing on AI and LLM integration. Integrate Large Language Models (LLMs) such as OpenAI, GPT, Llama, or others into applications to create intelligent, AI-powered features. Explore and apply LLM capabilities, including summarization, classification, RAG (Retrieval-Augmented Generation), prompt engineering, and prompt pipelines. Develop and implement efficient data processing pipelines for structured and unstructured data, ensuring data quality for AI models. Collaborate with cross-functional teams (e.g., product managers, data scientists, DevOps) to define, design, and ship new AI features and integrate LLMs effectively. Write clean, maintainable, well-tested, and well-documented Python code, adhering to best practices and coding standards. Ensure the reliability, performance, scalability, and security of AI/LLM-based applications, identifying and correcting bottlenecks. Conduct technical analysis of tasks, participate actively in scrum meetings, and deliver value committed for sprints. Stay up-to-date with the latest advancements in generative AI, LLM architectures, machine learning, and related technologies, sharing insights with the team. Participate in code reviews, contribute to technical improvements, and assist in troubleshooting and debugging issues. Qualifications Required Skills and Qualifications: 3+ years of proven experience in Python software development (Full stack or Backend), with a strong emphasis on backend development. Strong knowledge on Python data structures and algorithms Proficiency with Python and relevant libraries such as Pandas, NumPy, SciPy, scikit-learn, PyTorch, TensorFlow, Matplotlib etc. Solid understanding of machine learning concepts, algorithms. Experience with REST APIs and building scalable backend services. Familiarity with database technologies (e.g., PostgreSQL, MongoDB, SQL/NoSQL). Familiarity/experience with Cloud technologies(AWS, GCP, Azure etc.). Experience with version control systems, particularly Git. Strong problem-solving skills, analytical abilities, and attention to detail. Excellent communication and collaboration skills, with the ability to explain complex technical concepts clearly. Preferred Skills and Qualifications (Nice to Have): Hands-on experience with Large Language Models (LLMs) using RAG and their application in real-world scenarios. Familiarity with data quality and data governance concepts. Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Machine Learning, or a related technical field.

Posted 1 month ago

Apply

2.0 years

0 Lacs

Mohali

Remote

Job Title: AI Trainer (Project-Based / Freelance) Location: On-Site Job Type: Project-Based / Freelance Experience Required: 2+ years in AI/ML training or practical AI development Job Summary: We are seeking a skilled AI Trainer on a project basis to conduct hands-on training sessions for interns and new employees. The ideal candidate should have strong expertise in AI tools and the ability to explain AI/ML concepts clearly while providing practical training. Key Responsibilities: Deliver project-based training programs on AI/ML fundamentals and tools. Train interns and freshers on essential AI tools and frameworks including: Python and libraries like NumPy, Pandas, Scikit-learn, TensorFlow, Keras, PyTorch Jupyter Notebook / Google Colab for practical coding OpenAI tools (e.g., ChatGPT, API usage) NLP libraries such as spaCy, NLTK Data visualization tools like Matplotlib, Seaborn Version control using Git/GitHub Prepare training materials, hands-on assignments, and evaluations. Provide feedback and support to trainees during the training period. Update training content based on latest AI developments. Work remotely with flexible hours, delivering sessions according to project schedules. Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, AI or related field. Minimum 2 years’ experience in AI/ML development and/or training. Proven ability to train or mentor beginners in AI tools and technologies. Excellent communication and presentation skills. Self-motivated and able to manage training projects independently. Preferred: Experience with cloud AI platforms (AWS, GCP, Azure). Knowledge of Generative AI tools. Relevant certifications in AI/ML training. Job Types: Contractual / Temporary, Freelance Contract length: 3 months Pay: From ₹2,000.00 per month Schedule: Day shift Morning shift Supplemental Pay: Commission pay Language: English (Preferred) Work Location: In person

Posted 1 month ago

Apply

5.0 - 10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Job Requirements Job Requirements Role/ Job Title: Senior Data Scientist Function/ Department: Data & Analytics Job Purpose In this specialized role, you will leverage your expertise in machine learning and statistics to derive valuable insights from data. Your role will include developing predictive models, interpreting data and working closely with out ML engineers to ensure the effective deployment and functioning of these models. Key / Primary Responsibilities Lead cross-functional teams in the design, development, and deployment of Generative AI solutions, with a strong focus on Large Language Models (LLMs). Architect, train, and fine-tune state-of-the-art LLMs (e.g., GPT, BERT, T5) for various business applications, ensuring alignment with project goals. Deploy and scale LLM-based solutions, integrating them seamlessly into production environments and optimizing for performance and efficiency. Develop and maintain machine learning workflows and pipelines for training, evaluating, and deploying Generative AI models, using Python or R, and leveraging libraries like Hugging Face Transformers, TensorFlow, and PyTorch. Collaborate with product, data, and engineering teams to define and refine use cases for LLM applications such as conversational agents, content generation, and semantic search. Design and implement fine-tuning strategies to adapt pre-trained models to domain-specific tasks, ensuring high relevance and accuracy. Evaluate and optimize LLM performance, including handling challenges such as prompt engineering, inference time, and model bias. Manage and process large, unstructured datasets using SQL and NoSQL databases, ensuring smooth integration with AI models. Build and deploy AI-driven APIs and services, providing scalable access to LLM-based solutions. Use data visualization tools (e.g., Matplotlib, Seaborn, Tableau) to communicate AI model performance, insights, and results to non-technical stakeholders. Secondary Responsibilities Contribute to data analysis projects, with a strong emphasis on text analytics, natural language understanding, and Generative AI applications. Build, validate, and deploy predictive models specifically tailored to text data, including models for text generation, classification, and entity recognition. Handle large, unstructured text datasets, performing essential preprocessing and data cleaning steps, such as tokenization, lemmatization, and noise removal, for machine learning and NLP tasks. Work with cutting-edge text data processing techniques, ensuring high-quality input for training and fine-tuning Large Language Models (LLMs). Collaborate with cross-functional teams to develop and deploy scalable AI-powered solutions that process and analyze textual data at scale. Key Success Metrics Ensure timely deliverables. Spot Training Infrastructure fixes. Lead technical aspects of the projects. Error free deliverables. Education Qualification Graduation: Bachelor of Science (B.Sc) / Bachelor of Technology (B.Tech) / Bachelor of Computer Applications (BCA) Post-Graduation: Master of Science (M.Sc) /Master of Technology (M.Tech) / Master of Computer Applications (MCA Experience: 5-10 years of relevant experience

Posted 1 month ago

Apply

3.0 years

0 Lacs

Chennai

On-site

AI engineer with Python experience developing applications powered by LLMs and integrating with data warehouse like GCP Big Query & other standard data sources. Required Skills and Qualifications: 3+ years of proven experience in Python software development (Full stack or Backend), with a strong emphasis on backend development. Strong knowledge on Python data structures and algorithms Proficiency with Python and relevant libraries such as Pandas, NumPy, SciPy, scikit-learn, PyTorch, TensorFlow, Matplotlib etc. Solid understanding of machine learning concepts, algorithms. Experience with REST APIs and building scalable backend services. Familiarity with database technologies (e.g., PostgreSQL, MongoDB, SQL/NoSQL). Familiarity/experience with Cloud technologies(AWS, GCP, Azure etc.). Experience with version control systems, particularly Git. Strong problem-solving skills, analytical abilities, and attention to detail. Excellent communication and collaboration skills, with the ability to explain complex technical concepts clearly. Preferred Skills and Qualifications (Nice to Have): Hands-on experience with Large Language Models (LLMs) using RAG and their application in real-world scenarios. Familiarity with data quality and data governance concepts. Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Machine Learning, or a related technical field. Design, develop, and maintain core functionalities and backend services using Python, focusing on AI and LLM integration. Integrate Large Language Models (LLMs) such as OpenAI, GPT, Llama, or others into applications to create intelligent, AI-powered features. Explore and apply LLM capabilities, including summarization, classification, RAG (Retrieval-Augmented Generation), prompt engineering, and prompt pipelines. Develop and implement efficient data processing pipelines for structured and unstructured data, ensuring data quality for AI models. Collaborate with cross-functional teams (e.g., product managers, data scientists, DevOps) to define, design, and ship new AI features and integrate LLMs effectively. Write clean, maintainable, well-tested, and well-documented Python code, adhering to best practices and coding standards. Ensure the reliability, performance, scalability, and security of AI/LLM-based applications, identifying and correcting bottlenecks. Conduct technical analysis of tasks, participate actively in scrum meetings, and deliver value committed for sprints. Stay up-to-date with the latest advancements in generative AI, LLM architectures, machine learning, and related technologies, sharing insights with the team. Participate in code reviews, contribute to technical improvements, and assist in troubleshooting and debugging issues.

Posted 1 month ago

Apply

7.0 - 12.0 years

0 Lacs

Chennai

On-site

Job Summary: We are looking for a skilled Python Developer with 7 to 12 years of experience to design, develop, and maintain high-quality back-end systems and applications. The ideal candidate will have expertise in Python and related frameworks, with a focus on building scalable, secure, and efficient software solutions. This role requires a strong problem-solving mindset, collaboration with cross-functional teams, and a commitment to delivering innovative solutions that meet business objectives. Responsibilities Application and Back-End Development: Design, implement, and maintain back-end systems and APIs using Python frameworks such as Django, Flask, or FastAPI, focusing on scalability, security, and efficiency. Build and integrate scalable RESTful APIs, ensuring seamless interaction between front-end systems and back-end services. Write modular, reusable, and testable code following Python’s PEP 8 coding standards and industry best practices. Develop and optimize robust database schemas for relational and non-relational databases (e.g., PostgreSQL, MySQL, MongoDB), ensuring efficient data storage and retrieval. Leverage cloud platforms like AWS, Azure, or Google Cloud for deploying scalable back-end solutions. Implement caching mechanisms using tools like Redis or Memcached to optimize performance and reduce latency. AI/ML Development: Build, train, and deploy machine learning (ML) models for real-world applications, such as predictive analytics, anomaly detection, natural language processing (NLP), recommendation systems, and computer vision. Work with popular machine learning and AI libraries/frameworks, including TensorFlow, PyTorch, Keras, and scikit-learn, to design custom models tailored to business needs. Process, clean, and analyze large datasets using Python tools such as Pandas, NumPy, and PySpark to enable efficient data preparation and feature engineering. Develop and maintain pipelines for data preprocessing, model training, validation, and deployment using tools like MLflow, Apache Airflow, or Kubeflow. Deploy AI/ML models into production environments and expose them as RESTful or GraphQL APIs for integration with other services. Optimize machine learning models to reduce computational costs and ensure smooth operation in production systems. Collaborate with data scientists and analysts to validate models, assess their performance, and ensure their alignment with business objectives. Implement model monitoring and lifecycle management to maintain accuracy over time, addressing data drift and retraining models as necessary. Experiment with cutting-edge AI techniques such as deep learning, reinforcement learning, and generative models to identify innovative solutions for complex challenges. Ensure ethical AI practices, including transparency, bias mitigation, and fairness in deployed models. Performance Optimization and Debugging: Identify and resolve performance bottlenecks in applications and APIs to enhance efficiency. Use profiling tools to debug and optimize code for memory and speed improvements. Implement caching mechanisms to reduce latency and improve application responsiveness. Testing, Deployment, and Maintenance: Write and maintain unit tests, integration tests, and end-to-end tests using Pytest, Unittest, or Nose. Collaborate on setting up CI/CD pipelines to automate testing, building, and deployment processes. Deploy and manage applications in production environments with a focus on security, monitoring, and reliability. Monitor and troubleshoot live systems, ensuring uptime and responsiveness. Collaboration and Teamwork: Work closely with front-end developers, designers, and product managers to implement new features and resolve issues. Participate in Agile ceremonies, including sprint planning, stand-ups, and retrospectives, to ensure smooth project delivery. Provide mentorship and technical guidance to junior developers, promoting best practices and continuous improvement. Required Skills and Qualifications Technical Expertise: Strong proficiency in Python and its core libraries, with hands-on experience in frameworks such as Django, Flask, or FastAPI. Solid understanding of RESTful API development, integration, and optimization. Experience working with relational and non-relational databases (e.g., PostgreSQL, MySQL, MongoDB). Familiarity with containerization tools like Docker and orchestration platforms like Kubernetes. Expertise in using Git for version control and collaborating in distributed teams. Knowledge of CI/CD pipelines and tools like Jenkins, GitHub Actions, or CircleCI. Strong understanding of software development principles, including OOP, design patterns, and MVC architecture. Preferred Skills: Experience with asynchronous programming using libraries like asyncio, Celery, or RabbitMQ. Knowledge of data visualization tools (e.g., Matplotlib, Seaborn, Plotly) for generating insights. Exposure to machine learning frameworks (e.g., TensorFlow, PyTorch, scikit-learn) is a plus. Familiarity with big data frameworks like Apache Spark or Hadoop. Experience with serverless architecture using AWS Lambda, Azure Functions, or Google Cloud Run. Soft Skills: Strong problem-solving abilities with a keen eye for detail and quality. Excellent communication skills to effectively collaborate with cross-functional teams. Adaptability to changing project requirements and emerging technologies. Self-motivated with a passion for continuous learning and innovation. Education: Bachelor’s or Master’s degree in Computer Science, Software Engineering, or a related field. Job Features Job Category Software Division

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies