Home
Jobs

102482 Python Jobs - Page 38

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

About the Role Build and optimize RESTful APIs and internal service layers (e.g., for AI agents, data processing). Responsibilities Collaborate with frontend, ML, and infrastructure teams to ship end-to-end features. Integrate third-party APIs (e.g., CRM tools, messaging apps, databases). Own deployments, testing, and monitoring for the services you build. Contribute to architectural decisions and continuously improve system performance and reliability. Qualifications 4–6 years of experience in backend development using Python 3.x. Required Skills Strong expertise in FastAPI, Flask, or Django. Solid understanding of REST API design, asynchronous programming, and task queues (Celery, Redis). Experience working with both SQL (PostgreSQL) and NoSQL (MongoDB) databases. Familiarity with Docker, Git, and CI/CD pipelines. Comfortable working with cloud environments like AWS or GCP. Strong debugging, documentation, and testing practices. Good communication and ability to collaborate in a fast-paced environment.

Posted 20 hours ago

Apply

8.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

At Franklin Templeton, we’re driving our industry forward by developing new and innovative ways to help our clients achieve their investment goals. Our dynamic and diversified firm spans asset management, wealth management, and fintech, offering many ways to help investors make progress toward their goals. Our talented teams working around the globe bring expertise that’s both broad and unique. From our welcoming, inclusive, and flexible culture to our global and diverse business, we offer opportunities not only to help you reach your potential but also to contribute to our clients’ achievements. Come join us in delivering better outcomes for our clients around the world! What is the Lead Software Engineer in the FTT AI & Digital Transformation group responsible for? The Lead Software Engineer in the FTT AI & Digital Transformation group is responsible for designing, developing, and implementing cutting-edge generative AI tools-based products. These products are tailored for internal use, primarily benefiting sales and distribution teams as well as Operations teams. The engineer will collaborate closely with cross-functional teams to understand user needs, translate these requirements into technical specifications, and ensure that the solutions developed are scalable, efficient, and user-friendly. Moreover, the engineer will be at the forefront of integrating advanced AI methodologies into practical applications, driving innovation and enhancing operational workflows. What are the ongoing responsibilities of the Lead Software Engineer? Development and Implementation: Design, code, test, and deploy AI-based tools and applications using contemporary technologies. Ensure robust, scalable, and maintainable code. Collaboration with Stakeholders: Work closely with sales, distribution, and operations teams to gather requirements, understand challenges, and provide tailored solutions. Act as a bridge between technical and non-technical teams. AI Integration: Integrate generative AI algorithms and models into software products to enhance functionality and user experience. Stay updated with the latest advancements in AI and machine learning. System Maintenance: Maintain and improve existing software systems by identifying and fixing bugs, optimizing performance, and implementing new features based on user feedback. Documentation: Create and maintain comprehensive documentation for all development activities, ensuring that future modifications and maintenance can be conducted efficiently. Quality Assurance: Conduct regular testing and code reviews to ensure high standards of quality and reliability in all software products. Training and Support: Provide training and technical support to end-users, ensuring they can effectively utilize the developed tools. Gather feedback to continuously improve the products. Innovation and Research: Conduct research and feasibility analysis for new technologies and approaches to keep the company's AI tools at the cutting edge. What ideal qualifications, skills & experience would help someone to be successful? Bachelor’s degree in computer science, Software Engineering, or a related field. A master’s degree is preferred. Courses or certifications in AI, machine learning, or data science are highly desirable. Work Experience 8-10 years of experience in software development, with a focus on building AI-based applications. Proficiency in full stack development, or specialization in frontend technologies (React) or backend frameworks (Python, Django). Experience working with cross-functional teams and integrating AI solutions into business processes. Job Level - Individual Contributor Work Shift Timings - 2:00 PM - 11:00 PM IST Experience our welcoming culture and reach your professional and personal potential! Our culture is shaped by our diverse global workforce and strongly held core values. Regardless of your interests, lifestyle, or background, there’s a place for you at Franklin Templeton. We provide employees with the tools, resources, and learning opportunities to help them excel in their career and personal life. Hear more from our employees By joining us, you will become part of a culture that focuses on employee well-being and provides multidimensional support for a positive and healthy lifestyle. We understand that benefits are at the core of employee well-being and may vary depending on individual needs. Whether you need support for maintaining your physical and mental health, saving for life’s adventures, taking care of your family members, or making a positive impact in your community, we aim to have them covered. Highlights Of Our Benefits Include Professional development growth opportunities through in-house classes and over 150 Web-based training courses An educational assistance program to financially help employees seeking continuing education Medical, Life and Personal Accident Insurance benefit for employees. Medical insurance also cover employee’s dependents (spouses, children and dependent parents) Life insurance for protection of employees’ families Personal accident insurance for protection of employees and their families Personal loan assistance Employee Stock Investment Plan (ESIP) 12 weeks Paternity leave Onsite fitness center, recreation center, and cafeteria Transport facility Child day care facility for women employees Cricket grounds and gymnasium Library Health Center with doctor availability HDFC ATM on the campus Learn more about the wide range of benefits we offer at Franklin Templeton Franklin Templeton is an Equal Opportunity Employer. We are committed to providing equal employment opportunities to all applicants and existing employees, and we evaluate qualified applicants without regard to ancestry, age, color, disability, genetic information, gender, gender identity, or gender expression, marital status, medical condition, military or veteran status, national origin, race, religion, sex, sexual orientation, and any other basis protected by federal, state, or local law, ordinance, or regulation. Franklin Templeton is committed to fostering a diverse and inclusive environment. If you believe that you need an accommodation or adjustment to search for or apply for one of our positions, please send an email to accommodations@franklintempleton.com. In your email, please include the accommodation or adjustment you are requesting, the job title, and the job number you are applying for. It may take up to three business days to receive a response to your request. Please note that only accommodation requests will receive a response.

Posted 20 hours ago

Apply

0.0 - 1.0 years

4 - 6 Lacs

Hinjewadi, Pune, Maharashtra

Remote

AI/ML Developer Location: Remote / Hybrid / Onsite (Pune) Experience: 1–3+ years (flexible for strong candidates) Note: Only candidates with a notice period of 30 days or less will be considered. Final face-to-face interview in Pune is mandatory. Preference for candidates currently in Pune or willing to relocate . Immediate joiners preferred About the Role We are looking for a talented and innovative AI/ML Developer to join our growing technology team. In this role, you will design, develop, and deploy machine learning models that solve real-world problems and create measurable business impact. Working alongside software engineers, data scientists, and product managers, you'll contribute to building intelligent systems that power the next generation of AI solutions. Key Responsibilities Design and implement machine learning algorithms and AI models for business applications. Handle large, complex datasets: data cleaning, preprocessing, feature engineering, and EDA. Train, validate, and optimize models using TensorFlow , PyTorch , or Scikit-learn . Deploy models using Docker , Kubernetes , MLflow , or cloud platforms ( AWS , GCP , Azure ). Monitor and retrain deployed models to ensure performance and reliability. Collaborate with engineering and product teams to integrate ML models into production. Stay updated with emerging trends, tools, and best practices in AI/ML. Required Skills & Qualifications Bachelor’s or Master’s degree in Computer Science , Engineering , Mathematics , or a related field. Strong proficiency in Python , with hands-on experience using: Scikit-learn , TensorFlow , PyTorch , Hugging Face Transformers Experience with Natural Language Processing (NLP) , including: NER, Text Classification, QA, RAG, and chatbot development Exposure to deep learning for image processing (e.g., YOLO, CNNs, Stable Diffusion) Solid understanding of ML fundamentals: regression, classification, clustering, unsupervised learning Experience building APIs using FastAPI or Flask Familiarity with Docker , Kubernetes , and modern deployment tools Comfortable using Git and version control best practices Experience with at least one cloud provider: AWS , GCP , or Azure Preferred Qualifications Projects in Computer Vision , NLP , or Reinforcement Learning Hands-on with cloud ML services like: AWS SageMaker , Google Vertex AI , or Azure ML Open-source contributions or published research Understanding of CI/CD for ML , model lifecycle management, and production workflows Why Join Us? Work on impactful AI/ML products solving real-world problems Collaborate with passionate engineers, researchers, and innovators Access cutting-edge tools, open-source models, and cloud infrastructure Grow continuously through hands-on projects, learning, and innovation Job Type: Full-time Pay: ₹480,000.00 - ₹600,000.00 per year Benefits: Health insurance Provident Fund Work from home Location Type: In-person Schedule: Day shift Monday to Friday Ability to commute/relocate: Hinjewadi, Pune, Maharashtra: Reliably commute or planning to relocate before starting work (Preferred) Application Question(s): What is your current CTC? What is your expected CTC ? What is your legal notice period ? What is your current location ? How many years of experience do you have ? We are hiring candidates who must be available for an on-site interview at our Pune office. Education: Bachelor's (Preferred) Experience: Python: 1 year (Preferred) AI: 1 year (Preferred) Machine learning: 1 year (Preferred) Deep learning: 1 year (Preferred) Location: Hinjewadi, Pune, Maharashtra (Preferred) Work Location: In person

Posted 20 hours ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Pune

Remote

Design, develop, and optimize data pipelines and ETL/ELT workflows using Microsoft Fabric, Azure Data Factory, and Azure Synapse Analytics. Implement Lakehouse and Warehouse architectures within Microsoft Fabric, supporting medallion (bronze-silver-gold) data layers. Collaborate with business and analytics teams to build scalable and reliable data models (star/snowflake) using Azure SQL, Power BI, and DAX. Utilize Azure Analysis Services, Power BI Semantic Models, and Microsoft Fabric Dataflows for analytics delivery. Very good hands-on experience with Python for data transformation and processing. Apply CI/CD best practices and manage code through Git version control. Ensure data security, lineage, and quality using data governance best practices and Microsoft Purview (if applicable). Troubleshoot and improve performance of existing data pipelines and models. Participate in code reviews, testing, and deployment activities. Communicate effectively with stakeholders across geographies and time zones. Required Skills: Strong knowledge of Azure Synapse Analytics, Azure Data Factory, Azure SQL, and Azure Analysis Services. Proficiency in Power BI and DAX for data visualization and analytics modeling. Strong Python skills for scripting and data manipulation. Experience in dimensional modeling, star/snowflake schemas, and Kimball methodologies. Familiarity with CI/CD pipelines, DevOps, and Git-based versioning. Understanding of data governance, data cataloging, and quality management practices. Excellent verbal and written communication skills.

Posted 20 hours ago

Apply

0 years

0 Lacs

Bhiwandi, Maharashtra, India

On-site

0-1 VISL-Bhiwandi Full-Time INR 100000 - 350000 (Annual) Job Title Management Trainee – Business Analyst (AI & Machine Learning Focus) Location: Mumbai [Hybrid] Department: Strategy / Business Intelligence / Analytics Reports To: Assistant Head – Digital Technology Job Summary We are seeking a dynamic and driven Management Trainee with a background in Business Analytics and foundational knowledge of Artificial Intelligence (AI) and Machine Learning (ML) . This role is will have exposure across business functions, with a focus on data-driven decision-making, strategy development, and digital transformation initiatives. The ideal candidate will support cross-functional teams to solve business problems, generate insights, and contribute to the design and implementation of intelligent solutions. Key Responsibilities Collaborate with various business units to collect and analyze data related to performance, customer behavior, and operational efficiency. Translate business requirements into analytical models and present actionable insights to stakeholders. Assist in designing, testing, and evaluating business processes, models, and systems enhanced with AI/ML applications. Work closely with data scientists and IT teams to ensure proper data governance, integrity, and usability for analytics. Develop dashboards and visualizations to communicate findings effectively using tools like Power BI, Tableau, or similar platforms. Participate in identifying and scoping AI/ML initiatives that can automate or optimize business processes. Stay up to date with trends in business analytics, AI, and ML to continuously improve internal practices. Support business case development and ROI analysis for new digital projects or enhancements. Qualifications & Skills Bachelor’s or Master’s degree in Business Administration, Management, Business Analytics, or related fields. Specialization or coursework in Business Analysis, Data Analytics, or AI/ML concepts. Strong analytical and problem-solving skills with a good grasp of statistical techniques. Basic understanding of machine learning models and AI concepts (e.g., supervised/unsupervised learning, predictive modeling). Familiarity with data analysis tools and languages (Excel, SQL, Python, R is a plus). Excellent verbal and written communication skills. Ability to work in a fast-paced, collaborative environment. Preferred Skills Internships or academic projects involving data analysis, process improvement, or AI/ML implementation. Exposure to business intelligence tools such as Power BI, Tableau, or Looker. Understanding of CRM, ERP, or other enterprise systems. Growth Path This role is designed as a launchpad for future leadership in analytics, product management, or strategy roles within the company. Top performers may transition into Business Analyst or AI Project Coordinator roles after completing the trainee program. Share with someone awesome View all job openings

Posted 20 hours ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Who We Are Zinnia is the leading technology platform for accelerating life and annuities growth. With innovative enterprise solutions and data insights, Zinnia simplifies the experience of buying, selling, and administering insurance products. All of which enables more people to protect their financial futures. Our success is driven by a commitment to three core values: be bold, team up, deliver value – and that we do. Zinnia has over $180 billion in assets under administration, serves 100+ carrier clients, 2500 distributors and partners, and over 2 million policyholders. Who You Are As a Technical Lead specializing in job scheduling and automation, you bring extensive expertise in managing software support operations and ensuring seamless cycle management. You are adept at leveraging tools like Broadcom Automic to streamline workflows and optimize processes. With a strong background in Python, API integration, and database management, you excel in resolving complex technical challenges and driving efficiency enhancements. Your commitment to providing round-the-clock support underscores your dedication to customer satisfaction and operational excellence. What You’ll Do Lead a team of software support engineers in providing technical assistance for job scheduling tools and cycle management. Spearhead troubleshooting efforts to swiftly resolve software issues reported by customers, ensuring minimal disruption to operations. Collaborate closely with the development team to address intricate technical problems and implement robust solutions. Drive the configuration and optimization of job scheduling workflows, utilizing your expertise in Broadcom Automic scripting and automation. Champion the integration of job scheduling tools with external systems and APIs, enhancing interoperability and functionality. Conduct comprehensive system performance analyses and devise strategies for continual improvement. Document and disseminate solutions to technical and non-technical stakeholders, fostering transparency and knowledge sharing within the organization. What You’ll Need Experience with Broadcom Automic scripting and other automation and scheduling tools. Experience with ETL/ELT processes. Knowledge of Informatica or SSIS for data integration and transformation. Familiarity with data warehousing concepts and practices. Understanding of data quality and data governance principles. Experience in cloud-based environments and technologies. WHAT’S IN IT FOR YOU? We’re looking for the best and brightest innovators in the industry to join our team. At Zinnia, you collaborate with smart, creative professionals dedicated to delivering cutting-edge technologies, deeper data insights, and enhanced services to transform how insurance is done. Visit our website at www.zinnia.com for more information. Apply by completing the online application on the careers section of our website. We are an Equal Opportunity employer committed to a diverse workforce. We do not discriminate based on race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability.

Posted 20 hours ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

The D. E. Shaw group is a global investment and technology development firm with more than $65 billion in investment capital as of December 1, 2024, and offices in North America, Europe, and Asia. Since our founding in 1988, our firm has earned an international reputation for successful investing based on innovation, careful risk management, and the quality and depth of our staff. We have a significant presence in the world's capital markets, investing in a wide range of companies and financial instruments in both developed and developing economies. We are looking for an experienced engineer to join our Comply Tech team at our firm’s office in Hyderabad, Gurugram, or Bengaluru. The Comply Tech (or Compliance Tech) team creates software to meet the firm's various compliance needs. They develop critical applications that interact in real-time with the firm's trading systems and other processes to ensure compliance. Additionally, the team efficiently analyzes large volumes of trading data to generate reports. Compliance is a critical business requirement for the firm and the diverse business structure adds complexity. The team works closely with the Compliance department to develop applications that satisfy these needs with a strong focus on accuracy and reliability. WHAT YOU'LL DO DAY-TO-DAY: In this role, you will execute individual projects and deliverables while collaborating with business groups. You will also be responsible for projects that will include performing R&D to evaluate the appropriate technologies to use, resolving application and data issues in production within timelines, and driving complete project lifecycles, from collecting and analyzing requirements to project delivery/deployment to clients. Additionally, you will provide direction and guidance to junior members of the team and participate actively in code and design reviews. WHO WE’RE LOOKING FOR: Basic qualifications: A bachelor’s or master’s degree in computer science or a related field with an exceptional foundation in algorithms, data structures, and object-oriented programming At least 5 years of programming experience in Java/Python, in addition to some or all of the following: React, Redux, RESTful Web Services, Perl, messaging middleware, and databases(SQL Server) Excellent problem solving and analytical skills and a passion for technology. Exceptional reasoning ability and good communication skills An inclination or prior experience in project management Preferred qualifications: Prior exposure to project management Experience in Spring and deployment architecture. Interested candidates can apply through our website: https://www.deshawindia.com/recruit/jobs/Adv/Link/LdCompTechJun25 We encourage candidates with relevant experience looking to restart their careers after a break to apply for this position. Learn about Recommence, our gender-neutral return-to-work initiative. The Firm offers excellent benefits, a casual, collegial working environment, and an attractive compensation package. For further information about our recruitment process, including how applicant data will be processed, please visit https://www.deshawindia.com/careers. Members of the D. E. Shaw group do not discriminate in employment matters on the basis of sex, race, colour, caste, creed, religion, pregnancy, national origin, age, military service eligibility, veteran status, sexual orientation, marital status, disability, or any other protected class.

Posted 20 hours ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Hiring a Python Code Reviewer for a 6-month remote contractual position The ideal candidate should have 4-8 years of experience in Python development, QA, or code review, with a strong grasp of Python syntax, edge cases, debugging, and testing The role involves reviewing annotator evaluations of AI-generated Python code to ensure quality, functional accuracy, and alignment with prompt instructions Experience with Docker, code execution tools, and structured QA workflows is mandatory Strong written communication skills and adherence to quality assurance guidelines (Project Atlas) are required Familiarity with LLM evaluation, RLHF pipelines, or annotation platforms is a plus Location : - Mumbai, Delhi / NCR, Bengaluru , Kolkata, Chennai, Hyderabad, Ahmedabad, Pune, Remote

Posted 20 hours ago

Apply

7.0 - 12.0 years

25 - 37 Lacs

Bengaluru

Hybrid

Job Title : Sr. DevOps SRE Location State : Karnataka Location City : Bangalore Experience Required : 8+ Year(s) Shift: IST shift with some overlap with US shift Work Mode: Hybrid / Remote Position Type: Contract Company Name: VARITE INDIA PRIVATE LIMITED About The Client: An American multinational digital communications technology conglomerate corporation headquartered in San Jose, California. The Client develops, manufactures, and sells networking hardware, software, telecommunications equipment, and other high-technology services and products. The Client specializes in specific tech markets, such as the Internet of Things (IoT), domain security, videoconferencing, and energy management. It is one of the largest technology companies in the world, ranking 82nd on the Fortune 100 with over $51 billion in revenue and nearly 83,300 employees. Essential Job Functions: Develop ansible playbooks for configuring the Clients devices Design, configure, and maintain Grafana dashboards for real-time monitoring and visualization of infrastructure, application, and business metrics. Develop and optimize alerting rules to proactively detect and resolve issues. Create custom Splunk queries, dashboards, and reports for incident detection and troubleshooting. Build, deploy, and manage containers using Docker. Create, manage, and troubleshoot Kubernetes manifests (Deployments, Services, ConfigMaps, etc.). Develop, maintain, and optimize CI/CD pipelines for automated build, test, and deployment processes (using tools like Jenkins, GitLab CI, GitHub Actions, etc.). Implement best practices for infrastructure as code, automated testing, and continuous integration/delivery. Qualifications : Experience Required: 8+ years Relevant Experience: Strong DevOps/SRE with Automation focus How to Apply: Interested candidates are invited to submit their resume using the apply online button on this job post. Equal Opportunity Employer: VARITE is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees. We do not discriminate on the basis of race, color, religion, sex, sexual orientation, gender identity or expression, national origin, age, marital status, veteran status, or disability status. Unlock Rewards: Refer Candidates and Earn. If you're not available or interested in this opportunity, please pass this along to anyone in your network who might be a good fit and interested in our open positions. VARITE offers a Candidate Referral program, where you'll receive a one-time referral bonus based on the following scale if the referred candidate completes a three-month assignment with VARITE. Exp Req - Referral Bonus 0 - 2 Yrs. - INR 5,000 2 - 6 Yrs. - INR 7,500 6 + Yrs. - INR 10,000 About VARITE: VARITE is a global staffing and IT consulting company providing technical consulting and team augmentation services to Fortune 500 Companies in USA, UK, CANADA and INDIA. VARITE is currently a primary and direct vendor to the leading corporations in the verticals of Networking, Cloud Infrastructure, Hardware and Software, Digital Marketing and Media Solutions, Clinical Diagnostics, Utilities, Gaming and Entertainment, and Financial Services.

Posted 20 hours ago

Apply

5.0 - 9.0 years

8 - 12 Lacs

Hyderabad

Work from Office

Responsibilities: * Develop ML models using Python, TensorFlow & PyTorch. * Optimize model performance on AWS cloud platform. * Implement NLP techniques with Bert & Scikit-Learn.

Posted 20 hours ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Description Description Job Objective: To contribute with strong problem-solving skills and process orientation on the projects independently. To help the team with required trainings and mentor new joiners. Designation : Senior Business Analyst Job Location: Bangalore Type of employment: Permanent Roles & Responsibilities: Provide insights through data analysis and visualization for small to large datasets Ability to translate a business questions into analytical problem to develop business r ules, process flow and methodology for analysis Ability to summarize the analysis using basic statistical methods Work with team to execute Adhoc/ Regular reporting projects Requirements: 2+ years of professional experience is required Must be well verse with MS Excel, Word and PowerPoint Technically: Must - Experience with working on database query language ( SQL ) Good to have - Python, VBA, any visualization tool (Tableau, Qlik, Power BI) etc. Hands on experience in data analytics, data wrangling. Good to have: Patient level data analytics (RWD Data Lake, Optum/ DRG Claims, IQVIA APLD, IQVIA LAAD) Experience in analyzing IQVIA/IMS data (patient insights, MIDAS) Marketing analytics (Market assessment, Forecasting, competitive intelligence), Sales Analytics (sizing, structuring, segmentation) etc. Additional Skills: Ability to work independently across teams Passion for solving challenging analytical problems Ability to assess a problem quickly, qualitatively, and quantitatively Ability to work productively with team members, identify and resolve tough issues in a collaborative manner Should have good communication skills Qualifications B.E Graduates

Posted 20 hours ago

Apply

4.0 - 6.0 years

18 - 25 Lacs

Hyderabad

Work from Office

Job Summary: We are looking for a highly skilled and experienced AI/ML Developer-Lead with 4-5 years of hands-on relevant experience to join our technology team. You will be responsible for designing, developing, and optimizing machine learning models that drive intelligent business solutions. The role involves close collaboration with cross-functional teams to deploy scalable AI systems and stay abreast of evolving trends in artificial intelligence and machine learning. Key Responsibilities: 1. Develop and Implement AI/ML Models Design, build, and implement AI/ML models tailored to solve specific business challenges, including but not limited to natural language processing (NLP), image recognition, recommendation systems, and predictive analytics. 2. Model Optimisation and Evaluation Continuously improve existing models for performance, accuracy, and scalability. 3. Data Preprocessing and Feature Engineering Collect, clean, and preprocess structured and unstructured data from various sources. Engineer relevant features to improve model performance and interpret ability. 4. Collaboration and Communication Collaborate closely with data scientists, back end engineers, product managers, and stakeholders to align model development with business goals. Communicate technical insights clearly to both technical and non-technical stakeholders. 5. Model Deployment and Monitoring Deploy models to production using MLOps practices and tools (e.g., MLflow, Docker, Kubernetes). Monitor live model performance, diagnose issues, and implement improvements as needed. 6. Staying Current with AI/ML Advancements Stay informed of current research, tools, and trends in AI and machine learning. Evaluate and recommend emerging technologies to maintain innovation within the team. 7. Code Reviews and Best Practices Participate in code reviews to ensure code quality, scalability, and adherence to best practices. Promote knowledge sharing and mentoring within the development team. Required Skills and Qualifications: Bachelors or Masters degree in computer science, Data Science, Engineering, or a related field. 4-5 years' of experience in machine learning, artificial intelligence, or applied data science roles. Strong programming skills in Python (preferred) and/or R. Proficiency in ML libraries and frameworks , including: scikit-learn, XGBoost, LightGBM, TensorFlow or Keras, PyTorch Skilled in data preprocessing and feature engineering , using; pandas, numpy, sklearn.preprocessing Practical experience in deploying ML models into production environments using REST APIs and containers. Familiarity with version control systems (e.g., Git) and containerization tools (e.g., Docker). Experience working with cloud platforms such as AWS, Google Cloud Platform (GCP), or Azure . Understanding software development methodologies , especially Agile/Scrum. Strong analytical thinking, debugging, and problem-solving skills in real-world AI/ML applications.

Posted 20 hours ago

Apply

6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Summary Position Summary The AI&E portfolio is an integrated set of offerings that addresses our clients’ heart-of-the-business issues. This portfolio combines our functional and technical capabilities to help clients transform, modernize, and run their existing technology platforms across industries. As our clients navigate dynamic and disruptive markets, these solutions are designed to help them drive product and service innovation, improve financial performance, accelerate speed to market, and operate their platforms to innovate continuously. ROLE – Machine Vision Developer Level: Specialist Senior As Specialist Senior at Deloitte Consulting, you will be responsible for individually delivering high quality work products within due timelines in an agile framework. Need-basis consultants will be mentoring and/or directing junior team members/liaising with onsite/offshore teams to understand the functional requirements. Responsibilities: The work you will do includes: Develop, test & deploy advanced Computer Vision algorithms for industrial apps, ensuring real-time processing & high accuracy Work with data scientists to preprocess and annotate datasets, and with software engineers to integrate vision solutions into OT systems Continuously monitor, troubleshoot, and optimize vision systems for performance and efficiency. Update and retrain models to adapt to new data and changing conditions Good interpersonal and communication skills Qualifications Skills / Project Experience: Hands exp. In programming languages such as Python, C++ with GPU programming and parallel processing using CUDA or OpenCL Must Have: Good interpersonal and communication skills Flexibility to adapt and apply innovation to varied business domain and apply technical solutioning and learnings to use cases across business domains and industries. Knowledge and experience working with Microsoft Office tools Good to Have: Problem-Solving : Strong analytical and troubleshooting skills to address client-specific challenges. Adaptability : Ability to quickly adapt to changing client requirements and emerging technologies. Project Leadership : Demonstrated leadership in managing client projects, ensuring timely delivery and client satisfaction. Business Acumen : Understanding of business processes and the ability to align technical solutions with client business goals. Education: B.E./B. Tech/M.C.A./M.Sc (CS) degree or equivalent from accredited university Prior Experience: 6 - 10 years of experience working with Proven experience in developing and deploying computer vision solutions in industrial or manufacturing settings. Hands-on leadership or significant contributions in end-to-end project execution – from data acquisition and preprocessing to model deployment and integration with OT systems. Track record of working with cross-functional teams including data scientists, control engineers, and software developers. Experience fine-tuning state-of-the-art Vision Transformer models and demonstrating measurable impact over traditional CNN-based methods. Exposure to real-time systems, model optimization techniques, and deploying vision solutions on edge devices or embedded systems. Location: Bengaluru/ Hyderabad/ Gurugram The team Deloitte Consulting LLP’s Technology Consulting practice is dedicated to helping our clients build tomorrow by solving today’s complex business problems involving strategy, procurement, design, delivery, and assurance of technology solutions. Our service areas include analytics and information management, delivery, cyber risk services, and technical strategy and architecture, as well as the spectrum of digital strategy, design, and development services. Core Business Operations Practice optimizes clients’ business operations and helps them take advantage of new technologies. Drives product and service innovation, improves financial performance, accelerates speed to market, and operates client platforms to innovate continuously. Learn more about our Technology Consulting practice on www.deloitte.com. #HC&IE Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 302207

Posted 20 hours ago

Apply

5.0 - 10.0 years

15 - 27 Lacs

Chennai

Work from Office

Job Summary: We are seeking a highly skilled and experienced Machine Learning Engineer to join our data science team. You will be responsible for designing, developing, and deploying machine learning models and algorithms that power intelligent systems. This role requires a deep understanding of statistical modelling, data engineering, and software development best practices. Key Responsibilities: Design and implement scalable machine learning models and pipelines for real-world applications. Collaborate with cross-functional teams including data engineers, product managers, and software developers. Conduct exploratory data analysis, feature engineering, and model selection. Evaluate models using appropriate metrics and validate performance with A/B testing. Deploy ML models into production using cloud platforms such as AWS, GCP, or Azure. Continuously monitor model performance and retrain as needed. Document methodologies, experiments, and project outcomes. Mentor junior ML engineers and contribute to team development and knowledge sharing. Required Skills & Qualifications: Strong programming skills in Python (preferred), R, or Java. Solid understanding of machine learning algorithms (supervised, unsupervised, reinforcement learning). Experience with ML frameworks like Scikit-learn, TensorFlow, PyTorch, XGBoost, or LightGBM. Proficiency in data processing using Pandas, NumPy, and SQL. Familiarity with cloud services (AWS Sagemaker, GCP Vertex AI, Azure ML). Experience in deploying models using REST APIs, Docker, or CI/CD pipelines. Strong problem-solving, critical thinking, and analytical skills. Excellent communication and team collaboration abilities. Preferred Qualifications: Experience with deep learning for computer vision or NLP. Familiarity with big data technologies (e.g., Spark, Hadoop). Background in statistics, optimization, or operations research. Contribution to open-source ML projects or published research papers.

Posted 20 hours ago

Apply

4.0 - 9.0 years

10 - 18 Lacs

Chennai

Hybrid

Role & responsibilities Bachelors Degree 2+Years in GCP Services - Biq Query, Data Flow, Dataproc, DataPlex,DataFusion, Terraform, Tekton, Cloud SQL, Redis Memory, Airflow, Cloud Storage 2+ Years inData Transfer Utilities 2+ Years in Git / any other version control tool 2+ Years in Confluent Kafka 1+ Years of Experience in API Development 2+ Years in Agile Framework 4+ years of strongexperience in python, Pyspark development. 4+ years of shell scripting to develop the adhoc jobsfor data importing/exporting Google Cloud Platform - Biq Query, Data Flow, Dataproc, Data Fusion, TERRAFORM, Tekton,Cloud SQL, AIRFLOW, POSTGRES, Airflow PySpark, Python, API

Posted 20 hours ago

Apply

1.0 - 5.0 years

2 - 4 Lacs

Bengaluru

Work from Office

Roles & Responsibilities: - Manage day-to-day operational tasks for BBD campaign. - Coordinate with internal teams to ensure smooth process execution. - Track and update order status in system. - Handle operational escalations and support resolution. - Stakeholder Management. - Vendor Management. Skills Required: - Basic knowledge of MS Excel (VLOOKUP, Pivot Table). - Good communication skills. - Time management and multitasking ability. - Problem-solving mindset. NOTE: These role is for 3 Months contract. Only Interested candidates can come walk-in on 15th & 16th July 2025 for below given address. Interested candidates can call me / watspp me updated CV on 9148651089.

Posted 20 hours ago

Apply

5.0 years

0 Lacs

Trivandrum, Kerala, India

On-site

Budget : 15-20 LPA NP- immediate Location-Technopark, trivandrum Exprience- 5+ years We are seeking a highly skilled and experienced Senior AI Engineer with a minimum of 5 years of hands-on experience in designing, developing, and deploying robust Artificial Intelligence and Machine Learning solutions. The ideal candidate will be a strong problem-solver, adept at translating complex business challenges into scalable AI models, and capable of working across the entire AI/ML lifecycle, from data acquisition to model deployment and monitoring. This role requires a deep understanding of various AI techniques, strong programming skills, and a passion for staying updated with the latest advancements in the field. Key Responsibilities AI/ML Solution Design & Development: ○ Lead the design and development of innovative and scalable AI/ML models and algorithms to address specific business needs and optimize processes. ○ Apply various machine learning techniques including supervised, unsupervised, and reinforcement learning, deep learning, natural language processing (NLP), and computer vision. ○ Collaborate with data scientists, product managers, and other stakeholders to understand business requirements and translate them into technical specifications. Data Management & Preprocessing: ○ Oversee the collection, preprocessing, cleaning, and transformation of large and complex datasets to prepare them for AI model training. ○ Implement efficient data pipelines and ensure data quality and integrity. ○ Perform exploratory data analysis to uncover insights and inform model development. Model Training, Evaluation & Optimization: ○ Train, fine-tune, and evaluate AI/ML models for optimal accuracy, performance, and generalization. ○ Select the most suitable models and algorithms for specific tasks and optimize hyperparameters. ○ Conduct rigorous testing and debugging of AI systems to ensure reliability and desired outcomes. Deployment & MLOps: ○ Lead the productionization of AI/ML models, ensuring seamless integration with existing systems and applications. ○ Implement MLOps best practices for model versioning, deployment, monitoring, and retraining. ○ Develop and maintain APIs for AI model integration. Research & Innovation: ○ Continuously research and evaluate the latest advancements in AI/ML research, tools, and technologies. ○ Propose and implement innovative solutions to complex problems. ○ Contribute to the strategic direction of AI initiatives within the company. Collaboration & Mentorship: ○ Work collaboratively with cross-functional teams (e.g., software development, data science, product teams). ○ Clearly articulate complex AI concepts to both technical and non-technical audiences. ○ Mentor junior AI engineers and contribute to a culture of continuous learning. Required Skills And Qualifications Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Machine Learning, Data Science, or a related quantitative field. Minimum of 5 years of professional experience as an AI Engineer, Machine Learning Engineer, or a similar role. Expert-level proficiency in at least one major programming language used for AI development, such as Python (preferred), Java, or C++. Extensive experience with popular AI/ML frameworks and libraries, such as TensorFlow, PyTorch, Keras, Scikit-learn, Hugging Face Transformers. Strong understanding of core machine learning concepts, algorithms, and statistical modeling (e.g., regression, classification, clustering, dimensionality reduction). Solid knowledge of deep learning architectures (e.g., CNNs, RNNs, Transformers) and their applications. Experience with data manipulation and analysis libraries (e.g., Pandas, NumPy). Familiarity with database systems (SQL, NoSQL) and big data technologies (e.g., Apache Spark, Hadoop) for managing large datasets. Experience with cloud platforms (e.g., AWS, Azure, Google Cloud) and their AI/ML services for scalable deployment. Understanding of software development best practices, including version control (Git), testing, and code review. Excellent problem-solving skills, analytical thinking, and a data-driven approach. Strong communication and interpersonal skills, with the ability to explain technical concepts clearly. Ability to work independently and as part of a collaborative team in a fast-paced environment. Immediate Joinee preferred

Posted 20 hours ago

Apply

5.0 years

15 - 20 Lacs

Thiruvananthapuram Taluk, India

Remote

Are you passionate about building AI systems that create real-world impact? We are hiring a Senior AI Engineer with 5+ years of experience to design, develop, and deploy cutting-edge AI/ML solutions. 📍 Location: [Trivandrum / Kochi / Remote – customize based on your need] 💼 Experience: 5+ years 💰 Salary: ₹15–20 LPA 🚀 Immediate Joiners Preferred 🔧 What You’ll Do Design and implement ML/DL models for real business problems Build data pipelines and perform preprocessing for large datasets Use advanced techniques like NLP, computer vision, reinforcement learning Deploy AI models using MLOps best practices Collaborate with data scientists, developers & product teams Stay ahead of the curve with the latest research and tools ✅ What We’re Looking For 5+ years of hands-on AI/ML development experience Strong in Python, with experience in TensorFlow, PyTorch, Scikit-learn, Hugging Face Knowledge of NLP, CV, DL architectures (CNNs, RNNs, Transformers) Experience with cloud platforms (AWS/GCP/Azure) and AI services Solid grasp of MLOps, model versioning, deployment, monitoring Strong problem-solving, communication, and mentoring skills 💻 Tech Stack You’ll Work With Languages: Python, SQL Libraries: TensorFlow, PyTorch, Keras, Transformers, Scikit-learn Tools: Git, Docker, Kubernetes, MLflow, Airflow Platforms: AWS, GCP, Azure, Vertex AI, SageMaker Skills: cloud platforms (aws, gcp, azure),docker,computer vision,git,pytorch,airflow,hugging face,nlp,ml,ai,deep learning,kubernetes,mlflow,mlops,tensorflow,scikit-learn,python,machine learning

Posted 20 hours ago

Apply

3.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

We are looking for a skilled and passionate Python Developer with experience in Django, FastAPI, and hands-on work with LangChain and Large Language Models (LLMs). This role is ideal for someone who enjoys building intelligent systems, integrating with modern AI tools, and delivering scalable backend solutions. Key Responsibilities: Design, develop, and maintain backend applications using Python, Django, and FastAPI Integrate and build applications using LangChain, LLMs, or similar generative AI frameworks Collaborate with AI/ML and product teams to implement intelligent features Optimize APIs for performance, scalability, and security Work closely with frontend developers, DevOps, and QA to ensure seamless product delivery Stay updated with the latest advancements in generative AI and LLM tooling Required Skills: Minimum 3+ years of hands-on experience in Python development Strong expertise in Django and FastAPI frameworks Practical experience working with LangChain, LLMs, or other large language model APIs (e.g., OpenAI, Cohere, Anthropic) Understanding of RESTful API design and microservices architecture Experience with version control (Git) and CI/CD tools Good problem-solving skills and ability to work in an agile environment Preferred Skills (Good to Have): Knowledge of vector databases (e.g., FAISS, Pinecone, Weaviate) Exposure to prompt engineering and NLP techniques Familiarity with containerization tools like Docker Experience with cloud platforms like AWS, GCP, or Azure 

Posted 20 hours ago

Apply

0.0 years

0 - 0 Lacs

Indore, Madhya Pradesh

On-site

Role Summary : We are looking for a dynamic and well-connected professional who can proactively coordinate with companies, HRs, and hiring managers to generate placement opportunities for our students in the Data Analytics, Data Science, Business Analytics, Web Development and related fields . The candidate will be responsible for identifying job openings, building strong corporate relations, scheduling interviews, and ensuring the best possible outcomes for student placements. Key Responsibilities : Corporate Networking & Outreach Research and identify potential employers and job openings in the Data Analytics industry (startups, MNCs, analytics consultancies, etc.) Establish and maintain relationships with HRs, hiring managers, and decision-makers in relevant companies Represent IOTA Academy and promote the capabilities of trained students Job Sourcing & Opportunity Generation Search for and source suitable job opportunities through platforms like LinkedIn, Naukri, Indeed, and company websites Collaborate with companies to understand job roles, skill requirements, and expectations Pitch our trained candidates for job roles and internships Interview & Selection Coordination Coordinate between students and recruiters for scheduling interviews, tests, and assessments Provide necessary student profiles, resumes, and training records to recruiters Follow up with HRs to get interview feedback and results Negotiation & Offer Management Negotiate for better compensation, roles, and growth opportunities on behalf of students Ensure that offer letters and joining dates are properly communicated and documented Database & Reporting Maintain a database of hiring partners, recruiters, and alumni placed in companies Track placement performance, outreach efforts, and student outcomes Submit weekly and monthly reports to management Requirements : Proven experience in placements , corporate relations , business development , or HR/recruitment Excellent communication and interpersonal skills Strong LinkedIn and corporate network (especially in analytics domain is a plus) Good understanding of the Data Analytics industry , job roles, and hiring trends Self-motivated and target-driven Strong negotiation skills and confidence in client-facing communication Proficiency in Excel, Google Sheets, and CRM tools Bachelor's or Master’s degree in Business, HR, IT, or a related field preferred Desirable Qualities : Prior experience working with an educational institute or training academy Familiarity with analytics tools like Excel, Power BI, SQL, Python, etc. (not mandatory, but a plus) Ability to empathize with students and guide them toward relevant roles A persuasive personality and an entrepreneurial mindset Job Type: Full-time Pay: ₹20,000.00 - ₹30,000.00 per month Benefits: Cell phone reimbursement Health insurance Leave encashment Paid sick time Paid time off Ability to commute/relocate: Indore, Madhya Pradesh: Reliably commute or planning to relocate before starting work (Required) Education: Bachelor's (Preferred) Willingness to travel: 25% (Preferred) Work Location: In person

Posted 21 hours ago

Apply

3.0 years

0 Lacs

Indore, Madhya Pradesh, India

On-site

We at Techdome are hiring a sharp, analytical QA Engineer who thrives in a fast-paced environment and can take ownership of load and API testing with confidence. At Techdome, we work as Tech Doctors — solving digital problems with precision, care, and innovation across diverse industries like fintech, aviation, health, and enterprise solutions. Role Overview: You’ll be responsible for conducting performance testing using modern load-testing tools and ensuring robust API testing using Postman, with scripting support in Python and JavaScript. Responsibilities: Design and execute load tests using Locust or K6 Conduct deep-dive API testing using Postman (including mock servers) Build and maintain automated test scripts Collaborate with development and DevOps teams Log, track, and report test results and performance metrics Requirements: 2–3 years of QA experience with API & load testing Experience in Postman (advanced usage) Hands-on with Locust or K6 Scripting knowledge in Python and JavaScript (Java is a plus) Familiarity with basic automation testing workflows Brownie Points: Experience in performance benchmarking and analysis Exposure to CI/CD pipelines Familiarity with JMeter, Gatling, or other performance tools Comfort with Docker or cloud-based test environments ISTQB or relevant certifications

Posted 21 hours ago

Apply

5.0 - 10.0 years

14 - 22 Lacs

Chennai

Work from Office

Roles and Responsibilities 1. MLOps Strategy & Implementation Design and implement scalable MLOps pipelines for the end-to-end lifecycle of machine learning models (from data ingestion to model deployment and monitoring). Automate model training, testing, validation, and deployment using CI/CD practices. Collaborate with data scientists to productize ML models. 2. Infrastructure Management Build and maintain cloud-native infrastructure (e.g., AWS/GCP/Azure) for training, deploying, and monitoring ML models. Optimize compute and storage resources for ML workloads. Containerize ML applications using Docker and orchestrate them with Kubernetes. 3. Model Monitoring & Governance Set up monitoring for ML model performance (drift detection, accuracy drop, latency). Ensure compliance with ML governance policies, versioning, and auditing. 4. Collaboration & Communication Work with cross-functional teams (Data Engineering, DevOps, and Product) to ensure smooth ML model deployment and maintenance. Provide mentorship and technical guidance to junior engineers. 5. Automation & Optimization Automate feature extraction, model retraining, and deployment processes. Improve latency, throughput, and efficiency of deployed models in production. Technical Skills / Tech Stack 1. Programming Languages Python (primary for ML/AI and scripting) Bash/Shell Go or Java (optional but valuable for performance-critical components) 2. ML Frameworks & Libraries TensorFlow , PyTorch , Scikit-learn MLflow , Kubeflow , or SageMaker ONNX (for model conversion) 3. Data & Pipeline Tools Apache Airflow , Luigi Kafka , Apache Beam , Spark (for streaming/batch data) Pandas , Dask , NumPy 4. DevOps & MLOps Tools Docker , Kubernetes , Helm Terraform , Pulumi (for infrastructure as code) Jenkins , GitHub Actions , Argo Workflows MLflow , DVC , Tecton , Feast 5. Cloud Platforms AWS : S3, EKS, SageMaker, Lambda, CloudWatch GCP : GKE, AI Platform, BigQuery, Dataflow Azure : Azure ML, AKS, Blob Storage 6. Monitoring & Logging Prometheus , Grafana ELK Stack , Datadog , Cloud-native monitoring tools 7. CI/CD & Versioning Git , GitOps , CI/CD pipelines for model and data versioning Preferred Experience 5+ years in AI/ML engineering roles. Experience building MLOps pipelines in production. Familiarity with regulatory and ethical considerations in ML (e.g., fairness, bias detection, explainability). Strong debugging and performance tuning skills in distributed environments.

Posted 21 hours ago

Apply

3.0 - 8.0 years

6 - 16 Lacs

Kolkata, Hyderabad, Bengaluru

Hybrid

Work youll do Deloitte has institutionalized AI, Gen AI and Analytics group within its Tax & Legal business, this group is a part of the Deloitte South Asia cluster and focuses to embed AI in everything we do, for our Tax & Legal business. You will be engaged in internal & external client projects to disrupt the way we operate and focus on building assets and solutions for our clients, including the latest technologies and methods around predictive models, prescriptive analytics, generative AI etc. We are looking for a highly skilled data scientist to join our dynamic team. The ideal candidate will have a solid background in artificial intelligence and machine learning, with hands-on experience in frameworks such as TensorFlow, PyTorch, scikit-learn, etc. The candidate should possess a deep understanding of data structures, algorithms, and distributed computing. Additionally, experience in deploying machine learning models in production environments, working with various database systems, and familiarity with version control, containerization, and cloud platforms are essential for success in this role. Also, candidates with great storyboarding skills and a penchant to convert AI driven mathematical insights to stories will be given preference. Responsibilities: Collaborate with cross-functional teams to translate business requirements into actual implementation of models, algorithms, and technologies. Execute the product road map and planning of the programs and initiatives as defined by the product owners. Independently solve complex business problems with minimal supervision, while escalating more complex issues to appropriate next level. Develop and maintain software programs, algorithms, dashboards, information tools, and queries to clean, model, integrate and evaluate data sets. Build and optimize pipelines for data intake, validation, and mining as well as modelling and visualization by applying best practices to the engineering of large data sets. Develop and implement advanced machine learning algorithms and models for various applications. Apply the latest advances in deep learning, machine learning and natural language processing to improve performance of legacy models. Customize latest available large language models to develop generative AI solutions for multiple business problems across multiple functional areas. Apply A/B testing framework and test model quality. Experience in taking the models to Production using Cloud Technologies Provide findings and analysis to take informed business decisions. Stay updated with the latest developments in AI/ML technologies and contribute to the continuous improvement of our systems. Requirement: Minimum of 2 - 8 years of relevant work experience. Master's degree in a related field (Statistics, Mathematics or Computer Science) or MBA in Data Science/AI/Analytics Experience with database systems such as MySQL, PostgreSQL, or MongoDB. Experience in collecting and manipulating structured and unstructured data from multiple data systems (on-premises, cloud-based data sources, APIs, etc) Familiarity with version control systems, preferably Git. Familiarity with cloud platforms such as AWS, Azure, or Google Cloud. Solid understanding of data structures, algorithms, and distributed computing. Excellent knowledge of Jupyter Notebooks for experimentation and prototyping. Strong programming skills in Python. In-depth understanding of machine learning, deep learning & natural language processing (NLP) algorithms. Experience with popular machine learning frameworks such as TensorFlow, PyTorch, or scikit-learn. Knowledge of containerization tools such as Docker. Experience in deploying machine learning models in production environments. Excellent problem-solving and communication skills. Proficient in using data visualization tools such as Tableau or Matplotlib, or dashboarding packages like Flask, Streamlit. Good working knowledge of MS PowerPoint and storyboarding skills to translate mathematical results to business insights.

Posted 21 hours ago

Apply

155.0 years

0 Lacs

Mumbai, Maharashtra, India

Remote

India Foods Business is a full-service manufacturing and marketing unit comprising over 500 employees spread across over multiple locations across India. Our strong suite of products includes the Pillsbury, Betty Crocker, Haagen-Dazs and Nature Valley brands in both direct-to-consumer and B2B channels across Retail, Food Service and Bakeries & Exports. We combine the capabilities of a global enterprise with the entrepreneurial spirit and cultural awareness you would expect of a smaller local company. Position Title Sr. Analyst – SC Adv Analytics, OU Function/Group Logistics Location Mumbai Shift Timing 1:30 PM to 10:30 PM Role Reports to Assistant Manager Remote/Hybrid/in-Office Hybrid About General Mills We make food the world loves: 100 brands. In 100 countries. Across six continents. With iconic brands like Cheerios, Pillsbury, Betty Crocker, Nature Valley, and Häagen-Dazs, we’ve been serving up food the world loves for 155 years (and counting). Each of our brands has a unique story to tell. How we make our food is as important as the food we make. Our values are baked into our legacy and continue to accelerate us into the future as an innovative force for good. General Mills was founded in 1866 when Cadwallader Washburn boldly bought the largest flour mill west of the Mississippi. That pioneering spirit lives on today through our leadership team who upholds a vision of relentless innovation while being a force for good. For more details check out http://www.generalmills.com General Mills India Center (GIC) is our global capability center in Mumbai that works as an extension of our global organization delivering business value, service excellence and growth, while standing for good for our planet and people. With our team of 1800+ professionals, we deliver superior value across the areas of Supply chain (SC) , Digital & Technology (D&T) Innovation, Technology & Quality (ITQ), Consumer and Market Intelligence (CMI), Sales Strategy & Intelligence (SSI) , Global Shared Services (GSS) , Finance Shared Services (FSS) and Human Resources Shared Services (HRSS).For more details check out https://www.generalmills.co.in We advocate for advancing equity and inclusion to create more equitable workplaces and a better tomorrow. Job Overview Function Overview The GIC Supply Chain team manages end-to-end operations, encompassing planning, sourcing, manufacturing, logistics, and analytics. They strategically plan to meet market demands, optimize sourcing, ensure efficient production, and oversee the seamless movement of goods from production to delivery. The team employs advanced analytics throughout these processes, fostering adaptability and operational excellence. This collaborative approach ensures a well-coordinated supply chain that aligns with both organizational goals and dynamic market conditions. Link Purpose of the role Supply chain analytics group is responsible for delivering the solutions and developing the capabilities which enables better decision-making in supply chain. This team works closely with all supply chain functions (i.e., plan, source, make, deliver and customer service teams), business units and other cross functional partners for developing solutions to drive business value. This team also drives step-change innovation and improvements in business practices by delivering actionable insights through advanced analytics and supply chain expertise. The role involves building, maintaining, and executing optimization and simulation models to help identify, analyze, and implement opportunities in the areas of manufacturing network design and optimization, supply network planning and realignment by incorporating supply chain costs, manufacturing complexity, inventory optimization and capacity. The Sr Analyst supports the Supply Planning teams in identifying opportunities for manufacturing & distribution network and provides least cost sourcing and distribution options. Works on strategic projects and new capability development and enhancement initiatives. Key Accountabilities 15% - Requirement Gathering & Data Collection: Understand Project charter- Business context, Scope, Outcome & success criteria. Collaborate with the Global & India Supply Chain partners to gather the required information and conduct the data validations & analysis. Proactively analyze the current set of the data based on facts and root cause analysis. Perform data massaging – Outlier detection and handle missing data etc. and share key findings with the team lead. 40% Model building & Scenario Evaluation Solve business problems in the areas of Supply Chain by developing different modeling approaches/techniques using Descriptive/Prescriptive/Predictive analytics and recommend the best approach. Develop a detailed solution design/architecture or re-engineer an existing solution design. Build/Refresh the models periodically and provide recommendations to business teams. Run & analyze what-if scenarios by leveraging advanced tools and provide insights to the business team. 20% Model result analysis & presentation Detailed and quantitative analysis of the model output and clearly articulate findings and recommendations Synthesize large data sets/model results into usable insights and business recommendations Build presentation and communicate the results effectively with the broad audience of clients/stakeholders/cross-functional teams. Provides analytics support during the implementation and transition phase. 15% Other Responsibilities: Improving Participate in brainstorming sessions on new capabilities development/ideation sessions Work on enhancements in current capabilities/process improvements Support new technologies by participating in testing sessions & training sessions Sustaining Develop and design governance framework Ensure strong and clear process/training/project documentation and controls are in force Creating assumption documents for client meetings/methodology documents Participating in weekly directional meetings, team meetings & townhalls 10% Self-Development: Upskilling through internal & external training/courses, in behavioral, functional, analytical & technology areas Participating in innovation challenges, Hackathon competitions, Knowledge sharing sessions Participating in training sessions at the team/departmental level Participating in internal case studies to acquire business acumen Specific Job Experience Or Skills Needed End to End Supply Chain Knowledge Critical thinking with strong analytical and problem-solving skills Ability to formulate mathematical model & techniques Either of Technical know- how of optimization – Linear/Integer/MILP programming Technical know- how of Statistics – Distributions/ Hypothesis Testing/ Measures of Central tendency/ Regression/ Risk Analysis Technical know- how of prediction – Machine learning, various types of regression, classification and clustering techniques, time series, tree- based prediction models Technical know-how of Simulation – Discrete event simulation/Monte Carlo Understand business problems and convert into analytical problem statements Experience with model building in either of these tools Coupa Supply Chain Guru, R/Python, VBA/Macros, SQL, Solvers like CPLEX/Gurobi, etc., Simulation tools like @risk, MATLAB, etc., Machine Learning techniques/models Effective & Strong Communication Skills Storytelling with effective presentation Stakeholder Management Innovative Mindset & Learning Mindset Nice to have experience in front end application development - Rest APIs, Dash, R shiny. Competencies/Behaviors Required For Job Deliver outstanding results – Agile and self-driven individual to learn/adapt the analytics ways of working. Highly accountable to complete deliverable in a timely and effective manner. Proactive communication about any roadblock and recommend ideas and input to help team achieve greater result. Interpersonal Effectiveness – Relates well with stakeholders, colleagues & team members. Maintains a positive, supportive & appreciative attitude. Actively listens to others & demonstrates an understanding of their point of view. Clearly articulates views in written & verbal discussions. Problem Solving / Analytical skills – Ability to collect and analyze data quickly and efficiently. Can identify issues and provide ideas/solutions for resolution. Leads Innovation – Ability and confidence to identify and recommend creative solutions. Identifies both opportunities and needs for change. Adapts quickly and responds effectively to change. Identifies issues or problems and provides a respective solution. Explores and shares innovative best practices with others. Minimum Qualifications Minimum Degree Requirements: Bachelors Preferred Degree Requirements: Masters (M. Tech/MBA/MSC) Preferred Major Area of Study: Supply Chain / Operations Research / Industrial Engineering / Statistics / Mathematics / Mechanical Engineering / Computer Engineering / Electronics Engineering / Instrumentation Engineering/ Production Engineering Required Professional Certifications Preferred Professional Certifications: APICS, CSCP, CPIM Analytics, Six Sigma, SC Macro Masters or similar accreditation Preferred Institutes : IITs/NITs/ Tier 1 or 2 MBA Colleges/reputed University Preferred Qualifications Master’s degree 3 years of related experience Major Area of Study in Industrial Engineering or Supply Chain Professional Certifications: CSCMP, APICS Company Overview We exist to make food the world loves. But we do more than that. Our company is a place that prioritizes being a force for good, a place to expand learning, explore new perspectives and reimagine new possibilities, every day. We look for people who want to bring their best — bold thinkers with big hearts who challenge one other and grow together. Because becoming the undisputed leader in food means surrounding ourselves with people who are hungry for what’s next.

Posted 21 hours ago

Apply

155.0 years

0 Lacs

Mumbai, Maharashtra, India

Remote

Position Title Data Scientist II Function/Group R&D/Packaging Location Mumbai Shift Timing Regular Role Reports to Sr. Manager, Global Knowledge Solutions Remote/Hybrid/in-Office Hybrid About General Mills We make food the world loves: 100 brands. In 100 countries. Across six continents. With iconic brands like Cheerios, Pillsbury, Betty Crocker, Nature Valley, and Haagen-Dazs, we have been serving up food the world loves for 155 years (and counting). Each of our brands has a unique story to tell. How we make our food is as important as the food we make. Our values are baked into our legacy and continue to accelerate. us into the future as an innovative force for good. General Mills was founded in 1866 when Cadwallader Washburn boldly bought the largest flour mill west of the Mississippi. That pioneering spirit lives on today through our leadership team who upholds a vision of relentless innovation while being a force for good. For more details check out http://www.generalmills.com General Mills India Center (GIC) is our global capability center in Mumbai that works as an extension of our global organization delivering business value, service excellence and growth, while standing for good for our planet and people. With our team of 1800+ professionals, we deliver superior value across the areas of Supply chain (SC), Digital & Technology (D&T) Innovation, Technology & Quality (ITQ), Consumer and Market Intelligence (CMI), Sales Strategy & Intelligence (SSI), Global Shared Services (GSS), Finance Shared Services (FSS) and Human Resources Shared Services (HRSS).For more details check out https://www.generalmills.co.in We advocate for advancing equity and inclusion to create more equitable workplaces and a better tomorrow. Job Overview Function Overview In partnership with our cross-functional partners, ITQ innovates and develops products that meet the ever-changing needs of our consumers and enables long-term business growth. We identify and develop technologies that shape and protect our businesses today and into the future. ITQ operates across three organizations: Global Applications, Capabilities COEs, and Shared Services & Operations For more details about General Mills please visit this Link Purpose of the role The Global Knowledge Services (GKS) organization catalyzes the creation, transfer, and application of knowledge to ensure ITQ succeeds at its mission of driving internal and external innovation, developing differentiated technology, and engendering trust through food safety and quality. The scientists in the Statistics and Analytics Program Area will collaborate with US and India GKS team members to deliver high value statistical work that advances ITQ initiatives in consumer product research, health and nutrition science, research and development, and quality improvement. The Data Scientist II in this program area will be responsible for: designing, building, and maintaining scalable data pipelines and infrastructure to support advanced analytics, data science, and business intelligence across our organization leveraging GCP services. This role requires close collaboration with statisticians, data scientists, and BI developers to ensure timely, reliable, and quality data delivery that drives insights and decision-making. Key Accountabilities 70%of Time- Excellent Technical Work Design, develop, and optimize data pipelines and ETL/ELT workflows using GCP services (BigQuery, Dataflow, Pub/Sub, Cloud Functions, etc.) Build and maintain data architecture that supports structured and unstructured data from multiple sources Work closely with statisticians and data scientists to provision clean, transformed datasets for advanced modeling and analytics Enable self-service BI through efficient data modeling and provisioning in tools like Looker, Power BI, or Tableau Implement data quality checks, monitoring, and documentation to ensure high data reliability and accuracy Collaborate with DevOps/Cloud teams to ensure data infrastructure is secure, scalable, and cost-effective Support and optimize workflows for data exploration, experimentation, and productization of models Participate in data governance efforts, including metadata management, data cataloging, and access controls 15%of Time- Client Consultation and Business Partnering Work effectively with clients to identify client needs and success criteria, and translate into clear project objectives, timelines, and plans. Be responsive and timely in sharing project updates, responding to client queries, and delivering on project commitments. Clearly communicate analysis, conclusions, insights, and conclusions to clients using written reports and real-time meetings. 10%of Time-Innovation, Continuous Improvement (CI), and Personal Development Learn and apply a CI mindset to work, seeking opportunities for improvements in efficiency and client value. Identify new resources, develop new methods, and seek external inspiration to drive innovations in our work processes. Continually build skills and knowledge in the fields of statistics, and the relevant sciences. 5% of Time-Administration Participate in all required training (Safety, HR, Finance, CI, other) and actively GKS, and ITQ meetings, events, and activities. Complete other administrative tasks as required. Minimum Qualifications Minimum Degree Requirements: Masters from an accredited university Minimum 6 years of related experience required Specific Job Experience Or Skills Needed 6+ years of experience in data engineering roles, including strong hands-on GCP experience Proficiency in GCP services like BigQuery, Cloud Storage, Cloud Composer (Airflow), Dataflow, Pub/Sub Strong SQL skills and experience working with large-scale data warehouses Solid programming skills in Python and/or Java/Scala Experience with data modeling, schema design, and performance tuning Familiarity with CI/CD, Git, and infrastructure-as-code principles (Terraform preferred) Strong communication and collaboration skills across cross-functional teams For Global Knowledge Services Ability to effectively work cross-functionally with internal/global team members. High self-motivation, with the ability to work both independently and in teams. Excels at driving projects to completion, with attention to detail. Ability to exercise judgment in handling confidential and proprietary information. Ability to effectively prioritize, multi-task, and execute tasks according to a plan. Able to work on multiple priorities and projects simultaneously. Demonstrated creative problem-solving abilities, attention to detail, ability to “think outside the box.” Preferred Qualifications Preferred Major Area of Study: Master’s degree in Computer Science, Engineering, Data Science, or a related field Preferred Professional Certifications: GCP Preferred 6 years of related experience Company Overview We exist to make food the world loves. But we do more than that. Our company is a place that prioritizes being a force for good, a place to expand learning, explore new perspectives and reimagine new possibilities, every day. We look for people who want to bring their best — bold thinkers with big hearts who challenge one other and grow together. Because becoming the undisputed leader in food means surrounding ourselves with people who are hungry for what’s next.

Posted 21 hours ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies