Jobs
Interviews

20419 Ml Jobs - Page 49

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Job Category: Job Type: Job Location: Salary: Years of Experience: Job Role: Data Scientist Senior / Lead Position Data Scientist Senior / Lead Relevant Experience 4 years Location: Gurgaon Notice Period Immediate or serving Notice Period Primary Skills Data Science, ML/DL NLP/Computer Vision, GenAI, LLM; AzureDatabricks (Must) Job Brief We are looking for a Senior/ Lead Data Scientist to planning projects and building analytics models. You should have a strong problem-solving ability and a knack for statistical analysis. If youre also able to align our data products with our business goals, wed like to meet you. Your ultimate goal will be to help improve our products and business decisions by making the most out of our data. Responsibilities Data mining and collection procedures Ensure data quality and integrity Interpret and analyze data problems Conceive, plan and prioritize data projects Build analytic systems and predictive models Test performance of data-driven products Visualize data and create reports Experiment with new models and techniques Align data projects with organizational goals Team Management Qualifications: Requirements and Skills Proven experience as a Data Scientist or similar role Solid understanding of machine learning Experience in developing and testing of Chatbots/Bots Hands-on experience in NLP Hands-on experience in Gen AI, LLM End-to-end project deliveries Knowledge of data management and visualization techniques A knack for statistical analysis and predictive modeling Good knowledge of R, Python, and MATLAB Experience with SQL and NoSQL databases Strong team management, organizational, and leadership skills Excellent communication skills Experience in Azure Data Bricks (Must) A business mindset Degree in Computer Science, Data Science, Mathematics or similar field Benefits Youll Get Unlimited opportunities to learn on our multiple Training Platforms Certifications Reimbursement Flexibility Opportunity to work on multiple technologies Medical Coverage & Life Insurance Company Events and Outings Tech Thursdays and Fun Fridays 5 days working Work-Fun Environment

Posted 4 days ago

Apply

4.0 - 8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

The Applications Development Intermediate Programmer Analyst is an intermediate level position responsible for participation in the establishment and implementation of new or revised application systems and programs in coordination with the Technology team. The overall objective of this role is to contribute to applications systems analysis and programming activities. Responsibilities: Utilize knowledge of applications development procedures and concepts, and basic knowledge of other technical areas to identify and define necessary system enhancements, including using script tools and analyzing/interpreting code Consult with users, clients, and other technology groups on issues, and recommend programming solutions, install, and support customer exposure systems Apply fundamental knowledge of programming languages for design specifications. Analyze applications to identify vulnerabilities and security issues, as well as conduct testing and debugging Serve as advisor or coach to new or lower level analysts Identify problems, analyze information, and make evaluative judgements to recommend and implement solutions Resolve issues by identifying and selecting solutions through the applications of acquired technical experience and guided by precedents Has the ability to operate with a limited level of direct supervision. Can exercise independence of judgement and autonomy. Acts as SME to senior stakeholders and /or other team members. Appropriately assess risk when business decisions are made, demonstrating particular consideration for the firm's reputation and safeguarding Citigroup, its clients and assets, by driving compliance with applicable laws, rules and regulations, adhering to Policy, applying sound ethical judgment regarding personal behavior, conduct and business practices, and escalating, managing and reporting control issues with transparency. Key Responsibilities: Design and implement ETL pipelines using PySpark and Big Data tools on platforms like Hadoop, Hive, HDFS etc. Write scalable Python code for Machine Learning preprocessing tasks and work with libraries such as pandas, Scikit-learn etc. Develop data pipelines to support model training, evaluation and inference. Skills: Proficiency in Python programming with experience in PySpark for large-scale data processing. Hands-on experience in Big Data technologies: Hadoop, Hive HDFS etc. Exposure to machine learning workflows, model lifecycle and data preparation. Experience with ML libraries: Scikit-learn, XGBoost, Tensorflow, PyTorch etc. Exposure to cloud platforms (AWS/GCP) for data and AI workloads. Qualifications: 4-8 years of relevant experience in the Financial Service industry Intermediate level experience in Applications Development role Consistently demonstrates clear and concise written and verbal communication Demonstrated problem-solving and decision-making skills Ability to work under pressure and manage deadlines or unexpected changes in expectations or requirements Education: Bachelor’s degree/University degree or equivalent experience This job description provides a high-level review of the types of work performed. Other job-related duties may be assigned as required. ------------------------------------------------------ Job Family Group: Technology ------------------------------------------------------ Job Family: Applications Development ------------------------------------------------------ Time Type: Full time ------------------------------------------------------ Most Relevant Skills Please see the requirements listed above. ------------------------------------------------------ Other Relevant Skills For complementary skills, please see above and/or contact the recruiter. ------------------------------------------------------ Citi is an equal opportunity employer, and qualified candidates will receive consideration without regard to their race, color, religion, sex, sexual orientation, gender identity, national origin, disability, status as a protected veteran, or any other characteristic protected by law. If you are a person with a disability and need a reasonable accommodation to use our search tools and/or apply for a career opportunity review Accessibility at Citi. View Citi’s EEO Policy Statement and the Know Your Rights poster.

Posted 4 days ago

Apply

8.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

At ZoomInfo, we encourage creativity, value innovation, demand teamwork, expect accountability and cherish results. We value your take charge, take initiative, get stuff done attitude and will help you unlock your growth potential. One great choice can change everything. Thrive with us at ZoomInfo. At ZoomInfo we encourage creativity, value innovation, demand teamwork, expect accountability and cherish results. If you are collaborative, take initiative, and get stuff done we want to talk to you! We have high aspirations for the company and are looking for the right people to help fulfill the dream. We strive to continually improve every aspect of the company and use cutting-edge technologies and processes to delight our customers and rapidly increase revenue. About The Role: Enterprise Apps AI Center of Excellence (CoE) is a newly established team that operates with the agility of a startup to generate substantial value for Finance through Artificial Intelligence. Going beyond simple efficiency improvements, we focus on building innovative AI solutions that unlock new capabilities and insights for the Enterprise Applications organization. Our mission is to transform business processes, enhance strategic decision-making, and drive significant value using advanced AI applications. The team will be responsible for identifying, analyzing, developing, and deploying high-impact use cases. The ultimate goal is to empower the enterprise business applications users with cutting-edge AI, enabling more strategic and effective operations. As the Senior Engineer, you will lead our team in conceptualizing, developing, and deploying impactful AI/ML solutions that revolutionize finance processes. You will be accountable for the entire AI solution lifecycle – from ideation and requirements gathering to development, implementation, and ongoing support. This role involves leading and supporting technically a geographically diverse team located in the US and India. Identify and POC efficiency opportunities for software development, software optimization / tech debt reduction, QA, UAT, Unit Testing, Regression Testing What You'll do: Hands-on experience with developing, debugging, training, evaluating, deploying at scale, optimizing and fine tuning state-of-the-art AI models especially in NLP, Generative AI with Python, PyTorch, TensorFlow, Keras, ML algorithms (supervised, unsupervised, reinforcement learning), Gen AI frameworks (GPT-4, Llama, Claude, Gemini, Hugging Face, LangChain) Deep expertise in Neural Networks, Deep Learning, Conversational AI such as ChatBots, IVA, AgentAssist & NLP (LLMs, Transformers, RNNs, Semantic Search), Prompt Engineering Hands-on experience with AIOps , MLOps, DataOps in build the production like AI/ML pipelines with KubeFlow, MLFlow, AutoML on cloud platforms (AWS, Azure, GCP) Hands-on experience with AI-powered search (vector db, semantic search) and Microservices development - Java, Python, Spring Boot and NoSQL. Comprehensive experience with AI/ML advancements such as building the Multimodal AI, Advanced development with AutoML, Agentic AI, AI Integration, AI CyberSecurity and AI powered automation. Hands-on experience with data modelling, data processing with NLP techniques in Data Science to extract the insights. Hands-on experience with data pipelines, data warehousing, and data governance for building and deploying ML models. Responsible for proactive research in AI/ML space for the advancements and new industry trends and lead AI/ML & Data initiatives, to identify the business opportunities and critical AI use cases to deliver the AI/ML end-to-end solutions in finance business applications. Should be able to articulate the potential ROI and benefits of proposed AI solutions. Collaborate with technology, product, and data teams to seamlessly integrate validated AI solutions into existing finance workflows, clearly communicating technical details. Provide technical leadership and mentorship to the engineering team, fostering a culture of innovation and continuous learning. Utilize advanced statistical analysis and deep learning machine learning algorithms to address finance-related challenges and improve internal processes and strategies. What you bring: Minimum of 8 years hands-on experience in efficiently driving data science and/or AI/ML use cases, with a primary focus in the domain of finance and accounting. Strong understanding of software development lifecycle (SDLC) and standard methodologies for coding, design, testing, and deployment. Bachelor's or Master’s degree in Computer Science, Software Engineering, or in Statistics, Applied Mathematics, or an equivalent quantitative field. Excellent technical, problem-solving, and communication skills. Ability to analyze and resolve difficult technical issues quickly in a high-pressure environment. Experience working and presenting proposals to executives in a clear and compelling way Strong proficiency in modern programming languages (e.g., Java, Python) and frameworks (e.g., React, Node.js) to build AI/ML solutions in finance applications. Exposure to integration platforms such Boomi Strong Excel skills (required). Experience with SQL, Tableau, or other BI tools for extracting and visualizing data is a plus. About us: ZoomInfo (NASDAQ: GTM) is the Go-To-Market Intelligence Platform that empowers businesses to grow faster with AI-ready insights, trusted data, and advanced automation. Its solutions provide more than 35,000 companies worldwide with a complete view of their customers, making every seller their best seller. ZoomInfo may use a software-based assessment as part of the recruitment process. More information about this tool, including the results of the most recent bias audit, is available here. ZoomInfo is proud to be an equal opportunity employer, hiring based on qualifications, merit, and business needs, and does not discriminate based on protected status. We welcome all applicants and are committed to providing equal employment opportunities regardless of sex, race, age, color, national origin, sexual orientation, gender identity, marital status, disability status, religion, protected military or veteran status, medical condition, or any other characteristic protected by applicable law. We also consider qualified candidates with criminal histories in accordance with legal requirements. For Massachusetts Applicants: It is unlawful in Massachusetts to require or administer a lie detector test as a condition of employment or continued employment. An employer who violates this law shall be subject to criminal penalties and civil liability. ZoomInfo does not administer lie detector tests to applicants in any location.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Description At Boeing, we innovate and collaborate to make the world a better place. We’re committed to fostering an environment for every teammate that’s welcoming, respectful and inclusive, with great opportunity for professional growth. Find your future with us. Overview As a leading global aerospace company, Boeing develops, manufactures and services commercial airplanes, defense products and space systems for customers in more than 150 countries. As a top U.S. exporter, the company leverages the talents of a global supplier base to advance economic opportunity, sustainability and community impact. Boeing’s team is committed to innovating for the future, leading with sustainability, and cultivating a culture based on the company’s core values of safety, quality and integrity. Technology for today and tomorrow The Boeing India Engineering & Technology Center (BIETC) is a 5500+ engineering workforce that contributes to global aerospace growth. Our engineers deliver cutting-edge R&D, innovation, and high-quality engineering work in global markets, and leverage new-age technologies such as AI/ML, IIoT, Cloud, Model-Based Engineering, and Additive Manufacturing, shaping the future of aerospace. People-driven culture At Boeing, we believe creativity and innovation thrives when every employee is trusted, empowered, and has the flexibility to choose, grow, learn, and explore. We offer variable arrangements depending upon business and customer needs, and professional pursuits that offer greater flexibility in the way our people work. We also believe that collaboration, frequent team engagements, and face-to-face meetings bring together different perspectives and thoughts – enabling every voice to be heard and every perspective to be respected. No matter where or how our teammates work, we are committed to positively shaping people’s careers and being thoughtful about employee wellbeing. With us, you can create and contribute to what matters most in your career, community, country, and world. Join us in powering the progress of global aerospace. Position Overview The Boeing Company is seeking Associate Illustrated Parts Catalog Author to join the IPC/IPD teams within the support data engineering division, based in Chennai, India. Your aspirations extend beyond our planet. You possess innovation and creativity, constantly pushing boundaries. You excel in collaborative environments while also demonstrating the capability to handle tasks independently. The position offers an opportunity to analyze complex engineering documents and revise part applicability modifications in Illustrated Parts Catalogue. Position Responsibilities : Conduct regular data analysis tasks and prepare IPC/IPD data. Conducts in-depth analysis of engineering drawings, service bulletins, and modifications to interpret data for inclusion in the IPC Analyzes BOM (Bill of Materials) and engineering drawings to create installation and assembly breakdowns of parts using authoring tools. Update the Illustrated Parts Catalog to incorporate parts interchangeability and changes from pre/post configuration service bulletins. Follows IPC procedures, publication standards and government/customer specifications for authoring. Conducts analysis of Next Higher Assemblies and build indenture relationships to support impact analysis Communicates ISO processes to company, customer, ISO auditors and representatives. Create mark-ups to serve as inputs for illustrations. Performs quality assurance checks across multiple areas. Recognizes non-conformities in product and makes recommendations for corrections and preventive actions. Compares product to incoming source data for the purpose of verifying technical accuracy. Supports customer inquiries by researching issues and drafting responses. Carries out basic formatting and cataloging of sub-assemblies and components in accordance with established procedures and specifications. Communicates effectively with global partners and coordinates with team members within the group. Recognize opportunities for process improvement. Provides mentorship and guidance to other team members. Basic Qualifications (Required Skills/Experience): Bachelor’s degree in engineering or Diploma or Aircraft Maintenance Engineering (AME) is required as a basic qualification. 5 + years of experience in Aerospace technical publications or related work experience in other similar domains Over 3 years of experience in developing, updating, and evaluating IPC/IPD in accordance with ATA100/iSpec2200/S1000D standards Experience in utilizing engineering drawings, service bulletins, specifications, and other engineering resources to research, analyze, and interpret information for inclusion in publications. Preferred Qualifications (Desired Skills/Experience): Bachelor’s degree in engineering/Diploma/Aircraft Maintenance Engineering (AME) or equivalent will be considered along with 4-8 years of experience. Typical Education & Experience: Education and experience generally obtained through advanced studies (such as a bachelor’s degree in engineering or its equivalent) combined with over 5 years of relevant work experience. Relocation: This position offers relocation. Applications for this position will be accepted until Jul. 12, 2025 Export Control Requirements: This is not an Export Control position. Education Bachelor's Degree or Equivalent Required Relocation This position offers relocation based on candidate eligibility. Visa Sponsorship Employer will not sponsor applicants for employment visa status. Shift Not a Shift Worker (India) Contingent Upon Program Reward The position is contingent upon program award Equal Opportunity Employer: We are an equal opportunity employer. We do not accept unlawful discrimination in our recruitment or employment practices on any grounds including but not limited to; race, color, ethnicity, religion, national origin, gender, sexual orientation, gender identity, age, physical or mental disability, genetic factors, military and veteran status, or other characteristics covered by applicable law. We have teams in more than 65 countries, and each person plays a role in helping us become one of the world’s most innovative, diverse and inclusive companies. We are proud members of the Valuable 500 and welcome applications from candidates with disabilities. Applicants are encouraged to share with our recruitment team any accommodations required during the recruitment process. Accommodations may include but are not limited to: conducting interviews in accessible locations that accommodate mobility needs, encouraging candidates to bring and use any existing assistive technology such as screen readers and offering flexible interview formats such as virtual or phone interviews.

Posted 4 days ago

Apply

5.0 years

0 Lacs

Greater Kolkata Area

Remote

About CommentSold CommentSold is the North American leader in live selling technology (ranked by G2), having enabled over 7,000 small to mid-sized retailers with live-selling tools, generating over 166 million items sold with $4.5B+ in lifetime GMV. CommentSold’s technology continues to provide businesses and creators of all sizes with best-in-class solutions for delivering effective live video commerce experiences across all of their sales channels simultaneously. With the acquisition of Popshoplive, a community-driven livestream shopping marketplace app at the intersection of social, e-commerce and entertainment, CommentSold entered also direct-to-consumer commerce In 2022, CommentSold debuted its Videeo, the lightweight video commerce plugin, giving any retailer or brand the ability to embed lives and video commerce experiences within few days into their existing e-commerce stack. Since Q4 2023 a new line of AI products (AI Clip hero, Model:Me, AI Sub Hero, … ) have been developed in CommentSold to multiply the effects of the live selling shops and to speed-up their time-to-market with new products. About The Role The ML & Data Engineer in our AI & Data Team will serve as a senior Data engineering expert and cross-departmental liaison in our Company, responsible for further developing the existing platform of both structured data Data Warehouse as well as vast unstructured data in Data Lake. Tasks include setting new data pipelines, data crawlers and transformations, monitoring the performance and cost-effectiveness of existing data jobs, as well as data integration and docking of the Machine learning processes into the existing data landscape of the company. The ML & Data engineer will represent the Data team in multi-disciplinary projects and is the go-to-person for business teams to consult the new data sources onboarding as well as data availability for products and customer enabling. This person is a member of the Data team and reports to our EVP of AI & Data. This is a fully remote role (based in India), with the need to work (at least partially) for EST or CST time zones in the US. Main Responsibilities As part of the AI & Data team, build a company-wide data platform. Drive data democracy and literacy within the company. Develop and maintain central Data warehouse and its staging layers, oversee and adapt the ingestion and ETL jobs, enable seamless flow of structured data. Scan the landscape of both internal and external data sources, propose extensions and updates of the data platform. Document data dictionary and ETL processes. Own and upgrade the company's Data Lake in the cloud. Integrate event tracking and data off-loading into Data Lake, including text, images and video files. Ensure integrations with API gateways and down-streaming consuming services. Design and man the API integrations and automated data robots for external data ingestion. Design internal API microservices to support and enable the data exchange among products, systems and external 3rd party applications. Dock the Machine learning and Computer vision models into data pipelines, design the data flow for those AI services. Work closely with Engineering teams and Data team members, to steer or support projects aimed at data tools and data product creation. Interact with many stakeholders, incl. department leads and senior executives, to translate their business needs into extensions or adaptations of our internal data troves. Skills, Qualifications & Education Bachelor’s degree in Computer Science, Machine Learning or Artificial Intelligence At least 5 years of work experience in Business Intelligence, Data analytics, Controlling (or similar analytical roles) Have demonstrated aptitude for working with data both on a structured and unstructured basis. Does not shy to troubleshoot failed ETL processes or API integrations. Robust Python literacy (esp. for data handling) is a must. Skills with Spark or Typescript are a plus Strong expertise in SQL (will be tested during the recruitment process). Well-versed in cloud AWS services, especially around different data handling services from AWS suite. Understands the mutual interdependencies, navigates in linking and tracking individual data handling components. Hands-on experience with implementing and gearing APIs for data transfers Working knowledge of Machine learning, NLP and Computer vision algorithms and solutions Experience with Deep learning, GAI and Data crawling automation are a plus Outstanding data structure blueprinting skills Strong ability to translate ideas between technical and non-technical audiences Solid business acumen, and experience working with non-technical stakeholders or E-commerce experience (from past roles) are a strong plus Keen to work in a collaborative team environment, eager to give-and-take exchange with other team members and/or regular knowledge sharing Curious, insights and root-causes hungry. Thriving for visible business impact of own tasks and processes. Eager to test-and-learn new approaches. Flexible, self-motivated. Organized and structured, able to work on multiple projects simultaneously, sometimes against tight deadlines.

Posted 4 days ago

Apply

8.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Position Overview: ShyftLabs is seeking an experienced Databricks Architect to lead the design, development, and optimization of big data solutions using the Databricks Unified Analytics Platform. This role requires deep expertise in Apache Spark, SQL, Python, and cloud platforms (AWS/Azure/GCP). The ideal candidate will collaborate with cross-functional teams to architect scalable, high-performance data platforms and drive data-driven innovation. ShyftLabs is a growing data product company that was founded in early 2020 and works primarily with Fortune 500 companies. We deliver digital solutions built to accelerate business growth across various industries by focusing on creating value through innovation. Job Responsibilities Architect, design, and optimize big data and AI/ML solutions on the Databricks platform Develop and implement highly scalable ETL pipelines for processing large datasets Lead the adoption of Apache Spark for distributed data processing and real-time analytics Define and enforce data governance, security policies, and compliance standards Optimize data lakehouse architectures for performance, scalability, and cost-efficiency Collaborate with data scientists, analysts, and engineers to enable AI/ML-driven insights Oversee and troubleshoot Databricks clusters, jobs, and performance bottlenecks Automate data workflows using CI/CD pipelines and infrastructure-as-code practices Ensure data integrity, quality, and reliability across all data processes Basic Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field 8+ years of hands-on experience in data engineering, with at least 5+ years in Databricks Architect and Apache Spark Proficiency in SQL, Python, or Scala for data processing and analytics Extensive experience with cloud platforms (AWS, Azure, or GCP) for data engineering Strong knowledge of ETL frameworks, data lakes, and Delta Lake architecture Hands-on experience with CI/CD tools and DevOps best practices Familiarity with data security, compliance, and governance best practices Strong problem-solving and analytical skills in a fast-paced environment Preferred Qualifications: Databricks certifications (e.g., Databricks Certified Data Engineer, Spark Developer) Hands-on experience with MLflow, Feature Store, or Databricks SQL Exposure to Kubernetes, Docker, and Terraform Experience with streaming data architectures (Kafka, Kinesis, etc.) Strong understanding of business intelligence and reporting tools (Power BI, Tableau, Looker) Prior experience working with retail, e-commerce, or ad-tech data platforms We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources.

Posted 4 days ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job title: Data Engineer Location: Hyderabad About Sanofi We are an innovative global healthcare company, driven by one purpose: we chase the miracles of science to improve people’s lives. Our team, across some 100 countries, is dedicated to transforming the practice of medicine by working to turn the impossible into the possible. We provide potentially life-changing treatment options and life-saving vaccine protection to millions of people globally, while putting sustainability and social responsibility at the center of our ambitions. Sanofi has recently embarked into a vast and ambitious digital transformation program. A cornerstone of this roadmap is the acceleration of its data transformation and of the adoption of artificial intelligence (AI) and machine learning (ML) solutions that will accelerate Manufacturing & Supply performance and help bring drugs and vaccines to patients faster, to improve health and save lives. Who You Are: You are a dynamic Data Engineer interested in challenging the status quo to design and develop globally scalable solutions that are needed by Sanofi’s advanced analytic, AI and ML initiatives for the betterment of our global patients and customers. You are a valued influencer and leader who has contributed to making key datasets available to data scientists, analysts, and consumers throughout the enterprise to meet vital business use needs. You have a keen eye for improvement opportunities while continuing to fully comply with all data quality, security, and governance standards. Our vision for digital, data analytics and AI Join us on our journey in enabling Sanofi’s Digital Transformation through becoming an AI first organization. This means: AI Factory - Versatile Teams Operating in Cross Functional Pods: Utilizing digital and data resources to develop AI products, bringing data management, AI and product development skills to products, programs and projects to create an agile, fulfilling and meaningful work environment. Leading Edge Tech Stack: Experience building products that will be deployed globally on a leading-edge tech stack. World Class Mentorship and Training: Working with renowned leaders and academics in machine learning to further develop your skillsets Job Highlights Propose and establish technical designs to meet business and technical requirements Develop and maintain data engineering solutions based on requirements and design specifications using appropriate tools and technologies Create data pipelines / ETL pipelines and optimize performance Test and validate developed solution to ensure it meets requirements Create design and development documentation based on standards for knowledge transfer, training, and maintenance Work with business and products teams to understand requirements, and translate them into technical needs Adhere to and promote to best practices and standards for code management, automated testing, and deployments Leverage existing or create new standard data pipelines within Sanofi to bring value through business use cases Develop automated tests for CI/CD pipelines Gather/organize large & complex data assets, and perform relevant analysis Conduct peer reviews for quality, consistency, and rigor for production level solution Actively contribute to Data Engineering community and define leading practices and frameworks Communicate results and findings in a clear, structured manner to stakeholders Remains up to date on the company’s standards, industry practices and emerging technologies Key Functional Requirements & Qualifications Experience working cross-functional teams to solve complex data architecture and engineering problems Demonstrated ability to learn new data and software engineering technologies in short amount of time Good understanding of agile/scrum development processes and concepts Able to work in a fast-paced, constantly evolving environment and manage multiple priorities Strong technical analysis and problem-solving skills related to data and technology solutions Excellent written, verbal, and interpersonal skills with ability to communicate ideas, concepts and solutions to peers and leaders Pragmatic and capable of solving complex issues, with technical intuition and attention to detail Service-oriented, flexible, and approachable team player Fluent in English (Other languages a plus) Key Technical Requirements & Qualifications Bachelor’s Degree or equivalent in Computer Science, Engineering, or relevant field 4 to 5+ years of experience in data engineering, integration, data warehousing, business intelligence, business analytics, or comparable role with relevant technologies and tools, such as Spark/Scala, Informatica/IICS/Dbt Understanding of data structures and algorithms Working knowledge of scripting languages (Python, Shell scripting) Experience in cloud-based data platforms (Snowflake is a plus) Experience with job scheduling and orchestration (Airflow is a plus) Good knowledge of SQL and relational databases technologies/concepts Experience working with data models and query tuning Nice To Haves Experience working in life sciences/pharmaceutical industry is a plus Familiarity with data ingestion through batch, near real-time, and streaming environments Familiarity with data warehouse concepts and architectures (data mesh a plus) Familiarity with Source Code Management Tools (GitHub a plus) Why choose us? Bring the miracles of science to life alongside a supportive, future-focused team. Discover endless opportunities to grow your talent and drive your career, whether it’s through a promotion or lateral move, at home or internationally. Enjoy a thoughtful, well-crafted rewards package that recognizes your contribution and amplifies your impact. Take good care of yourself and your family, with a wide range of health and wellbeing benefits including high-quality healthcare, prevention and wellness programs and at least 14 weeks’ gender-neutral parental leave. Opportunity to work in an international environment, collaborating with diverse business teams and vendors, working in a dynamic team, and fully empowered to propose and implement innovative ideas. Pursue Progress . Discover Extraordinary . Progress doesn’t happen without people – people from different backgrounds, in different locations, doing different roles, all united by one thing: a desire to make miracles happen. You can be one of those people. Chasing change, embracing new ideas and exploring all the opportunities we have to offer. Let’s pursue progress. And let’s discover extraordinary together. At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, or gender identity. Watch our ALL IN video and check out our Diversity Equity and Inclusion actions at sanofi.com! null Pursue Progress . Discover Extraordinary . Join Sanofi and step into a new era of science - where your growth can be just as transformative as the work we do. We invest in you to reach further, think faster, and do what’s never-been-done-before. You’ll help push boundaries, challenge convention, and build smarter solutions that reach the communities we serve. Ready to chase the miracles of science and improve people’s lives? Let’s Pursue Progress and Discover Extraordinary – together. At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, protected veteran status or other characteristics protected by law.

Posted 4 days ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

Remote

About the Role: We are hiring AI/ML Engineers to join a growing remote-first product and research team working on various AI/ML-driven applications across domains like: • Real Estate Intelligence • Fintech • Predictive Analytics • Generative AI • AI Assistants & Agents • Data Automation Tools This is a hands-on engineering role best suited for freshers or early-stage professionals looking to gain deep experience with practical ML model development, LLM integrations, and production deployment. ⸻ 🛠 Responsibilities: • Assist in training and tuning ML models using scikit-learn, TensorFlow, or PyTorch • Work with structured and unstructured datasets using Pandas, NumPy, SQL, and APIs • Build and test AI pipelines: preprocessing → modeling → evaluation → deployment • Integrate AI models into microservices (FastAPI/Flask) • Use LLM APIs (OpenAI, Anthropic, Gemini, Mistral) for building AI assistants and tools • Implement vector search and semantic search using Pinecone or ChromaDB • Write clean, reusable, and well-documented code in Python ⸻ ✅ Requirements: • Degree in Computer Science, Data Science, Engineering, or equivalent practical skills • Solid understanding of ML concepts: regression, classification, clustering, etc. • Experience with: • Python and Jupyter notebooks • Pandas, Numpy, Matplotlib • At least one ML library: scikit-learn, TensorFlow, or PyTorch • Good grasp of APIs and working with JSON/REST endpoints • Strong problem-solving ability and attention to detail • Basic version control with Git ⸻ 🌟 Nice-to-Haves: • Exposure to OpenAI, LangChain, Hugging Face, or LlamaIndex • Familiarity with Pinecone, ChromaDB, or vector databases • Understanding of cloud deployment (Google Cloud, AWS, or Firebase) • Participation in hackathons, Kaggle competitions, or personal ML projects • Interest in domain-specific AI (real estate, finance, e-commerce, etc.) ⸻ 🧠 You’ll Learn & Work With: • AI prompt engineering & chaining logic • LLM-driven workflows using LangChain / RAG pipelines • End-to-end ML lifecycle (train → deploy → monitor) • Generative AI and AI copilots • FastAPI for microservice integration

Posted 4 days ago

Apply

10.0 years

0 Lacs

Thiruvananthapuram, Kerala, India

On-site

Software Architect -Cloud Qualification and Experience:B.Tech/B.E/ MSc /MCA - 10 years Qualification and Experience B.Tech/B.E/ MSc /MCA Experience: 10 years Responsibilities Architect and Implement the AI driven Cloud /SaaS offering Research and design new frameworks and functional and nonfunctional features for various products meeting high quality standards Ensure Product delivered are designed for the required scale, resiliency and efficiency. Motivate Lead and Senior developers by assisting them to contribute beyond their levels for their professional and technical growth Contribution to academic outreach programs and other company branding activities Requirements of the role Designed and delivered on one or more widely used enterprise-class SaaS Application(s) preference will be given to candidates who have domain knowledge on Marketing technologies Knowledge of cloud computing infrastructure AWS Certification will be an advantage Hands-on experience in designing and managing scalable distributed systems Awareness of AI/ML technologies and their adoption in enterprise applications. Hands-on experience with the big data technologies (Hadoop, MapReduce, Spark, Hive, HBase) Hands-on experience in working with In-Memory databases and caching systems Hands-on experience with ETL (Extract-Transform-Load) tools Hands-on experience in containerization solutions such as Kubernetes Experience with large-scale RDBMS deployments and SQL optimization. Hands-on experience in building and managing Agile and Scrum development processes. Hands-on development experience in JAVA, Spring (Core, JPA, Boot, Cloud) technologies Hands-on experience in Git Aligned and experienced in DevOps

Posted 4 days ago

Apply

0 years

0 Lacs

Thiruvananthapuram, Kerala, India

On-site

Senior Software Engineer-R&D Qualification and Experience:B.Tech / M.Tech / M.E / MS / M.Sc in Computer Science Qualification and Experience B.Tech / M.Tech / M.E / MS / M.Sc in Computer Science or a related discipline (Applied Mathematics, Statistics, Electrical and/or Computer Engineering) or MCA Demonstrated commitment towards mastering AI/machine learning through own initiatives (side projects, books, MOOC courses etc.) would be a strong plus Responsibilities Implement and/or productize AI/Machine Learning algorithms at scale, utilizing distributed computing techniques, research findings, AI best practices and state of the art frameworks/libraries Setup and manage infrastructure, tools and frameworks for data management and transformation to facilitate AI R&D Package AI/ML algorithms to construct reusable AI recipes/components and/or create APIs for consuming packaged AI models Create examples and prototypes demonstrating consumption of packaged AI/Machine Learning algorithms Follow best practices to modularize, validate and package source code and follow proper source control management guidelines Conduct code reviews and mentor junior team members Work closely with AI researchers to productize innovations Requirements Requirements of the role The candidate should be strong in fundamentals of computer science, especially in algorithm analysis and design, and should be proficient in python programming. The candidate should have experience in working with and maintaining Linux-based systems, and should be hands-on in some/all of the following areas: Applying AI/machine learning, natural language processing and information retrieval algorithms on large datasets Creating and/or consuming AI/machine learning algorithms using tools/frameworks/libraries such as Jupyter/Zeppelin, scikit-learn, numpy, scipy, matplotlib, pandas, Tensorflow, Keras, Apache Spark etc. ETL/data cleansing and enrichment using Hadoop/Spark/other Big Data frameworks Gathering and processing raw data at scale using web scraping, API calls, crawling public repositories etc. Experience in working with SCM tools such as Git/GitHub/Bitbucket, exposure to Extreme Programming (XP), DevOps and/or Agile methodology Experience in conducting design and/or code reviews Job Code: SSE R&D_TVM Location: Trivandrum For more information, Please mail to: recruitment@flytxt.com

Posted 4 days ago

Apply

10.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Role - Quantum Developer Required Technical Skill Set - Quantum Computing, Optimization, AI/ML Desired Experience Range - 10+ Years Location of Requirement - PAN India Desired Competencies (Technical/Behavioural Competency) Must-Have Strong understanding of quantum mechanics and quantum computing principles Proficiency in python & numerical libraries Experience with quantum frameworks/SDK’s (like Qiskit, Pennylane etc.) Immense experience with classical algorithms & optimization techniques Hands-on experience in modelling & solving optimization problems using various solvers Experience with implementing algorithms on NISQ era devices & quantum annealers Strong foundation in linear algebra (advanced), applied mathematics Good-to-Have Familiarity with different cloud-based quantum services like AWS Brake Knowledge in adjacent areas like machine learning, HPC & cryptography Tensorflow & pytorch (for quantum machine learning apps) Responsibility of / Expectations from the Role Design & implement quantum algorithms for specific use cases (on different problem types – crypto, optimization, AI/ML, simulation) across different manufacturing domains Develop hybrid quantum-classical solutions using various quantum frameworks/SDK’s like Qiskit, Pennylane, Bracket Deploy & test the solutions on different quantum simulators and hardware from different providers like D-wave, IBM, IonQ etc Design mathematical model for optimization problems (MILP, LP, NLP etc.) Work with the cross functional team to understand the problem requirements and model constraints, objectives & variables Implement the model(s) using solvers like Cplex, Gurobi etc Benchmark the results from quantum solutions against classical methods Explore any new tools/frameworks/products launched in quantum computing space

Posted 4 days ago

Apply

6.0 - 10.0 years

20 - 30 Lacs

Hyderabad

Hybrid

Job Description We are seeking a Senior Software Engineer to lead and guide our engineering team in building and enhancing our internal AI Gateway platform. This platform empowers internal organizations to create and deploy AI use cases, integrating with Azure OpenAI, AWS Bedrock, Databricks, and other cloud platforms. You will drive architecture, design, and implementation of scalable data/model management pipelines, agentic AI, RAGs, tracing, and MCP server components. Essential to have hands on Development experience in Python JavaScript, TypeScript, Node JS Essential to have hands on experience in building scalable GenAI applications leveraging LLMs e.g. GPT, Claude, Anthropic using various techniques e.g. RAG, Agentic AI etc. Preferred experience in Developing GenAI Solutions using frameworks e.g. Langchain, LlamaIndex. Preferred experience in building AI pipelines for interaction via Chatbots, User Interface, Autonomous Agents. Must have excellent communication skills to understand complex AI concepts & work with architects to build the solution as well as the ability to explain it to Nontechnical Stakeholders. Preferred experience of deploying solutions on AWS with best, practices. Good to have built capabilities in GenAI backed by evaluation (any existing framework for evaluation or custom built for the specific use case).Contributions to AI research, opensource models or AI hackathons are a plus. Responsib0ilities: Lead the design and development of the AI Gateway platform and its integrations (Azure OpenAI, AWS Bedrock, Databricks, etc.). Architect and implement scalable, secure, and maintainable data/model management pipelines. Guide the team in building agentic AI, RAG (Retrieval-Augmented Generation), tracing, and MCP server solutions. Mentor and upskill team members, conduct code reviews, and enforce best practices. Collaborate with product, DevOps, and data science teams to deliver robust solutions. Drive automation and CI/CD for model and data pipeline deployments. Ensure platform reliability, observability, and security. Requirements: 7+ years of software engineering experience, with at least 2 years in a technical leadership role. Strong Python (FastAPI, asyncio), cloud (Azure, AWS), and data engineering skills. Experience with LLMs, agentic AI, RAGs, and orchestration frameworks. Hands-on with cloud ML services (Azure OpenAI, AWS Bedrock, Databricks). Familiarity with CI/CD, Docker, Kubernetes, and infrastructure-as-code.

Posted 4 days ago

Apply

6.0 - 10.0 years

0 Lacs

Gurugram, Haryana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms. Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions. Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and deliver large-scale AI/Gen AI projects across multiple industries and domains. Liaison with on-site and client teams to understand various business problem statements and project requirements. Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation. Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices. Assist and participate in pre-sales, client pursuits and proposals. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members. Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or master’s degree in a quantitative field. Led a 3–5-member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components. Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303629

Posted 4 days ago

Apply

6.0 - 10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms. Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions. Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and deliver large-scale AI/Gen AI projects across multiple industries and domains. Liaison with on-site and client teams to understand various business problem statements and project requirements. Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation. Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices. Assist and participate in pre-sales, client pursuits and proposals. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members. Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or master’s degree in a quantitative field. Led a 3–5-member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components. Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303629

Posted 4 days ago

Apply

6.0 - 10.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms. Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions. Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and deliver large-scale AI/Gen AI projects across multiple industries and domains. Liaison with on-site and client teams to understand various business problem statements and project requirements. Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation. Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices. Assist and participate in pre-sales, client pursuits and proposals. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members. Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or master’s degree in a quantitative field. Led a 3–5-member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components. Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303629

Posted 4 days ago

Apply

6.0 - 10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms. Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions. Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and deliver large-scale AI/Gen AI projects across multiple industries and domains. Liaison with on-site and client teams to understand various business problem statements and project requirements. Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation. Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices. Assist and participate in pre-sales, client pursuits and proposals. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members. Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or master’s degree in a quantitative field. Led a 3–5-member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components. Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303629

Posted 4 days ago

Apply

10.0 - 20.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Dear Aspirant, Greetings from TCS ! TCS presents excellent opportunity for Product Architect [ Quantum Computing, Optimization, AI/ML ] Experience: 10 - 20 Years Job Location: Chennai / Bangalore / Hyderabad / Mumbai / Pune / Kolkata / Mumbai / Pune / Delhi / Noida / Gurgaon Quantum Computing, Optimization, AI/ML Technical Expertise: Strong understanding of software development concepts and coding skills. System Design & Modeling: Ability to design and model complex software systems. Communication & Collaboration: Excellent communication and collaboration skills for working with stakeholders. Problem-Solving & Critical Thinking: Strong problem-solving and critical thinking skills for identifying and resolving issues. Project Management & Leadership: Ability to manage projects and guide teams effectively. GOOD TO HAVE: Certifications such as Certified Software Architect or similar. Experience in Agile methodologies and DevOps practices. Regards, Sai Lokesh S

Posted 4 days ago

Apply

5.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Position Overview: ShyftLabs is seeking a skilled Databricks Engineer to support in designing, developing, and optimizing big data solutions using the Databricks Unified Analytics Platform. This role requires strong expertise in Apache Spark, SQL, Python, and cloud platforms (AWS/Azure/GCP). The ideal candidate will collaborate with cross-functional teams to drive data-driven insights and ensure scalable, high-performance data architectures. ShyftLabs is a growing data product company that was founded in early 2020 and works primarily with Fortune 500 companies. We deliver digital solutions built to help accelerate the growth of businesses in various industries, by focusing on creating value through innovation. Job Responsiblities Design, implement, and optimize big data pipelines in Databricks Develop scalable ETL workflows to process large datasets Leverage Apache Spark for distributed data processing and real-time analytics Implement data governance, security policies, and compliance standards Optimize data lakehouse architectures for performance and cost-efficiency Collaborate with data scientists, analysts, and engineers to enable advanced AI/ML workflows Monitor and troubleshoot Databricks clusters, jobs, and performance bottlenecks Automate workflows using CI/CD pipelines and infrastructure-as-code practices Ensure data integrity, quality, and reliability in all pipelines Basic Qualifications Bachelor’s or Master’s degree in Computer Science, Data Engineering, or a related field 5+ years of hands-on experience with Databricks and Apache Spark Proficiency in SQL, Python, or Scala for data processing and analysis Experience with cloud platforms (AWS, Azure, or GCP) for data engineering Strong knowledge of ETL frameworks, data lakes, and Delta Lake architecture Experience with CI/CD tools and DevOps best practices Familiarity with data security, compliance, and governance best practices Strong problem-solving and analytical skills with an ability to work in a fast-paced environment Preferred Qualifications Databricks certifications (e.g., Databricks Certified Data Engineer, Spark Developer) Hands-on experience with MLflow, Feature Store, or Databricks SQL Exposure to Kubernetes, Docker, and Terraform Experience with streaming data architectures (Kafka, Kinesis, etc.) Strong understanding of business intelligence and reporting tools (Power BI, Tableau, Looker) Prior experience working with retail, e-commerce, or ad-tech data platforms We are proud to offer a competitive salary alongside a strong insurance package. We pride ourselves on the growth of our employees, offering extensive learning and development resources.

Posted 4 days ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

About Marvell Marvell’s semiconductor solutions are the essential building blocks of the data infrastructure that connects our world. Across enterprise, cloud and AI, automotive, and carrier architectures, our innovative technology is enabling new possibilities. At Marvell, you can affect the arc of individual lives, lift the trajectory of entire industries, and fuel the transformative potential of tomorrow. For those looking to make their mark on purposeful and enduring innovation, above and beyond fleeting trends, Marvell is a place to thrive, learn, and lead. Your Team, Your Impact Marvell Technology is a global leader in the semiconductor industry, specializing in the design and development of high-performance semiconductor solutions that enable the seamless movement of data across various platforms. Marvell's innovative technology powers the world's leading products in storage, networking, and connectivity. We are seeking a motivated People Analytics Intern to join our team. In this role, you’ll work with people data to solve organizational challenges by building solutions using AI/ML models, including large language models (LLMs). You will primarily focus on automating text analysis, building predictive models and identifying patterns to help improve business outcomes. Internship Duration- 3 months. What You Can Expect Collaborate with the HR Leaders and People Analytics team to translate business challenges into technical solutions. Develop and deploy AI/ML solutions to analyze and interpret workforce data, providing predictive and prescriptive insights. Fine-tune pre-trained LLMs to align with organizational context and specific business objectives. Build interactive dashboards for non-technical stakeholders to explore text-based insights. Stay up to date on the latest developments in LLM architectures and NLP, document findings and present results to the team. What We're Looking For Currently pursuing a degree in Data Science, Artificial Intelligence, Computer Science or a related field. Proficiency in Python, R, or similar languages for data analysis and model development. Familiarity with NLP tools and libraries (e.g., NLTK, SpaCy). Experience with large language models (LLMs) such as GPT, LLAMA, or similar transformer-based architectures. Understanding of natural language processing tasks, such as text classification, sentiment analysis, and named entity recognition. Strong analytical, problem-solving and communication skills, with the ability to present complex technical information clearly Ability to work independently and collaboratively as part of a team. Preferred Qualifications: Experience working with HR data and an understanding of people metrics. Knowledge of cloud platforms (example: AWS, Azure) for model deployment. Familiarity with visualization tools (e.g., Power BI, Tableau). Prior research experience or internships in AI/ML. Additional Compensation And Benefit Elements With competitive compensation and great benefits, you will enjoy our workstyle within an environment of shared collaboration, transparency, and inclusivity. We’re dedicated to giving our people the tools and resources they need to succeed in doing work that matters, and to grow and develop with us. For additional information on what it’s like to work at Marvell, visit our Careers page. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability or protected veteran status.

Posted 4 days ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Job Summary: We are looking for a passionate and skilled Automation Engineer with 3 to 5 years of experience in test automation using Python , pytest , and Selenium , with a solid understanding of RESTful API testing and CI/CD pipelines (Jenkins) . Experience or interest in automating NMS (Network Management Systems) is a plus. Key Responsibilities: Design, develop, and maintain robust automation frameworks using Python and pytest Create, execute, and maintain automated test scripts for web-based applications using Selenium with Python Develop and maintain automated test cases for RESTful APIs using tools like requests or httpx Integrate automated tests into CI/CD pipelines using Jenkins and contribute to continuous testing practices Analyze test results, identify root causes of failures, and work closely with developers to resolve issues Required Skills & Qualifications: 3 to 5 years of hands-on experience in test automation Strong programming skills in Python Experience with pytest or other test frameworks Proficiency in using Selenium with Python for UI automation Hands-on experience testing RESTful APIs Good understanding of CI/CD practices, especially using Jenkins Familiarity with version control tools like Git Strong analytical and problem-solving skills Excellent communication and documentation skills Nice to Have: Experience automating NMS (Network Management Systems) applications Exposure to Docker or virtual test environments Exposure to AI/ML concepts or interest in AI-driven test automation is a plus

Posted 4 days ago

Apply

2.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Ways of Working - Employees will work from the office in hybrid mode (Bangalore). About Swiggy Swiggy is India’s leading on-demand delivery platform with a tech-first approach to logistics and a solution-first approach to consumer demands. With a presence in 500+ cities across India, partnerships with hundreds of thousands of restaurants, an employee base of over 5000, a 2 lakh+ strong independent fleet of Delivery Executives, we deliver unparalleled convenience driven by continuous innovation. Built on the back of robust ML technology and fuelled by terabytes of data processed every day, Swiggy offers a fast, seamless and reliable delivery experience for millions of customers across India. From starting out as a hyperlocal food delivery service in 2014, to becoming India’s leading on-demand convenience platform today, our capabilities result not only in lightning-fast delivery for customers, but also in a productive and fulfilling experience for our employees. About This Role This position will be a key part of the Growth Marketing vertical, responsible for driving disproportionate growth by shipping impactful solutions in performance marketing, owned media, and other high-growth domains. The role involves end-to-end ownership of product discovery, solutioning, PRD creation, GTM strategies and root cause analysis (RCA), . The position will work cross-functionally with tech, marketing, design, and analytics teams to conceptualize, execute, and scale products that fuel business growth. Responsibilities Product Discovery & Ideation: Identify opportunities for growth through performance marketing and owned media solutions. Conduct market research, user feedback, and competitive analysis to define high-impact problem statements. Solution Design & PRD Development Develop comprehensive Product Requirement Documents (PRDs) for solutions that align with growth objectives. Collaborate with tech teams to define product architecture and features. Cross-functional Collaboration Partner with marketing, design, analytics, and engineering teams to bring solutions from concept to launch. Act as the bridge between business and technical stakeholders, ensuring alignment across teams. Experimentation & Problem Solving Design, execute, and analyze experiments to validate product hypotheses. Conduct RCA wherever applicable. GTM Strategy & Execution Own the GTM for new products, ensuring timely execution and alignment with business goals. Develop frameworks for post-launch performance tracking and optimization. Performance Marketing Innovation Build tools and products to improve targeting, personalization, and media efficiency. Explore and implement new channels, algorithms, and automation to scale performance marketing efforts. Leadership & Planning Create and present quarterly, half yearly and annual growth roadmaps to leadership. Influence and drive alignment across teams, including product, business, analytics, bizfin, brand, and design. Required Skill-set Proven ability to manage complex, cross-functional projects with measurable impact. Strong analytical skills with a focus on leveraging data for decision-making. Experience in crafting and driving PRDs, experimentation frameworks, and GTM strategies. Familiarity with performance marketing channels (Google, Facebook, affiliates, programmatic etc.) and tools (Snowflake, AppsFlyer, Clevertap, etc.). Strong understanding of owned media, including CRM, push notifications, in-app messaging, and personalization. Exceptional communication and stakeholder management skills, with a proven ability to influence without authority. Self-driven with a strong bias for action and comfort with ambiguity. Preferred Experience 2+ years in product management, growth marketing, or a similar role in B2C tech/e-commerce. Experience collaborating with engineering, design, marketing, and analytics teams. Prior work in performance marketing, owned media, or growth-focused roles is a strong plus. This role is ideal for a self-starter passionate about solving growth challenges through innovative product thinking and collaborative execution. As a MarTech Manager, you will play a critical role in shaping the future of high-growth initiatives, driving user acquisition, engagement, and retention through impactful product solutions. We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regards to race, color, religion, sex, disability status, or any other characteristic protected by the law.

Posted 4 days ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Company Description Experian is a global data and technology company, powering opportunities for people and businesses around the world. We help to redefine lending practices, uncover and prevent fraud, simplify healthcare, create marketing solutions, all using our unique combination of data, analytics and software. We also assist millions of people to realise their financial goals and help them save time and money. We operate across a range of markets, from financial services to healthcare, automotive, agribusiness, insurance, and many more industry segments. We invest in people and new advanced technologies to unlock the power of data. As a FTSE 100 Index company listed on the London Stock Exchange (EXPN), we have a team of 22,500 people across 32 countries. Our corporate headquarters are in Dublin, Ireland. Learn more at experianplc.co m Job Description Ready to shape the future? We are a global technology-driven organization committed to innovation, data intelligence, and scalable digital transformation. Our mission is to empower businesses and communities through cutting-edge cloud and AI solutions. We believe in fostering a culture of collaboration, continuous learning, and impactful change. Job Description As a Solution Architect, you will design and govern modern, cloud-native, AI-enabled solutions. Collaborating with global teams, you'll review solution designs, identify architectural risks, and ensure alignment with enterprise standards, shaping the future technology landscape. Key Responsibilities You will be based in Hyderabad and reporting to Head of Architecture Lead architectural reviews and provide feedback on designs and non-functional requirements. Ensure architectural alignment and promote reuse and quality practices across global teams. Contribute to the UKI Batch Strategy and advocate for the UKI Batch Platform. Design scalable & secure AWS-based solutions. Guide teams in applying AWS Well-Architected Framework and best practices. Maintain architectural documentation (blueprints, patterns, specs). Lead migration to the UKI Batch Platform and decommission legacy systems. Collaborate with product teams to gather requirements and shape the platform backlog. Help define and evolve the UKI Batch Platform roadmap. Stay updated on latest technologies and best practices. Mentor engineering teams and promote architectural excellence. Provide leadership in integrating AI and Large Language Models (LLMs). Qualifications Minimum 5 yrs of experience as a Solution Architect in enterprise solution design. Bachelor's degree in Computer Science, IT, or a related field. Strong hands-on expertise with core AWS services (compute, storage, databases), container orchestration, serverless, data streaming, ETL, monitoring, performance tuning, and infrastructure as code. Experience with cloud-native architectures, including OLTP, event-driven, and streaming workloads. Proficient in microservices, containers, serverless, and cloud architecture patterns. Experience with AI/ML frameworks and Large Language Models (LLMs). Familiarity with architectural frameworks and AWS Well-Architected Framework. Knowledge of Agile methodologies and CI/CD practices Proficient in C#, Java, and Python. Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters; DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning; World's Best Workplaces™ 2024 (Fortune Global Top 25), Great Place To Work™ in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site and Glassdoor to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is a critical part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, color, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Benefits Experian care for employee's work life balance, health, safety and wellbeing. In support of this endeavor, we offer best-in-class family well-being benefits, enhanced medical benefits and paid time off. Experian Careers - Creating a better tomorrow together Find out what its like to work for Experian by clicking here

Posted 4 days ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Ways of Working - Full time Office role on hybrid mode (Bangalore) About Swiggy Swiggy is India’s leading on-demand delivery platform with a tech-first approach to logistics and a solution-first approach to consumer demands. With a presence in 500+ cities across India, partnerships with hundreds of thousands of restaurants, an employee base of over 5000, a 2 lakh+ strong independent fleet of Delivery Executives, we deliver unparalleled convenience driven by continuous innovation. Built on the back of robust ML technology and fuelled by terabytes of data processed every day, Swiggy offers a fast, seamless and reliable delivery experience for millions of customers across India. From starting out as a hyperlocal food delivery service in 2014, to becoming India’s leading on-demand convenience platform today, our capabilities result not only in lightning-fast delivery for customers, but also in a productive and fulfilling experience for our employees. Roles And Responsibilities Draft, review and negotiate a variety of commercial agreements, Nondisclosure Agreements, Supply Agreements, Master Service Agreements, Statement of Works, IPR related documents and other legal documents. Focus on service agreements, licensing agreements, vendor contracts, advertising, endorsement, marketing agreements, sponsorship agreements, NDAs etc. Support new business initiatives, project work with project teams to ensure legal evaluation and timely compliance with all conditions precedent and other contractual obligations. Research on applicable regulatory laws and prepare in-house preliminary opinions. Assist in reviewing print, social media and other media advertisements and marketing communications to ensure legal compliance. Provide guidance and assistance on drafting and reviewing different policies and terms and conditions relating to offers, business and our services. Desired Skills Transactional drafting, negotiation and advisory experience on different commercial transactions gained at a leading law firm and/or in-house at a multinational corporation The role requires the individual to be well skilled in contract analysis and working knowledge of the fundamental legal provisions of commercial contracts. Excellent attention to details, ability to analyze and assess business processes, spot issues and propose/implement solutions. Ability to function autonomously yet communicate laterally and upwardly with ease. Strong legal and business judgment. Excellent written and oral communication and interpersonal skills to effectively communicate and coordinate complex issues and projects with diverse levels of management and employees. Ready and willing to take up new projects and work independently with minimal supervision and take responsibility. Prioritize and manage work load effectively, recognizing the quick turn-around requirements. We are an equal opportunity employer and all qualified applicants will receive consideration for employment without regard to race, color, religion, sex, disability status, or any other characteristic protected by the law.

Posted 4 days ago

Apply

6.0 - 10.0 years

0 Lacs

Mumbai, Maharashtra, India

On-site

Summary Position Summary AI & Data In this age of disruption, organizations need to navigate the future with confidence, embracing decision making with clear, data-driven choices that deliver enterprise value in a dynamic business environment. The AI & Data team leverages the power of data, analytics, robotics, science and cognitive technologies to uncover hidden relationships from vast troves of data, generate insights, and inform decision-making. The offering portfolio helps clients transform their business by architecting organizational intelligence programs and differentiated strategies to win in their chosen markets. AI & Data will work with our clients to: Implement large-scale data ecosystems including data management, governance and the integration of structured and unstructured data to generate insights leveraging cloud-based platforms. Leverage automation, cognitive and science-based techniques to manage data, predict scenarios and prescribe actions. Drive operational efficiency by maintaining their data ecosystems, sourcing analytics expertise and providing As-a-Service offerings for continuous insights and improvements. Job Title: Senior Generative AI Developer/Team Lead Job Summary: We are looking for a Generative AI Team Lead with hands-on experience to design, develop and deploy AI and Generative AI models that generate high quality content, such as text, images, chatbots, etc. The Ideal candidate will have expertise in deep learning, natural language processing, and computer vision. Key Responsibilities: Lead and deliver large-scale AI/Gen AI projects across multiple industries and domains. Liaison with on-site and client teams to understand various business problem statements and project requirements. Lead a team of Data Engineers, ML/AI Engineers, Prompt engineers, and other Data & AI professionals to deliver projects from inception to implementation. Brainstorm, build & improve AI/Gen AI models developed by the team & identify scope for model improvements & best practices. Assist and participate in pre-sales, client pursuits and proposals. Drive a human-led culture of Inclusion & Diversity by caring deeply for all team members. Qualifications: 6-10 years of relevant experience in Generative AI, Deep Learning, or NLP Bachelor’s or master’s degree in a quantitative field. Led a 3–5-member team on multiple end to end AI/GenAI projects Excellent communication and client/stakeholder management skills Must have strong hands-on experience with programming languages like Python, Cuda and SQL, and frameworks such as TensorFlow, PyTorch and Keras Hands-on experience with top LLM models like OpenAI GPT-3.5/4, Google Gemini, AWS Bedrock, LLaMA 3.0, and Mistral, along with RAG and Agentic workflows Well versed with GANs and Transformer architecture, knows about Diffusion models, up to date with new research/progress in the field of Gen AI Should follow research papers, comprehend and innovate/present the best approaches/solutions related to Generative AI components. Knowledge of hyperscaler offerings (NVIDIA, AWS, Azure, GCP, Oracle) and Gen AI tools (Copilot, Vertex AI). Knowledge of Vector DB, Neo4J/relevant Graph DBs Familiar with Docker containerization, GIT, etc. AI/Cloud certification from a premier institute is preferred. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303629

Posted 4 days ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

🚀 We’re Hiring! MLOps Engineer – Pune, India | Asmadiya Technologies Pvt. Ltd. Are you passionate about deploying machine learning models at scale? Want to be part of a fast-growing tech company building intelligent digital solutions? Asmadiya Technologies is looking for an experienced MLOps Engineer (3–6 years) to join our AI/ML team in Pune . What You’ll Do Build and manage scalable ML pipelines (training → deployment → monitoring) Automate model deployment using CI/CD tools (Jenkins, GitHub Actions, etc.) Work with Docker, Kubernetes, and AWS (EKS, S3, SageMaker) Ensure ML model governance, performance tracking, and system reliability Collaborate closely with data scientists, DevOps, and product teams What You Bring 3–6 years of hands-on experience in software engineering, DevOps, or MLOps Strong Python + ML stack (scikit-learn, TensorFlow/PyTorch) Deep knowledge of cloud platforms (preferably AWS ) Experience with ML monitoring, model registries, and orchestration tools Bonus: Knowledge of LLMs, MLFlow, Kubeflow, or Airflow Why Join Us? Work on cutting-edge AI/ML projects that make real-world impact A collaborative, innovative, and fast-paced work culture Growth opportunities and flexible work environment Ready to take your MLOps career to the next level? Apply now at careers@asmadiya.com or DM us to learn more!

Posted 4 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies