Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 8.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Job Description Responsibilities : Identify relevant data sources - a combination of data sources to make it useful. Build the automation of the collection processes. Pre-processing of structured and unstructured data, leveraging NLP techniques for text data. Handle large amounts of information to create the input for analytical models, incorporating Gen AI for advanced data processing and generation. Build predictive models, machine learning, and deep learning algorithms; innovate with Gen AI applications in model development. Build network graphs, apply NLP techniques for text analysis, and design forecasting models while building data pipelines for end-to-end solutions. Propose solutions and strategies to address business challenges, integrating Gen AI and NLP in practical applications. Collaborate with product development teams and communicate with Senior Leadership teams. Participate in problem-solving sessions, leveraging NLP and Gen AI for innovative solutions. Requirements Bachelor's degree in a highly quantitative field (e., Computer Science, Engineering, Physics,Math, Operations Research, etc.) or equivalent experience. Extensive machine learning and algorithmic background with deep expertise in Gen AI andNatural Language Processing (NLP) techniques, along with a strong understanding of supervised and unsupervised learning methods, reinforcement learning, deep learning, Bayesian inference, and network graph analysis. Advanced knowledge of NLP methods, including text generation, sentiment analysis, namedentity recognition, and language modelling. Strong math skills, including proficiency in statistics, linear algebra, and probability, with the ability to apply these concepts in Gen AI and NLP solutions. Proven problem-solving aptitude with the ability to apply NLP and Gen AI tools to real-world business challenges. Excellent communication skills with the ability to translate complex technical information, especially related to Gen AI and NLP, into clear insights for non-technical stakeholders. Fluency in at least one data science/analytics programming language (e., Python, R, Julia), with expertise in NLP and Gen AI libraries like TensorFlow, PyTorch, Hugging Face, or OpenAI tools. Start-up experience is a plus, with ideally 5-8 years of advanced analytics experience in startups or marquee companies, particularly in roles leveraging Gen AI and NLP for product or business innovations. (ref:hirist.tech)
Posted 20 hours ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
AI/ML Engineer – Core Algorithm and Model Expert 1. Role Objective: The engineer will be responsible for designing, developing, and optimizing advanced AI/ML models for computer vision, generative AI, Audio processing, predictive analysis and NLP applications. Must possess deep expertise in algorithm development and model deployment as production-ready products for naval applications. Also responsible for ensuring models are modular, reusable, and deployable in resource constrained environments. 2. Key Responsibilities: 2.1. Design and train models using Naval-specific data and deliver them in the form of end products 2.2. Fine-tune open-source LLMs (e.g. LLaMA, Qwen, Mistral, Whisper, Wav2Vec, Conformer models) for Navy-specific tasks. 2.3. Preprocess, label, and augment datasets. 2.4. Implement quantization, pruning, and compression for deployment-ready AI applications. 2.5. The engineer will be responsible for the development, training, fine-tuning, and optimization of Large Language Models (LLMs) and translation models for mission-critical AI applications of the Indian Navy. The candidate must possess a strong foundation in transformer-based architectures (e.g., BERT, GPT, LLaMA, mT5, NLLB) and hands-on experience with pretraining and fine-tuning methodologies such as Supervised Fine-Tuning (SFT), Instruction Tuning, Reinforcement Learning from Human Feedback (RLHF), and Parameter-Efficient Fine-Tuning (LoRA, QLoRA, Adapters). 2.6. Proficiency in building multilingual and domain-specific translation systems using techniques like backtranslation, domain adaptation, and knowledge distillation is essential. 2.7. The engineer should demonstrate practical expertise with libraries such as Hugging Face Transformers, PEFT, Fairseq, and OpenNMT. Knowledge of model compression, quantization, and deployment on GPU-enabled servers is highly desirable. Familiarity with MLOps, version control using Git, and cross-team integration practices is expected to ensure seamless interoperability with other AI modules. 2.8. Collaborate with Backend Engineer for integration via standard formats (ONNX, TorchScript). 2.9. Generate reusable inference modules that can be plugged into microservices or edge devices. 2.10. Maintain reproducible pipelines (e.g., with MLFlow, DVC, Weights & Biases). 3. Educational Qualifications Essential Requirements: 3.1. B Tech / M.Tech in Computer Science, AI/ML, Data Science, Statistics or related field with exceptional academic record. 3.2. Minimum 75% marks or 8.0 CGPA in relevant engineering disciplines. Desired Specialized Certifications: 3.3. Professional ML certifications from Google, AWS, Microsoft, or NVIDIA 3.4. Deep Learning Specialization. 3.5. Computer Vision or NLP specialization certificates. 3.6. TensorFlow/ PyTorch Professional Certification. 4. Core Skills & Tools: 4.1. Languages: Python (must), C++/Rust. 4.2. Frameworks: PyTorch, TensorFlow, Hugging Face Transformers. 4.3. ML Concepts: Transfer learning, RAG, XAI (SHAP/LIME), reinforcement learning LLM finetuning, SFT, RLHF, LoRA, QLorA and PEFT. 4.4. Optimized Inference: ONNX Runtime, TensorRT, TorchScript. 4.5. Data Tooling: Pandas, NumPy, Scikit-learn, OpenCV. 4.6. Security Awareness: Data sanitization, adversarial robustness, model watermarking. 5. Core AI/ML Competencies: 5.1. Deep Learning Architectures: CNNs, RNNs, LSTMs, GRUs, Transformers, GANs, VAEs, Diffusion Models 5.2. Computer Vision: Object detection (YOLO, R-CNN), semantic segmentation, image classification, optical character recognition, facial recognition, anomaly detection. 5.3. Natural Language Processing: BERT, GPT models, sentiment analysis, named entity recognition, machine translation, text summarization, chatbot development. 5.4. Generative AI: Large Language Models (LLMs), prompt engineering, fine-tuning, Quantization, RAG systems, multimodal AI, stable diffusion models. 5.5. Advanced Algorithms: Reinforcement learning, federated learning, transfer learning, few-shot learning, meta-learning 6. Programming & Frameworks: 6.1. Languages: Python (expert level), R, Julia, C++ for performance optimization. 6.2. ML Frameworks: TensorFlow, PyTorch, JAX, Hugging Face Transformers, OpenCV, NLTK, spaCy. 6.3. Scientific Computing: NumPy, SciPy, Pandas, Matplotlib, Seaborn, Plotly 6.4. Distributed Training: Horovod, DeepSpeed, FairScale, PyTorch Lightning 7. Model Development & Optimization: 7.1. Hyperparameter tuning using Optuna, Ray Tune, or Weights & Biases etc. 7.2. Model compression techniques (quantization, pruning, distillation). 7.3. ONNX model conversion and optimization. 8. Generative AI & NLP Applications: 8.1. Intelligence report analysis and summarization. 8.2. Multilingual radio communication translation. 8.3. Voice command systems for naval equipment. 8.4. Automated documentation and report generation. 8.5. Synthetic data generation for training simulations. 8.6. Scenario generation for naval training exercises. 8.7. Maritime intelligence synthesis and briefing generation. 9. Experience Requirements 9.1. Hands-on experience with at least 2 major AI domains. 9.2. Experience deploying models in production environments. 9.3. Contribution to open-source AI projects. 9.4. Led development of multiple end-to-end AI products. 9.5. Experience scaling AI solutions for large user bases. 9.6. Track record of optimizing models for real-time applications. 9.7. Experience mentoring technical teams 10. Product Development Skills 10.1. End-to-end ML pipeline development (data ingestion to model serving). 10.2. User feedback integration for model improvement. 10.3. Cross-platform model deployment (cloud, edge, mobile) 10.4. API design for ML model integration 11. Cross-Compatibility Requirements: 11.1. Define model interfaces (input/output schema) for frontend/backend use. 11.2. Build CLI and REST-compatible inference tools. 11.3. Maintain shared code libraries (Git) that backend/frontend teams can directly call. 11.4. Joint debugging and model-in-the-loop testing with UI and backend teams
Posted 1 day ago
0.0 years
0 Lacs
Chennai, Tamil Nadu
On-site
Job Information Date Opened 07/22/2025 Job Type Permanent RSD NO 11146 Industry IT Services Min Experience 6 Max Experience 8 City Chennai State/Province Tamil Nadu Country India Zip/Postal Code 600018 Job Description Job Title - BlackRock Aladdin Specialist Location - Chennai, Bangalore, Hyderabad Job Type - Full-time - Work from office Position Overview - As a BlackRock Aladdin Specialist at Indium, you will play a crucial role in supporting & enhancing the Aladdin platform for our clients in the financial sector. You will collaborate with cross-functional teams to implement and customize Aladdin solutions, ensuring seamless integration and optimal performance. Responsibilities – Understands Life of a Trade. Knowledge of Black-rock Aladdin Solution Explore to Portfolio Management, Trade Execution, Data Control and Operations and Portfolio administration on Aladdin Platform. Experience of working in an Asset Manager or similar organization with strong knowledge of order and execution management systems and electronic trading platforms is mandatory Technical Experience: 1: Experience with Black-rock Aladdin platform 2: Proven ability to use complex analytical, interpretive and problem-solving skills and techniques, to synthesize and present complex information to stakeholders of various levels 3: Demonstrated ability to manipulate, integrate and visualize complex data sets Understanding of global Asset Management and large asset owners 4: Good communication skills 5: Support Experience Aladdin Platform Customization - Configure & customize the Aladdin platform to meet the specific needs and requirements of clients. Knowledge of Julia Platform. Hence, knowledge of Python, R and Matlab are desirable. Implementation and Integration - Lead the implementation of Aladdin solutions for clients, ensuring smooth integration with existing systems. Stakeholder Management - Work closely with clients, internal teams, and other stakeholders to understand business requirements and deliver tailored Aladdin solutions. Training & Documentation - Maintain the Knowledge Base & document detailing Standard Operating Procedures (SOP), support issues, resolutions, & best practices for future reference. Qualifications - Bachelor's degree in Computer Science, Information Technology, or a related field. Proven experience working with BlackRock Aladdin, with a strong understanding of its features and capabilities. Technical proficiency in scripting languages such as Python / R / Matlab and familiarity with APIs. Excellent problem-solving skills and a proactive approach to troubleshooting. Strong communication skills, with the ability to collaborate effectively with both technical and non-technical stakeholders. Relevant certifications in Aladdin or related technologies would be a plus. Experience with Amazon Web Services (AWS) – including data management, computer services, and cloud-based deployment. At Indium diversity, equity, and inclusion (DEI) are the cornerstones of our values. We champion DEI through a dedicated council, expert sessions, and tailored training programs, ensuring an inclusive workplace for all. Our initiatives, including the WE@IN women empowerment program and our DEI calendar, foster a culture of respect and belonging. Recognized with the Human Capital Award, we are committed to creating an environment where every individual thrives. Join us in building a workplace that values diversity and drives innovation.
Posted 1 day ago
3.0 - 7.0 years
0 Lacs
karnataka
On-site
At QpiAI, we are at the forefront of the discovery of optimal AI and Quantum systems in various industries such as Life sciences, Healthcare, Transportation, Finance, Industrial, and Space technologies. Our focus lies in building full-stack Enterprise Quantum Computers. As a part of the QpiAI Quantum hardware team, your role will involve the design and characterization of Quantum Processors, Cryogenic Quantum Control Circuits, RF Control Hardware, and QpiAI ASGP. We are currently seeking a talented engineer with a strong mathematical background and sharp algorithmic skills to join us in building a high-performance optimization infrastructure. This infrastructure will cater to real-world decision-making problems in domains like logistics, manufacturing, and emerging technologies. We need someone who excels at translating abstract optimization problems into efficient, production-ready solvers. Key Responsibilities - Design and implement fast, scalable solvers for complex optimization problems in both discrete and continuous domains. - Develop constraint modeling frameworks and metaheuristic algorithms based on solid mathematical principles. - Evaluate solution quality, convergence behavior, and performance benchmarks across various instances and datasets. - Collaborate with system engineers to seamlessly integrate your solver modules into larger optimization stacks and real-time decision systems. - Explore and incorporate techniques from mathematical programming, stochastic methods, and quantum-inspired approaches. What We're Looking For - Strong mathematical foundation including expertise in linear algebra, combinatorics, graph theory, numerical methods, convex and discrete optimization. - Proficiency in algorithmic and systems thinking, with the ability to write efficient, optimized code and address performance bottlenecks. - Exceptional programming skills in Python and C++ (or Rust/Julia); familiarity with low-level optimizations and profiler-driven development is advantageous. - Experience in algorithm design for constraint systems, heuristics, and metaheuristics. - Hands-on coding experience with a notable presence on platforms like Leetcode, Codeforces, or Hackerrank. - Product-oriented mindset with the capability to design modular, reusable, and scalable solvers suitable for production systems. - Exposure to quantum computing or hybrid quantum-classical paradigms is a bonus. Good to Have - Familiarity with model encoding techniques and constraint representations. - Experience in benchmarking on large-scale combinatorial datasets. - Participation in mathematical modeling competitions such as INMO, COMAP, or Kaggle competitions involving optimization. Why Join Us By joining our team, you will collaborate closely with professionals who view optimization not just as a mathematical puzzle but as an engineering challenge. If you are passionate about developing fast solvers, exploring boundaries in hybrid or hardware-accelerated optimization, and tackling problems at scale, we would be thrilled to have you onboard.,
Posted 1 day ago
5.0 years
0 Lacs
Gurugram, Haryana, India
On-site
Achieving our goals starts with supporting yours. Grow your career, access top-tier health and wellness benefits, build lasting connections with your team and our customers, and travel the world using our extensive route network. Come join us to create what’s next. Let’s define tomorrow, together. Description At United, we have some of the best aircraft in the world. Our Technical Operations team is full of aircraft maintenance technicians, engineers, planners, ground equipment and facilities professionals, and supply chain teams that help make sure they’re well taken care of and ready to get our customers to their desired destinations. If you’re ready to work on our planes, join our Tech Ops experts and help keep our fleet in tip-top shape. Job Overview And Responsibilities The Manager - Inventory Management manages provisioning and inventory planning of expendable parts and rotable parts to support the maintenance operation of United Airlines fleets. This includes planning of inventory investment and optimal inventory levels to adequately support maintenance needs. Continuously monitors the performance of the inventory plans. Accountable for identifying automation and integration opportunities to improve efficiency and support organizational goals. Develop strategic and tactical inventory plans for rotable and expendable parts to support the expected changes in schedule and maintenance operations Manage the team in making daily strategic adjustments to station inventory levels for rotable and expendable parts to support the maintenance operations This includes, but not limited to, analyzing and establishing system service levels that support operations at the least total cost, adjusting allocations/ROPs to support service levels with expected flying and maintenance changes, being accountable for life cycle of inventory – inception through final disposition and being accountable for performance against plan Identify supply chain planning automation opportunities in collaboration with other teams in the organization Develop project plans and ensure timely project completions Mentor team to develop in their roles and skillsets This position is offered on local terms and conditions. Expatriate assignments and sponsorship for employment visas, even on a time-limited visa status, will not be awarded. This position is for United Airlines Business Services Pvt. Ltd - a wholly owned subsidiary of United Airlines Inc. Qualifications What’s needed to succeed (Minimum Qualifications): BS degree in a quantitative field such as engineering, supply chain management or related field Understanding of supply chain operations, planning and management Exhibits an analytical approach to problem solving with attention to detail and a sense of urgency 5+ years of relevant experience in an analytics or process improvement-oriented role in supply chain, aviation, manufacturing or related industry Understanding of supply chain functions including sourcing/procurement, vendor management, inventory planning, warehousing and logistics Familiarity with Line and Base maintenance planning and operations Knowledge of relational database models and SQL Familiarity with Supply Chain models Strong written and verbal communication skills Experience with automation and digital technology deployments Ability to identify automation and supply chain optimization opportunities in existing processes to improve efficiency Must be legally authorized to work in India for any employer without sponsorship Must be fluent in English (written and spoken) Successful completion of interview required to meet job qualification Reliable, punctual attendance is an essential function of the position What will help you propel from the pack (Preferred Qualifications): MS degree in a quantitative field such as math, statistics and/or MBA Lean Six Sigma Green/Black Belt certification Project management experience Experience with continuous improvement methodologies Experience using statistical and mathematical methods Experience in inventory management Experience in a related field involving automation of complex business processes Experience leading direct reports Familiarity with ERP capabilities At least two years of airline experience At least two years of supply chain experience Strong background in one or more supply chain functions Familiarity with Maintenance, Repair and Overhaul (MRO) functions Familiarity with the airline industry Background in agile project management methodologies Strong understanding of Supply Chain Planning tools Familiarity with Line and Base Maintenance Planning Familiarity with MRO planning Strong project management skills Financial and budget/expense management Ability to establish and monitor policies and guidelines Programming experience in a scripting language such as Python, R, and/or Julia GGN00002041
Posted 1 week ago
5.0 years
6 - 9 Lacs
Hyderābād
On-site
GenAI Engineer – CL4 Role Overview : As a GenAI Engineer , you will actively engage in your engineering craft, taking a hands-on approach to multiple high-visibility projects. Your expertise will be pivotal in delivering solutions that delight customers and users, while also driving tangible value for Deloitte's business investments. You will leverage your extensive GenAI & engineering craftmanship across multiple programming languages and modern frameworks, consistently demonstrating your strong track record in delivering high-quality, outcome-focused solutions. The ideal candidate will be a dependable team player, collaborating with cross-functional teams to design, develop, and deploy advanced software solutions. Key Responsibilities : Outcome-Driven Accountability: Embrace and drive a culture of accountability for customer and business outcomes. Develop engineering solutions that solve complex problems with valuable outcomes, ensuring high-quality, lean designs and implementations. Technical Leadership and Advocacy: Serve as the technical advocate for products, ensuring code integrity, feasibility, and alignment with business and customer goals. Lead requirement analysis, component design, development, unit testing, integrations, and support. Engineering Craftsmanship: Maintain accountability for the integrity of code design, implementation, quality, data, and ongoing maintenance and operations. Stay hands-on, self-driven, and continuously learn new approaches, languages, and frameworks with significant focus on infusing AI/ML/GenAI where possible/appropriate. Create technical specifications, and write high-quality, supportable, scalable code ensuring all quality KPIs are met or exceeded. Demonstrate collaborative skills to work effectively with diverse teams. Customer-Centric Engineering: Develop lean engineering solutions through rapid, inexpensive experimentation to solve customer needs. Engage with customers and product teams before, during, and after delivery to ensure the right solution is delivered at the right time. Incremental and Iterative Delivery: Adopt a mindset that favors action and evidence over extensive planning. Utilize a learning-forward approach to navigate complexity and uncertainty, delivering lean, supportable, and maintainable solutions. Cross-Functional Collaboration and Integration: Work collaboratively with empowered, cross-functional teams including product management, experience, and delivery. Integrate diverse perspectives to make well-informed decisions that balance feasibility, viability, usability, and value. Foster a collaborative environment that enhances team synergy and innovation. Advanced Technical Proficiency: Possess deep expertise in modern software engineering practices and principles, including AI/ML/GenAI, Agile methodologies and DevSecOps to deliver daily product deployments using full automation from code check-in to production with all quality checks through SDLC lifecycle. Strive to be a role model, leveraging these techniques to optimize solutioning and product delivery. Demonstrate understanding of the full lifecycle product development, focusing on continuous improvement and learning. Domain Expertise: Quickly acquire domain-specific knowledge relevant to the business or product. Translate business/user needs, architectures, and UX/UI designs into technical specifications and code. Be a valuable, flexible, and dedicated team member, supportive of teammates, and focused on quality and tech debt payoff. Effective Communication and Influence: Exhibit exceptional communication skills, capable of articulating complex technical concepts clearly and compellingly. Inspire and influence teammates and product teams through well-structured arguments and trade-offs supported by evidence. Create coherent narratives that align technical solutions with business objectives. Engagement and Collaborative Co-Creation: Engage and collaborate with product engineering teams at all organizational levels, including customers as needed. Build and maintain constructive relationships, fostering a culture of co-creation and shared momentum towards achieving product goals. Align diverse perspectives and drive consensus to create feasible solutions. The team : US Deloitte Technology Product Engineering has modernized software and product delivery, creating a scalable, cost-effective model that focuses on value/outcomes that leverages a progressive and responsive talent structure. As Deloitte’s primary internal development team, Product Engineering delivers innovative digital solutions to businesses, service lines, and internal operations with proven bottom-line results and outcomes. It helps power Deloitte’s success. It is the engine that drives Deloitte, serving many of the world’s largest, most respected companies. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. Key Qualifications : § A bachelor’s degree in computer science, software engineering, or a related discipline. An advanced degree (e.g., MS) is preferred but not required. Experience is the most relevant factor. § Strong software engineering foundation with deep understanding of OOPs, data-structure, algorithms, code instrumentations, beautiful coding practices etc. § 5+ years of experience with AI/ML, with last 2 years focused on GenAI as well as technologies like OpenAI, Claude, Gemini, LangChain, Agents, Vector databases, and approaches like Prompt Engineering, fine-tuning, etc. § Proven experience in: Python, R, TensorFlow, PyTorch, Keras, Julia, ML libraries, NLP, etc. § Proven experience with big data technologies, Angular, React, NodeJS, Python, C#, .NET Core, Java, Golang, SQL/NoSQL. § Proven experience with cloud-native engineering, using FaaS/PaaS/micro-services on cloud hyper-scalers like Azure, AWS, and GCP. § Strong understanding of methodologies & tools like, XP, Lean, SAFe, DevSecOps, SRE, ADO, GitHub, SonarQube, etc. § Excellent interpersonal and organizational skills, with the ability to handle diverse situations, complex projects, and changing priorities, behaving with passion, empathy, and care. How You will Grow: At Deloitte, our professional development plans focus on helping people at every level of their career to identify and use their strengths to do their best work every day and excel in everything they do. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India. Benefits to help you thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303508
Posted 1 week ago
5.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Position Summary GenAI Engineer – CL4 Role Overview : As a GenAI Engineer , you will actively engage in your engineering craft, taking a hands-on approach to multiple high-visibility projects. Your expertise will be pivotal in delivering solutions that delight customers and users, while also driving tangible value for Deloitte's business investments. You will leverage your extensive GenAI & engineering craftmanship across multiple programming languages and modern frameworks, consistently demonstrating your strong track record in delivering high-quality, outcome-focused solutions. The ideal candidate will be a dependable team player, collaborating with cross-functional teams to design, develop, and deploy advanced software solutions. Key Responsibilities : Outcome-Driven Accountability: Embrace and drive a culture of accountability for customer and business outcomes. Develop engineering solutions that solve complex problems with valuable outcomes, ensuring high-quality, lean designs and implementations. Technical Leadership and Advocacy: Serve as the technical advocate for products, ensuring code integrity, feasibility, and alignment with business and customer goals. Lead requirement analysis, component design, development, unit testing, integrations, and support. Engineering Craftsmanship: Maintain accountability for the integrity of code design, implementation, quality, data, and ongoing maintenance and operations. Stay hands-on, self-driven, and continuously learn new approaches, languages, and frameworks with significant focus on infusing AI/ML/GenAI where possible/appropriate. Create technical specifications, and write high-quality, supportable, scalable code ensuring all quality KPIs are met or exceeded. Demonstrate collaborative skills to work effectively with diverse teams. Customer-Centric Engineering: Develop lean engineering solutions through rapid, inexpensive experimentation to solve customer needs. Engage with customers and product teams before, during, and after delivery to ensure the right solution is delivered at the right time. Incremental and Iterative Delivery: Adopt a mindset that favors action and evidence over extensive planning. Utilize a learning-forward approach to navigate complexity and uncertainty, delivering lean, supportable, and maintainable solutions. Cross-Functional Collaboration and Integration: Work collaboratively with empowered, cross-functional teams including product management, experience, and delivery. Integrate diverse perspectives to make well-informed decisions that balance feasibility, viability, usability, and value. Foster a collaborative environment that enhances team synergy and innovation. Advanced Technical Proficiency: Possess deep expertise in modern software engineering practices and principles, including AI/ML/GenAI, Agile methodologies and DevSecOps to deliver daily product deployments using full automation from code check-in to production with all quality checks through SDLC lifecycle. Strive to be a role model, leveraging these techniques to optimize solutioning and product delivery. Demonstrate understanding of the full lifecycle product development, focusing on continuous improvement and learning. Domain Expertise: Quickly acquire domain-specific knowledge relevant to the business or product. Translate business/user needs, architectures, and UX/UI designs into technical specifications and code. Be a valuable, flexible, and dedicated team member, supportive of teammates, and focused on quality and tech debt payoff. Effective Communication and Influence: Exhibit exceptional communication skills, capable of articulating complex technical concepts clearly and compellingly. Inspire and influence teammates and product teams through well-structured arguments and trade-offs supported by evidence. Create coherent narratives that align technical solutions with business objectives. Engagement and Collaborative Co-Creation: Engage and collaborate with product engineering teams at all organizational levels, including customers as needed. Build and maintain constructive relationships, fostering a culture of co-creation and shared momentum towards achieving product goals. Align diverse perspectives and drive consensus to create feasible solutions. The team : US Deloitte Technology Product Engineering has modernized software and product delivery, creating a scalable, cost-effective model that focuses on value/outcomes that leverages a progressive and responsive talent structure. As Deloitte’s primary internal development team, Product Engineering delivers innovative digital solutions to businesses, service lines, and internal operations with proven bottom-line results and outcomes. It helps power Deloitte’s success. It is the engine that drives Deloitte, serving many of the world’s largest, most respected companies. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. Key Qualifications : A bachelor’s degree in computer science, software engineering, or a related discipline. An advanced degree (e.g., MS) is preferred but not required. Experience is the most relevant factor. Strong software engineering foundation with deep understanding of OOPs, data-structure, algorithms, code instrumentations, beautiful coding practices etc. 5+ years of experience with AI/ML, with last 2 years focused on GenAI as well as technologies like OpenAI, Claude, Gemini, LangChain, Agents, Vector databases, and approaches like Prompt Engineering, fine-tuning, etc. Proven experience in: Python, R, TensorFlow, PyTorch, Keras, Julia, ML libraries, NLP, etc. Proven experience with big data technologies, Angular, React, NodeJS, Python, C#, .NET Core, Java, Golang, SQL/NoSQL. Proven experience with cloud-native engineering, using FaaS/PaaS/micro-services on cloud hyper-scalers like Azure, AWS, and GCP. Strong understanding of methodologies & tools like, XP, Lean, SAFe, DevSecOps, SRE, ADO, GitHub, SonarQube, etc. Excellent interpersonal and organizational skills, with the ability to handle diverse situations, complex projects, and changing priorities, behaving with passion, empathy, and care. How You will Grow: At Deloitte, our professional development plans focus on helping people at every level of their career to identify and use their strengths to do their best work every day and excel in everything they do. Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303508
Posted 1 week ago
0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job Overview We are seeking a Data Scientist with a strong foundation in machine learning and a passion for the travel industry. You will work with cross-functional teams to analyze customer behavior, forecast travel demand, optimize pricing models, and deploy AI-driven solutions to improve user experience and drive business growth. Key Responsibilities Engage in all stages of the project lifecycle, including data collection, labeling, and preprocessing, to ensure high-quality datasets for model training. Utilize advanced machine learning frameworks and pipelines for efficient model development, training execution, and deployment. Implement MLFlow for tracking experiments, managing datasets, and facilitating model versioning to streamline collaboration. Oversee model deployment on cloud platforms, ensuring scalable and robust performance in real-world travel applications. Analyze large volumes of structured and unstructured travel data to identify trends, patterns, and actionable insights. Develop, test, and deploy predictive models and machine learning algorithms for fare prediction, demand forecasting, and customer segmentation. Create dashboards and reports to communicate insights effectively to stakeholders across the business. Collaborate with Engineering, Product, Marketing, and Finance teams to support strategic data initiatives. Build and maintain data pipelines for data ingestion, transformation, and modeling. Conduct statistical analysis, A/B testing, and hypothesis testing to guide product decisions. Automate processes and contribute to scalable, production-ready data science tools. Technical Skills Machine Learning Frameworks: PyTorch, TensorFlow, JAX, Keras, Keras-Core, Scikit-learn, Distributed Model Training Programming & Development: Python, Pyspark, Julia, MATLAB, Git, GitLab, Docker, MLOps, CI/CD Pipelines Cloud & Deployment: AWS SageMaker, MLFlow, Production Scaling Data Science & Analytics: Statistical Analysis, Predictive Modeling, Feature Engineering, Data Preprocessing, Pandas, NumPy, PySpark Computer Vision: CNN, RNN, OpenCV, Kornia, Object Detection, Image Processing, Video Analytics Visualization Tools: Looker,Tableau, Power BI, Matplotlib, Seaborn Databases & Querying: SQL, Snowflake, Databricks Big Data & MLOps: Spark, Hadoop, Kubernetes, Model Monitoring Nice To Have Experience with deep learning, LLMs, NLP (Transformers), or recommendation systems in travel use cases. Knowledge of GDS APIs (Amadeus, Sabre), flight search optimization, and pricing models. Strong system design (HLD/LLD) and architecture experience for production-scale ML workflows. Skills: data preprocessing,docker,feature engineering,sql,python,predictive modeling,statistical analysis,keras,data science,spark,data scientist,aws sagemaker,machine learning,mlflow,tensorflow,pytorch
Posted 1 week ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Prismforce\ Prismforce is a Vertical SaaS company revolutionizing the Talent Supply Chain for global Technology, R&D/Engineering, and IT Services companies. Our AI-powered product suite enhances business performance by enabling operational flexibility, accelerating decision-making, and boosting profitability. Our mission is to become the leading industry cloud/SaaS platform for tech services and talent organizations worldwide.\ Job Description Role: Data Scientist Reporting to: Lead AI/ML Location: Mumbai/Bangalore/Pune/Kolkata Job Brief We are looking for Data Scientists to build data products to be the core of SAAS company disrupting the Skill market. You would be required to create and participate in evolving Analytical culture of the organization, experiment with the existing Analytical Techniques, Improvise on existing algorithms to solve the problems at hand, Innovate new algorithms to disrupt the industry, and play the critical role in solving the problems at hand. Responsibilities Identify valuable data sources and automate collection processes Undertake preprocessing of structured and unstructured data Analyze large amounts of information to discover trends and patterns Build predictive models and machine-learning algorithms Combine models through ensemble modeling Present information using data visualization techniques Propose solutions and strategies to business challenges Collaborate with engineering and product development teams Requirements Bachelor's degree in a highly quantitative field (e.g., Computer Science, Engineering, Physics, Math, Operations Research, etc.) or equivalent experience. Extensive machine learning and algorithmic background with deep expertise in Gen AI and Natural Language Processing (NLP) techniques, along with a strong understanding of supervised and unsupervised learning methods, reinforcement learning, deep learning, Bayesian inference, and network graph analysis. Advanced knowledge of NLP methods, including text generation, sentiment analysis, named entity recognition, and language modelling. Strong math skills, including proficiency in statistics, linear algebra, and probability, with the ability to apply these concepts in Gen AI and NLP solutions. Proven problem-solving aptitude with the ability to apply NLP and Gen AI tools to real-world business challenges. Excellent communication skills with the ability to translate complex technical information, especially related to Gen AI and NLP, into clear insights for non-technical stakeholders. Fluency in at least one data science/analytics programming language (e.g., Python, R, Julia), with expertise in NLP and Gen AI libraries like TensorFlow, PyTorch, Hugging Face, or OpenAI tools. Start-up experience is a plus, with ideally 5-8 years of advanced analytics experience in startups or marquee companies, particularly in roles leveraging Gen AI and NLP for product or business innovations. Required Skills Machine Learning, Deep Learning, Algorithms, Computer Science, Engineering, Operations Research, Math Skills, Communication Skills, SAAS Product, IT Services, Artificial Intelligence, ERP, Product Management, Automation, Analytical Models, Predictive Models, NLP, Forecasting Models, Product Development, Leadership, Problem Solving, Unsupervised Learning, Reinforcement Learning, Natural Language Processing, Algebra, Data Science, Programming Language, Python, Julia. What Makes Us Unique First-Mover Advantage: We are the only Vertical SaaS product company addressing Talent Supply Chain challenges in the IT services industry. Innovative Product Suite: Our solutions offer forward-thinking features that outshine traditional ERP systems. Strategic Expertise: Guided by an advisory board of ex-CXOs from top global IT firms, providing unmatched industry insights. Experienced Leadership: Our founding team brings deep expertise from leading firms like McKinsey, Deloitte, Amazon, Infosys, TCS, and Uber. Diverse and Growing Team: We have grown to 160+ employees across India, with hubs in Mumbai, Pune, Bangalore, and Kolkata. Strong Financial Backing: Series A-funded by Sequoia, with global IT companies using our product as a core solution. Why Join Prismforce Competitive Compensation: We offer an attractive salary and benefits package that rewards your contributions. Innovative Projects: Work on pioneering projects with cutting-edge technologies transforming the Talent Supply Chain. Collaborative Environment: Thrive in a dynamic, inclusive culture that values teamwork and innovation. Growth Opportunities: Continuous learning and development are core to our philosophy, helping you advance your career. Flexible Work: Enjoy flexible work arrangements that balance your work-life needs. By joining Prismforce, you'll become part of a rapidly expanding, innovative company that's reshaping the future of tech services and talent management. Perks & Benefits Work with the best in the industry: Work with a high-pedigree leadership team that will challenge you, build on your strengths and invest in your personal development Insurance Coverage-Group Mediclaim cover for self,spouse,kids and parents & Group Term Life Insurance Policy for self. Flexible Policies Retiral Benefits Hybrid Work Model Self-driven career progression tool
Posted 2 weeks ago
0 years
0 Lacs
Burdwan, West Bengal, India
On-site
University: Delft University of Technology (TU Delft) Country: Netherlands Deadline: 2025-07-31 Fields: Civil Engineering, Computational Science, Mechanical Engineering, Applied Mathematics, Environmental Engineering The Department of Civil Engineering and Geosciences at Delft University of Technology (TU Delft) invites applications for a PhD position focused on the development of scalable solvers for multiscale, multiphysics, and multifidelity simulations in offshore floating solar farms. The successful candidate will join the CMOE group and contribute to the DigiOcean4Solar project, advancing scientific computing in the field of renewable energy. Key Responsibilities – Develop and analyze multiscale models for wind-wave-structure interaction in offshore floating solar farms. – Design and implement scalable solvers for large-scale high-performance computing applications. – Assess farm-scale effects in floating solar installations. Requirements – Strong background in scientific computing, numerical partial differential equations, and finite element methods. – Proficiency in programming languages such as Julia, Python, or C++, and experience with HPC libraries. – Experience or interest in multiphysics or multiscale simulation methods. – Demonstrated enthusiasm for renewable energy and/or offshore engineering applications. Benefits – Opportunity to advance scientific computing for emerging renewable energy technologies. – Collaborative research environment with international research institutions. – Membership in the CMOE research group at TU Delft. Application Procedure Interested candidates are invited to apply online at https://lnkd.in/eVhv6aHG. The position is based in Delft, Netherlands. Application deadline: 2025-07-31 Get the latest openings in your field and preferred country—straight to your email inbox. Sign up now for 14 days free: https://phdfinder.com/position-alert-service/ We’re an independent team helping students find opportunities. Found this opportunity helpful? Support us with a coffee! Also See PhD Research Fellow in the Mutual Impact of Climate Change and Offshore Renewable Energy PhD Position in Underwater Acoustics and Offshore Engineering PhD and Postdoctoral Positions in Aero/Hydro-Acoustics – ERC Project Off-coustics PhD Position in Geodesy and Satellite Radar Interferometry (InSAR) PhD Vacancies in Offshore Engineering – Underwater Acoustics and Environmental Impacts
Posted 2 weeks ago
10.0 - 15.0 years
30 - 35 Lacs
Hyderabad
Work from Office
Define, Design, and Build an optimal data pipeline architecture to collect data from a variety of sources, cleanse, and organize data in SQL & NoSQL destinations (ELT & ETL Processes). Define and Build business use case-specific data models that can be consumed by Data Scientists and Data Analysts to conduct discovery and drive business insights and patterns. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS big data technologies. Build and deploy analytical models and tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics. Work with stakeholders including the Executive, Product, Data, and Design teams to assist with data-related technical issues and support their data infrastructure needs. Define, Design, and Build Executive dashboards and reports catalogs to serve decision-making and insight generation needs. Provide inputs to help keep data separated and secure across data centers on-prem and private and public cloud environments. Create data tools for analytics and data science team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. Implement scheduled data load process and maintain and manage the data pipelines. Troubleshoot, investigate, and fix failed data pipelines and prepare RCA. Experience with a mix of the following Data Engineering Technologies Python, Spark, Snowflake, Databricks, Hadoop (CDH), Hive, Sqoop, oozie SQL Postgres, MySQL, MS SQL Server Azure ADF, Synapse Analytics, SQL Server, ADLS G2 AWS Redshift, EMR cluster, S3 Experience with a mix of the following Data Analytics and Visualization toolsets SQL, PowerBI, Tableau, Looker, Python, R Python libraries -- Pandas, Scikit-learn, Seaborn, Matplotlib, TF, Stat-Models, PySpark, Spark-SQL, R, SAS, Julia, SPSS, Azure Synapse Analytics, Azure ML studio, Azure Auto ML
Posted 3 weeks ago
1.0 - 2.0 years
3 - 6 Lacs
Hyderabad
Work from Office
Define, Design, and Build an optimal data pipeline architecture to collect data from a variety of sources, cleanse, and organize data in SQL & NoSQL destinations (ELT & ETL Processes). Define and Build business use case-specific data models that can be consumed by Data Scientists and Data Analysts to conduct discovery and drive business insights and patterns. Identify, design, and implement internal process improvements: automating manual processes, optimizing data delivery, re-designing infrastructure for greater scalability, etc. Build the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using SQL and AWS big data technologies. Build and deploy analytical models and tools that utilize the data pipeline to provide actionable insights into customer acquisition, operational efficiency, and other key business performance metrics. Work with stakeholders including the Executive, Product, Data, and Design teams to assist with data-related technical issues and support their data infrastructure needs. Define, Design, and Build Executive dashboards and reports catalogs to serve decision-making and insight generation needs. Provide inputs to help keep data separated and secure across data centers - on-prem and private and public cloud environments. Create data tools for analytics and data science team members that assist them in building and optimizing our product into an innovative industry leader. Work with data and analytics experts to strive for greater functionality in our data systems. Implement scheduled data load process and maintain and manage the data pipelines. Troubleshoot, investigate, and fix failed data pipelines and prepare RCA. Experience with a mix of the following Data Engineering Technologies Python, Spark, Snowflake, Databricks, Hadoop (CDH), Hive, Sqoop, oozie SQL - Postgres, MySQL, MS SQL Server Azure - ADF, Synapse Analytics, SQL Server, ADLS G2 AWS - Redshift, EMR cluster, S3 Experience with a mix of the following Data Analytics and Visualization toolsets SQL, PowerBI, Tableau, Looker, Python, R Python libraries -- Pandas, Scikit-learn, Seaborn, Matplotlib, TF, Stat-Models, PySpark, Spark-SQL, R, SAS, Julia, SPSS, Azure - Synapse Analytics, Azure ML studio, Azure Auto ML
Posted 3 weeks ago
10.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Description: Principal Data Scientist (Reinforcement Learning and Game AI) Location : Chennai Department : Data Science Reports To : CTO Please share your profile to elangovan@algosoftware.io Mission: We are seeking a Principal Data Scientist with deep expertise in Reinforcement Learning (RL) and Neural Networks to lead the evolution of our AI core for gaming applications, specifically in imperfect information games like poker. This role offers the opportunity to work on ground breaking innovations, collaborate with industry leaders, and gain visibility as a domain expert through publications and conferences. Key Responsibilities : Lead the development and evolution of our AI core, currently built in C++, towards the next generation of algorithms and architectures. The main responsibility is to lead the evolution of our AI core to the next level. Included the opportunity to get visibility as an industry expert through the publishing of papers, conferences, and so on, to make it more appealing. Design, implement, and optimize advanced RL algorithms tailored for imperfect information games. Research and integrate state-of-the-art techniques in RL, neural networks, and game theory to enhance AI performance and scalability. Collaborate with cross-functional teams to identify challenges and innovate AI-driven solutions. Evaluate and fine-tune AI models for decision-making, strategy optimization, and real-time applications in gaming. Profile, optimize, and troubleshoot the AI core for high-performance execution across different computing architectures (e.g., CPU, GPU, or custom accelerators). Document methodologies, experiments, and results to ensure transparency and reproducibility. Required Skills and Qualifications : Bachelors, Master’s, or Ph.D. in Computer Science, Mathematics, Physics, Artificial Intelligence, or a related field. 10+ years of experience in AI/ML, with a strong focus on Reinforcement Learning and Neural Networks. Proficiency in programming languages commonly used in AI, such as C++, Python, Julia, or others relevant to your expertise. In-depth understanding of game theory, especially concepts like Nash equilibrium and strategies in imperfect information games. Expertise in RL frameworks and tools like OpenSpiel, RLlib, or similar libraries tailored for game AI. Strong knowledge of RL algorithms, and experience with neural network architectures. Familiarity with parallel computing, performance profiling, and optimization techniques. Excellent problem-solving skills, with the ability to work independently and in a team. Preferred Skills : Experience with multi-agent RL or hierarchical RL in gaming contexts. Background in poker AI or similar imperfect information games. Familiarity with deep learning frameworks such as TensorFlow, PyTorch, or Scikit-learn. Knowledge of distributed systems, parallel programs, or cloud-based AI deployments. Published research or contributions to open-source AI projects in RL or game AI. What We Offer : The opportunity to work on cutting-edge AI applications in gaming and lead innovations in imperfect information games. Support for publishing research papers, presenting at conferences, and gaining visibility as a world domain expert in the field. A collaborative and intellectually stimulating work environment with a team passionate about pushing the boundaries of AI. Flexibility and resources to experiment with state-of-the-art techniques and contribute to open-source projects. Soft Skills : Effective communication for collaborating with cross-functional teams and presenting complex ideas clearly. Passion for innovation and driving the next generation of AI in gaming. Compensation & Benefits: Competitive salary commensurate with experience. Performance-based incentives tied to achieving key financial and strategic goals. Opportunities for career growth within a fast-scaling tech group.
Posted 3 weeks ago
6.0 years
0 Lacs
Chennai, Tamil Nadu, India
Remote
About Company Papigen is a fast-growing global technology services company, delivering innovative digital solutions through deep industry experience and cutting-edge expertise. We specialize in technology transformation, enterprise modernization, and dynamic areas like Cloud, Big Data, Java, React, DevOps, and more. Our client-centric approach combines consulting, engineering, and data science to help businesses evolve and scale efficiently. We support flexible engagement models including Time & Material, Staff Augmentation, and SoW-based Managed Services. About The Role We’re hiring a Senior Machine Learning Developer to join our data modernization and AI engineering team. In this role, you will design, develop, and deploy ML models and intelligent systems that drive next-gen financial insights. You will work with structured and unstructured data on a secure cloud-based infrastructure leveraging Azure ML , Python , and modern NLP/AI frameworks . Key Responsibilities Build, train, and implement ML/NLP models for classification, clustering, and text analysis Work with large-scale financial data and develop intelligent automation solutions Clean, normalize, and validate structured and unstructured datasets Build and consume REST APIs for ML services Integrate Azure AI services including Cognitive Services, OpenAI, and Form Recognizer Collaborate with cross-functional teams including DevOps, data engineers, and PMs Follow best practices in model versioning, testing, and deployment Document ML workflows and models for reproducibility and compliance Required Skills & Experience 6+ years of experience in software/data engineering, with 3+ years in ML/AI Strong Python expertise with libraries like Scikit-learn, Pandas, NumPy, etc. Experience with ML frameworks such as TensorFlow or PyTorch Experience building AI models for NLP tasks like classification, summarization, or entity extraction Familiarity with Azure AI/ML services (Azure ML Studio, Cognitive Services, AKS, Key Vault) Experience handling both structured and unstructured datasets REST API and Python library development Excellent communication and documentation skills Nice To Have Exposure to finance sector datasets or financial document automation Experience with Azure OpenAI, Azure Language Studio, or Chatbot frameworks Familiarity with statistical programming languages like R or Julia Microsoft Data or Azure AI/ML certifications Benefits And Perks Opportunity to work with leading global clients Flexible work arrangements with remote options Exposure to modern technology stacks and tools Supportive and collaborative team environment Continuous learning and career development opportunities Skills: machine learning,natural language processing (nlp),numpy,azure ml,data normalization,azure cognitive services,financial data,azure key vault,azure ml studio,rest apis,unstructured data,api development,openai,data cleansing,rest api,python,scikit-learn,aks,tensorflow,nlp,pandas,data engineering,azure ai,pytorch,statistical programming,text classification
Posted 3 weeks ago
8.0 - 11.0 years
45 - 50 Lacs
Noida, Kolkata, Chennai
Work from Office
Dear Candidate, We are hiring a Julia Developer to build computational and scientific applications requiring speed and mathematical accuracy. Ideal for domains like finance, engineering, or AI research. Key Responsibilities: Develop applications and models using the Julia programming language . Optimize for performance, parallelism, and numerical accuracy . Integrate with Python or C++ libraries where needed. Collaborate with data scientists and engineers on simulations and modeling. Maintain well-documented and reusable codebases. Required Skills & Qualifications: Proficient in Julia , with knowledge of multiple dispatch and type system Experience in numerical computing or scientific research Familiarity with Plots.jl, Flux.jl, or DataFrames.jl Understanding of Python, R, or MATLAB is a plus Soft Skills: Strong troubleshooting and problem-solving skills. Ability to work independently and in a team. Excellent communication and documentation skills. Note: If interested, please share your updated resume and preferred time for a discussion. If shortlisted, our HR team will contact you. Kandi Srinivasa Delivery Manager Integra Technologies
Posted 4 weeks ago
7.0 - 11.0 years
4 - 7 Lacs
Gurugram
Work from Office
Skill required: Delivery - Digital Marketing Analytics Designation: I&F Decision Sci Practitioner Specialist Qualifications: Any Graduation Years of Experience: 7 to 11 years About Accenture Combining unmatched experience and specialized skills across more than 40 industries, we offer Strategy and Consulting, Technology and Operations services, and Accenture Song all powered by the worlds largest network of Advanced Technology and Intelligent Operations centers. Our 699,000 people deliver on the promise of technology and human ingenuity every day, serving clients in more than 120 countries. Visit us at www.accenture.com What would you do Data & AIIn digital marketing analytics, you will be involved in processes and technologies that enable marketers to effectively evaluate the success and value of their digital marketing initiatives, identify trends and patterns over time and make data driven decisions. What are we looking for Data Science Proficiency in Data Modeling, Experimental Design/Analysis, Marketing and/or Business Analytics. Knowledge of Data Science and Machine Learning concepts and algorithms. Experienced with programming languages and Data Science scripting (in Python / R / Scala / Julia) Experience with Cloud Technologies Experience in implementing ethical practices into data science. Negotiation skills Problem-solving skills Adaptable and flexible Results orientation Prioritization of workload Roles and Responsibilities: Utilize advanced statistical and machine learning techniques to address complex business problems Possess the ability to conduct exploratory data analysis and present findings in ways that are meaningful to stakeholders Develop, refine, and implement models that enhance decision-making processes. Perform comprehensive preprocessing of both structured and unstructured data to ensure its quality and relevance Undertake data collection, analysis, and interpretation to extract meaningful insights. Analyze vast datasets to identify trends, patterns, and correlations that contribute to informed decision-making Build, refine, and deploy predictive models and machine-learning algorithms that align with business objectives Collaborate closely with engineering and marketing teams to understand their requirements and integrate data-driven solutions into their workflows Conduct ad-hoc analyses to provide timely and relevant insights, along with actionable recommendations to address pressing business questions Develop and implement machine learning solutions, including audience segmentation, marketing engagement level evaluation, and prioritization of sales engagement with accounts Manage the end-to-end lifecycle of machine learning models, including training, testing, deployment, and continuous improvement Effectively communicate complex findings and insights to both technical and non-technical stakeholders Document methodologies, data sources, and models to ensure transparency and reproducibility Qualification Any Graduation
Posted 1 month ago
0 years
0 Lacs
India
Remote
Company Description At JaDel Digital, we specialise in delivering end-to-end digital solutions that empower businesses to thrive in today’s technology-driven world. With expertise in web development, data science, machine learning, artificial intelligence, and Salesforce solutions, we transform visions into impactful results. Our mission is to simplify complex challenges through innovative, scalable, and optimized solutions that drive business growth. We offer a collaborative approach, focusing on measurable outcomes that align with our clients' business goals. Role Description We’re seeking a skilled Julia Developer to work 3–4 hours per day remotely. The focus will be on refining existing Julia code, improving structure, performance, and maintainability. You’ll collaborate with our data science and development teams to ensure clean, efficient, and well-documented implementations. Responsibilities Review and refactor existing Julia code Improve performance, readability, and modularity Debug issues and ensure code stability Maintain proper documentation and test coverage Work closely with the team to align improvements with project goals Requirements Proficiency in Julia programming Strong experience with code optimization and clean coding practices Familiarity with scientific computing, numerical methods, or data workflows Comfortable working independently in a part-time, remote setup Basic knowledge of Git and collaborative development workflows Preferred Skills Experience with Flux.jl, DataFrames.jl, or optimization libraries Familiarity with PyJulia or integrating Julia with Python Background in AI/ML or numerical simulation projects What We Offer Flexible, part-time remote engagement Compensation aligned with market standards Opportunity to work on real-world AI/data science projects Collaborative and tech-focused work environment Job Details Location: Remote (India preferred) Type: Contractual | Part-Time (3–4 hours/day) Compensation: As per market standards
Posted 1 month ago
12.0 years
0 Lacs
Gurugram, Haryana, India
Remote
Pharma experience is a must Job Title: GenAI (Director / Sr. Director) Job Location: - Noida / Gurgaon / Bangalore Job Overview: We are looking for a GenAI Leader with 12+ years of experience and a strong background in pharma/life sciences to join our team. As a key leader, you will drive innovation in the pharma commercial and clinical space by designing cutting-edge AI solutions, leading high-performing teams, and delivering impactful results. If you have a proven track record in Generative AI, hands-on experience with models like GPT, DALL-E, and Stable Diffusion, and a passion for mentoring and thought leadership, we want to hear from you! Must have skills & competencies Enhance the go-to-market strategy by designing new and relevant solution frameworks to accelerate our clients’ journeys for impacting patient outcomes. Pitch for these opportunities and craft winning proposals to grow the Data Science Practice. Build and lead a team of data scientists and analysts, fostering a collaborative and innovative environment. Oversee the design and delivery of the models, ensuring projects are completed on time and meet business objectives. Engaging in consultative selling with clients to grow/deliver business. Develop and operationalize scalable processes to deliver on large & complex client engagements. Proven experience in building and productizing GenAI app at an enterprise level Ensure profitable delivery and great customer experience – design the end-to-end solution, put together the right team and help them deliver as per established processes. Build an A team – hire the required skills sets and nurture them in a supporting environment to develop strong delivery leaders for the Data Science Practice. Train and mentor staff and establish best practices and ways of working to enhance data science capabilities at Axtria. Operationalize an eco-system for continuous learning & development. Write white papers, collaborate with academia and participate in relevant speaker opportunities to continuously upgrade learning & establish Axtria’s thought leadership in this space. Research, develop, evaluate and optimize newly emerging algorithms and technologies for relevant use cases in pharma commercial & clinical space. Extensive hands-on experience with Python, R, or Julia, focusing on data science and generative AI frameworks. Expertise in working with generative models such as GPT, DALL-E, Stable Diffusion, Codex, and MidJourney for various applications. Proficiency in fine-tuning and deploying generative models using libraries like Hugging Face Transformers, Diffusers, or PyTorch Lightning. Strong understanding of generative techniques, including GANs, VAEs, diffusion models, and autoregressive models. Experience in prompt engineering, zero-shot, and few-shot learning for optimizing generative AI outputs across different use cases. Expertise in managing generative AI data pipelines, including preprocessing large-scale multimodal datasets for text, image, or code generation. Experience in leveraging APIs and SDKs for generative AI services from OpenAI, Anthropic, Cohere, and Google AI. Expertise in defining generative AI strategies, aligning them with business objectives, and implementing solutions for scalable deployment. Knowledge of real-world applications of generative AI, such as text generation, code generation, image generation, chatbots, and content creation. Awareness of ethical considerations in generative AI, including bias mitigation, data privacy, and safe deployment practices. Good to have skills & competencies Proven ability to collaborate with cross-functional teams, including product managers, data scientists, and DevOps engineers, to deliver end-to-end generative AI solutions. Proven experience in leading teams focused on generative AI projects, mentoring ML engineers, and driving innovation in AI applications. Possessing robust analytical skills to address and model intricate business needs is highly advantageous, especially for those with a background in life sciences or pharmaceuticals. Eligibility Criteria Masters/PhD in CSE/IT from Tier 1 institute Minimum 12+ years of relevant experience in building software applications in data and analytics field. We will provide– (Employee Value Proposition) Offer an inclusive environment that encourages diverse perspectives and ideas Delivering challenging and unique opportunities to contribute to the success of a transforming organization Opportunity to work on technical challenges that may impact on geographies Vast opportunities for self-development: online Axtria Institute, knowledge sharing opportunities globally, learning opportunities through external certifications Sponsored Tech Talks & Hackathons Possibility of relocating to any Axtria office for short and long-term projects Benefit package: -Health benefits -Retirement benefits -Paid time off -Flexible Benefits -Hybrid /FT Office/Remote Axtria is an equal-opportunity employer that values diversity and inclusiveness in the workplace. Who we are Axtria 14 years journey Axtria, Great Place to Work Life at Axtria Axtria Diversity
Posted 1 month ago
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Summary Position Summary DT-US Product Engineering - Data Scientist Manager We are seeking an exceptional Data Scientist who combines deep expertise in AI/ML with a strong focus on data quality and advanced analytics. This role requires a proven track record in developing production-grade machine learning solutions, implementing robust data quality frameworks, and leveraging cutting-edge analytical tools to drive business transformation through data-driven insights . Work you will do The Data Scientist will be responsible for developing and implementing end-to-end AI/ML solutions while ensuring data quality excellence across all stages of the data lifecycle. This role requires extensive experience in modern data science platforms, AI frameworks, and analytical tools, with a focus on scalable and production-ready implementations. Project Leadership and Management: Lead complex data science initiatives utilizing Databricks, Dataiku, and modern AI/ML frameworks for end-to-end solution development Establish and maintain data quality frameworks and metrics across all stages of model development Design and implement data validation pipelines and quality control mechanisms for both structured and unstructured data Strategic Development: Develop and deploy advanced machine learning models, including deep learning and generative AI solutions Design and implement automated data quality monitoring systems and anomaly detection frameworks Create and maintain MLOps pipelines for model deployment, monitoring, and maintenance Team Mentoring and Development: Lead and mentor a team of data scientists and analysts, fostering a culture of technical excellence and continuous learning Develop and implement training programs to enhance team capabilities in emerging technologies and methodologies Establish performance metrics and career development pathways for team members Drive knowledge sharing initiatives and best practices across the organization Provide technical guidance and code reviews to ensure high-quality deliverables Data Quality and Governance: Establish data quality standards and best practices for data collection, preprocessing, and feature engineering Implement data validation frameworks and quality checks throughout the ML pipeline Design and maintain data documentation systems and metadata management processes Lead initiatives for data quality improvement and standardization across projects Technical Implementation: Design, develop and deploy end-to-end AI/ML solutions using modern frameworks including TensorFlow, PyTorch, scikit-learn, XGBoost for machine learning, BERT and GPT for NLP, and OpenCV for computer vision applications Architect and implement robust data processing pipelines leveraging enterprise platforms like Databricks, Apache Spark, Pandas for data transformation, Dataiku and Apache Airflow for ETL/ELT processes, and DVC for data version control Establish and maintain production-grade MLOps practices including model deployment, monitoring, A/B testing, and continuous integration/deployment pipelines Technical Expertise Requirements: Must Have: Enterprise AI/ML Platforms: Demonstrate mastery of Databricks for large-scale processing, with proven ability to architect solutions at scale Programming & Analysis: Advanced Python (NumPy, Pandas, scikit-learn), SQL, PySpark with production-level expertise Machine Learning: Deep expertise in TensorFlow or PyTorch, and scikit-learn with proven implementation experience Big Data Technologies: Advanced knowledge of Apache Spark, Databricks, and distributed computing architectures Cloud Platforms: Strong experience with at least one major cloud platform (AWS/Azure/GCP) and their ML services (SageMaker/Azure ML/Vertex AI) Data Processing & Analytics: Extensive experience with enterprise-grade data processing tools and ETL pipelines MLOps & Infrastructure: Proven experience in model deployment, monitoring, and maintaining production ML systems Data Quality: Experience implementing comprehensive data quality frameworks and validation systems Version Control & Collaboration: Strong proficiency with Git, JIRA, and collaborative development practices Database Systems: Expert-level knowledge of both SQL and NoSQL databases for large-scale data management Visualization Tools: Tableau, Power BI, Plotly, Seaborn Large Language Models: Experience with GPT, BERT, LLaMA, and fine-tuning methodologies Good to Have: Additional Programming: R, Julia Additional Big Data: Hadoop, Hive, Apache Kafka Multi-Cloud: Experience across AWS, Azure, and GCP platforms Advanced Analytics: Dataiku, H2O.ai Additional MLOps: MLflow, Kubeflow, DVC (Data Version Control) Data Quality & Validation: Great Expectations, Deequ, Apache Griffin Business Intelligence: SAP HANA, SAP Business Objects, SAP BW Specialized Databases: Cassandra, MongoDB, Neo4j Container Orchestration: Kubernetes, Docker Additional Collaboration Tools: Confluence, BitBucket Education: Advanced degree in quantitative discipline (Statistics, Math, Computer Science, Engineering) or relevant experience. Qualifications: 10-13 years of experience with data mining, statistical modeling tools and underlying algorithms. 5+ years of experience with data analysis software for large scale analysis of structured and unstructured data. Proven track record of leading and delivering large-scale machine learning projects, including production model deployment, data quality framework implementation and experience with very large datasets to create data-driven insights thru predictive and prescriptive analytic models. E xtensive knowledge of supervised and unsupervised analytic modeling techniques such as linear and logistic regression, support vector machines, decision trees / random forests, Naïve-Bayesian, neural networks, association rules, text mining, and k-nearest neighbors among other clustering models. Extensive experience with deep learning frameworks, automated ML platforms, data processing tools (Databricks Delta Lake, Apache Spark), analytics platforms (Tableau, Power BI), and major cloud providers (AWS, Azure, GCP) Experience architecting and implementing enterprise-grade solutions using cloud-native ML services while ensuring cost optimization and performance efficiency Strong track record of team leadership, stakeholder management, and driving technical excellence across multiple concurrent projects Expert-level proficiency in Python, R, and SQL, with deep understanding of statistical analysis, hypothesis testing, feature engineering, model evaluation, and validation techniques in production environments Demonstrated leadership experience in implementing MLOps practices, including model monitoring, A/B testing frameworks, and maintaining production ML systems at scale. Working knowledge of supervised and unsupervised learning techniques, such as Regression/Generalized Linear Models, decision tree analysis, boosting and bagging, Principal Components Analysis, and clustering methods. Strong oral and written communication skills, including presentation skills The Team Information Technology Services (ITS) helps power Deloitte’s success. ITS drives Deloitte, which serves many of the world’s largest, most respected organizations. We develop and deploy cutting-edge internal and go-to-market solutions that help Deloitte operate effectively and lead in the market. Our reputation is built on a tradition of delivering with excellence. The ~3,000 professionals in ITS deliver services including: Security, risk & compliance Technology support Infrastructure Applications Relationship management Strategy Deployment PMO Financials Communications Product Engineering (PxE) Product Engineering (PxE) team is the internal software and applications development team responsible for delivering leading-edge technologies to Deloitte professionals. Their broad portfolio includes web and mobile productivity tools that empower our people to log expenses, enter timesheets, book travel and more, anywhere, anytime. PxE enables our client service professionals through a comprehensive suite of applications across the business lines. In addition to application delivery, PxE offers full-scale design services, a robust mobile portfolio, cutting-edge analytics, and innovative custom development. Work Location: Hyderabad Our purpose Deloitte’s purpose is to make an impact that matters for our people, clients, and communities. At Deloitte, purpose is synonymous with how we work every day. It defines who we are. Our purpose comes through in our work with clients that enables impact and value in their organizations, as well as through our own investments, commitments, and actions across areas that help drive positive outcomes for our communities. Our people and culture Our inclusive culture empowers our people to be who they are, contribute their unique perspectives, and make a difference individually and collectively. It enables us to leverage different ideas and perspectives, and bring more creativity and innovation to help solve our clients' most complex challenges. This makes Deloitte one of the most rewarding places to work. Professional development At Deloitte, professionals have the opportunity to work with some of the best and discover what works best for them. Here, we prioritize professional growth, offering diverse learning and networking opportunities to help accelerate careers and enhance leadership skills. Our state-of-the-art DU: The Leadership Center in India, located in Hyderabad, represents a tangible symbol of our commitment to the holistic growth and development of our people. Explore DU: The Leadership Center in India . Benefits To Help You Thrive At Deloitte, we know that great people make a great organization. Our comprehensive rewards program helps us deliver a distinctly Deloitte experience that helps that empowers our professionals to thrive mentally, physically, and financially—and live their purpose. To support our professionals and their loved ones, we offer a broad range of benefits. Eligibility requirements may be based on role, tenure, type of employment and/ or other criteria. Learn more about what working at Deloitte can mean for you. Recruiting tips From developing a stand out resume to putting your best foot forward in the interview, we want you to feel prepared and confident as you explore opportunities at Deloitte. Check out recruiting tips from Deloitte recruiters. Requisition code: 303069 Show more Show less
Posted 1 month ago
2.0 years
10 - 12 Lacs
Gurgaon
On-site
Job Overview We are looking for a dynamic and innovative Full Stack Data Scientist with 2+ years of experience who excels in end-to-end data science solutions. The ideal candidate is a tech-savvy professional passionate about leveraging data to solve complex problems, develop predictive models, and drive business impact in the MarTech domain. Key Responsibilities 1. Data Engineering & Preprocessing Collect, clean, and preprocess structured and unstructured data from various sources. Perform advanced feature engineering, outlier detection, and data transformation. Collaborate with data engineers to ensure seamless data pipeline development. 2. Machine Learning Model Development Design, train, and validate machine learning models (supervised, unsupervised, deep learning). Optimize models for business KPIs such as accuracy, recall, and precision. Innovate with advanced algorithms tailored to marketing technologies. 3. Full Stack Development Build production-grade APIs for model deployment using frameworks like Flask, FastAPI, or Django. Develop scalable and modular code for data processing and ML integration. 4. Deployment & Operationalization Deploy models on cloud platforms (AWS, Azure, or GCP) using tools like Docker and Kubernetes. Implement continuous monitoring, logging, and retraining strategies for deployed models. 5. Insight Visualization & Communication Create visually compelling dashboards and reports using Tableau, Power BI, or similar tools. Present insights and actionable recommendations to stakeholders effectively. 6. Collaboration & Teamwork Work closely with marketing analysts, product managers, and engineering teams to solve business challenges. Foster a collaborative environment that encourages innovation and shared learning. 7. Continuous Learning & Innovation Stay updated on the latest trends in AI/ML, especially in marketing automation and analytics. Identify new opportunities for leveraging data science in MarTech solutions. Qualifications Educational Background Bachelor’s or Master’s degree in Computer Science, Data Science, Statistics, Mathematics, or a related field. Technical Skills Programming Languages: Python (must-have), R, or Julia; familiarity with Java or C++ is a plus. ML Frameworks: TensorFlow, PyTorch, Scikit-learn, or XGBoost. Big Data Tools: Spark, Hadoop, or Kafka. Cloud Platforms: AWS, Azure, or GCP for model deployment and data pipelines. Databases: Expertise in SQL and NoSQL (e.g., MongoDB, Cassandra). Visualization: Mastery of Tableau, Power BI, Plotly, or D3.js. Version Control: Proficiency with Git for collaborative coding. Experience 2+ years of hands-on experience in data science, machine learning, and software engineering. Proven expertise in deploying machine learning models in production environments. Experience in handling large datasets and implementing big data technologies. Soft Skills Strong problem-solving and analytical thinking. Excellent communication and storytelling skills for technical and non-technical audiences. Ability to work collaboratively in diverse and cross-functional teams. Preferred Qualifications Experience with Natural Language Processing (NLP) and Computer Vision (CV). Familiarity with CI/CD pipelines and DevOps for ML workflows. Exposure to Agile project management methodologies. Why Join Us? Opportunity to work on innovative projects with cutting-edge technologies. Collaborative and inclusive work environment that values creativity and growth. If you're passionate about turning data into actionable insights and driving impactful business decisions, we’d love to hear from you! Job Types: Full-time, Permanent Pay: ₹1,000,000.00 - ₹1,200,000.00 per year Benefits: Flexible schedule Health insurance Life insurance Paid sick time Paid time off Provident Fund Schedule: Day shift Fixed shift Monday to Friday Experience: Data science: 2 years (Required) Location: Gurugram, Haryana (Preferred) Work Location: In person
Posted 1 month ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About The Company JuliaHub's mission is to develop products that bring Julia's superpowers to its customers. JuliaHub's flagship product is JuliaHub, a secure, cloud based, software-as-a-service platform for developing Julia programs, deploying them, and scaling to thousands of nodes. JuliaHub was founded in 2015 by the creators of the Julia programming language for artificial intelligence, machine learning, analytics, data science, modeling, and simulation. About The Role We are seeking a creative and detail-oriented Web Developer to join our team and play a key role in supporting the marketing team. You will be responsible for bringing marketing content, campaigns, and digital experiences to life on our website. This role requires a strong blend of technical expertise and an eye for design, ensuring a seamless user experience that aligns with brand and business goals. Key Responsibilities Develop and maintain web pages, landing pages, and content sections based on marketing needs. Collaborate with the marketing team to implement content updates, design changes, and campaign-specific web features. Optimize website performance, SEO, and accessibility. Ensure the website is responsive, user-friendly, and aligned with the latest web standards. Troubleshoot and resolve website issues or bugs quickly. Key Requirements Bachelors degree in Computer Science, Web Development, or a related field. 3+ years of experience in web development, preferably in a marketing-focused environment. Proficiency in HTML, CSS, and JavaScript. Hands-on experience with content management systems (CMS), with a strong preference for HubSpot CMS. Proficient in deploying, managing, and maintaining websites using a CMS, ideally HubSpot. Strong technical skills in resolving SEO errors flagged by tools such as Semrush (e., broken links, missing tags, crawl issues, site speed optimization). Ability to implement structured data, canonical tags, XML sitemaps, and other technical SEO elements. Working knowledge of web analytics tools such as Google Analytics 4 (GA4) and Google Tag Manager (GTM), with the ability to implement and troubleshoot tracking scripts. Understanding of UI/UX principles and responsive design. Strong attention to detail, time management, and communication skills. Ability to work collaboratively with cross-functional teams. Inside Our Culture Were more than just a startupwere a hub of innovation where cutting-edge technology meets a high-growth mindset. As part of our team, you'll be working on challenging, impactful projects that push the boundaries of whats possible. Unlimited Learning & Development : Dive into hands-on projects with the latest technology stacks, collaborating with a global team of technical experts. We offer continuous learning opportunities, including workshops, mentorship, and access to courses and certifications to fuel your professional growth. Innovate & Build at Scale : We are tackling some of the most complex technical challenges in the industry. As a part of our highly skilled, diverse team, you'll have the autonomy to experiment, innovate, and help shape the future of technology. Ownership & Impact : At our fast-growing startup, youll have the opportunity to take ownership of products and features that directly impact the business and our customers. Your contributions will be recognized and rewarded. Uncapped Sick Leaves (TRUST Policy) : With our TRUST policy, you can take unlimited sick leaves year-round, no questions asked. We prioritize your well-being and trust you to manage your time responsibly. Top-Tier Compensation & Equity : Along with a competitive salary, we offer equity options, giving you the chance to share in the success of the company as we grow. Health & Wellness Support : We provide company-sponsored health and wellness benefits, along with retirement plans to secure your future. Join us and be a part of a technical powerhouse where growth, innovation, and impact are at the core of everything we do! (ref:hirist.tech) Show more Show less
Posted 1 month ago
7.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Position Summary... What you'll do... About Team: This is the team which builds reusable technologies that aid in acquiring customers, onboarding and empowering merchants besides ensuring a seamless experience for both these stakeholders. We also optimize tariffs and assortment, adhering to the Walmart philosophy - Everyday Low Cost. In addition to ushering in affordability, we also create personalized experiences for customers the omnichannel way, across all channels - in-store, on the mobile app and websites. Marketplace is the gateway to domestic and international Third-Party sellers; we enable them to manage their end-to-end onboarding, catalog management, order fulfilment, return ; refund management. Our team is responsible for design, development, and operations of large-scale distributed systems by leveraging cutting-edge technologies in web/mobile, cloud, big data ; AI/ML. We interact with multiple teams across the company to provide scalable robust technical solutions. What you'll do: As a Data Scientist for Walmart , you'll have the opportunity to Drive data-derived insights across the wide range of retail divisions by developing advanced statistical models, machine learning algorithms and computational algorithms based on business initiatives Direct the gathering of data, assessing data validity and synthesizing data into large analytics datasets to support project goals Utilize big data analytics and advanced data science techniques to identify trends, patterns, and discrepancies in data. Determine additional data needed to support insights Build and train statistical models and machine learning algorithms for replication for future projects Communicate recommendations to business partners and influencing future plans based on insights What you'll bring: Very good knowledge of the foundations of machine learning and statistics Hand on Experience in building and maintaining Gen AI powered solutions in production Experience in Analyzing the Complex Problems and translate it into data science algorithms Experience in machine learning, supervised and unsupervised and deep learning. Hands on experience in Computer Visions and NLP. Experience with big data analytics - identifying trends, patterns, and outliers in large volumes of data Strong Experience in Python with excellent knowledge of Data Structures Strong Experience with big data platforms Hadoop (Hive, Pig, Map Reduce, HQL, Scala, Spark) Hands on experience with Git Experience with SQL and relational databases, data warehouse Qualifications Bachelors with > 7 years of experience / Masters degree with > 5 years of experience. Educational qualifications should be preferably in Computer Science/Mathematics/Statistics or a related area. Experience should be relevant to the role. Good to have: Experience in ecommerce domain. Experience in R and Julia Demonstrated success in data science platforms like Kaggle. About Walmart Global Tech Imagine working in an environment where one line of code can make life easier for hundreds of millions of people. Thats what we do at Walmart Global Tech. Were a team of software engineers, data scientists, cybersecurity experts and service professionals within the worlds leading retailer who make an epic impact and are at the forefront of the next retail disruption. People are why we innovate, and people power our innovations. We are people-led and tech-empowered. We train our team in the skillsets of the future and bring in experts like you to help us grow. We have roles for those chasing their first opportunity as well as those looking for the opportunity that will define their career. Here, you can kickstart a great career in tech, gain new skills and experience for virtually every industry, or leverage your expertise to innovate at scale, impact millions and reimagine the future of retail. Flexible, hybrid work We use a hybrid way of working with primary in office presence coupled with an optimal mix of virtual presence. We use our campuses to collaborate and be together in person, as business needs require and for development and networking opportunities. This approach helps us make quicker decisions, remove location barriers across our global team, be more flexible in our personal lives. Benefits Beyond our great compensation package, you can receive incentive awards for your performance. Other great perks include a host of best-in-class benefits maternity and parental leave, PTO, health benefits, and much more. Belonging We aim to create a culture where every associate feels valued for who they are, rooted in respect for the individual. Our goal is to foster a sense of belonging, to create opportunities for all our associates, customers and suppliers, and to be a Walmart for everyone. At Walmart, our vision is ''everyone included.'' By fostering a workplace culture where everyone isand feelsincluded, everyone wins. Our associates and customers reflect the makeup of all 19 countries where we operate. By making Walmart a welcoming place where all people feel like they belong, were able to engage associates, strengthen our business, improve our ability to serve customers, and support the communities where we operate. Equal Opportunity Employer Walmart, Inc., is an Equal Opportunities Employer By Choice. We believe we are best equipped to help our associates, customers and the communities we serve live better when we really know them. That means understanding, respecting and valuing unique styles, experiences, identities, ideas and opinions while being welcoming of all people. Minimum Qualifications... Outlined below are the required minimum qualifications for this position. If none are listed, there are no minimum qualifications. Minimum Qualifications:Option 1- Bachelor's degree in Statistics, Economics, Analytics, Mathematics, Computer Science, Information Technology, or related field and 3 years' experience in an analytics related field. Option 2- Master's degree in Statistics, Economics, Analytics, Mathematics, Computer Science, Information Technology, or related field and 1 years' experience in an analytics related field. Option 3 - 5 years' experience in an analytics or related field. Preferred Qualifications... Outlined below are the optional preferred qualifications for this position. If none are listed, there are no preferred qualifications. Primary Location... 4,5,6, 7 Floor, Building 10, Sez, Cessna Business Park, Kadubeesanahalli Village, Varthur Hobli , India R-2146190 Show more Show less
Posted 1 month ago
0 years
0 Lacs
Durgapur, West Bengal, India
Remote
Space Mechanics Coding Expert (Volunteer Position) | Talisha Aerospace Location: Remote | Commitment: Flexible | Compensation: Unpaid (Founding Volunteer Role) Company: Talisha Aerospace – Building the Future of Reusable Rocket Technology Are you passionate about space exploration and advanced propulsion systems? Do you dream of contributing to breakthrough innovations in orbital mechanics and reusable rockets? Talisha Aerospace invites you to become a part of a bold new journey — not just to space, but to redefine how we get there. 🌌 About Us We are Talisha Aerospace, a newly formed aerospace startup with an ambitious mission: to design and develop reusable rocket technology that transforms access to space. Though we are in our early stages, our focus is razor-sharp, our passion unshakable, and our roadmap firmly aligned with solving real engineering challenges in orbital dynamics, propulsion systems, and reusability. This is a volunteer role — ideal for professionals, researchers, or enthusiasts who want to leave a legacy in the next era of spaceflight. 🔭 Role Overview We are looking for a Space Mechanics Coding Expert to join our core technical team. This is a rare opportunity to apply your knowledge of orbital mechanics, astrodynamics, and numerical simulation in a mission that could change the course of aerospace history. You'll work on the algorithms, models, and simulation environments that govern spacecraft behavior — from launch to landing and beyond. 🛰 Responsibilities Develop and implement algorithms for orbital dynamics, trajectory optimization, and re-entry simulation Write robust and scalable code (e.g., in Python, MATLAB, C++, or Julia) Build simulation tools for rocket staging, attitude control, and fuel optimization Collaborate with propulsion, avionics, and structural engineers Integrate space mechanics calculations into broader system design tools Contribute to research and documentation as part of the core team 🧠 Ideal Background Strong foundation in orbital mechanics, astrodynamics, and space systems engineering Proficiency in scientific programming (Python, C++, MATLAB, or similar) Familiarity with numerical methods, Kalman filtering, and simulation environments Experience with aerospace toolkits (e.g., GMAT, STK, Orekit) is a bonus Excellent problem-solving skills and a collaborative mindset Prior experience in academic or research projects in aerospace or related fields 💡 Why Join Talisha Aerospace? Be part of a pioneering mission that aims to make spaceflight more sustainable and accessible Work alongside passionate innovators and engineers who believe in open collaboration Help shape the technical foundation of a future-defining aerospace company Gain experience and visibility that could become part of your legacy in the space industry 🌍 Important Note This is currently an unpaid volunteer role. We are transparent about this — as a bootstrapped startup, we are not funded (yet). But we're not looking for employees right now; we're looking for founding visionaries who believe in what we’re building and are ready to help lay the groundwork. Your contributions could be instrumental in attracting funding, accelerating development, and establishing Talisha Aerospace as a force in the new space economy. Ready to launch with us? Send us your resume, portfolio, or just a short note on why you want to be part of Talisha Aerospace. 🌠 Let's build rockets that come back — and a future that goes forward. Show more Show less
Posted 1 month ago
12.0 - 17.0 years
30 - 35 Lacs
Noida
Work from Office
Minimum 12+ years of relevant experience in building software applications in data and analytics field Enhance the go-to-market strategy by designing new and relevant solution frameworks to accelerate our clients’ journeys for impacting patient outcomes. Pitch for these opportunities and craft winning proposals to grow the Data Science Practice. Build and lead a team of data scientists and analysts, fostering a collaborative and innovative environment. Oversee the design and delivery of the models, ensuring projects are completed on time and meet business objectives. Engaging in consultative selling with clients to grow/deliver business. Develop and operationalize scalable processes to deliver on large & complex client engagements. Extensive hands-on experience with Python, R, or Julia, focusing on data science and generative AI frameworks. Expertise in working with generative models such as GPT, DALL-E, Stable Diffusion, Codex, and MidJourney for various applications. Proficiency in fine-tuning and deploying generative models using libraries like Hugging Face Transformers, Diffusers, or PyTorch Lightning. Strong understanding of generative techniques, including GANs, VAEs, diffusion models, and autoregressive models. Experience in prompt engineering, zero-shot, and few-shot learning for optimizing generative AI outputs across different use cases. Expertise in managing generative AI data pipelines, including preprocessing large-scale multimodal datasets for text, image, or code generation.
Posted 1 month ago
4.0 - 6.0 years
2 - 15 Lacs
Remote, , India
Remote
Job Description REMOTE OPPORTUNITY!!!!!!!! Client: Technology GCC for Global Biotech Company 2+ years of relevant experience in instrument pipeline development 2+ years of experience writing and executing Test Scripts to challenge business and functional requirements 2+ years of experience working in a collaborative team environment including both IT and business personnel minor experience in performing business analysis User Support Application Support Software Design and Development Python,LabVIEW, C++,Julia Public instrument schemas and models Debugging Source Code Scripting Configuration Management Waterfall and Agile Methodology Research and Development / Technical Development Lab Support Integration and System Testing Knowledge and Understanding of 21 CFR Part 11 and Data Integrity Strong Interpersonal, Verbal and Written Communication Exceptional Analytical, Problem-Solving, and Troubleshooting Abilities
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France