Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Execute the customer projects with high quality, cost and deliverables within agreed timelines with close follow-ups with internal CFTs & customers. Calibration work packages, Vehicle calibration, Documenting in the form of development reports Vehicle functions :- Veh speed acquisition, Vehicle drivability, Low & high idle governing, Start system, Expedition (Summer, winter & Altitude), Gear detection, Top speed limitation, FAS, Smoke limitation. Special functions: - ZFC, FBC, Dew point detection, FMO, P4 modelling. Dataset and project management, ARCs, PRCs, SSDs & CSSDs checks - follow ups with internal CFTs. Label responsibility list. INCA Flow, iCDM, vCDM, E-Handbook and Uniplot usage PSR-C for dataset containerization OHW specific functions (CEV & genset calibration) Preparation of first firing activities. Executing necessary preliminary tests on the labcar Preparing necessary test set up for calibration activities on the Engine/vehicle Executing necessary measurements on the engine & Vehicles based on functional experts requirements Executing calibration activities on the dyno and real driving conditions Participation in customer discussion and updating the status on regularly Creating the calibration development reports Calibration review with calibration functional experts Field data analysis and report creation Frequent update to calibration activity status to internal team and customers Calibration data and status update in ICDM & VCDM Handling of test vehicles as per the predefine procedure Participation in vehicle expedition at extreme ambient conditions Support for calibration efficiency improvements and innovation Calibration automation and tools development Domestic travel to customer site for calibration activities
Posted 1 day ago
7.0 years
0 Lacs
India
Remote
Job Title: Tech Lead – Software Engineer (LLM Evaluation & Dataset Creation) Location: Remote Employment Type: Contractor Assignment (No medical/paid leave) Start Date: Immediate (Max Notice Period: 1 week) Contract Duration: 3 Months Work Commitment: First Priority: 40 hrs/week with PST overlap (No dual employment allowed) Second Priority: 30 hrs/week with PST overlap Third Priority: 20 hrs/week with PST overlap (Part-time/dual employment allowed) About the Project: We are hiring for our client, who is working on building high-quality LLM evaluation and training datasets to enable large language models (LLMs) to solve realistic software engineering (SWE) problems . The project involves synthetic data generation with human-in-the-loop validation , using public GitHub repositories to construct diverse and verifiable SWE tasks. The goal is to create datasets that span a variety of programming languages, complexity levels, and task types—pushing the boundaries of how LLMs interact with real-world code. About the Role: We are seeking experienced software engineers (Tech Lead level) who are well-versed in open-source codebases and capable of contributing to this innovative project. This is a hands-on engineering role combining real-world development with AI research , giving you a front-row seat in the evolution of AI-assisted coding . Why Join This Project? This opportunity places you at the forefront of how AI models interact with real-world software , blending practical engineering with experimental LLM evaluation. You’ll work on impactful use cases that shape the future of AI-assisted software development . Day-to-Day Responsibilities: Analyze and triage GitHub issues from trending open-source repositories Set up and configure codebases with Docker , development environment automation, and dependency resolution Evaluate unit test coverage and quality Run and modify real-world codebases to assess LLM performance in bug-fixing scenarios Collaborate with AI researchers to identify and curate challenging coding tasks Contribute to repository selection , issue evaluation, and test scenario generation Option to lead and guide a team of junior engineers Must-Have Skills & Experience: Minimum 5–7 years of professional software engineering experience At least 3+ years of Java development experience Strong experience with Git , Docker , and local development environments Ability to navigate, modify, and test complex open-source codebases Excellent problem-solving and debugging skills Comfortable working independently and collaborating in research-driven environments Nice to Have: Previous involvement in LLM research or evaluation projects Contributions to open-source projects Experience in building or testing developer tools , code analyzers, or automation frameworks
Posted 1 day ago
6.0 years
0 Lacs
Pune, Maharashtra, India
On-site
What You’ll Work On 1. Deep Learning & Computer Vision Train models for image classification: binary/multi-class using CNNs, EfficientNet, or custom backbones. Implement object detection using YOLOv5, Faster R-CNN, SSD; tune NMS and anchor boxes for medical contexts. Work with semantic segmentation models (UNet, DeepLabV3+) for region-level diagnostics (e.g., cell, lesion, or nucleus boundaries). Apply instance segmentation (e.g., Mask R-CNN) for microscopy image cell separation. Use super-resolution and denoising networks (SRCNN, Real-ESRGAN) to enhance low-quality inputs. Develop temporal comparison pipelines for changes across image sequences (e.g., disease progression). Leverage data augmentation libraries (Albumentations, imgaug) for low-data domains. 2. Vision-Language Models (VLMs) Fine-tune CLIP, BLIP, LLaVA, GPT-4V to generate explanations, labels, or descriptions from images. Build image captioning models (Show-Attend-Tell, Transformer-based) using paired datasets. Train or use VQA pipelines for image-question-answer triples. Align text and image embeddings with contrastive loss (InfoNCE), cosine similarity, or projection heads. Design prompt-based pipelines for zero-shot visual understanding. Evaluate using metrics like BLEU, CIDEr, SPICE, Recall@K, etc. 3. Model Training, Evaluation & Interpretation Use PyTorch (core), with support from HuggingFace, torchvision, timm, Lightning. Track model performance with TensorBoard, Weights & Biases, MLflow. Implement cross-validation, early stopping, LR schedulers, warm restarts. Visualize model internals using GradCAM, SHAP, Attention rollout, etc. Evaluate metrics: • Classification: Accuracy, ROC-AUC, F1 • Segmentation: IoU, Dice Coefficient • Detection: mAP • Captioning/VQA: BLEU, METEOR 4. Optimization & Deployment Convert models to ONNX, TorchScript, or TFLite for portable inference. Apply quantization-aware training, post-training quantization, and pruning. Optimize for low-power inference using TensorRT or OpenVINO. Build multi-threaded or asynchronous pipelines for batched inference. 5. Edge & Real-Time Systems Deploy models on Jetson Nano/Xavier, Coral TPU. Handle real-time camera inputs using OpenCV, GStreamer and apply streaming inference. Handle multiple camera/image feeds for simultaneous diagnostics. 6. Regulatory-Ready AI Development Maintain model lineage, performance logs, and validation trails for 21 CFR Part 11 and ISO 13485 readiness. Contribute to validation reports, IQ/OQ/PQ, and reproducibility documentation. Write SOPs and datasheets to support clinical validation of AI components. 7. DevOps, CI/CD & MLOps Use Azure Boards + DevOps Pipelines (YAML) to: Track sprints • Assign tasks • Maintain epics & user stories • Trigger auto-validation pipelines (lint, unit tests, inference validation) on code push • Integrate MLflow or custom logs for model lifecycle tracking. • Use GitHub Actions for cross-platform model validation across environments. 8. Bonus Skills (Preferred but Not Mandatory) Experience in microscopy or pathology data (TIFF, NDPI, DICOM formats). Knowledge of OCR + CV hybrid pipelines for slide/dataset annotation. Experience with streamlit, Gradio, or Flask for AI UX prototyping. Understanding of active learning or semi-supervised learning in low-label settings. Exposure to research publishing, IP filing, or open-source contributions. 9. Required Background 4–6 years in applied deep learning (post academia) Strong foundation in: Python + PyTorch CV workflows (classification, detection, segmentation) Transformer architectures & attention VLMs or multimodal learning Bachelor’s or Master’s degree in CS, AI, EE, Biomedical Engg, or related field 10. How to Apply Send the following to info@sciverse.co.in Subject: Application – AI Research Engineer (4–8 Yrs, CV + VLM) Include: • Your updated CV • GitHub / Portfolio • Short write-up on a model or pipeline you built and why you’re proud of it OR apply directly via LinkedIn — but email applications get faster visibility. Let’s build AI that sees, understands, and impacts lives.
Posted 1 day ago
5.0 years
0 Lacs
Chandigarh, India
On-site
At Adeptiv.AI, we're building the most advanced AI Governance Platform for enterprises. Our flagship Real-Time Evaluation module empowers businesses to test, evaluate, and trust their AI systems. We're now expanding this module to support ML models, explainability frameworks, and diverse AI use cases, and we're looking for a Senior AI Evaluation Engineer subject matter expert to lead this transformation. This role is ideal for someone who lives and breathes AI/ML evaluation, loves digging deep into models, and can bridge the gap between theory and production-grade software. Key Responsibilities: Design and lead the implementation of evaluation frameworks for ML and Gen AI systems. Define and guide the evaluation of different AI/ML metrics, such as Accuracy, Precision, Recall, AUC, BLEU, ROUGE, METEOR, etc. Develop strategies for model robustness, bias detection, and fairness evaluations. Implement tools like SHAP, LIME, Captum, DeepChecks, Foolbox, Evidently AI, Alibi Detect, etc. Define pipelines for automated test case execution, continuous evaluation, and report generation. Guide and mentor full-stack and backend engineers in integrating AI/ML testing logic into production-ready services. Establish standards for test dataset generation, edge-case simulation, and benchmarking. Validate the correctness of evaluations across supported AI use cases. Stay ahead of the curve on emerging research in AI evaluations and bring insights into the product. Must-Have Skills & Experience: 5+ years in AI/ML focused on the evaluation and testing of AI / ML systems Deep expertise in traditional ML evaluation, Computer Vision, and Generative AI metrics Strong familiarity with explainability tools (SHAP, LIME, Integrated Gradients, etc.) Experience evaluating models in one or more domains: NLP, Computer Vision, Tabular Data, Reinforcement Learning Hands-on experience with libraries like scikit-learn, huggingface, transformers, OpenAI, LangChain, TorchMetrics, Evidently, etc. Experience working in collaboration with engineering teams to productize evaluation pipelines Strong Python development and scripting capabilities Solid understanding of AI reliability, robustness, fairness, and auditability Good to Have: Experience with LLM evaluation, hallucination detection, and prompt scoring Prior contributions to AI testing or monitoring tools or open-source projects Understanding of MLOps/LLMOps workflows Familiarity with CI/CD of model evaluations in production Awareness of AI compliance and audit frameworks (like EU AI Act, NIST AI RMF) What You'll Bring A rigorous scientific mindset, but with a builder's attitude A passion to make AI trustworthy for enterprises Strong communication skills to work cross-functionally with product & engineering High ownership to shape a strategic product module from scratch Why Join Us? Be part of a cutting-edge product solving real challenges in AI Governance. Work directly with the founding team and make a massive impact in enterprises. Opportunity to influence the future of AI evaluation and reliability.
Posted 1 day ago
0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
Dear Future Colleague, Thank you for your interest in working with us. We welcome you to explore our firm and together evaluate if a shared journey awaits us. At its core, XQtiv is a global talent advisory firm. We are dedicated to solving complex talent problems for our clients from hiring top-notch talent to evaluating current and incumbent leaders. But the advent of AI is changing our methods and ways of working in a significant manner. We are investing to build a TALENT INTELLIGENCE PLATFORM that will significantly impact how we go to market and execute leadership searches. As an Associate - Talent Intelligence at XQtiv, you will play a crucial role in this transition. While being part of the Search team, where you will execute global search mandates, you will also be building high-quality candidate datasets to be further deployed on platform. Once you have achieved maturity as a search consultant, you will pivot to deploying your expertise and depth in platform adoption on the dataset that you have built and taking it to market. This is an on – site role in our office in Powai, Mumbai. Specifically, we seek expertise in: Strategic Research and Analysis Project Management Talent Assessment and Evaluation Relationship Building Support Business Development Table Stakes Microsoft Office particularly PowerPoint, Excel, Power BI etc. Adept in research and insight generation/ synthesis Understanding of business fundamentals as well as complex business contexts Professional maturity to interact with senior executives Comfort with ambiguity and changing priorities What you can expect A rewarding career and a platform to leverage your skills for achieving significant outcomes Remuneration with incentives An educative and empowering atmosphere A welcoming team invested in your growth! We look forward to speaking with you to understand how we can positively impact the careers of our candidates and shape the future of our clients.
Posted 1 day ago
0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Job Description GalaxEye is a Space-Tech startup at the forefront of innovation, we are pioneering the World's First Multi-Sensor Earth Observation Satellite and aiming to build a constellation of indigenous micro-satellites with advanced data fusion capabilities. Our upcoming satellite Mission Drishti slated for launch in 2025, will set a global benchmark as the “World's First Multi-Sensor Satellite” integrating both SAR (Synthetic Aperture Radar) & MSI (Multi-Spectral Imaging) sensors on a single platform. A member of the Image Processing team, you will research, design, implement, optimize and deploy image processing algorithms to evolve our AI ecosystem continuously. The usual day-to-day tasks include developing the optical processing pipeline for various platforms, implementing new image processing algorithms, calibration of the sensors, fine-tuning them on our data, reviewing performance metrics, integrating and optimising the algorithms to run efficiently in our platform while continuously iterating and improving based on real-world performance. Responsibilities Select, test, implement and optimize image processing and image analysis using both classical and deep learning methods to extract relevant signals from imaging data Design and implement algorithms to carry out image processing including filtering, alignment, segmentation, and feature extraction efficiently across multi-terabyte sized datasets Research and fine-tune the latest image processing algorithms on our dataset to run efficiently while optimising based on real-time performance Write robust, well-documented, and well-tested code libraries that adhere to community standards and best practices Image Processing Engineer to Develop CI/CD pipeline patterns and best practices Collaborating with a team of data scientists, DevOps, Data engineers, and ML engineers Requirements Strong Foundation in Mathematical Statistics. Logical Thinking & Ability to Comprehend key facts. Background in basic image processing or basic signal processing will be preferred Experience with data science tools including Python scripting, CUDA, numpy, scipy, matplotlib, scikit-learn, bash scripting and Linux environment. Proficiency in Python and C++ Programming is a must. Preferred Experience in remote sensing or related fields. Ability to collaborate effectively across the team A detail-oriented team member who can consistently meet deadlines. Benefits Be a part of India’s leading space-tech satellite technology company. Work closely with new age leaders and gain hands-on experience in strategy and operations. Thrive in a collaborative and growth-oriented startup culture. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#5BBD6E;border-color:#5BBD6E;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered="">
Posted 1 day ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Job Description Principal Software Development Engineer – Ticketing Platform Engineering At Oracle Cloud Infrastructure (OCI), we build the future of the cloud for Enterprises as a diverse team of fellow creators and inventors. We act with the speed and attitude of a start-up, with the scale and customer-focus of the leading enterprise software company in the world. Values are OCI’s foundation and how we deliver excellence. We strive for equity, inclusion, and respect for all. We are committed to the greater good in our products and our actions. We are constantly learning and taking opportunities to grow our careers and ourselves. We challenge each other to stretch beyond our past to build our future. You are the builder here. You will be part of a team of really experienced, motivated, and diverse people and given the autonomy and support to do your best work. It is a dynamic and flexible workplace where you’ll belong and be encouraged. About Oracle Cloud Infrastructure (OCI) The next 5-10 years will see the software industry move from on-premise software solutions towards software delivery via cloud-based services. Oracle’s number one strength is the size and strength of our software portfolio to which most of the business world heavily uses. Our vision is to bring our customers to the Cloud. From our app server and development tools to our dataset and people management software, Oracle already has the core infrastructure that all of this is built on. The future is about providing higher level services that companies can consume on demand. Oracle is now recognized as a significant hyper-scale cloud provider offering hybrid capabilities, bridging the gap between on premise infrastructure and the Cloud to manage workloads in a secure, efficient, and cost-effective way. Who are we looking for? The Oracle ticketing Platform Engineering group is looking for a Principal Software Development Engineer with expertise and passion in building and growing services within large scale global cloud environment. Our mission is to provide critical tools and experiences to OCI operations and service teams across Oracle. This role will be responsible for designing, and implementing large, complex, and deeply integrated tooling services for Oracle Cloud Infrastructure. OCI is central to multiple businesses within Oracle, and cross organizational collaboration is common. We are looking for a deeply technical engineer that can thrive in this environment and provide designs and technical standards to implement large complex multi system integrations and platforms that will modernize how we do business as an Enterprise. We are looking for a candidate with a proven track record and the ability to lead execution while the product suite is growing and evolving. An innovator mentality is a must, the ability to roll up your sleeves and create designs, drive standards of implementation, create and streamline complex integration patterns, and write code are expected. You are the technical champion for our organization, and you will be responsible for creating our technical strategy for the ever-evolving product suite. You are a technology leader. You will influence the decisions of senior business leaders through effective verbal and written communication, logical reasoning, and the presentation of key timelines and service evolution. You will mentor and grow the technical talent within the organization, and you will ensure the technical roadmap is adhered to as we integrate our service across OCI. Responsibilities Include: Work with senior architects and product management to define requirements Design and implement new features Define and guide engineering processes and procedures Review code written by your peers to ensure correctness and fit with the team's overall design principles Work with the team to operate services that host massive amounts of data Solid technically - You will build and improve component design/code for efficiency, performance, simplicity , scale and resiliency. Acumen for test coverage, observability, availability, durability Cloud Infra operations - LSE mitigation experience , CAPA, Observability improvements, Tooling Collaborate with the OCI Engineering communities Minimum qualifications: 10+ years of experience architecting, designing, and implementing enterprise server-side Java application Proven success in building high-volume, low latency cloud services using latest cloud native capabilities, Java/J2EE technologies and open source frameworks Experience in building secure applications using J2EE, role- based authorization, and Single Sign On. 5+ years of experience working with leading cloud platforms such as OCI, AWS, Azure and managing service in production. Experience in Infrastructure as code tools like terraform and Ansible An exceptional communicator who can write, present and effectively adjust messages to meet individual audiences and organizations Solid organization, communication and interpersonal skills Experience in mentoring and growing developers with many members located remotely and others that support success through a matrixed environment. Strong understanding of the Database systems. EDUCATION A Computer Science degree is preferred. An advanced degree is a plus. Career Level - IC4 Responsibilities Responsibilities Include: Work with senior architects and product management to define requirements Design and implement new features Define and guide engineering processes and procedures Review code written by your peers to ensure correctness and fit with the team's overall design principles Work with the team to operate services that host massive amounts of data Solid technically - You will build and improve component design/code for efficiency, performance, simplicity , scale and resiliency. Acumen for test coverage, observability, availability, durability Cloud Infra operations - LSE mitigation experience , CAPA, Observability improvements, Tooling Collaborate with the OCI Engineering communities Minimum qualifications: 10+ years of experience architecting, designing, and implementing enterprise server-side Java application Proven success in building high-volume, low latency cloud services using latest cloud native capabilities, Java/J2EE technologies and open source frameworks Experience in building secure applications using J2EE, role- based authorization, and Single Sign On. 5+ years of experience working with leading cloud platforms such as OCI, AWS, Azure and managing service in production. Experience in Infrastructure as code tools like terraform and Ansible An exceptional communicator who can write, present and effectively adjust messages to meet individual audiences and organizations Solid organization, communication and interpersonal skills Experience in mentoring and growing developers with many members located remotely and others that support success through a matrixed environment. Strong understanding of the Database systems. EDUCATION A Computer Science degree is preferred. An advanced degree is a plus. About Us As a world leader in cloud solutions, Oracle uses tomorrow’s technology to tackle today’s challenges. We’ve partnered with industry-leaders in almost every sector—and continue to thrive after 40+ years of change by operating with integrity. We know that true innovation starts when everyone is empowered to contribute. That’s why we’re committed to growing an inclusive workforce that promotes opportunities for all. Oracle careers open the door to global opportunities where work-life balance flourishes. We offer competitive benefits based on parity and consistency and support our people with flexible medical, life insurance, and retirement options. We also encourage employees to give back to their communities through our volunteer programs. We’re committed to including people with disabilities at all stages of the employment process. If you require accessibility assistance or accommodation for a disability at any point, let us know by emailing accommodation-request_mb@oracle.com or by calling +1 888 404 2494 in the United States. Oracle is an Equal Employment Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, sexual orientation, gender identity, disability and protected veterans’ status, or any other characteristic protected by law. Oracle will consider for employment qualified applicants with arrest and conviction records pursuant to applicable law.
Posted 1 day ago
3.0 years
5 - 8 Lacs
Gurgaon
On-site
DESCRIPTION At Alexa Shopping Operations strives to become the most reliable source for dataset generation and annotations. We work in collaboration with Shopping feature teams to enhance customer experience (CX) quality across shopping features, devices, and locales. Our primary focus lies in handling annotations for training, measuring, and improving Artificial Intelligence (AI) and Large Language Models (LLMs), enabling Amazon to deliver a superior shopping experience to customers worldwide. Our mission is to empower Amazon's LLMs through Reinforcement Learning from Human Feedback (RLHF) across various categories at high speed. We aspire to provide an end-to-end data solution for the LLM lifecycle, leveraging cuttingedge technology alongside our operational excellence. By joining us, you will play a pivotal role in shaping the future of the shopping experience for customers worldwide. Key job responsibilities The candidate actively seeks to understand Amazon’s core business values and initiatives, and translates those into everyday practices. Some of the key result areas include, but not limited to: Experience in managing process and operational escalations Driving appropriate data oriented analysis, adoption of technology solutions and process improvement projects to achieve operational and business goal Managing stakeholder communication across multiple lines of business on operational milestones, process changes and escalations Communicate and take the lead role in identifying gaps in process areas and work with all stakeholders to resolve the gaps Be a SME for the process and a referral point for peers and junior team members Has the ability to drive business/operational metrics through quantitative decision making, and adoption of different tools and resources Ability to meet deadlines in a fast paced work environment driven by complex software systems and processes Ability to perform deep dive in the process and come up with process improvement solutions Shall collaborate effectively with other teams and subject matter experts (SMEs), Language Engineers (LaEs) to support launches of new process and services BASIC QUALIFICATIONS A Bachelor’s Degree and relevant work experience of 3+ years. Excellent level of English and Spanish, C1 level or above. Candidate must demonstrate ability to analyze and interpret complex SOPs. Excellent problem-solving skills with a proactive approach to identifying and implementing process improvements. Strong communication and interpersonal skills to effectively guide and mentor associates. Ability to work collaboratively with cross-functional teams. Thoroughly understand multiple SOPs and ensure adherence to established processes. Identify areas for process improvement and SOP enhancement, and develop actionable plans for implementation. Lead and participate in process improvement initiatives. Comfortable working in a fast paced, highly collaborative, dynamic work environment · Willingness to support several projects at one time, and to accept re-prioritization as necessary. Adaptive to change and able to work in a fast-paced environment. PREFERRED QUALIFICATIONS Experience with Artificial Intelligence interaction, such as prompt generation. Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Job details IND, HR, Gurugram Editorial, Writing, & Content Management
Posted 2 days ago
1.0 years
2 - 3 Lacs
Chennai
On-site
DESCRIPTION Alexa Shopping Operations strives to become the most reliable source for dataset generation and annotations. We work in collaboration with Shopping feature teams to enhance customer experience (CX) quality across shopping features, devices, and locales. Our primary focus lies in handling annotations for training, measuring, and improving Artificial Intelligence (AI) and Large Language Models (LLMs), enabling Amazon to deliver a superior shopping experience to customers worldwide. Our mission is to empower Amazon's LLMs through Reinforcement Learning from Human Feedback (RLHF) across various categories at high speed. We aspire to provide an end-to-end data solution for the LLM lifecycle, leveraging the latest technology alongside our operational excellence. By joining us, you will play a pivotal role in shaping the future of the shopping experience for customers worldwide. Key job responsibilities Process annotation & data analysis tasks with high efficiency and quality in a fast paced environment Provide floor support to Operations manager in running day to day operations working closely with Data Associates Handle work prioritization and deliver based on business needs Track and report ops metrics and ensure delivery on all KPIs and SLAs You will work closely with your team members and managers to drive process efficiencies and explore opportunities for automation You will strive to enhance the productivity and effectiveness of the data generation and annotation processes The tasks will be primarily repetitive in nature and will require the individual to make judgment-based decisions keeping in mind the guidelines provided in the SOP. BASIC QUALIFICATIONS Graduate or equivalent (up to 1 year of experience) Candidate must demonstrate language proficiency in French language for the following: verbal, writing, reading and comprehension. Required language level: B2.2/BA/Advanced Diploma Good English language proficiency: verbal, writing, reading and comprehension Strong analytical and communication skills Passion for delivering a positive customer experience, and maintain composure in difficult situations Ability to effectively and efficiently complete challenging goals or assignments within defined SLA PREFERRED QUALIFICATIONS Basic level of Excel knowledge Familiarity with online retail (e-commerce industry) Previous experience as AI trainers, knowledge of AI and NLP Experience with Artificial Intelligence interaction, such as prompt generation and open AI's Experience in content or editorial writing Our inclusive culture empowers Amazonians to deliver the best results for our customers. If you have a disability and need a workplace accommodation or adjustment during the application and hiring process, including support for the interview or onboarding process, please visit https://amazon.jobs/content/en/how-we-hire/accommodations for more information. If the country/region you’re applying in isn’t listed, please contact your Recruiting Partner. Job details IND, TN, Chennai IND, HR, Gurugram Editorial, Writing, & Content Management
Posted 2 days ago
0 years
0 Lacs
Bengaluru East, Karnataka, India
On-site
Primary skills: Technology->Artificial Intelligence->Computer Vision, Technology->Big Data->Natural language processing(NLP), Technology->Machine Learning->Python Data collection and preparation: Collect and prepare data for training and evaluating LLMs. This may involve cleaning and processing text data, or creating synthetic data Model development: Design and implement LLM models. This may involve choosing the right architecture, training the model, and tuning the hyperparameters Model evaluation: Evaluate the performance of LLM models. This may involve measuring the accuracy of the model on a held-out dataset, or assessing the quality of the generated text Model deployment: Deploy LLM models to production. This may involve packaging the model, creating a REST API, and deploying the model to a cloud computing platform. Responsible AI: Should have proficient knowledge in Responsible AI and Data Privacy principles to ensure ethical data handling, transparency, and accountability in all stages of AI development. Must demonstrate a commitment to upholding privacy standards, mitigating bias, and fostering trust within data-driven initiatives. Experience in working with ML toolkit like R, NumPy, MatLab etc.. Experience in data mining, statistical analysis and data visualization Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends
Posted 2 days ago
3.0 years
0 Lacs
Coimbatore, Tamil Nadu, India
Remote
Job Title: Python Backend Developer - FastAPI, PostgreSQL, Pattern Recognition Location: Remote / Hybrid Type: Full-time Experience: 3 to 7 Years Compensation: USDT only | Based on skill and performance About Us: We are a cutting-edge fintech startup building an Al-powered trading intelligence platform that integrates technical analysis, machine learning, and real-time data processing. Our systems analyze financial markets using custom algorithms to detect patterns, backtest strategies, and deploy automated insights. We're seeking a skilled Python Backend Developer experienced in FastAPI, PostgreSQL, pattern recognition, and financial data workflows. Key Responsibilities Implement detection systems for chart patterns, candlestick patterns, and technical indicators (e.g.,RSI, MACD, EMA) Build and scale high-performance REST APIs using FastAPI for real-time analytics and model communication Develop semi-automated pipelines to label financial datasets for supervised/unsupervised ML models Implement and maintain backtesting engines for trading strategies using Python and custom simulation logic Design and maintain efficient PostgreSQL schemas for storing candle data, trades, indicators, and model metadata (Optional) Contribute to frontend integration using Next.js/React for analytics dashboards and visualizations Key Requirements Python (3-7 years): Strong programming fundamentals, algorithmic thinking, and deep Python ecosystem knowledge FastAPI: Proven experience building scalable REST APIs PostgreSQL: Schema design, indexing, complex queries, and performance optimization Pattern Recognition: Experience in chart/candlestick detection, TA-Lib, rule-based or ML-based identification systems Technical Indicators: Familiarity with RSI, Bollinger Bands, Moving Averages, and other common indicators Backtesting Frameworks: Hands-on experience with custom backtesting engines or libraries like Backtrader, PyAlgoTrade Data Handling: Proficiency in NumPy, Pandas, and dataset preprocessing/labeling techniques Version Control: Git/GitHub - comfortable with collaborative workflows Bonus Skills Experience in building dashboards with Next.js / React Familiarity with Docker, Celery, Redis, Plotly, or TradingView Charting Library Previous work with financial datasets or real-world trading systems Exposure to Al/ML model training, SHAP/LIME explainability tools, or reinforcement learning strategies Ideal Candidate Passionate about financial markets and algorithmic trading systems Thrives in fast-paced, iterative development environments Strong debugging, data validation, and model accuracy improvement skills Collaborative mindset - able to work closely with quants, frontend developers, and ML engineers What You'll Get Opportunity to work on next-gen fintech systems with real trading applications Exposure to advanced AI/ML models and live market environments Competitive salary + performance-based bonuses Flexible working hours in a remote-first team
Posted 2 days ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Biostate. AI - Frontend Engineer Location: Remote or Onsite (Bengaluru, India) Job Type: Full-time Experience Level: Mid-Senior About The Role Key Responsibilities Build and maintain front-end applications using React, Angular, and Next.js. Implement responsive, user-friendly interfaces with minimal design hand-holding. Translate product requirements and user needs into interactive features. Write clean, reusable, and scalable code following best practices. Optimize performance and ensure cross-browser compatibility and mobile responsiveness. Work collaboratively with product managers, engineers, and occasionally designers. Stay current with modern front-end development trends and tools. Required Qualifications 3+ years of front-end development experience in production environments. Proficient in JavaScript (ES6+), TypeScript, HTML5, and CSS3/SASS. Experienced in frontend development with expertise in routing, dependency injection (DI), component-based architecture, state management, form handling, and module-based application design. Strong working knowledge of: React.js, including hooks and component patterns. Angular (v10+), especially its modular and service-oriented architecture. Next.js, including server-side rendering and static site generation. Demonstrated ability to design and implement interfaces independently with attention to usability and visual polish. Familiarity with RESTful and/or GraphQL APIs. Strong debugging and troubleshooting skills. Preferred Qualifications Knowledge of accessibility standards (WCAG) and front-end security practices. Background in startup environments or product-first teams. About Biostate AI, Inc. Biostate AI is building generative models for predicting the evolution of living organisms' health following drug dosing, emphasizing safety and toxicity. By innovating new methodologies for collecting RNAseq and DNA methylation data at a fraction of the traditional cost, Biostate has curated a vast, proprietary, multi-species, multi-drug, multi-omics dataset. The company stands out for its patented tokenization and Universal Gene Embedding methods for transfer learning, enabling its generative AI models to forecast human and animal health states with unprecedented 1-day temporal resolution. Biostate AI is co-founded by David Zhang (former tenured professor at Rice University, former founder/CEO of NuProbe) and Ashwin Gopinath (former MIT professor, former founder/CTO of Palamedrix). Our individual investors include Dario Amodei (CEO of Anthropic), Joris Poort (CEO of Rescale), Michael Schnall-Levin (CTO of 10X Genomics), and Emily Leproust (CEO of Twist Biosciences), all leaders in fields relevant to our work. Institutional investors in Biostate include Accel Ventures, Matter Venture Partners, Vision Plus Capital, and Caltech Fund. Our Culture At Biostate AI, our core values are Honesty, Discipline, Efficiency, Initiative, and Meritocracy. Team members are encouraged and expected to take ownership of their projects, evaluate and decide effort allocation among multiple projects, continuously learn in an inter-disciplinary manner, provide direct feedback to management, continually challenge conventional assumptions and innovate from first principles, and contribute to the company's growth and success. We embrace a “move fast and break things” attitude, encouraging our team members to take calculated risks, experiment with new ideas, and learn from failures. We value the ability to move quickly, iterate on ideas, and push boundaries. Join Biostate.ai in our mission to revolutionize the interface of artificial intelligence and biological research.
Posted 2 days ago
4.0 years
0 Lacs
Ahmedabad, Gujarat, India
On-site
Line of Service Advisory Industry/Sector Not Applicable Specialism Data, Analytics & AI Management Level Senior Associate Job Description & Summary At PwC, our people in data and analytics focus on leveraging data to drive insights and make informed business decisions. They utilise advanced analytics techniques to help clients optimise their operations and achieve their strategic goals. In business intelligence at PwC, you will focus on leveraging data and analytics to provide strategic insights and drive informed decision-making for clients. You will develop and implement innovative solutions to optimise business performance and enhance competitive advantage. *Why PWC At PwC, you will be part of a vibrant community of solvers that leads with trust and creates distinctive outcomes for our clients and communities. This purpose-led and values-driven work, powered by technology in an environment that drives innovation, will enable you to make a tangible impact in the real world. We reward your contributions, support your wellbeing, and offer inclusive benefits, flexibility programmes and mentorship that will help you thrive in work and life. Together, we grow, learn, care, collaborate, and create a future of infinite experiences for each other. Learn more about us . At PwC, we believe in providing equal employment opportunities, without any discrimination on the grounds of gender, ethnic background, age, disability, marital status, sexual orientation, pregnancy, gender identity or expression, religion or other beliefs, perceived differences and status protected by law. We strive to create an environment where each one of our people can bring their true selves and contribute to their personal growth and the firm’s growth. To enable this, we have zero tolerance for any discrimination and harassment based on the above considerations. " Responsibilities: Roles & Responsibilities: · Hands on Experience with Power BI Dashboard Development and willing to work as an individual contributor. · Clear Understanding of Data Warehousing Concepts. · Should Work with a data engineering team closely to perform data extraction and data transformation processes to create the datasets. · Good Experience in different Categories of DAX functions like Time Intelligence Function, Filter Functions, Date Functions, Logical Functions, Text Functions, Number and Statistical Functions. · Good experience with Visual level, Page level, Report level and Drill Through filters for filtering the data in a Report. · Experience in Row Level Security (RLS) implementation in Power BI. · Should work with On-Premises Data Gateway to Refresh and Schedule Refresh of the Dataset. · Strong data transformation skills through Power Query Editor with familiarity in M language. · Data Modelling knowledge with Joins on multiple tables and creating new bridge tables. · Knowledge on PBI desktop features like Bookmarks, Selections, Sync Slicers & Edit interactions. · Knowledge of PBI Service features like creating import, scheduling extract refresh, managing subscriptions etc. · Publishing and maintenance of Apps in Power BI. Also, knowledge on configuring Row Level Security and Dashboard level Security in Power BI Service. · Experience in creating and publishing reports on both web and mobile layout. · Able to Perform Unit Testing like functionality testing and Data Validation. · Report Performance Optimization and Troubleshooting. · Clear Understanding of UI and UX designing. · Hands on Working Experience in SQL to write the queries. · Very good communication skills must be able to discuss the requirements effectively with business owners. Mandatory skill sets: Power BI, DAX Preferred skill sets: Power BI, DAX Years of experience required: 4-8 Years Educational Qualification: BE, B.Tech, MCA, M.Tech Education (if blank, degree and/or field of study not specified) Degrees/Field of Study required: Master of Business Administration, Bachelor of Technology, Bachelor of Engineering Degrees/Field of Study preferred: Certifications (if blank, certifications not specified) Required Skills DAX Language, Power BI Optional Skills Accepting Feedback, Accepting Feedback, Active Listening, Analytical Thinking, Business Case Development, Business Data Analytics, Business Intelligence and Reporting Tools (BIRT), Business Intelligence Development Studio, Communication, Competitive Advantage, Continuous Process Improvement, Creativity, Data Analysis and Interpretation, Data Architecture, Database Management System (DBMS), Data Collection, Data Pipeline, Data Quality, Data Science, Data Visualization, Embracing Change, Emotional Regulation, Empathy, Inclusion, Industry Trend Analysis {+ 16 more} Desired Languages (If blank, desired languages not specified) Travel Requirements Available for Work Visa Sponsorship? Government Clearance Required? Job Posting End Date
Posted 2 days ago
4.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Job title : Senior Analyst - Market Access Analytics (Pricing Analytics Team) Hiring Manager: Manager/Team Lead/Group Lead Location: Hyderabad % of travel expected: Travel required as per business need, if any Job type: Permanent and Full time About The Job Our Team: Sanofi Global Hub (SGH) is an internal Sanofi resource organization based in India and is setup to centralize processes and activities to support Specialty Care, Vaccines, General Medicines, CHC, CMO, and R&D, Data & Digital functions . SGH strives to be a strategic and functional partner for tactical deliveries to Medical, HEVA, and Commercial organizations in Sanofi, Globally. Main Responsibilities We are seeking a highly skilled and experienced individual to join our team as senior analyst for Pricing Analytics Team. This pivotal role will be responsible for spearheading all development activities related to pricing reporting solutions and market access analytics. The Overall Purpose And Main Responsibilities Are Listed Below At our Sanofi we are leveraging analytics and technology, on behalf of patients around the world. We are seeking those who have a passion for using data, analytics, and insights to drive decision making that will allow us to tackle some of the world’s greatest health threats. Within our commercial Insights, Analytics, and Data organization we are transforming to better power decision-making across our end-to-end commercialization process, from business development to late lifecycle management. Deliverables support planning and decision making across multiple functional areas such as finance, manufacturing, product development and commercial. In addition to ensuring high-quality deliverables, our team drives synergies across the franchise, fosters innovation and best practices, and creates solutions to bring speed, scale and shareability to our planning processes. As we endeavour, we are seeking a dynamic talent for the role of “Senior Analyst” We are looking for a team member to support our analytics team based out of US. Robust analytics is a priority for our businesses, as the product potential has major implications to a wide range of disciplines. It is essential to have someone who understands and aspires to implement innovative analytics techniques to drive our insights generation. People: Maintain effective relationship with the end stakeholders within the allocated GBU and tasks – with an end objective to develop report and analysis as per requirement Collaborate with global stakeholders for project planning and setting up the timelines and maintaining budget Performance indicators: Feedback from (end stakeholders) on overall satisfaction Performance: Ability to translate business question to analytical requirement and work on it to develop reports/decks with minimum supervision. Experience working on patient analytics report and dataset such as LAAD, APLD and IQVIA Sales data Collaborates with Digital to enhance data access across various sources, develop tools and process to constantly improve quality and productivity. Will assist in managing business rules, definition and KPIs for reporting and insight. He/she will ensure on time and accurate delivery of all analytics and reporting requirement by collaborating with relevant stakeholders. He/she will ensure reports, decks and metrics are maintained as per requirements Pro-actively identifying analytical requirements. Building advance tools, automatization and/or improvement processes for analytical and other needs Performance indicators: Adherence to timeline, quality target Process: Secure adherence to compliance procedures and internal/operational risk controls in accordance with all applicable standards Use latest tools/technologies/methodologies and partner with internal teams to continuously improve data quality and availability by building business processes that support global standardization Ability to work cross-functionally, gather requirements, analyse data, and generate insights and reports that can be used by the GBU Performance indicators: Feedback from stakeholders on satisfaction with deliverables Stakeholder: Work closely with global teams and/ external vendors to ensure the end-to-end effective project delivery of the designated publication/medical education deliverables Work collaboratively with the stakeholder teams to prioritize work and deliver on time-sensitive requests Performance indicators: Feedback from stakeholders on satisfaction with deliverables About You Experience: 4+ years relevant work experience with solid understanding of principles, standards, and best practices in Insight Generation and storytelling from data analysis. In-depth knowledge of IQVIA, APLD, LAAD, Speciality Pharma and Distributor, Claims data etc. Soft skills: Strong learning agility; Ability to manage ambiguous environments, and to adapt to changing needs of the business; Good interpersonal and communication skills; strong presentation skills a must; Team player who is curious, dynamic, result oriented and can work collaboratively; Ability to think strategically in an ambiguous environment; Ability to operate effectively in an international matrix environment, with ability to work across time zones; Demonstrated leadership and management in driving innovation and automation leveraging advanced statistical and analytical techniques Technical skills: Expert in Relational database technologies and concepts Proficient with pharmaceutical syndicated data sources (e.g. APLD, LAAD, Speciality Pharmacy and Distributor data) Capable of prioritizing and handling multiple projects simultaneously Excellent planning, design, project management and documentation skills Excellent management of customer expectations, listening, and multi-tasking skills. Ability to take initiative, follow through, and meet deadlines as necessary while maintaining the quality Proficiency of programming languages SQL, Python, R Strong experience using analytical platforms (e.g., Snowflake) Experience of using analytical tools like Power BI and Tableau Expert knowledge of Excel ,PowerPoint and proficiency in VBA An aptitude for problem solving and strategic thinking Ability to synthesize complex information into clear and actionable insights Proven ability to work effectively across all levels of stakeholders and diverse functions Solid understanding of pharmaceutical contracting entities and landscape (e.g. Payers, GPOs, Buy and Bill) Demonstrated leadership and management in driving innovation and automation leveraging advanced statistical and analytical techniques Education: Bachelor’s or Master’s degree in areas such as Information Science/Operations/Management/Statistics/Decision Sciences/Engineering/Life Sciences/ Business Analytics or related field (e.g., PhD / MBA / Masters); Languages: Excellent knowledge in English and strong communication skills – written and spoken Other Requirement: This role is a sole contributor focused on development, delivery and communication of insights null Pursue Progress . Discover Extraordinary . Join Sanofi and step into a new era of science - where your growth can be just as transformative as the work we do. We invest in you to reach further, think faster, and do what’s never-been-done-before. You’ll help push boundaries, challenge convention, and build smarter solutions that reach the communities we serve. Ready to chase the miracles of science and improve people’s lives? Let’s Pursue Progress and Discover Extraordinary – together. At Sanofi, we provide equal opportunities to all regardless of race, color, ancestry, religion, sex, national origin, sexual orientation, age, citizenship, marital status, disability, gender identity, protected veteran status or other characteristics protected by law.
Posted 2 days ago
5.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Ciklum is looking for a Senior Data Scientist to join our team full-time in India. We are a custom product engineering company that supports both multinational organizations and scaling startups to solve their most complex business challenges. With a global team of over 4,000 highly skilled developers, consultants, analysts and product owners, we engineer technology that redefines industries and shapes the way people live. About the role: As a Senior Data Scientist, become a part of a cross-functional development team working for A healthcare technology company that provides platforms and solutions to improve the management and access of cost-effective pharmacy benefits. Our technology helps enterprise and partnership clients simplify their businesses and helps consumers save on prescriptions. Our client is a leader in SaaS technology for healthcare, They offer innovative solutions with integrated intelligence on a single enterprise platform that connects the pharmacy ecosystem. With their expertise and modern, modular platform, our partners use real-time data to transform their business performance and optimize their innovative models in the marketplace. Responsibilities: Development of prototype solutions, mathematical models, algorithms, machine learning techniques, and robust analytics to support analytic insights and visualization of complex data sets Work on exploratory data analysis so you can navigate a dataset and come out with broad conclusions based on initial appraisals Provide optimization recommendations that drive KPIs established by product, marketing, operations, PR teams, and others Interacts with engineering teams and ensures that solutions meet customer requirements in terms of functionality, performance, availability, scalability, and reliability Work directly with business analysts and data engineers to understand and support their use cases Work with stakeholders throughout the organization to identify opportunities for leveraging company data to drive business solutions Drive innovation by exploring new experimentation methods and statistical techniques that could sharpen or speed up our product decision-making processes Cross-trains other team members on technologies being developed, while also continuously learning new technologies from other team members Contribute to unit’s activities and community building, participate in conferences, and provide excellence in exercise and best practices Requirements: We know that sometimes, you can’t tick every box. We would still love to hear from you if you think you’re a good fit! 5+ years of development of Data Science solutions with a proven track record of leveraging analytics to drive significant business impact Bachelor's/Master's degree in Mathematics, Statistics, Computer Science, Operations Research, Econometrics or related field Proven ability to relate and solve business problems through machine learning and statistics 4+ years of experience applying various machine learning techniques: regression, classification, clustering, dimensional reduction, time series prediction, and/or outlier detection, recommendation systems Understanding of advantages and drawbacks of machine learning algorithms as well as their usage constraints including performance 4+ years of experience in Python development of machine learning solutions and statistical analysis: Pandas, SciPy, Scikit-learn, XGBoost, LightGBM, and/or statsmodels, imbalanced-learn libraries and ML libraries like scikit-learn, TensorFlow, PyTorch, data wrangling and visualization (e.g., Pandas, NumPy, Matplotlib, Seaborn Experience in working with large-scale datasets, including time series and healthcare data Experience with NLP, deep learning and GenAI Experience diving into data to consider hidden patterns and conducting error analysis 2+ years experience in data visualization: Power BI, Tableau, and/or Python libraries like Matplotlib and Seaborn Experience with SQL for data processing, data manipulation, sampling, reporting 3+ years experience creating/maintaining of OOP Machine Learning solutions Understanding of CRISP-ML(Q) / TDSP concept 1+ year of experience with MLOps: integration of reliable Machine Learning Pipelines in Production, Docker, containerization, orchestration 2+ years of experience with Clouds (AWS, Azure, GCP) and Clouds AI And ML Services(e.g. Amazon Sage Maker, Azure ML) Excellent time and project management skills, with the ability to manage detailed work and communicate project status effectively to all levels Desirable: Probability Theory & Statistics knowledge and intuition as well as understanding of Mathematics behind Machine Learning 1+ year of experience in Deep Learning solution development with Tensorflow or PyTorch libraries Data Science / Machine Learning certifications, or research experience with papers being published Experience with Kubernetes Experience with Databricks, Snowflake platforms 1+ year of BigData experience, i.e. Hadoop / Spark Experience with No-SQL, and/or columnar/graph databases What`s in it for you? Care: your mental and physical health is our priority. We ensure comprehensive company-paid medical insurance, as well as financial and legal consultation Tailored education path: boost your skills and knowledge with our regular internal events (meetups, conferences, workshops), Udemy license, language courses and company-paid certifications Growth environment: share your experience and level up your expertise with a community of skilled professionals, locally and globally Opportunities: we value our specialists and always find the best options for them. Our Resourcing Team helps change a project if needed to help you grow, excel professionally, and fulfil your potential Global impact: work on large-scale projects that redefine industries with international and fast-growing clients Welcoming environment: feel empowered with a friendly team, open-door policy, informal atmosphere within the company and regular team-building events About us: India is a strategic growth market for Ciklum. Be a part of a big story created right now. Let’s grow our delivery center in India together! Boost your skills and knowledge: create and innovate with like-minded professionals — all of that within a global company with a local spirit and start-up soul. Supported by Recognize Partners and expanding globally, we will engineer the experiences of tomorrow! Be bold, not bored! Interested already? We would love to get to know you! Submit your application. We can’t wait to see you at Ciklum.
Posted 2 days ago
3.0 - 5.0 years
0 Lacs
Thane, Maharashtra, India
On-site
Job Overview Develop statistical methods sections of protocols and review case report forms (CRFs). Prepare analysis plans and write specifications for analysis files, tables, and figures. Communicate with clients regarding study protocol or statistical analysis issues as they arise. Communicate with study team members regarding study execution as it relates to timelines, data quality, and interpretation of results. Interpret analyses and write statistical sections of study reports. Accountable for controlling costs and maximizing revenue recognition. Provide training, guidance and mentorship to lower level and new staff. Essential Functions Production of High-Quality Deliverables: Completes and reviews more complex assigned tasks with a focus on accuracy. Conducts all appropriate validation requirements, according to Standard Operating Procedures (SOPs), for each task undertaken. Checks programming logs for cleanliness and correct processing of data. Inputs into Data Issues log and follows issues to appropriate resolution. Leadership: Perform statistical team lead role on single studies. Through this, works closely with the Project Team Lead and supervisor to deliver on time, with high quality and within budget. Build and maintain effective customer relationships, driving statistical discussions, providing support and/or guidance for statistical activities. Demonstrates and promotes efficient communication. If in lead role, runs meetings, documenting where necessary and following up on actions. Actively participates in internal project team meetings, provides timely progress updates. As a lead, will have input on estimate at completion (EAC) reporting. Data Management: Assist in reviewing or advising data management staff on database design, validation checks and critical data. Handles data issue resolutions. If in lead biostatistical role, handles lock and unblinding process with appropriate supervision. Statistical Analysis Plan (SAP) and Shells: Authors or performs quality control review (QC) of SAPs and shells. Make best use of resources and expertise within the organization (e.g. Libraries, templates and consultants for complex statistical methods). May author or QC complex SAPs, under supervision if needed. Datasets: Writes and maintains programming specifications. Programs assigned datasets to industry standards. Handles dataset derivations and assignment. Tables, Listings and Figures (TLFs): Writes programming specifications for statistical analyses outputs. Programs TLFs, maximizes programming efficiency with use of tools, where applicable. Checks resulting output for format and content, and questions specification as needed. Ensures consistency across items produced. Timelines: Plans and documents timelines, forecasts resource needs, suggests work may be out of scope. Financials: Shares accountability (with resource managers) for the financial success of assigned studies. Accountable for controlling costs and maximizing revenue recognition. Responsible for sharing budget expectations with the team. Raises concerns to manager if new work or rework appears to be out of scope. Understands 'scope of work' and has an awareness of contract and budget assumptions. Knowledge Sharing: Helps train staff regarding operational items. Mentors junior staff. Supports colleagues and provides motivation as needed. Risk Management: Identifies risks to project delivery and/or quality and spends time to proactively avoid as well as proposes solutions to mitigate risks. Where possible, anticipate risks to minimize need for study level escalations. Other Clinical Data Interchange Standards Consortium (CDISC) requirements: Leadership: Under supervision within Compound. May perform statistical team lead role on studies within a compound. In addition to the leadership responsibilities above, also prioritizes and takes proactive approach to gain efficiencies in work across protocols. Study Start up: Assist with protocol development, sample size calculation, protocol and case report form (CRF) review. Protocol: Authors or performs quality control (QC) review of the statistical section of a protocol (making best possible use of resources and expertise within the organization (e.g. Libraries, templates and consultants for complex statistical methods). Proposals: May be able to review and comment on proposals/budgets at a study level. May contribute to request for proposals (RFP). May be expected to present at bid defenses. Clinical Study Report (CSR): Reviews or drafts CSR or statistical report. Customer: On occasion, may serve as primary point of contact for customer. May also consult on operational topics with clients. Lock and Unblinding Process: Handles the database lock and unblinding process. May participate on the biostatistics randomization team (drafts randomization specifications and/or reduce or perform quality control (QC) review of randomization schedules). May serve as unblinded lead statistician. Other Responsibilities: As defined on ad-hoc basis by managers. May assist with cross functional collaboration. Qualifications Bachelor's Degree Biostatistics or related field and 3 - 5 years relevant experience Req Or Master's Degree Biostatistics or related field and 3-5 years relevant experience Req Or Ph.D. Biostatistics or related field and 6 year relevant experience Req Typically requires 7 years of prior relevant experience, or equivalent combination of education, training and experience. Requires advanced knowledge of job area, and broad knowledge of a other related job areas, typically obtained through advanced education combined with experience. Excellent written and oral communication skills including grammatical/technical writing skills. Excellent attention and accuracy with details. In-depth knowledge of applicable clinical research regulatory requirements, i.e., Good Clinical Practice (GCP) and International Conference on Harmonization (ICH) guidelines. Familiarity with moderately complex statistical methods that apply to applicable clinical trials. Strong individual initiative. Strong organizing skills. Strong working knowledge of SAS computing package. Familiarity with other relevant statistical computing packages such as nQuery. Strong commitment to quality. Ability to effectively manage multiple tasks and projects. Ability to lead and co-ordinate small teams. Ability to solve moderately complex problems. Ability to establish and maintain effective working relationships with coworkers, managers and clients. Working knowledge of relevant Data Standards (such as Clinical Data Interchange Standards Consortium CDISC/ADaM). IQVIA is a leading global provider of clinical research services, commercial insights and healthcare intelligence to the life sciences and healthcare industries. We create intelligent connections to accelerate the development and commercialization of innovative medical treatments to help improve patient outcomes and population health worldwide. Learn more at https://jobs.iqvia.com
Posted 2 days ago
4.0 - 5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
The CoinDCX Journey: Building Tomorrow, today At CoinDCX, we believe ‘CHANGE STARTS TOGETHER’. You are the driving force that will help us make Web3 accessible to all. In the last six years, we have skyrocketed from being India’s first crypto unicorn to carrying a community of over 125 million with us. To continue maximising the adoption and acceleration of Web3, we are now focused on developing cutting-edge products, addressing accessibility and security challenges, and bridging the gap between people and Web3 technologies. While we go ahead and keep dominating the Web3 world, we would like to HODL you on our team! Join our team of passionate innovators who are breaking barriers and building the future of Web3. Together, we will make the complex simple, the inaccessible accessible, and the impossible possible. Boost your innovation to an ALL TIME HIGH with us! Inside CoinDCX’s Business Analytics Team Our Business Analytics team is an awesome group of collaborators, who love to solve first-of-its-kind problems with a lot of autonomy, creativity and fun. As a team, they fulfil the needs, wants and desires of our patrons by finding the key levers that enable them to use our platforms better. You help them do more. At CoinDCX you will not only be the skill of the future but also you will get to work and learn from the best while building the future of Web3. Coin your trust in us as we create magic together! This particular role is for the DeFi arm of CoinDCX i.e @Okto About Okto Okto, is a key player in the Web3 orchestration space, focusing on Chain Abstraction and powering the Okto Wallet and Lite SDK used by over 1 million users. The company is at the forefront of innovation in the blockchain industry, with a strong focus on user experience and technology advancement. You need to be a HODLer of these 4-5 years of experience as an Analyst or in a similar role with Outstanding analytical and problem-solving skills Excellent SQL skills and experience in at least one scripting language (Python or R) for data manipulation and automation Solid understanding of statistical methods and experiment designs Must have hands-on experience with product analytics and BI tools like MixPanel Amplitude, Google Analytics, Looker, and Tableau You will be mining through these tasks This is a full-time hybrid role for a Lead Business Analyst at Okto. The role involves overseeing business analysis activities, communication with stakeholders, analyzing and improving business processes, and gathering and documenting business requirements. Providing Analytical recommendations to influence our product, growth, marketing and strategy Design and evaluate product and growth experiments backed by solid hypotheses. Finding insights that influence decisions (product/features), spanning from early data explorations about user behaviour to multivariate experiments and optimisations Providing user insights through data: cohorts analysis, user segmentation, funnels and behavioural analyses in partnership with growth, product, engineering and UX You will be leading a team as well as contribute as an IC to some projects, and hence expected to have Strong technical skills, Strong Communication skills, experience in Business Process improvement, Mentorship and Project management experience. You will also be working on blockchain analytics that will open up a very different dataset for you. However, previous experience is not must, but open to learning and interest into web3 sector will play a pivotal role Are you the one? Our missing block You take ownership and have a thirst for excellence with an impact-driven and result-oriented mindset. You grow while helping others grow with you You thrive on change, have attention to detail, and passion for quality You love exploring new ideas to build something useful and are always curious to learn. Perks That Empower You Our benefits are designed to make a lasting impact on your life, giving you the freedom to create a work-life balance that truly suits you. Design Your Own Benefit: Tailor your perk package to fit your unique needs. Whether you’re eyeing a new gadget or welcoming a furry friend into your life, our flexible benefits ensure that you can prioritize what matters most to you. Unlimited Wellness Leaves: We believe in the power of well-being. Take the time you need to recharge, knowing that your health is our priority. With unlimited wellness leaves, you can return refreshed, ready to build and grow. Mental Wellness Support: Your mental health is as important as your professional growth. Benefit from access to health experts, free counseling sessions, monthly wellness workshops, and regular team outings, all designed to help you stay balanced and connected. Bi-Weekly Learning Sessions: These sessions are more than just updates—they’re opportunities to fuel your growth. Stay ahead with the latest industry knowledge, sharpen your skills, and accelerate your career in an ever-evolving landscape.
Posted 2 days ago
6.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
Position : Machine Vision & Imaging AI Engineer Location : THub Hyderabad Company : Meridian Data Labs Type : Full-time | On-site Experience : 2–6 years (flexible for the right candidate) About Us Meridian Data Labs is a deep-tech startup working at the intersection of computer vision, AI/ML, and edge computing . We design and build cutting-edge imaging systems and AI solutions for defense, aerospace, and industrial clients. Our work spans robotics, video analytics, inspection systems, and 3D reconstruction. Role Overview We are seeking a hands-on engineer with strong expertise in Machine Vision and AI/ML for Image and Video Processing . You will help develop high-performance vision systems that combine precise hardware with intelligent software for real-world applications. Key Responsibilities Machine Vision Design and integrate camera systems (industrial, stereo, depth, thermal, etc.) Select and configure lenses, lighting setups (structured, backlight, coaxial, etc.) Interface with frame grabbers, image acquisition libraries (e.g., GenICam, Spinnaker, etc.) Calibrate vision systems (intrinsic/extrinsic, multi-camera setups) Work closely with hardware and robotics teams for vision system integration AI/ML for Imaging and Video Develop deep learning models for object detection, segmentation, classification Build pipelines for image/video preprocessing, augmentation, and real-time inference Work on 3D reconstruction from stereo, multi-view, or structured light data Implement and optimize models for deployment on edge devices (NVIDIA Jetson, etc.) Collaborate on dataset creation, annotation, and model validation Required Skills & Qualifications B.E./B.Tech or M.S./M.Tech in Computer Science, ECE, Robotics, or related fields Strong knowledge of computer vision and imaging fundamentals Experience with OpenCV, PyTorch/TensorFlow, and image processing libraries Experience with industrial cameras (e.g., Basler, FLIR), lighting systems, and lenses Understanding of camera interfaces (USB3, GigE, MIPI, etc.) Hands-on experience in AI model training, deployment, and tuning for real-world data Bonus (Good to Have) Exposure to robotics and automation systems Experience in real-time video analytics or embedded systems Experience with 3D data (point clouds, photogrammetry, stereo vision) Familiarity with frameworks like ROS, NVIDIA DeepStream, Open3D, etc. What We Offer Opportunity to work on high-impact, mission-critical systems Cross-disciplinary learning in hardware-software integration Fast-paced startup environment with real ownership Work with cutting-edge tech in defense and aerospace domains To Apply : Send your resume and portfolio (if available) to somesh@meridiandatalabs.com
Posted 2 days ago
8.0 years
0 Lacs
India
Remote
Work Type: Contractor | Permanent Remote Compensation: USD 15 – 25/hour Hours: 20 / 30 / 40 hours per week (PST overlap required) Experience Required: 3 – 8 Years Contract Duration: 3 Months Notice Period: Immediate to Max 1 Week Note This is a remote, contract-based position. Payment is hourly with no paid leaves or employee benefits. Contractors are responsible for their own taxes and statutory compliance. Job Overview We are hiring Senior JavaScript Engineers to support LLM (Large Language Model) evaluation and training dataset development using real-world GitHub repositories. You’ll work with researchers to assess LLM capabilities in solving real coding problems, perform environment setup, evaluate software quality, and validate bug-fix performance. Key Responsibilities Analyze and triage GitHub issues in trending open-source projects Set up and configure repositories (including Dockerization and tooling) Evaluate test coverage and software quality Modify, run, and validate codebases for bug-fix test cases Collaborate with researchers to identify complex or challenging repositories (Optional) Mentor or lead junior engineers on project tasks Must-Have Skills 3–4+ years of hands-on JavaScript development experience Strong understanding of real-world, large-scale codebases Experience with Git, Docker, and CI/CD pipelines Comfortable running, debugging, and modifying real code locally Exposure to GitHub workflows and open-source contributions
Posted 3 days ago
8.0 years
0 Lacs
India
Remote
Work Type: Contractor | Permanent Remote Compensation: USD 15 – 25/hour Hours: 20 / 30 / 40 hours per week (PST overlap required) Experience Required: 3 – 8 Years Contract Duration: 3 Months Notice Period: Immediate to Max 1 Week Note This is a remote, contract-based position. Payment is hourly with no paid leaves or employee benefits. Contractors are responsible for their own taxes and statutory compliance. Job Overview We are hiring Senior JavaScript Engineers to support LLM (Large Language Model) evaluation and training dataset development using real-world GitHub repositories. You’ll work with researchers to assess LLM capabilities in solving real coding problems, perform environment setup, evaluate software quality, and validate bug-fix performance. Key Responsibilities Analyze and triage GitHub issues in trending open-source projects Set up and configure repositories (including Dockerization and tooling) Evaluate test coverage and software quality Modify, run, and validate codebases for bug-fix test cases Collaborate with researchers to identify complex or challenging repositories (Optional) Mentor or lead junior engineers on project tasks Must-Have Skills 3–4+ years of hands-on JavaScript development experience Strong understanding of real-world, large-scale codebases Experience with Git, Docker, and CI/CD pipelines Comfortable running, debugging, and modifying real code locally Exposure to GitHub workflows and open-source contributions
Posted 3 days ago
4.0 years
0 Lacs
India
Remote
Location: India Remote Employment Type: Full-time Experience Level: Mid to Senior (4-5+ years) Date of Joining : Required Immediate Joiners About UltraSafeAI UltraSafeAI is a US-based technology company at the forefront of developing secure, reliable, and explainable AI systems. We specialize in proprietary AI technologies including advanced LLMs, CNNs, VLLMs, intelligent agents, computer vision systems, and cutting-edge ML algorithms. Our focus is on B2B AI adoption, providing end-to-end integration using our proprietary technology stack to automate entire business processes. We create intelligent solutions that prioritize safety, transparency, and human alignment across various industries including healthcare, finance, legal, and enterprise services. Our mission is to enable seamless AI adoption while maintaining the highest standards of safety and ethical considerations. Position Overview We're seeking an experienced AI Research Engineer specializing in model training, fine-tuning, and optimization of large language models (LLMs), vision language models (VLMs), and convolutional neural networks (CNNs). The ideal candidate will have deep expertise in training and fine-tuning foundation models using advanced techniques and frameworks, with particular emphasis on reinforcement learning approaches for alignment. As an AI Research Engineer at UltraSafeAI, you'll work on developing and enhancing our proprietary models, creating domain-specific adaptations, and optimizing inference performance. You'll collaborate with our engineering team to build the core AI capabilities that power our enterprise solutions and set new standards for AI performance and safety in business applications. Key Responsibilities ● Train and fine-tune large language models (LLMs) and vision language models (VLMs) using state-of-the-art techniques ● Implement and improve reinforcement learning methods for model alignment, including DPO, PPO, and GRPO ● Develop and optimize CNNs for specialized computer vision tasks in enterprise contexts ● Work with distributed training frameworks for efficient model development ● Optimize models for inference using libraries like VLLM, SGLANG, and Triton Inference Server ● Implement techniques for reducing model size while maintaining performance (quantization, distillation, pruning) ● Create domain-specific adaptations of foundation models for vertical industry applications ● Design and execute experiments to evaluate model performance and alignment ● Develop benchmarks and metrics to measure improvements in model capabilities ● Collaborate with data science team on dataset curation and preparation for training ● Document model architectures, training procedures, and experimental results ● Stay current with the latest research in model training and alignment techniques Required Qualifications ● 4-5+ years of professional experience in AI/ML engineering with focus on model training ● Strong expertise in training and fine-tuning large language models ● Experience with major deep learning frameworks (PyTorch, TensorFlow, JAX) ● Hands-on experience with model training libraries and frameworks (Transformers, NeMo, Megatron-LM, TRL) ● Practical implementation of reinforcement learning techniques for model alignment (DPO, PPO, GRPO) ● Experience optimizing models for efficient inference ● Strong understanding of distributed training techniques for large models ● Background in computer vision model development (CNNs, vision transformers) ● Excellent programming skills in Python and related data science tools ● Experience with cloud infrastructure for ML workloads (AWS, GCP, or Azure) ● Strong problem-solving skills and scientific mindset ● Excellent documentation and communication abilities Highly Desirable ● Experience with multimodal models combining text, vision, and other modalities ● Knowledge of model quantization techniques (QLoRA, GPTQ, AWQ) ● Experience with VLLM, SGLANG, or Triton Inference Server for optimized inference ● Background in prompt engineering and instruction tuning ● Familiarity with RLHF (Reinforcement Learning from Human Feedback) ● Experience with model evaluation and red-teaming ● Knowledge of AI safety and alignment research ● Experience with domain-specific model adaptation for industries like healthcare, finance, or legal ● Background in research with publications or contributions to open-source ML projects ● Experience with MLOps tools and practices for model lifecycle management Why Join UltraSafeAI? ● Create Proprietary AI Models: Develop the core AI technologies that power our enterprise solutions ● 100% Remote Work: Work remotely with our US-based company with flexible hours ● B2B Impact: Help shape AI models that solve real business problems across industries ● Cutting-Edge Research: Work on the frontier of AI model development and alignment ● Continuous Learning: Regular knowledge sharing sessions and education stipend ● Collaborative Team: Work with talented researchers and engineers focused on innovation ● Work-Life Balance: Flexible PTO policy and respect for personal time ● Career Growth: Clear paths for advancement in a rapidly growing company ● Industry-Leading Infrastructure: Access to high-performance computing resources for model training
Posted 3 days ago
3.0 years
0 Lacs
Serilingampalli, Telangana, India
On-site
The Statistical Programmer II provides technical expertise for the conduct of clinical trials, and works with minimal supervision to support various programming activities related to the analysis and reporting of clinical study data. In addition, the Statistical Programmer II may fill the Statistical Programming Lead role (or part of that role) on small, non-complex projects. This role supports the generation of real-world evidence (RWE) by programming and analyzing large-scale observational datasets. The ideal candidate will have strong SAS programming skills, familiarity with R, and experience working with healthcare claims, electronic health records (EHR), or registry data. Key Accountabilities Project Management: Assist in the coordination of project start-up activities, creation of global programs, tracking spreadsheets, and other required documentation. Statistical Programming For Assigned Projects Deliver best value and high quality service. Check own work in an ongoing way to ensure first-time quality. Use efficient programming techniques to produce derived datasets (e.g. SDTM, ADaM), tables, figures, and data listings of any complexity and QC low-medium complexity derived datasets, tables, figures, and data listings. Assist in the production/QC of derived dataset specifications and other process supporting documents and submission documentation. Training Maintain and expand local and international regulatory knowledge within the clinical industry. Develop knowledge of SAS and processes/procedures within other Parexel functional areas. Provide relevant training and mentorship to staff and project teams as appropriate. General Develop, validate, and maintain SAS and R programs to support RWD analyses, including prevalence, treatment patterns, cost/utilization, and time-to-event studies Execute programming tasks using Client standard macros and environments within UNIX and AWS-based platforms Perform double programming and quality control (QC) checks in alignment with internal SOPs and KIMS system workflows Collaborate with statisticians, data scientists, and cross-functional teams to define specifications and deliverables Document programming processes and outputs in accordance with regulatory and internal audit requirements Contribute to the development and maintenance of internal R packages, Shiny apps, and Quarto documentation to support programming workflows Participate in onboarding and mentoring of new programmers, including training on client-specific tools and data environments Skills Excellent analytical skills. Proficiency in SAS; working knowledge of R is highly desirable 3+ years of experience in statistical programming, preferably in a pharmaceutical or healthcare setting Knowledge and understanding of the programming and reporting process. Knowledge of SOPs/Guidelines, ICH-GCP, and any other applicable local and international regulations such as 21 CFR Part 11. Familiarity with real-world data sources such as Optum, MarketScan, Flatiron, CPRD, or similar Experience with Snowflake, UNIX/Linux environments, and version control tools (e.g., Git). Strong understanding of data privacy, regulatory compliance, and audit-readiness in RWD contexts Ability to learn new systems and function in an evolving technical environment. Ability to manage competing priorities and flexibility to change. Attention to detail. Ability to successfully work as part of a global team. Work effectively in a quality-focused environment. Effective time management in order to meet daily metrics or team objectives. Show commitment to and perform consistently high quality work. Business/operational skills that include customer focus, commitment to quality management, and problem solving. Knowledge And Experience Competent in written and oral English. Good communication skills. Experience with OMOP/OHDSI standards and tools Exposure to project management tools like Monday.com Ability to work independently and manage multiple priorities in a fast-paced environment Education Educated to degree level in a relevant discipline and/or equivalent work experience; Bachelor’s or Master’s degree in Statistics, Computer Science, Epidemiology preferred.
Posted 3 days ago
3.0 years
6 Lacs
India
On-site
Job Description Designation: QA Engineer Experience : 6months - 3 Years Location: Indore (M. P.) - Work from office Role: Full time Responsibilities: Review and analyze system specifications. Execute test cases (manual or automated) and analyze results Evaluate product code according to specifications Create logs to document testing phases and defects Report bugs and errors to development teams Help troubleshoot issues Conduct post-release/ post-implementation testing Work with cross-functional teams to ensure quality throughout the software development lifecycle Create a large dummy dataset in order to do the performance testing Gather Performance Testing Requirements Conduct system performance testing to ensure system reliability, capacity, and scalability. Work with the testing team to develop performance test plans and cases . Requirements: Excellent communication skills Proven experience as a Quality Assurance Tester or similar role Experience in project management and QA methodology Good experience with automation testing tools. Familiarity with Agile frameworks and regression testing is a plus. Attention to detail Analytical mind and problem-solving aptitude Write documentation for automated processes including test plans, test procedures, and test cases. Job Types: Full-time, Permanent Pay: Up to ₹55,179.30 per month Work Location: In person Speak with the employer +91 8319592630
Posted 3 days ago
0.0 years
0 - 0 Lacs
Rajkot, Gujarat
On-site
Base Criteria - Candidate should have minimum 3 relevant years of experience in SSRS/Power BI Report - - Builder(Paginated Report) - Candidate should be able to explain the work done in the Past - Candidate should be good in communication and work approach - Candidate should have good solution oriented approach Skills 1. Proficient knowledge of SSRS/Power BI Report Builder(Paginated Report) 2. Good understanding of advanced SQL Server. 3. Good experience on dealing with complex SQL queries. 4. Candidate should be good in writing optimized DB Queries 5. Should know the various ways of improving Query performances 6. Good understanding and experience of code versioning tools such as Git. 7. Proficient in Linux OS and Candidate should be able to work independently 8. Candidate should have experience on working with largescale database 9. Candidate should have experience of dataset and datagateways of Power BI. 10. Candidate should know how to publish, share and access restrict reports on power bi service. Power Pages & Power Apps (Optional) Job Type: Full-time Pay: ₹30,000.00 - ₹60,000.00 per month Schedule: Day shift Supplemental Pay: Yearly bonus Ability to commute/relocate: Rajkot, Rajkot, Gujarat: Reliably commute or planning to relocate before starting work (Required) Work Location: In person
Posted 3 days ago
6.0 years
0 Lacs
Gurugram, Haryana, India
On-site
About The Role Grade Level (for internal use): 10 The Team: The Private Markets Data Stewardship team delivers research, content, and analytics about private capital markets for Private Equity & Venture Capital and alternative asset class firms, their portfolio investments, and funds. Our data is used to support deal sourcing, fundraising, data exploration and valuation analysis by institutional investors, money managers and investment banking professionals. We provide expertise in datasets, assist with client requests, and help Product and Commercial teams with their engagements with current and prospective clients. A Key responsibility of our team is to support the development of new solutions for our dataset across various data delivery channels (feeds, platforms, APIs etc.), covering both self-sourced data and the data purchased from third party vendors. The Impact As a senior member of the team, you will be collaborating and focusing on bringing together various stakeholders to drive Data Strategy Initiatives through the SDLC, ensuring technical solutions are sound and execute business strategies through client-centric mindset. The successful candidate will work within the context of a cross-functional team aligned with Data Stewards, Operations Managers, Product and its support functions and Agile Technology teams. The role requires excellent communication, project management, data analytics, conflict resolution, critical thinking, and stakeholder management skills to ensure quality and timely delivery of projects. What’s in it for you: Private Capital Markets data is in high demand, and our clients value our data driven approach. This is a great opportunity to further develop your business acumen by venturing into a mammoth world of S&P data and gain deep experience and expertise in the field of data exploration, data analytics. This role offers exposure to work with a group that is driven by principles and challenging roles that provide multiple paths for growth in different Private Markets data projects. Responsibilities Manage the end-to-end delivery of the Private Markets data projects (requirements gathering and consultation, research & analysis, work sizing, developing the vision & roadmap, collaboration with Technology and Product teams on implementation) Develop business logics to integrate and transform various Private Capital Markets data facets from multiple sources into our internal data structures and perform User Acceptance Testing (UAT) for the same Work with Agile Technology teams to complete the design of various projects and processes/tools while partnering with other Content teams on the shared structures Collaboration with Product Teams on identifying key client needs and converting them into smaller business projects for successful implementation in backend structures Converting raw complex data into easy-to-understand information through data visualization and presenting the same to stakeholders Analyze and uncover inconsistencies in large amounts of data supplied by third-party providers by utilizing SQL knowledge and propose solutions to address them in S&P Global products Support BAU and client requests across all areas of the dataset Essential Qualifications What We’re Looking For: 6+ years of work experience including Data Strategy, management & governance, preferably in a financial market data intensive environment Sound knowledge about backend infrastructure and SDLC (tables/pipelines/loaders, etc.) Strong command in writing and developing SQL Queries (joins, exist/not-exist, group by, having, cast, etc.) Good understanding of S&P Products such as Capital IQ Pro, Capital IQ, Excel Plug-In, feeds, etc. Proactive attitude in problem identification/resolution and track record of successful delivery of complex projects, particularly in content-related domain Ability to work collaboratively across segments and cultures Comfortable working in a dynamic, fast-paced environment while handling multiple tasks Excellent time management skills, ability to meet strict deadlines Effective and Structured Communication/Presentation skills Willing to work evening shifts and flexible hours Preferred Qualifications Familiarity with Private Capital Markets Data Project Management/Agile/Data Management Related Certification Working knowledge of Visualization Tools (Tableau, Power BI etc.) Experience in Data Mining, Analysis, AI/ML/Automation Basic understanding of Python What’s In It For You? Our Purpose Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technology–the right combination can unlock possibility and change the world. Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence®, pinpointing risks and opening possibilities. We Accelerate Progress. Our People We're more than 35,000 strong worldwide—so we're able to understand nuances while having a broad perspective. Our team is driven by curiosity and a shared belief that Essential Intelligence can help build a more prosperous future for us all. From finding new ways to measure sustainability to analyzing energy transition across the supply chain to building workflow solutions that make it easy to tap into insight and apply it. We are changing the way people see things and empowering them to make an impact on the world we live in. We’re committed to a more equitable future and to helping our customers find new, sustainable ways of doing business. We’re constantly seeking new solutions that have progress in mind. Join us and help create the critical insights that truly make a difference. Our Values Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits We take care of you, so you can take care of business. We care about our people. That’s why we provide everything you—and your career—need to thrive at S&P Global. Our Benefits Include Health & Wellness: Health care coverage designed for the mind and body. Flexible Downtime: Generous time off helps keep you energized for your time on. Continuous Learning: Access a wealth of resources to grow your career and learn valuable new skills. Invest in Your Future: Secure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly Perks: It’s not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the Basics: From retail discounts to referral incentive awards—small perks can make a big difference. For more information on benefits by country visit: https://spgbenefits.com/benefit-summaries Global Hiring And Opportunity At S&P Global At S&P Global, we are committed to fostering a connected and engaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, “pre-employment training” or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to: EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only: The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf 20 - Professional (EEO-2 Job Categories-United States of America), DTMGOP202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority – Ratings - (Strategic Workforce Planning) Job ID: 312872 Posted On: 2025-07-17 Location: Ahmedabad, Gujarat, India
Posted 3 days ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
The dataset job market in India is booming with opportunities for talented individuals who are skilled in working with data. From data analysts to data scientists, there is a wide range of roles available for job seekers interested in this field. In this article, we will explore the dataset job market in India and provide valuable insights for those looking to kickstart or advance their career in this domain.
These cities are known for their thriving tech industries and are hotspots for dataset job opportunities in India.
The average salary range for dataset professionals in India varies based on experience levels: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
A typical career path in the dataset field may look like this: - Junior Data Analyst - Data Analyst - Senior Data Analyst - Data Scientist - Data Architect - Chief Data Officer
In addition to dataset expertise, professionals in this field are often expected to have skills in: - Advanced Excel - SQL - Data visualization tools (e.g., Tableau, Power BI) - Machine learning - Python/R programming
As you navigate the dataset job market in India, remember to hone your skills, stay updated with industry trends, and prepare well for interviews. With determination and dedication, you can land your dream job in this exciting field. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France