Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Position Title: AI/ML Engineer Company : Cyfuture India Pvt. Ltd. Industry : IT Services and IT Consulting Location : Sector 81, NSEZ, Noida (5 Days Work From Office) Website : www.cyfuture.com About Cyfuture Cyfuture is a trusted name in IT services and cloud infrastructure, offering state-of-the-art data center solutions and managed services across platforms like AWS, Azure, and VMWare. We are expanding rapidly in system integration and managed services, building strong alliances with global OEMs like VMWare, AWS, Azure, HP, Dell, Lenovo, and Palo Alto. Position Overview We are hiring an experienced AI/ML Engineer to lead and shape our AI/ML initiatives. The ideal candidate will have hands-on experience in machine learning and artificial intelligence, with strong leadership capabilities and a passion for delivering production-ready solutions. This role involves end-to-end ownership of AI/ML projects, from strategy development to deployment and optimization of large-scale systems. Key Responsibilities Lead and mentor a high-performing AI/ML team. Design and execute AI/ML strategies aligned with business goals. Collaborate with product and engineering teams to identify impactful AI opportunities. Build, train, fine-tune, and deploy ML models in production environments. Manage operations of LLMs and other AI models using modern cloud and MLOps tools. Implement scalable and automated ML pipelines (e.g., with Kubeflow or MLRun). Handle containerization and orchestration using Docker and Kubernetes. Optimize GPU/TPU resources for training and inference tasks. Develop efficient RAG pipelines with low latency and high retrieval accuracy. Automate CI/CD workflows for continuous integration and delivery of ML systems. Key Skills & Expertise 1. Cloud Computing & Deployment Proficiency in AWS, Google Cloud, or Azure for scalable model deployment. Familiarity with cloud-native services like AWS SageMaker, Google Vertex AI, or Azure ML. Expertise in Docker and Kubernetes for containerized deployments Experience with Infrastructure as Code (IaC) using tools like Terraform or CloudFormation. 2. Machine Learning & Deep Learning Strong command of frameworks: TensorFlow, PyTorch, Scikit-learn, XGBoost. Experience with MLOps tools for integration, monitoring, and automation. Expertise in pre-trained models, transfer learning, and designing custom architectures. 3. Programming & Software Engineering Strong skills in Python (NumPy, Pandas, Matplotlib, SciPy) for ML development. Backend/API development with FastAPI, Flask, or Django. Database handling with SQL and NoSQL (PostgreSQL, MongoDB, BigQuery). Familiarity with CI/CD pipelines (GitHub Actions, Jenkins). 4. Scalable AI Systems Proven ability to build AI-driven applications at scale. Handle large datasets, high-throughput requests, and real-time inference. Knowledge of distributed computing: Apache Spark, Dask, Ray. 5. Model Monitoring & Optimization Hands-on with model compression, quantization, and pruning. A/B testing and performance tracking in production. Knowledge of model retraining pipelines for continuous learning. 6. Resource Optimization Efficient use of compute resources: GPUs, TPUs, CPUs. Experience with serverless architectures to reduce cost. Auto-scaling and load balancing for high-traffic systems. 7. Problem-Solving & Collaboration Translate complex ML models into user-friendly applications. Work effectively with data scientists, engineers, and product teams. Write clear technical documentation and architecture reports. Show more Show less
Posted 2 weeks ago
0 years
0 Lacs
India
On-site
Role: AI Engineer Join AiDP: Revolutionizing Document Automation through AI At AiDP, we're transforming complex document workflows into seamless experiences with powerful AI-driven automation. We're on a mission to redefine efficiency, accuracy, and collaboration in finance, insurance, and compliance. To continue pushing boundaries, we’re looking for exceptional talent Your Mission: Develop, deploy, and optimize cutting-edge machine learning models for accurate extraction and structuring of data from complex documents. Design and implement scalable NLP pipelines to handle vast quantities of unstructured and structured data. Continuously refine models through experimentation and data-driven analysis to maximize accuracy and efficiency. Collaborate closely with product and engineering teams to deliver impactful, real-world solutions. We’re looking for: Proven expertise in NLP, machine learning, and deep learning with solid knowledge of frameworks such as PyTorch, TensorFlow, Hugging Face, or scikit-learn. Strong proficiency in Python and experience with data processing tools (Pandas, NumPy, Dask). Experience deploying models to production using containerization technologies (Docker, Kubernetes) and cloud platforms (AWS, Azure, GCP). Familiarity with version control systems (Git) and continuous integration/continuous deployment (CI/CD) pipelines. Background in computer science, including understanding of algorithms, data structures, and software engineering best practices. Strong analytical thinking, problem-solving skills, and passion for tackling challenging issues in document automation and compliance workflows Show more Show less
Posted 2 weeks ago
8.0 years
0 Lacs
Vadodara, Gujarat, India
On-site
At Rearc, we're committed to empowering engineers to build awesome products and experiences. Success as a business hinges on our people's ability to think freely, challenge the status quo, and speak up about alternative problem-solving approaches. If you're an engineer driven by the desire to solve problems and make a difference, you're in the right place! Our approach is simple — empower engineers with the best tools possible to make an impact within their industry. We're on the lookout for engineers who thrive on ownership and freedom, possessing not just technical prowess, but also exceptional leadership skills. Our ideal candidates are hands-on leaders who don't just talk the talk but also walk the walk, designing and building solutions that push the boundaries of cloud computing. As a Senior Data Engineer at Rearc, you will be at the forefront of driving technical excellence within our data engineering team. Your expertise in data architecture, cloud-native solutions, and modern data processing frameworks will be essential in designing workflows that are optimized for efficiency, scalability, and reliability. You'll leverage tools like Databricks, PySpark, and Delta Lake to deliver cutting-edge data solutions that align with business objectives. Collaborating with cross-functional teams, you will design and implement scalable architectures while adhering to best practices in data management and governance . Building strong relationships with both technical teams and stakeholders will be crucial as you lead data-driven initiatives and ensure their seamless execution. What You Bring 8+ years of experience in data engineering, showcasing expertise in diverse architectures, technology stacks, and use cases. Strong expertise in designing and implementing data warehouse and data lake architectures, particularly in AWS environments. Extensive experience with Python for data engineering tasks, including familiarity with libraries and frameworks commonly used in Python-based data engineering workflows. Proven experience with data pipeline orchestration using platforms such as Airflow, Databricks, DBT or AWS Glue. Hands-on experience with data analysis tools and libraries like Pyspark, NumPy, Pandas, or Dask. Proficiency with Spark and Databricks is highly desirable. Experience with SQL and NoSQL databases, including PostgreSQL, Amazon Redshift, Delta Lake, Iceberg and DynamoDB. In-depth knowledge of data architecture principles and best practices, especially in cloud environments. Proven experience with AWS services, including expertise in using AWS CLI, SDK, and Infrastructure as Code (IaC) tools such as Terraform, CloudFormation, or AWS CDK. Exceptional communication skills, capable of clearly articulating complex technical concepts to both technical and non-technical stakeholders. Demonstrated ability to quickly adapt to new tasks and roles in a dynamic environment. What You'll Do Strategic Data Engineering Leadership: Provide strategic vision and technical leadership in data engineering, guiding the development and execution of advanced data strategies that align with business objectives. Architect Data Solutions: Design and architect complex data pipelines and scalable architectures, leveraging advanced tools and frameworks (e.g., Apache Kafka, Kubernetes) to ensure optimal performance and reliability. Drive Innovation: Lead the exploration and adoption of new technologies and methodologies in data engineering, driving innovation and continuous improvement across data processes. Technical Expertise: Apply deep expertise in ETL processes, data modelling, and data warehousing to optimize data workflows and ensure data integrity and quality. Collaboration and Mentorship: Collaborate closely with cross-functional teams to understand requirements and deliver impactful data solutions—mentor and coach junior team members, fostering their growth and development in data engineering practices. Thought Leadership: Contribute to thought leadership in the data engineering domain through technical articles, conference presentations, and participation in industry forums. Some More About Us Founded in 2016, we pride ourselves on fostering an environment where creativity flourishes, bureaucracy is non-existent, and individuals are encouraged to challenge the status quo. We're not just a company; we're a community of problem-solvers dedicated to improving the lives of fellow software engineers. Our commitment is simple - finding the right fit for our team and cultivating a desire to make things better. If you're a cloud professional intrigued by our problem space and eager to make a difference, you've come to the right place. Join us, and let's solve problems together! Show more Show less
Posted 2 weeks ago
6.0 - 9.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description The Risk division is responsible for credit, market and operational risk, model risk, independent liquidity risk, and insurance throughout the firm. RISK BUSINESS The Risk Business identifies, monitors, evaluates, and manages the firm’s financial and non-financial risks in support of the firm’s Risk Appetite Statement and the firm’s strategic plan. Operating in a fast paced and dynamic environment and utilizing the best in class risk tools and frameworks, Risk teams are analytically curious, have an aptitude to challenge, and an unwavering commitment to excellence. Overview To ensure uncompromising accuracy and timeliness in the delivery of the risk metrics, our platform is continuously growing and evolving. Risk Engineering combines the principles of Computer Science, Mathematics and Finance to produce large scale, computationally intensive calculations of risk Goldman Sachs faces with each transaction we engage in. As an Engineer in the Risk Engineering organization, you will have the opportunity to impact one or more aspects of risk management. You will work with a team of talented engineers to drive the build & adoption of common tools, platforms, and applications. The team builds solutions that are offered as a software product or as a hosted service. We are a dynamic team of talented developers and architects who partner with business areas and other technology teams to deliver high profile projects using a raft of technologies that are fit for purpose (Java, Cloud computing, HDFS, Spark, S3, ReactJS, Sybase IQ among many others). A glimpse of the interesting problems that we engineer solutions for, include acquiring high quality data, storing it, performing risk computations in limited amount of time using distributed computing, and making data available to enable actionable risk insights through analytical and response user interfaces. What We Look For Senior Developer in large projects across a global team of developers and risk managers Performance tune applications to improve memory and CPU utilization. Perform statistical analyses to identify trends and exceptions related Market Risk metrics. Build internal and external reporting for the output of risk metric calculation using data extraction tools, such as SQL, and data visualization tools, such as Tableau. Utilize web development technologies to facilitate application development for front end UI used for risk management actions Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. Interact with business users for resolving issues with applications. Design and support batch processes using scheduling infrastructure for calculation and distributing data to other systems. Oversee junior technical team members in all aspects of Software Development Life Cycle (SDLC) including design, code review and production migrations. Skills And Experience Bachelor’s degree in Computer Science, Mathematics, Electrical Engineering or related technical discipline 6-9 years’ experience is working risk technology team in another bank, financial institution. Experience in market risk technology is a plus. Experience with one or more major relational / object databases. Experience in software development, including a clear understanding of data structures, algorithms, software design and core programming concepts Comfortable multi-tasking, managing multiple stakeholders and working as part of a team Comfortable with working with multiple languages Technologies: Scala, Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Should be ready to work in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics (e.g., CPU, memory, threads, file handles). Knowledge and experience in distributed computing – parallel computation on a single machine like DASK, Distributed processing on Public Cloud. Knowledge of SDLC and experience in working through entire life cycle of the project from start to end About Goldman Sachs At Goldman Sachs, we commit our people, capital and ideas to help our clients, shareholders and the communities we serve to grow. Founded in 1869, we are a leading global investment banking, securities and investment management firm. Headquartered in New York, we maintain offices around the world. We believe who you are makes you better at what you do. We're committed to fostering and advancing diversity and inclusion in our own workplace and beyond by ensuring every individual within our firm has a number of opportunities to grow professionally and personally, from our training and development opportunities and firmwide networks to benefits, wellness and personal finance offerings and mindfulness programs. Learn more about our culture, benefits, and people at GS.com/careers. We’re committed to finding reasonable accommodations for candidates with special needs or disabilities during our recruiting process. Learn more: https://www.goldmansachs.com/careers/footer/disability-statement.html © The Goldman Sachs Group, Inc., 2023. All rights reserved. Goldman Sachs is an equal employment/affirmative action employer Female/Minority/Disability/Veteran/Sexual Orientation/Gender Identity Show more Show less
Posted 2 weeks ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Sanas is revolutionizing the way we communicate with the world’s first real-time algorithm, designed to modulate accents, eliminate background noises, and magnify speech clarity. Pioneered by seasoned startup founders with a proven track record of creating and steering multiple unicorn companies, our groundbreaking GDP-shifting technology sets a gold standard. Sanas is a 200-strong team, established in 2020. In this short span, we’ve successfully secured over $100 million in funding. Our innovation have been supported by the industry’s leading investors, including Insight Partners, Google Ventures, Quadrille Capital, General Catalyst, Quiet Capital, and other influential investors. Our reputation is further solidified by collaborations with numerous Fortune 100 companies. With Sanas, you’re not just adopting a product; you’re investing in the future of communication. We’re looking for a sharp, hands-on Data Engineer to help us build and scale the data infrastructure that powers cutting-edge audio and speech AI products. You’ll be responsible for designing robust pipelines, managing high-volume audio data, and enabling machine learning teams to access the right data — fast. As one of the first dedicated data engineers on the team, you'll play a foundational role in shaping how we handle data end-to-end, from ingestion to training-ready features. You'll work closely with ML engineers, research scientists, and product teams to ensure data is clean, accessible, and structured for experimentation and production. Key Responsibilities : Build scalable, fault-tolerant pipelines for ingesting, processing, and transforming large volumes of audio and metadata Design and maintain ETL workflows for training and evaluating ML models, using tools like Airflow or custom pipelines Collaborate with ML research scientists to make raw and derived audio features (e.g., spectrograms, MFCCs) efficiently available for training and inference Manage and organize datasets, including labeling workflows, versioning, annotation pipelines, and compliance with privacy policies Implement data quality, observability, and validation checks across critical data pipelines Help optimize data storage and compute strategies for large-scale training Qualifications : 2–5 years of experience as a Data Engineer, Software Engineer, or similar role with a focus on data infrastructure Proficient in Python, SQL, and working with distributed data processing tools (e.g., Spark, Dask, Beam) Experience with cloud data infrastructure (AWS/GCP), object storage (e.g.,S3), and data orchestration tools Familiarity with audio data and its unique challenges (large file sizes, time-series features, metadata handling) is a strong plus Comfortable working in a fast-paced, iterative startup environment where systems are constantly evolving Strong communication skills and a collaborative mindset — you’ll be working cross-functionally with ML, infra, and product teams Nice to Have : Experience with data for speech models like ASR, TTS, or speaker verification Knowledge of real-time data processing (e.g., Kafka, WebSockets, or low-latency APIs) Background in MLOps, feature engineering, or supporting model lifecycle workflows Experience with labeling tools, audio annotation platforms, or human-in-the-loop systems Joining us means contributing to the world’s first real-time speech understanding platform revolutionizing Contact Centers and Enterprises alike. Our technology empowers agents, transforms customer experiences, and drives measurable growth. But this is just the beginning. You'll be part of a team exploring the vast potential of an increasingly sonic future Show more Show less
Posted 2 weeks ago
5.0 - 10.0 years
0 Lacs
Mumbai, Maharashtra, India
On-site
We are seeking a highly skilled and motivated Lead DS/ML engineer to join our team. The role is critical to the development of a cutting-edge reporting platform designed to measure and optimize online marketing campaigns. We are seeking a highly skilled Data Scientist / ML Engineer with a strong foundation in data engineering (ELT, data pipelines) and advanced machine learning to develop and deploy sophisticated models. The role focuses on building scalable data pipelines, developing ML models, and deploying solutions in production to support a cutting-edge reporting, insights, and recommendations platform for measuring and optimizing online marketing campaigns. The ideal candidate should be comfortable working across data engineering, ML model lifecycle, and cloud-native technologies. Job Description: Key Responsibilities: Data Engineering & Pipeline Development Design, build, and maintain scalable ELT pipelines for ingesting, transforming, and processing large-scale marketing campaign data. Ensure high data quality, integrity, and governance using orchestration tools like Apache Airflow, Google Cloud Composer, or Prefect. Optimize data storage, retrieval, and processing using BigQuery, Dataflow, and Spark for both batch and real-time workloads. Implement data modeling and feature engineering for ML use cases. Machine Learning Model Development & Validation Develop and validate predictive and prescriptive ML models to enhance marketing campaign measurement and optimization. Experiment with different algorithms (regression, classification, clustering, reinforcement learning) to drive insights and recommendations. Leverage NLP, time-series forecasting, and causal inference models to improve campaign attribution and performance analysis. Optimize models for scalability, efficiency, and interpretability. MLOps & Model Deployment Deploy and monitor ML models in production using tools such as Vertex AI, MLflow, Kubeflow, or TensorFlow Serving. Implement CI/CD pipelines for ML models, ensuring seamless updates and retraining. Develop real-time inference solutions and integrate ML models into BI dashboards and reporting platforms. Cloud & Infrastructure Optimization Design cloud-native data processing solutions on Google Cloud Platform (GCP), leveraging services such as BigQuery, Cloud Storage, Cloud Functions, Pub/Sub, and Dataflow. Work on containerized deployment (Docker, Kubernetes) for scalable model inference. Implement cost-efficient, serverless data solutions where applicable. Business Impact & Cross-functional Collaboration Work closely with data analysts, marketing teams, and software engineers to align ML and data solutions with business objectives. Translate complex model insights into actionable business recommendations. Present findings and performance metrics to both technical and non-technical stakeholders. Qualifications & Skills: Educational Qualifications: Bachelor’s or Master’s degree in Computer Science, Data Science, Machine Learning, Artificial Intelligence, Statistics, or a related field. Certifications in Google Cloud (Professional Data Engineer, ML Engineer) is a plus. Must-Have Skills: Experience: 5-10 years with the mentioned skillset & relevant hands-on experience Data Engineering: Experience with ETL/ELT pipelines, data ingestion, transformation, and orchestration (Airflow, Dataflow, Composer). ML Model Development: Strong grasp of statistical modeling, supervised/unsupervised learning, time-series forecasting, and NLP. Programming: Proficiency in Python (Pandas, NumPy, Scikit-learn, TensorFlow/PyTorch) and SQL for large-scale data processing. Cloud & Infrastructure: Expertise in GCP (BigQuery, Vertex AI, Dataflow, Pub/Sub, Cloud Storage) or equivalent cloud platforms. MLOps & Deployment: Hands-on experience with CI/CD pipelines, model monitoring, and version control (MLflow, Kubeflow, Vertex AI, or similar tools). Data Warehousing & Real-time Processing: Strong knowledge of modern data platforms for batch and streaming data processing. Nice-to-Have Skills: Experience with Graph ML, reinforcement learning, or causal inference modeling. Working knowledge of BI tools (Looker, Tableau, Power BI) for integrating ML insights into dashboards. Familiarity with marketing analytics, attribution modeling, and A/B testing methodologies. Experience with distributed computing frameworks (Spark, Dask, Ray). Location: Bengaluru Brand: Merkle Time Type: Full time Contract Type: Permanent Show more Show less
Posted 3 weeks ago
4.0 years
0 Lacs
Noida, Uttar Pradesh, India
Remote
Who We Are Ontic makes software that corporate and government security professionals use to proactively manage threats, mitigate risks, and make businesses stronger. Built by security and software professionals, the Ontic Platform connects and unifies critical data, business processes, and collaborators in one place, consolidating security intelligence and operations. We call this Connected Intelligence. Ontic serves corporate security teams across key functions, including intelligence, investigations, GSOC, executive protection, and security operations. As Ontic employees, we put our mission first and value the trust bestowed upon us by our clients to help keep their people safe. We approach our clients and each other with empathy while focusing on the execution of our strategy. And we have fun doing it. Who We Are Ontic makes software that corporate and government security professionals use to proactively manage threats, mitigate risks, and make businesses stronger. Built by security and software professionals, the Ontic Platform connects and unifies critical data, business processes, and collaborators in one place, consolidating security intelligence and operations. We call this Connected Intelligence. Ontic serves corporate security teams across key functions, including intelligence, investigations, GSOC, executive protection, and security operations. Key Responsibilities Design, develop, and optimize machine learning models for various business applications. Build and maintain scalable AI feature pipelines for efficient data processing and model training. Develop robust data ingestion, transformation, and storage solutions for big data. Implement and optimize ML workflows, ensuring scalability and efficiency. Monitor and maintain deployed models, ensuring performance, reliability, and retraining when necessary. Qualifications And Experience Bachelor's or Master's degree in Computer Science, Artificial Intelligence, Data Science, or a related field. 4+ years of experience in machine learning, deep learning, or data science roles. Proficiency in Python and ML frameworks/tools such as PyTorch, Langchain Experience with data processing frameworks like Spark, Dask, Airflow and Dagster Hands-on experience with cloud platforms (AWS, GCP, Azure) and ML services. Experience with MLOps tools like ML flow, Kubeflow Familiarity with containerisation and orchestration tools like Docker and Kubernetes. Excellent problem-solving skills and ability to work in a fast-paced environment. Strong communication and collaboration skills. Ontic Benefits & Perks Competitive Salary Medical Benefits Internet Reimbursement Home Office Stipend Continued Education Stipend Festive & Achievement Celebrations Dynamic Office Environment Ontic is an equal opportunity employer. We are committed to a work environment that celebrates diversity. We do not discriminate against any individual based on race, color, sex, national origin, age, religion, marital status, sexual orientation, gender identity, gender expression, military or veteran status, disability, or any factors protected by applicable law. Ontic Benefits & Perks Competitive Salary Medical, Vision & Dental Benefits 401k Stock Options HSA Contribution Learning Stipend Flexible PTO Policy Quarterly company ME (mental escape) days Generous Parental Leave policy Home Office Stipend Mobile Phone Reimbursement Home Internet Reimbursement for Remote Employees Anniversary & Milestone Celebrations Ontic is an equal-opportunity employer. We are committed to a work environment that celebrates diversity. We do not discriminate against any individual based on race, color, sex, national origin, age, religion, marital status, sexual orientation, gender identity, gender expression, military or veteran status, disability, or any factors protected by applicable law. All Ontic employees are expected to understand and adhere to all Ontic Security and Privacy related policies in order to protect Ontic data and our clients data. Show more Show less
Posted 3 weeks ago
3.0 - 5.0 years
10 - 15 Lacs
Pune
Work from Office
Job Description: Sr. Software Engineer (Python) Company: Karini AI Location: Pune (Wakad) Experience Required: 3 - 5 years Compensation: Not Disclosed Role Overview : We are seeking a skilled Sr. Software Engineer with advanced Python skills with a passion for product development and a knowledge of Machine Learning and/or Generative AI. You will collaborate with a talented team of engineers and AI Engineers to design and develop high-quality Generative AI platform on AWS. Key Responsibilities : Designed and developed backend applications and APIs using Python. Work on product development, building robust, scalable, and maintainable solutions. Integrate Generative AI models into production environments to solve real-world problems. Collaborate with cross-functional teams, including data scientists, product managers, and designers, to understand requirements and deliver solutions. Optimize application performance and ensure scalability across cloud environments. Write clean, maintainable, and efficient code while adhering to best practices. Requirements : 3-5 years of hands-on experience in Product development. Demonstrable experience in understanding advanced python concepts for building scalable systems. Demonstrable experience working with FastAPI server in production environment. Familiarity with unit testing, version control and CI/CD Good understanding of Machine Learning concepts and frameworks (e.g., TensorFlow, PyTorch). Experience with integrating and deploying ML models into applications is a plus. Knowledge of database systems (SQL/NoSQL) and RESTful API development. Exposure to containerization (Docker) and cloud platforms (AWS). Strong problem-solving skills and attention to detail. Preferred Qualifications : Bachelor of Engineering in Computer Science, Information Technology, or any other engineering discipline. M.Tech, M.E. & B.E-Computer Science preferred. Hands-on experience in product-focused organizations. Experience working with data pipelines or data engineering tasks. Knowledge of CI/CD pipelines and DevOps practices. Familiarity with version control tools like Git. Interest or experience in Generative AI or NLP applications. What We Offer: Top-tier compensation package, aligned with industry benchmarks. Comprehensive employee benefits including Provident Fund (PF) and medical insurance. Experience working with Ex-AWS founding team with the fastest growing company. Work on innovative AI-driven products that solve complex problems. Collaborate with a talented and passionate team in a dynamic environment. Opportunities for professional growth and skill enhancement in Generative AI A supportive, inclusive, and flexible work culture that values creativity and ownership.
Posted 3 weeks ago
5.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
As a Senior Data Scientist, you will drive data science initiatives from conception to deployment, crafting advanced ML models and providing mentorship to junior colleagues. Collaborate seamlessly across teams to integrate data-driven solutions and maintain data governance compliance. Stay abreast of industry trends, contribute to thought leadership, and innovate solutions for intricate business problems. Responsibilities Lead and manage data science projects from conception to deployment, ensuring alignment with business objectives and deadlines. Develop and deploy AI and statistical algorithms to extract insights and drive actionable recommendations from complex datasets. Provide guidance and mentorship to junior data scientists on advanced analytics techniques, coding best practices, and model interpretation. Design rigorous testing frameworks to evaluate model performance, validate results, and iterate on models to improve accuracy and reliability. Stay updated with the latest advancements in data science methodologies, tools, and technologies, and contribute to the team's knowledge base through sharing insights, attending conferences, and conducting research. Establishing and maintaining data governance policies, ensuring data integrity, security, and compliance with regulations. Qualifications 5+ Years of prior analytics and data science experience in driving projects involving AI and Advanced Analytics. 3+ Years Experience in Deep Learning frameworks, NLP/Text Analytics, SVM, LSTM, Transformers, Neural network. In-depth understanding and hands on experience in working with Large Language Models along with exposure in fine tuning open source models for variety of use case. Strong exposure in prompt engineering, knowledge of vector database, langchain framework and data embeddings. Strong problem-solving skills and the ability to iterate and experiment to optimize AI model behavior. Proficiency in Python programming language for data analysis, machine learning, and web development. Hands-on experience with machine learning libraries such as NumPy, SciPy, and scikit-learn. Experience with distributed computing frameworks such as PySpark and Dask. Excellent problem-solving skills and attention to detail. Ability to communicate effectively with diverse clients/stakeholders. Education Background Bachelor’s in Computer Science, Statistics, Mathematics, or related field. Tier I/II candidates preferred. Show more Show less
Posted 3 weeks ago
0 years
1 Lacs
India
On-site
We are looking for a passionate Python IOT developer to join our team at Magneto Dynamics. About Magneto Dynamics – From a humble beginning in 1989 , we have come a long way in quest for innovation. For us , Quality has been a way of life , inbuilt in our system . Our Business profile expanded with our quest for niche and innovative applications and we ventured out to successfully develop intricate sub assemblies and parts of Flow meters for our OEM customer in US and Europe. These include complete assembly of components which are so unique to the industry. Our parts range from complex assemblies like Clutch Calibrators, Pickup sensors to high precision Turbine Rotors, and other critical flow meter related parts. We now operate on a mixed model of design, manufacturing, assembly and outsourcing. From Aluminum and Zinc casting to Stainless Steel machining, we are involved in the entire gamut of servicing any of our customer needs. Our Specialization involves work related to casting, molding, machining, precision fabrication for high precision and high end applications. We have special purpose Wire EDM machine and CNC machines for making precision parts. We also develop customized testing facility and fixtures meeting customer's needs. We are geared up to bring out precision parts utilizing most modern technologies by use of Solid Modelling, CNC turn mills , and VMCs . Our Infrastructure is well established with lean manufacturing concepts. As an ISO 9001 certified company, we are focused on establishing a good Quality system in place. Our Policy is to minimize waste in supply chain leading to tangible benefits to our customers. Strong Management Culture is inbuilt in our system with a clear objective to Delight Customers. With a built-up space of 7000 Sqft on a 12000 sqft own land, a supportive base of vendors and other business associates our ecosystem is quite prepared for all expansion plans. Job Profile- You will be responsible for developing and implementing high-quality software solutions, creating complex applications using cutting-edge programming features and frameworks and collaborating with other teams in the firm to define, design and ship new features. As an active part of our company, you will brainstorm and chalk out solutions to suit our requirements and meet our business goals. You will also be working on data engineering problems and building data pipelines. You would get ample opportunities to work on challenging and innovative projects, using the latest technologies and tools. If you enjoy working in a fast-paced and collaborative environment, we encourage you to apply for this exciting role. We offer industry-standard compensation packages, relocation assistance, and professional growth and development opportunities. Objectives of this role · Develop, test and maintain high-quality software using Python and Embedded C programming language. · Participate in the entire software development lifecycle, building, testing and delivering high-quality solutions. · Collaborate with cross-functional teams to identify and solve complex problems. · Write clean and reusable code that can be easily maintained and scaled. Your tasks- · Create large-scale IOT / Embedded Applications. · Participate in code reviews, ensure code quality and identify areas for improvement to implement practical solutions. · Debugging codes when required and troubleshooting any Python-related queries. · Keep up to date with emerging trends and technologies in Python development. Required skills and qualifications- · Bachelor's degree in Computer Science, Software Engineering or a related field.(Freshers) · Strong Programming Fundamentals. · In-depth understanding of the Python software development stacks, ecosystems, frameworks and tools such as Numpy, Scipy, Pandas, Dask, spaCy, NLTK, sci-kit-learn and PyTorch. · Familiarity with front-end development using HTML, CSS, and JavaScript. · Familiarity with database technologies such as SQL and MySQL. · Excellent problem-solving ability with solid communication and collaboration skills. Interview Date: 09-06-2025 Interview Time: 10am onwards Contact Person: Antony Peter - 9962048534 Interview Venue: Talent Pursuits. Magneto Dynamics No 7,8,9 Venkateswar Ngr Main Road, Perungudi, Chennai, Tamil Nadu 600096 Google Map Link: https://goo.gl/maps/TSStgJfA7B7c7DMj7 Job Types: Full-time, Permanent Pay: Up to ₹150,000.00 per year Benefits: Provident Fund Schedule: Day shift Work Location: In person
Posted 3 weeks ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
As a Senior Data Scientist, you will drive data science initiatives from conception to deployment, crafting advanced ML models and providing mentorship to junior colleagues. Collaborate seamlessly across teams to integrate data-driven solutions and maintain data governance compliance. Stay abreast of industry trends, contribute to thought leadership, and innovate solutions for intricate business problems. Responsibilities Lead and manage data science projects from conception to deployment, ensuring alignment with business objectives and deadlines. Develop and deploy AI and statistical algorithms to extract insights and drive actionable recommendations from complex datasets. Provide guidance and mentorship to junior data scientists on advanced analytics techniques, coding best practices, and model interpretation. Design rigorous testing frameworks to evaluate model performance, validate results, and iterate on models to improve accuracy and reliability. Stay updated with the latest advancements in data science methodologies, tools, and technologies, and contribute to the team's knowledge base through sharing insights, attending conferences, and conducting research. Establishing and maintaining data governance policies, ensuring data integrity, security, and compliance with regulations. Qualifications 5+ Years of prior analytics and data science experience in driving projects involving AI and Advanced Analytics. 3+ Years Experience in Deep Learning frameworks, NLP/Text Analytics, SVM, LSTM, Transformers, Neural network. In-depth understanding and hands on experience in working with Large Language Models along with exposure in fine tuning open source models for variety of use case. Strong exposure in prompt engineering, knowledge of vector database, langchain framework and data embeddings. Strong problem-solving skills and the ability to iterate and experiment to optimize AI model behavior. Proficiency in Python programming language for data analysis, machine learning, and web development. Hands-on experience with machine learning libraries such as NumPy, SciPy, and scikit-learn. Experience with distributed computing frameworks such as PySpark and Dask. Excellent problem-solving skills and attention to detail. Ability to communicate effectively with diverse clients/stakeholders. Education Background Bachelor’s in Computer Science, Statistics, Mathematics, or related field. Tier I/II candidates preferred. Show more Show less
Posted 3 weeks ago
0.0 years
0 Lacs
Perungudi, Chennai, Tamil Nadu
On-site
We are looking for a passionate Python IOT developer to join our team at Magneto Dynamics. About Magneto Dynamics – From a humble beginning in 1989 , we have come a long way in quest for innovation. For us , Quality has been a way of life , inbuilt in our system . Our Business profile expanded with our quest for niche and innovative applications and we ventured out to successfully develop intricate sub assemblies and parts of Flow meters for our OEM customer in US and Europe. These include complete assembly of components which are so unique to the industry. Our parts range from complex assemblies like Clutch Calibrators, Pickup sensors to high precision Turbine Rotors, and other critical flow meter related parts. We now operate on a mixed model of design, manufacturing, assembly and outsourcing. From Aluminum and Zinc casting to Stainless Steel machining, we are involved in the entire gamut of servicing any of our customer needs. Our Specialization involves work related to casting, molding, machining, precision fabrication for high precision and high end applications. We have special purpose Wire EDM machine and CNC machines for making precision parts. We also develop customized testing facility and fixtures meeting customer's needs. We are geared up to bring out precision parts utilizing most modern technologies by use of Solid Modelling, CNC turn mills , and VMCs . Our Infrastructure is well established with lean manufacturing concepts. As an ISO 9001 certified company, we are focused on establishing a good Quality system in place. Our Policy is to minimize waste in supply chain leading to tangible benefits to our customers. Strong Management Culture is inbuilt in our system with a clear objective to Delight Customers. With a built-up space of 7000 Sqft on a 12000 sqft own land, a supportive base of vendors and other business associates our ecosystem is quite prepared for all expansion plans. Job Profile- You will be responsible for developing and implementing high-quality software solutions, creating complex applications using cutting-edge programming features and frameworks and collaborating with other teams in the firm to define, design and ship new features. As an active part of our company, you will brainstorm and chalk out solutions to suit our requirements and meet our business goals. You will also be working on data engineering problems and building data pipelines. You would get ample opportunities to work on challenging and innovative projects, using the latest technologies and tools. If you enjoy working in a fast-paced and collaborative environment, we encourage you to apply for this exciting role. We offer industry-standard compensation packages, relocation assistance, and professional growth and development opportunities. Objectives of this role · Develop, test and maintain high-quality software using Python and Embedded C programming language. · Participate in the entire software development lifecycle, building, testing and delivering high-quality solutions. · Collaborate with cross-functional teams to identify and solve complex problems. · Write clean and reusable code that can be easily maintained and scaled. Your tasks- · Create large-scale IOT / Embedded Applications. · Participate in code reviews, ensure code quality and identify areas for improvement to implement practical solutions. · Debugging codes when required and troubleshooting any Python-related queries. · Keep up to date with emerging trends and technologies in Python development. Required skills and qualifications- · Bachelor's degree in Computer Science, Software Engineering or a related field.(Freshers) · Strong Programming Fundamentals. · In-depth understanding of the Python software development stacks, ecosystems, frameworks and tools such as Numpy, Scipy, Pandas, Dask, spaCy, NLTK, sci-kit-learn and PyTorch. · Familiarity with front-end development using HTML, CSS, and JavaScript. · Familiarity with database technologies such as SQL and MySQL. · Excellent problem-solving ability with solid communication and collaboration skills. Interview Date: 09-06-2025 Interview Time: 10am onwards Contact Person: Antony Peter - 9962048534 Interview Venue: Talent Pursuits. Magneto Dynamics No 7,8,9 Venkateswar Ngr Main Road, Perungudi, Chennai, Tamil Nadu 600096 Google Map Link: https://goo.gl/maps/TSStgJfA7B7c7DMj7 Job Types: Full-time, Permanent Pay: Up to ₹150,000.00 per year Benefits: Provident Fund Schedule: Day shift Work Location: In person
Posted 3 weeks ago
0.0 years
0 Lacs
Hyderabad / Secunderabad, Telangana, Telangana, India
On-site
Ready to build the future with AI At Genpact, we don&rsquot just keep up with technology&mdashwe set the pace. AI and digital innovation are redefining industries, and we&rsquore leading the charge. Genpact&rsquos , our industry-first accelerator, is an example of how we&rsquore scaling advanced technology solutions to help global enterprises work smarter, grow faster, and transform at scale. From large-scale models to , our breakthrough solutions tackle companies most complex challenges. If you thrive in a fast-moving, innovation-driven environment, love building and deploying cutting-edge AI solutions, and want to push the boundaries of what&rsquos possible, this is your moment. Genpact (NYSE: G) is an advanced technology services and solutions company that delivers lasting value for leading enterprises globally. Through our deep business knowledge, operational excellence, and cutting-edge solutions - we help companies across industries get ahead and stay ahead. Powered by curiosity, courage, and innovation , our teams implement data, technology, and AI to create tomorrow, today. Get to know us at and on , , , and . Inviting applications for the role of Consultant - Pyspark /Python Data Engineer! We are looking for a passionate Python developer to join our team at Genpact. You will be responsible for developing and implementing high-quality software solutions for data transformation and analytics using cutting-edge programming features and frameworks and collaborating with other teams in the firm to define, design and ship new features. As an active part of our company, you will brainstorm and chalk out solutions to suit our requirements and meet our business goals. You will also be working on data engineering problems and building data pipelines. You would get ample opportunities to work on challenging and innovative projects, using the latest technologies and tools. If you enjoy working in a fast-paced and collaborative environment, we encourage you to apply for this exciting role. We offer industry-standard compensation packages, relocation assistance , and professional growth and development opportunities. Responsibilities . Develop, test and maintain high-quality solutions using PySpark /Python programming language. . Participate in the entire software development lifecycle, building, testing and delivering high-quality data pipelines. . Collaborate with cross-functional teams to identify and solve complex problems. . Write clean and reusable code that can be easily maintained and scaled. . Keep up to date with emerging trends and technologies in Python development. Qualifications we seek in you! Minimum qualifications . years of experience as a Python Developer with a strong portfolio of projects. . Bachelor%27s degree in Computer Science , Software Engineering or a related field. . Experience on developing pipelines on cloud platforms such as AWS or Azure using AWS Glue or ADF . In-depth understanding of the Python software development stacks, ecosystems, frameworks and tools such as Numpy , Scipy , Pandas, Dask , spaCy , NLTK, Great Expectations, Splink and PyTorch . . Experience with data platforms such as Databricks/ Snowflake . Experience with front-end development using HTML or Python. . Familiarity with database technologies such as SQL and NoSQL. . Excellent problem-solving ability with solid communication and collaboration skills. Preferred skills and qualifications . Experience with popular Python frameworks such as Django, Flask, FastAPI or Pyramid. . Knowledge of GenAI concepts and LLMs. . Contributions to open-source Python projects or active involvement in the Python community. Why join Genpact Lead AI-first transformation - Build and scale AI solutions that redefine industries Make an impact - Drive change for global enterprises and solve business challenges that matter Accelerate your career &mdashGain hands-on experience, world-class training, mentorship, and AI certifications to advance your skills Grow with the best - Learn from top engineers, data scientists, and AI experts in a dynamic, fast-moving workplace Committed to ethical AI - Work in an environment where governance, transparency, and security are at the core of everything we build Thrive in a values-driven culture - Our courage, curiosity, and incisiveness - built on a foundation of integrity and inclusion - allow your ideas to fuel progress Come join the 140,000+ coders, tech shapers, and growth makers at Genpact and take your career in the only direction that matters: Up . Let&rsquos build tomorrow together. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color , religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values respect and integrity, customer focus, and innovation. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a %27starter kit,%27 paying to apply, or purchasing equipment or training.
Posted 3 weeks ago
0 years
0 Lacs
Madhya Pradesh, India
On-site
Job Overview: We are looking for a AI/ML Developer to join our team of researchers, data scientists, and developers. You will work on cutting-edge AI solutions across industries such as commerce, agriculture, insurance, financial markets, and procurement. Your role involves developing and optimizing machine learning and generative AI models to solve real-world challenges. Key Responsibilities: • Develop and optimize ML, NLP, Deep Learning, and Generative AI models. • Research and implement state-of-the-art algorithms for supervised and unsupervised learning. • Work with large-scale datasets in distributed environments. • Understand business processes to select and apply the best ML approaches. • Ensure scalability and performance of ML solutions. • Collaborate with cross-functional teams, including product owners, designers, and developers. • Solve complex data integration and deployment challenges. • Communicate results effectively using data visualization. • Work in global teams across different time zones. Required Skills & Experience: • Strong experience in Machine Learning, Deep Learning, NLP, and Generative AI . • Hands-on expertise in frameworks like TensorFlow, PyTorch, or Hugging Face Transformers . • Experience with LLMs (Large Language Models), model fine-tuning, and prompt engineering . • Proficiency in Python, R, or Scala for ML development. • Knowledge of cloud-based ML platforms (AWS, Azure, GCP). • Experience with big data processing (Spark, Hadoop, or Dask). • Ability to scale ML models from prototypes to production . • Strong analytical and problem-solving skills. If you’re passionate about pushing the boundaries of ML and GenAI , we’d love to hear from you! Show more Show less
Posted 3 weeks ago
10.0 years
0 Lacs
Pune, Maharashtra, India
Remote
Entity: Finance Job Family Group: Subsurface Group Job Description: We are a global energy business involved in every aspect of the energy system. We are working towards delivering light, heat, and mobility to millions of people every day. We are one of the very few companies equipped to solve some of the big complex challenges that matter for the future. We have a real contribution to make to the world's ambition of a low-carbon future. Join us and be part of what we can accomplish together. You can participate in our new ambition to become a net zero company by 2050 or sooner and help the world get to net zero. Would you like to discover how our diverse, hardworking people are owning the way in making energy cleaner and better – and how you can play your part in our world-class team? Join our Finance Team and advance your career as Data Scientist Manager ! ROLE SYNOPSIS: The role of Staff data scientist is a senior-level position within a data-science team, responsible for leading and giving to sophisticated data analysis, modeling, and machine learning projects. This role plays a pivotal role in extracting actionable insights, driving strategic decision-making, and enhancing business processes through data-driven solutions. KEY ACCOUNTABILITIES: Advanced Data Analysis and Modeling: Feature selection and dimensionality reduction. Model evaluation and validation. Data modeling techniques. Domain knowledge: Expertise in domain in which the data scientist operates is critical for asking relevant questions and creating significant solutions. Business Insight: Understanding of business operations, market dynamics, and financial implications to prioritize data science projects that align with the FDO's goals Out of Code computing: Use libraries that support out-of-core computing, such as Dask in Python These libraries can process data that doesn't fit into memory by reading it in smaller portions from disk. Machine Learning: Innovation and Strategy - Technical directions on models and techniques in the team. Advanced Machine Learning Skills for complex models. Partnership and Communication: Effectively communicate to non-technical customers. Domain process understanding from SME. Communicate findings through visualization so interaction with visualization and reporting authorities. Continuous Learning: Stay relevant through technical data science with domain understanding. Data Ethics and Privacy: Anonymization and Pseudonymization. Data Retention Policies. Database Management: Knowledge of working with databases and querying data using SQL or NoSQL databases is valuable for accessing and extracting data Project Management: Ability to lead data science projects effectively, including prioritizing, planning, and delivering results within set timelines. Statistical Analysis and Mathematics: Solid grasp of statistical methods and mathematical concepts are needed for data analysis, modeling, and drawing significant insights from data. EXPERIENCE AND JOB REQUIREMENTS: Overall: DS&T - Data & Analysis [Data Science Team] plays a crucial role in driving data- informed decision-making and generating actionable insights to support the company's goals. This team is responsible for processing, analyzing, and interpreting large and sophisticated datasets from multiple datasets to provide valuable insights and recommendations across various domains An experienced [10years+] with master's degree or equivalent experience in quantitative, qualitative field such as Computer Science, Statistics, Mathematics, Physics, Engineering, or a related data field is often required. Skills: Leadership role in Data Analysis, Programming proficiency in Python, SQL, Azure data bricks, Statistics & Mathematics Leadership qualities to steer the team. Strategic direction and technical expertise. Soft skills: Active listening, translate business problems into data questions, Communication and partnership. Data Sources: SAP, Concur, Salesforce, Workday, Excel files Able to prepare analytical reports, presentations and/or visualization dashboards to communicate findings, KPIs and insights to both technical and non-technical partners. Stay up to date with industry trends, standard methodologies and new technologies in data analytics, machine learning, data science technique ! DESIRABLE CRITERIA: Certified in SQL, Machine learning, Azure data bricks. Extensive experience (typically 10+ years) Effective team manager, managing team of 6-8 individuals. Lead, senior data scientists and/or data engineers. Effective team manager, managing team of 6-8 individuals. Lead, senior data scientists and/or data engineers. Tools and Libraries: Pandas, PySpark, NumPy, and SciPy. NLP, Fuzzy matching logic. Experience with SAP systems and data structures, including SAP ECC, SAP S/4HANA, or SAP BW. Travel Requirement Up to 10% travel should be expected with this role Relocation Assistance: This role is eligible for relocation within country Remote Type: This position is a hybrid of office/remote working Skills: Legal Disclaimer: We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, socioeconomic status, neurodiversity/neurocognitive functioning, veteran status or disability status. Individuals with an accessibility need may request an adjustment/accommodation related to bp’s recruiting process (e.g., accessing the job application, completing required assessments, participating in telephone screenings or interviews, etc.). If you would like to request an adjustment/accommodation related to the recruitment process, please contact us. If you are selected for a position and depending upon your role, your employment may be contingent upon adherence to local policy. This may include pre-placement drug screening, medical review of physical fitness for the role, and background checks. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Navi Mumbai, Maharashtra, India
Remote
As an expectation a fitting candidate must have/be: Ability to analyze business problem and cut through the data challenges. Ability to churn the raw corpus and develop a data/ML model to provide business analytics (not just EDA), machine learning based document processing and information retrieval Quick to develop the POCs and transform it to high scale production ready code. Experience in extracting data through complex unstructured documents using NLP based technologies. Good to have : Document analysis using Image processing/computer vision and geometric deep learning Technology Stack: Python as a primary programming language. Conceptual understanding of classic ML/DL Algorithms like Regression, Support Vectors, Decision tree, Clustering, Random Forest, CART, Ensemble, Neural Networks, CNN, RNN, LSTM etc. Programming: Must Have: Must be hands-on with data structures using List, tuple, dictionary, collections, iterators, Pandas, NumPy and Object-oriented programming Good to have: Design patterns/System design, cython ML libraries: Must Have: Scikit-learn, XGBoost, imblearn, SciPy, Gensim Good to have: matplotlib/plotly, Lime/sharp Data extraction and handling: Must Have: DASK/Modin, beautifulsoup/scrappy, Multiprocessing Good to have: Data Augmentation, Pyspark, Accelerate NLP/Text analytics: Must Have: Bag of words, text ranking algorithm, Word2vec, language model, entity recognition, CRF/HMM, topic modelling, Sequence to Sequence Good to have: Machine comprehension, translation, elastic search Deep learning: Must Have: TensorFlow/PyTorch, Neural nets, Sequential models, CNN, LSTM/GRU/RNN, Attention, Transformers, Residual Networks Good to have: Knowledge of optimization, Distributed training/computing, Language models Software peripherals: Must Have: REST services, SQL/NoSQL, UNIX, Code versioning Good to have: Docker containers, data versioning Research: Must Have: Well verse with latest trends in ML and DL area. Zeal to research and implement cutting areas in AI segment to solve complex problems Good to have: Contributed to research papers/patents and it is published on internet in ML and DL Morningstar is an equal opportunity employer. Morningstar’s hybrid work environment gives you the opportunity to work remotely and collaborate in-person each week. We’ve found that we’re at our best when we’re purposely together on a regular basis, at least three days each week. A range of other benefits are also available to enhance flexibility as needs change. No matter where you are, you’ll have tools and resources to engage meaningfully with your global colleagues. I10_MstarIndiaPvtLtd Morningstar India Private Ltd. (Delhi) Legal Entity Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Senior Platform Engineer We’re hiring a Senior Platform Engineer – ML Infrastructure in the AI/ML infrastructure and deep-tech industry . We’re seeking an experienced engineer to join our core infrastructure team. This role will be critical in designing and scaling the foundational systems that power AI products. If you're passionate about building robust, efficient, and innovative ML infrastructure, we’d love to hear from you. What you’ll do Design, build and operate scalable ML & data infrastructure across on‑prem and cloud (AWS/Azure/GCP). Stand‑up and automate multi‑node Kubernetes + GPU clusters; keep them healthy and cost‑efficient. Create golden‑path CI/CD & MLOps pipelines (Kubeflow/Flyte/Ray) for training, serving, RAG and agentic workflows. Partner with ML engineers to debug thorny CUDA/K8s issues before they hit prod. Champion IaC (Terraform/Pulumi) and config‑as‑code (Ansible) standards. Mentor developers on platform best‑practices and drive a platform‑first mindset. What makes you a great fit 5 + yrs DevOps/SRE/Platform engineering; 2 + with ML infra at scale. Deep hands‑on with Docker, Kubernetes, Helm and kube‑native tooling. Comfort with distributed GPU scheduling, CUDA drivers and networking. Strong Terraform/Pulumi, Ansible, Bash/Python skills. Experience operating data lakes, high‑availability databases and object stores. Familiarity with ML orchestration (Kubeflow, Flyte, Prefect) and model registries. Working knowledge of RAG, LLM fine‑tuning or agentic frameworks is a big plus. Nice to have Experience with Ray, Spark, or Dask. Security & RBAC design chops. OSS contributions in the cloud‑native / MLOps space. Skills: Data,ML,Computing Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Belgaum, Karnataka, India
On-site
Senior Platform Engineer We’re hiring a Senior Platform Engineer – ML Infrastructure in the AI/ML infrastructure and deep-tech industry . We’re seeking an experienced engineer to join our core infrastructure team. This role will be critical in designing and scaling the foundational systems that power AI products. If you're passionate about building robust, efficient, and innovative ML infrastructure, we’d love to hear from you. What you’ll do Design, build and operate scalable ML & data infrastructure across on‑prem and cloud (AWS/Azure/GCP). Stand‑up and automate multi‑node Kubernetes + GPU clusters; keep them healthy and cost‑efficient. Create golden‑path CI/CD & MLOps pipelines (Kubeflow/Flyte/Ray) for training, serving, RAG and agentic workflows. Partner with ML engineers to debug thorny CUDA/K8s issues before they hit prod. Champion IaC (Terraform/Pulumi) and config‑as‑code (Ansible) standards. Mentor developers on platform best‑practices and drive a platform‑first mindset. What makes you a great fit 5 + yrs DevOps/SRE/Platform engineering; 2 + with ML infra at scale. Deep hands‑on with Docker, Kubernetes, Helm and kube‑native tooling. Comfort with distributed GPU scheduling, CUDA drivers and networking. Strong Terraform/Pulumi, Ansible, Bash/Python skills. Experience operating data lakes, high‑availability databases and object stores. Familiarity with ML orchestration (Kubeflow, Flyte, Prefect) and model registries. Working knowledge of RAG, LLM fine‑tuning or agentic frameworks is a big plus. Nice to have Experience with Ray, Spark, or Dask. Security & RBAC design chops. OSS contributions in the cloud‑native / MLOps space. Skills: Data,ML,Computing Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Belgaum, Karnataka, India
On-site
Senior Platform Engineer We’re hiring a Senior Platform Engineer – ML Infrastructure in the AI/ML infrastructure and deep-tech industry . We’re seeking an experienced engineer to join our core infrastructure team. This role will be critical in designing and scaling the foundational systems that power AI products. If you're passionate about building robust, efficient, and innovative ML infrastructure, we’d love to hear from you. What you’ll do Design, build and operate scalable ML & data infrastructure across on‑prem and cloud (AWS/Azure/GCP). Stand‑up and automate multi‑node Kubernetes + GPU clusters; keep them healthy and cost‑efficient. Create golden‑path CI/CD & MLOps pipelines (Kubeflow/Flyte/Ray) for training, serving, RAG and agentic workflows. Partner with ML engineers to debug thorny CUDA/K8s issues before they hit prod. Champion IaC (Terraform/Pulumi) and config‑as‑code (Ansible) standards. Mentor developers on platform best‑practices and drive a platform‑first mindset. What makes you a great fit 5 + yrs DevOps/SRE/Platform engineering; 2 + with ML infra at scale. Deep hands‑on with Docker, Kubernetes, Helm and kube‑native tooling. Comfort with distributed GPU scheduling, CUDA drivers and networking. Strong Terraform/Pulumi, Ansible, Bash/Python skills. Experience operating data lakes, high‑availability databases and object stores. Familiarity with ML orchestration (Kubeflow, Flyte, Prefect) and model registries. Working knowledge of RAG, LLM fine‑tuning or agentic frameworks is a big plus. Nice to have Experience with Ray, Spark, or Dask. Security & RBAC design chops. OSS contributions in the cloud‑native / MLOps space. Skills: Data,ML,Computing Show more Show less
Posted 3 weeks ago
5.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Position Overview Job Title: Senior Python Engineer – (Credit Risk Technology) Corporate Title: Assistant Vice President Location: Pune, India Role Description We are looking for an experienced Senior Python Engineer to lead the development of robust, scalable, and high-performance software solutions. The ideal candidate will have extensive experience in Python programming, system design, and mentoring junior developers. You will play a key role in designing and implementing complex systems, ensuring code quality, and driving technical excellence within the team. What We’ll Offer You As part of our flexible scheme, here are just some of the benefits that you’ll enjoy Best in class leave policy Gender neutral parental leaves 100% reimbursement under childcare assistance benefit (gender neutral) Sponsorship for Industry relevant certifications and education Employee Assistance Program for you and your family members Comprehensive Hospitalization Insurance for you and your dependents Accident and Term life Insurance Complementary Health screening for 35 yrs. and above Your Key Responsibilities Develop, test, and maintain Python-based applications and services. Write clean, efficient, and reusable code following best practices. Collaborate with product managers, designers, and other engineers to implement new features. Optimize application performance and scalability. Debug and resolve software defects and issues. Integrate third-party APIs and libraries as needed. Participate in code reviews to ensure code quality and consistency. Stay updated with the latest trends and advancements in Python and related technologies. Your Skills And Experience 5+ years of professional experience in Python development. Strong expertise in Python frameworks (e.g., Django, Flask, FastAPI). Experience with system design, architecture, and scalable application development. Proficiency in working with relational and non-relational databases (e.g., PostgreSQL, MySQL, MongoDB). Solid understanding of RESTful APIs, microservices, and distributed systems. Experience with cloud platforms (e.g., AWS, Azure, GCP) and containerization tools (e.g., Docker, Kubernetes). Familiarity with CI/CD pipelines and DevOps practices. Strong problem-solving skills and attention to detail. Excellent communication and leadership abilities. Preferred Qualifications: Experience with big data processing tools (e.g., Apache Spark, Dask). Knowledge of asynchronous programming and event-driven architectures. Familiarity with machine learning frameworks (e.g., TensorFlow, PyTorch) is a plus. Bachelor's or master’s degree in computer science, Engineering, or a related field. How We’ll Support You Training and development to help you excel in your career Coaching and support from experts in your team A culture of continuous learning to aid progression A range of flexible benefits that you can tailor to suit your needs About Us And Our Teams Please visit our company website for further information: https://www.db.com/company/company.htm We strive for a culture in which we are empowered to excel together every day. This includes acting responsibly, thinking commercially, taking initiative and working collaboratively. Together we share and celebrate the successes of our people. Together we are Deutsche Bank Group. We welcome applications from all people and promote a positive, fair and inclusive work environment. Show more Show less
Posted 3 weeks ago
0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Company Description NielsenIQ is a consumer intelligence company that delivers the Full View™, the world’s most complete and clear understanding of consumer buying behavior that reveals new pathways to growth. Since 1923, NIQ has moved measurement forward for industries and economies across the globe. We are putting the brightest and most dedicated minds together to accelerate progress. Our diversity brings out the best in each other so we can leave a lasting legacy on the work that we do and the people that we do it with. NielsenIQ offers a range of products and services that leverage Machine Learning and Artificial Intelligence to provide insights into consumer behavior and market trends. This position opens the opportunity to apply the latest state of the art in AI/ML and data science to global and key strategic projects Job Description We are looking for a Senior Research Scientist with a data-centric mindset to join our applied research and innovation team. The ideal candidate will have a strong background in machine learning, deep learning, operationalization of AI/ML and process automation. You will be responsible for analyzing data, researching the most appropriate techniques, and the development, testing, support and delivery of proof of concepts to resolve real-world and large-scale challenging problems. Job Responsibilities Develop and apply machine learning innovations with minimal technical supervision. Understand the requirements from stakeholders and be able to communicate results and conclusions in a way that is accurate, clear and winsome. Perform feasibility studies and analyse data to determine the most appropriate solution. Work on many different data challenges, always ensuring a combination of simplicity, scalability, reproducibility and maintainability within the ML solutions and source code. Both data and software must be developed and maintained with high-quality standards and minimal defects. Collaborate with other technical fellows on the integration and deployment of ML solutions. To work as a member of a team, encouraging team building, motivation and cultivating effective team relations. Qualifications Essential Requirements Bachelor's degree in Computer Science or an equivalent numerate discipline Demonstrated senior experience in Machine Learning, Deep Learning & other AI fields Experience working with large datasets, production-grade code & operationalization of ML solutions EDA analysis & practical hands-on experience with datasets, ML models (Pytorch or Tensorflow) & evaluations Able to understand scientific papers & develop the idea into executable code Analytical mindset, problem solving & logical thinking capabilities Proactive attitude, constructive, intellectual curiosity & persistence to find answers to questions A high level of interpersonal & communication skills in English & strong ability to meet deadlines Python, Pytorch, Git, pandas, dask, polars, sklearn, huggingface, docker, databricks Desired Skills Masters degree &/or specialization courses in AI/ML. PhD in science is an added value Experience in MLOPs (MLFlow, Prefect) & deployment of AI/ML solutions to the cloud (Azure preferred) Understanding & practice of LLMs & Generative AI (prompt engineering, RAG) Experience with Robotic Process Automation, Time Series Forecasting & Predictive modeling A practical grasp of databases (SQL, ElasticSearch, Pinecone, Faiss) Previous experience in retail, consumer, ecommerce, business, FMCG products (NielsenIQ portfolio) Additional Information With @NielsenIQ, we’re now an even more diverse team of 40,000 people – each with their own stories Our increasingly diverse workforce empowers us to better reflect the diversity of the markets we measure. Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) Our Benefits Flexible working environment Volunteer time off LinkedIn Learning Employee-Assistance-Program (EAP) About NIQ NIQ is the world’s leading consumer intelligence company, delivering the most complete understanding of consumer buying behavior and revealing new pathways to growth. In 2023, NIQ combined with GfK, bringing together the two industry leaders with unparalleled global reach. With a holistic retail read and the most comprehensive consumer insights—delivered with advanced analytics through state-of-the-art platforms—NIQ delivers the Full View™. NIQ is an Advent International portfolio company with operations in 100+ markets, covering more than 90% of the world’s population. For more information, visit NIQ.com Want to keep up with our latest updates? Follow us on: LinkedIn | Instagram | Twitter | Facebook Our commitment to Diversity, Equity, and Inclusion NIQ is committed to reflecting the diversity of the clients, communities, and markets we measure within our own workforce. We exist to count everyone and are on a mission to systematically embed inclusion and diversity into all aspects of our workforce, measurement, and products. We enthusiastically invite candidates who share that mission to join us. We are proud to be an Equal Opportunity/Affirmative Action-Employer, making decisions without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability status, age, marital status, protected veteran status or any other protected class. Our global non-discrimination policy covers these protected classes in every market in which we do business worldwide. Learn more about how we are driving diversity and inclusion in everything we do by visiting the NIQ News Center: https://nielseniq.com/global/en/news-center/diversity-inclusion Show more Show less
Posted 3 weeks ago
1.0 - 2.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Job Description The Risk Business identifies, monitors, evaluates, and manages the firm’s financial and non-financial risks in support of the firm’s Risk Appetite Statement and the firm’s strategic plan. Operating in a fast paced and dynamic environment and utilizing the best in class risk tools and frameworks, Risk teams are analytically curious, have an aptitude to challenge, and an unwavering commitment to excellence. Overview To ensure uncompromising accuracy and timeliness in the delivery of the risk metrics, our platform is continuously growing and evolving. Market Risk Engineering combines the principles of Computer Science, Mathematics and Finance to produce large scale, computationally intensive calculations of risk Goldman Sachs faces with each transaction we engage in. Market Risk Engineering has an opportunity for an Associate level Software Engineer to work across a broad range of applications and extremely diverse set of technologies to keep the suite operating at peak efficiency. As an Engineer in the Risk Engineering organization, you will have the opportunity to impact one or more aspects of risk management. You will work with a team of talented engineers to drive the build & adoption of common tools, platforms, and applications. The team builds solutions that are offered as a software product or as a hosted service. We are a dynamic team of talented developers and architects who partner with business areas and other technology teams to deliver high profile projects using a raft of technologies that are fit for purpose (Java, Cloud computing, HDFS, Spark, S3, ReactJS, Sybase IQ among many others). A glimpse of the interesting problems that we engineer solutions for, include acquiring high quality data, storing it, performing risk computations in limited amount of time using distributed computing, and making data available to enable actionable risk insights through analytical and response user interfaces. What We Look For Senior Developer in large projects across a global team of developers and risk managers Performance tune applications to improve memory and CPU utilization. Perform statistical analyses to identify trends and exceptions related Market Risk metrics. Build internal and external reporting for the output of risk metric calculation using data extraction tools, such as SQL, and data visualization tools, such as Tableau. Utilize web development technologies to facilitate application development for front end UI used for risk management actions Develop software for calculations using databases like Snowflake, Sybase IQ and distributed HDFS systems. Interact with business users for resolving issues with applications. Design and support batch processes using scheduling infrastructure for calculation and distributing data to other systems. Oversee junior technical team members in all aspects of Software Development Life Cycle (SDLC) including design, code review and production migrations. Skills And Experience Bachelor’s degree in Computer Science, Mathematics, Electrical Engineering or related technical discipline 1-2 years’ experience is working risk technology team in another bank, financial institution. Experience in market risk technology is a plus. Experience with one or more major relational / object databases. Experience in software development, including a clear understanding of data structures, algorithms, software design and core programming concepts Comfortable multi-tasking, managing multiple stakeholders and working as part of a team Comfortable with working with multiple languages Technologies: Scala, Java, Python, Spark, Linux and shell scripting, TDD (JUnit), build tools (Maven/Gradle/Ant) Experience in working with process scheduling platforms like Apache Airflow. Should be ready to work in GS proprietary technology like Slang/SECDB An understanding of compute resources and the ability to interpret performance metrics (e.g., CPU, memory, threads, file handles). Knowledge and experience in distributed computing – parallel computation on a single machine like DASK, Distributed processing on Public Cloud. Knowledge of SDLC and experience in working through entire life cycle of the project from start to end About Goldman Sachs At Goldman Sachs, we commit our people, capital and ideas to help our clients, shareholders and the communities we serve to grow. Founded in 1869, we are a leading global investment banking, securities and investment management firm. Headquartered in New York, we maintain offices around the world. We believe who you are makes you better at what you do. We're committed to fostering and advancing diversity and inclusion in our own workplace and beyond by ensuring every individual within our firm has a number of opportunities to grow professionally and personally, from our training and development opportunities and firmwide networks to benefits, wellness and personal finance offerings and mindfulness programs. Learn more about our culture, benefits, and people at GS.com/careers. We’re committed to finding reasonable accommodations for candidates with special needs or disabilities during our recruiting process. Learn more: https://www.goldmansachs.com/careers/footer/disability-statement.html © The Goldman Sachs Group, Inc., 2023. All rights reserved. Goldman Sachs is an equal opportunity employer and does not discriminate on the basis of race, color, religion, sex, national origin, age, veterans status, disability, or any other characteristic protected by applicable law. Show more Show less
Posted 3 weeks ago
3.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
About Us Jar is India’s leading Daily Saving app that helps people build strong saving habits—one small step at a time. Our goal is to make saving simple, rewarding, and truly life-changing . Founded in 2021 by Misbah Ashraf and Nishchay AG , Jar is a Bengaluru-based startup with one simple belief: saving a little every day in 24K Digital Gold can truly transform your future. Today, 20 million+ Indians trust Jar as their saving partner. With flexible saving options— Daily, Weekly, Monthly, and Instant Saving —we have made it easy for everyone to save in their own way and withdraw anytime. We are one of the leaders in UPI autopay transactions, crossing more than 1 million transactions per day. In 2023, we expanded our vision with Nek , our jewelry brand crafted to bring together luxury and affordability, it has since surpassed ₹100 crore in revenue. We have a big dream of bringing “ Har Ghar Sona”. Small, consistent savings are just the start. We’re here to walk alongside our users, helping Indians secure their financial future every step of the way. Backed by Tiger Global Management, Arkam Ventures, and WEH Ventures, among others, we have raised 50 million+ in funding. In January 2025 , we hit a huge milestone of becoming profitable . Now, we’re charging ahead, focused on sustainable growth and scaling impact. And this is just the beginning! What will be your responsibilities? Data Analysis & Insights Perform deep dives on large, structured, and unstructured datasets to identify trends, irregularities, and actionable insights that drive product development and business decisions. Provide actionable insights Develop and maintain dashboards and reports for key stakeholders. Continuously monitor transactions to ensure accuracy, completeness, and integrity. Ensure data accuracy and consistency across multiple sources. Identify discrepancies, gaps, and failures in transactions and escalate for immediate resolution. Analyze historical sales data and market trends to develop accurate demand forecasts. Strategic Decision-Making Collaborate with product, engineering, and business teams to solve problems using data. Lead analysis to improve key retention and renewal metrics such as churn rate, renewal rate, transaction success rate, and GMV growth Support A/B testing and experiments to optimize product and feature performance. Analyze transaction failures, payment declines, and retry success rates to optimize the auto debit payment funnel. Data Management & Modeling Design and optimize data models that support real-time transaction monitoring, churn prediction, and cohort analysis for subscription-based customers. Partner with data engineers to ensure data accuracy, to improve data collection, completeness, and accessibility. Build and optimize reports that track business performance over time. Automate recurring reports and processes to improve efficiency. Leadership & Mentorship Guide and mentor junior analysts in best practices, technical skills, and storytelling through data. Lead projects end-to-end, ensuring clarity, timeliness, and high-quality outcomes. Metrics & Reporting & Ownerships Own reporting of subscription transactions, payment success rates, churn trends, and GMV impact, ensuring timely insights to Business & Product teams. Automate and optimize reporting to provide timely insights to leadership. Supporting other teams in setting up success metrics for particular products/features What’s required from you? Technical Skills Strong proficiency in Python & MongoDb for data analysis, including Pandas, NumPy, Scikit-learn, Matplotlib/Seaborn, Dask, Stats models, Re(regular expressions for text cleaning), textblob (sentiment analysis & text processing) & Automations. Object oriented programming is a plus. SQL: Expertise in writing complex SQL (postgres) queries for data extraction, transformation, and reporting. Process, clean, and wrangle large datasets using Python to ensure data quality and readiness for analysis. Strong understanding of Excel/Google Sheets for quick ad-hoc analysis. Experience working with large, structured/unstructured datasets. Able to develop KPIs related to retention, acquisition, A/B experiments. Visualization Tools: Data Exploration and Visualization with tools like Amplitude, Clevertap, MS Excel, Tableau, Power BI, etc. Soft Skills High sense of ownership, accountability, and proactive problem-solving mindset. Strong problem-solving, critical thinking, and business acumen. Excellent communication skills with the ability to translate complex findings into clear insights for non-technical stakeholders. Experience 3+ years of experience in data analysis, preferably in fintech or startups. Proven experience leading high-impact projects independently. A desire to work in a fast-paced environment What makes us different? We’re not just building a product—we’re shaping the future of savings in India. We seek people who bring passion, energy, and fresh ideas to help us make that happen. Experience matters, but we are a potential first organisation. We move fast, learn from our mistakes, and take bold risks to solve problems that haven’t been attempted before. If you’re excited about working in an environment where people inspire and truly support each other, you’ve found the right fit. What do we stand for? The Five Values That We Live By Passion: At Jar, we strive to create an environment where people love what they do, are motivated and equipped to do their best work. Equality: We bring diverse skills, ideas, and experiences to the table, supporting and challenging each other across teams to create something bigger than ourselves. Growth: When our people grow, Jar grows. We create opportunities for learning, development, and meaningful impact. Accountability: The core of our work ethic is taking ownership of our work, showing initiative, and having the freedom to ask questions. Consistency: We believe in doing the right things consistently. Big change doesn’t happen overnight—it’s built one step at a time. Join us and let’s build something amazing together! What employee benefits do we have? Glad you asked! Among other things, we have Medical Insurance for employees and their families ESOPs allocation Pluxee meal card Swish club card for exclusive employee discounts Advance salary plans Relocation assistance L&D programmes Skills: python,tableau,sql,numpy,data wrangling,data exploration,scikit-learn,power bi,mentoring,clevertap,report writing,analytical skills,nlp,pandas,mongodb,excel,matplotlib,data analysis,business sense,amplitude,ms excel,google sheets,data visualization,querying,etl,data analysis packages Show more Show less
Posted 3 weeks ago
0 years
2 - 4 Lacs
Hyderābād
On-site
Location: Hyderabad, IN Employment type: Employee Place of work: Office Offshore/Onshore: Onshore TechnipFMC is committed to driving real change in the energy industry. Our ambition is to build a sustainable future through relentless innovation and global collaboration – and we want you to be part of it. You’ll be joining a culture that values curiosity, expertise, and ideas as well as diversity, inclusion, and authenticity. Bring your unique energy to our team of more than 20,000 people worldwide, and discover a rewarding, fulfilling, and varied career that you can take in anywhere you want to go. Job Purpose Seeking a skilled Python Developer to join our team and help us develop applications and tooling to streamline in-house engineering design processes with a continuous concern for quality, targets, and customer satisfaction. Job Description 1.Write clean and maintainable Python code using PEP guidelines 2. Build and maintain software packages for scientific computing 3. Build and maintain command line interfaces (CLIs) 4. Build and maintain web applications and dashboards 5. Design and implement data analysis pipelines 6. Create and maintain database schemas and queries 7. Optimise code performance and scalability 8. Develop and maintain automated tests to validate software 9. Contribute and adhere to team software development practices, e.g., Agile product management, source code version control, continuous integration/deployment (CI/CD) 10. Build and maintain machine learning models (appreciated, but not a prerequisite) Technical Stack 1. Languages: Python, SQL 2. Core libraries: Scipy, Pandas, NumPy 3. Web frameworks: Streamlit, Dash, Flask 4. Visualisation: Matplotlib, Seaborn, Plotly 5. Automated testing: pytest 6. CLI development: Click, Argparse 7. Source code version control: Git 8. Agile product management: Azure DevOps, GitHub 9. CI/CD: Azure Pipelines, Github Actions, Docker 10. Database systems: PostgreSQL, Snowflake, SQlite, HDF5 11. Performance: Numba, Dask 12. Machine Learning: Scikit-learn, TensorFlow, PyTorch (Desired) You are meant for this job if: • Bachelor's degree in computer science or software engineering • Master's degree is a plus• Strong technical basis in engineering • Presentation skills • Good organizational and problem-solving skills • Service/Customer oriented • Ability to work in a team-oriented environment • Good command of English Skills Spring Boot Data Modelling CI/CD Internet of Things (IoT) Jira/Confluence React/Angular SAFe Scrum Kamban Collaboration SQL Bash/Shell/Powershell AWS S3 AWS lambda Cypress/Playwright Material Design Empirical Thinking Agility Github HTML/CSS Javascript/TypeScript GraphQL Continuous Learning Cybersecurity Computer Programming Java/Kotlin Test Driven Development Being a global leader in the energy industry requires an inclusive and diverse environment. TechnipFMC promotes diversity, equity, and inclusion by ensuring equal opportunities to all ages, races, ethnicities, religions, sexual orientations, gender expressions, disabilities, or all other pluralities. We celebrate who you are and what you bring. Every voice matters and we encourage you to add to our culture. TechnipFMC respects the rights and dignity of those it works with and promotes adherence to internationally recognized human rights principles for those in its value chain. Date posted: Jun 2, 2025 Requisition number: 13580
Posted 3 weeks ago
3.0 years
0 Lacs
Pune/Pimpri-Chinchwad Area
On-site
Job Description Vice President, Data Management & Quantitative Analysis I At BNY, our culture empowers you to grow and succeed. As a leading global financial services company at the center of the world’s financial system we touch nearly 20% of the world’s investible assets. Every day around the globe, our 50,000+ employees bring the power of their perspective to the table to create solutions with our clients that benefit businesses, communities and people everywhere. We continue to be a leader in the industry, awarded as a top home for innovators and for creating an inclusive workplace. Through our unique ideas and talents, together we help make money work for the world. This is what is all about. We’re seeking a future team member for the role of Vice President I to join our Data Management & Quantitative Analysis team. This role is located in Pune, MH or Chennai, TN (Hybrid). In this role, you’ll make an impact in the following ways: BNY Data Analytics Reporting and Transformation (“DART”) has grown rapidly and today it represents a highly motivated and engaged team of skilled professionals with expertise in financial industry practices, reporting, analytics, and regulation. The team works closely with various groups across BNY to support the firm’s Capital Adequacy, Counterparty Credit as well as Enterprise Risk modelling and data analytics; alongside support for the annual Comprehensive Capital Analysis and Review (CCAR) Stress Test. The Counterparty Credit Risk Data Analytics Team within DART designs and develops data-driven solutions aimed at strengthening the control framework around our risk metrics and reporting. For the Counterparty Credit Risk Data Analytics Team, we are looking for a Counterparty Risk Analytics Developer to support our Counterparty Credit Risk control framework. Develop analytical tools using SQL & Python to drive business insights Utilize outlier detection methodologies to identify data anomalies in the financial risk space, ensuring proactive risk management Analyze business requirements and translate them into practical solutions, developing data-driven controls to mitigate potential risks Plan and execute projects from concept to final implementation, demonstrating strong project management skills Present solutions to senior stakeholders, effectively communicating technical concepts and results Collaborate with internal and external auditors and regulators to ensure compliance with prescribed standards, maintaining the highest level of integrity and transparency. To be successful in this role, we’re seeking the following: A Bachelor's degree in Engineering, Computer Science, Data Science, or a related discipline (Master's degree preferred) At least 3 years of experience in a similar role or in Python development/data analytics Strong proficiency in Python (including data analytics, data visualization libraries) and SQL, basic knowledge of HTML and Flask. Ability to partner with technology and other stakeholders to ensure effective functional requirements, design, construction, and testing Knowledge of financial risk concepts and financial markets is strongly preferred Familiarity with outlier detection techniques (including Autoencoder method, random forest, etc.), clustering (k-means, etc.), and time series analysis (ARIMA, EWMA, GARCH, etc.) is a plus. Practical experience working with Python (Pandas, NumPy, Matplolib, Plotly, Dash, Scikit-learn, TensorFlow, Torch, Dask, Cuda) Intermediate SQL skills (including querying data, joins, table creation, and basic performance optimization techniques) Knowledge of financial risk concepts and financial markets Knowledge of outlier detection techniques, clustering, and time series analysis Strong project management skills Show more Show less
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane