Jobs
Interviews

1598 Matplotlib Jobs - Page 37

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 5.0 years

5 - 20 Lacs

Gurgaon

On-site

Assistant Manager EXL/AM/1349734 ServicesGurgaon Posted On 30 May 2025 End Date 14 Jul 2025 Required Experience 0 - 5 Years Basic Section Number Of Positions 1 Band B1 Band Name Assistant Manager Cost Code D003152 Campus/Non Campus NON CAMPUS Employment Type Permanent Requisition Type Backfill Max CTC 500000.0000 - 2000000.0000 Complexity Level Not Applicable Work Type Hybrid – Working Partly From Home And Partly From Office Organisational Group Analytics Sub Group Banking & Financial Services Organization Services LOB Services SBU Analytics Country India City Gurgaon Center Gurgaon-SEZ BPO Solutions Skills Skill PYTHON SQL Minimum Qualification B.TECH/B.E Certification No data available Job Description We are seeking a skilled Data Engineer to join our dynamic team. The ideal candidate will have expertise in Python, SQL, Tableau, and PySpark, with additional exposure to SAS, banking domain knowledge, and version control tools like GIT and BitBucket. The candidate will be responsible for developing and optimizing data pipelines, ensuring efficient data processing, and supporting business intelligence initiatives. Key Responsibilities: Design, build, and maintain data pipelines using Python and PySpark Develop and optimize SQL queries for data extraction and transformation Create interactive dashboards and visualizations using Tableau Implement data models to support analytics and business needs Collaborate with cross-functional teams to understand data requirements Ensure data integrity, security, and governance across platforms Utilize version control tools like GIT and BitBucket for code management Leverage SAS and banking domain knowledge to improve data insights Required Skills: Strong proficiency in Python and PySpark for data processing Advanced SQL skills for data manipulation and querying Experience with Tableau for data visualization and reporting Familiarity with database systems and data warehousing concepts Preferred Skills: Knowledge of SAS and its applications in data analysis Experience working in the banking domain Understanding of version control systems, specifically GIT and BitBucket Knowledge of pandas, numpy, statsmodels, scikit-learn, matplotlib, PySpark , SASPy Qualifications: Bachelor's/Master's degree in Computer Science, Data Science, or a related field Excellent problem-solving and analytical skills Ability to work collaboratively in a fast-paced environment Workflow Workflow Type L&S-DA-Consulting

Posted 1 month ago

Apply

0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Roles & Responsibilities: • Be responsible for the development of the conceptual, logical, and physical data models • Work with application/solution teams to implement data strategies, build data flows and develop/execute logical and physical data models • Implement and maintain data analysis scripts using SQL and Python. • Develop and support reports and dashboards using Google Plx/Data Studio/Looker. • Monitor performance and implement necessary infrastructure optimizations. • Demonstrate ability and willingness to learn quickly and complete large volumes of work with high quality. • Demonstrate excellent collaboration, interpersonal communication and written skills with the ability to work in a team environment. Minimum Qualifications • Hands-on experience with design, development, and support of data pipelines • Strong SQL programming skills (Joins, sub queries, queries with analytical functions, stored procedures, functions etc.) • Hands-on experience using statistical methods for data analysis • Experience with data platform and visualization technologies such as Google PLX dashboards, Data Studio, Tableau, Pandas, Qlik Sense, Splunk, Humio, Grafana • Experience in Web Development like HTML, CSS, jQuery, Bootstrap • Experience in Machine Learning Packages like Scikit-Learn, NumPy, SciPy, Pandas, NLTK, BeautifulSoup, Matplotlib, Statsmodels. • Strong design and development skills with meticulous attention to detail. • Familiarity with Agile Software Development practices and working in an agile environment • Strong analytical, troubleshooting and organizational skills • Ability to analyse and troubleshoot complex issues, and proficiency in multitasking • Ability to navigate ambiguity is great • BS degree in Computer Science, Math, Statistics or equivalent academic credentials

Posted 1 month ago

Apply

3.0 years

1 - 6 Lacs

India

On-site

About Us: Analytics Circle is a leading institute dedicated to empowering individuals with in-demand data analytics skills. We are passionate about bridging the industry-academia gap through practical, hands-on training in the most sought-after tools and technologies in data analytics. Job Description: We are looking for a highly skilled and passionate Data Analyst Trainer to join our growing team. The ideal candidate should have real-world industry experience and a strong command over Advanced Excel, Power BI, Tableau, SQL, and Python. As a trainer, you will be responsible for delivering engaging and insightful sessions to our learners, preparing them for careers in data analytics. Key Responsibilities: Deliver interactive and practical training sessions on: Advanced Excel Power BI Tableau SQL Python for Data Analysis Design and update course materials, case studies, and hands-on projects based on current industry trends. Evaluate student progress through assignments, projects, and assessments. Provide one-on-one mentorship and support to learners when needed. Assist in curriculum development and continuous improvement of training content. Stay updated with the latest developments in data analytics tools and technologies. Requirements: Bachelor’s or Master’s degree in Computer Science, Statistics, Data Science, or a related field. Minimum 3+ years of experience in the data analytics domain. Proven training or teaching experience is preferred. Proficiency in: Excel (including pivot tables, lookups, macros, dashboards) Power BI (DAX, Power Query, data modeling) Tableau (data visualization, dashboard building) SQL (queries, joins, data manipulation) Python (Pandas, NumPy, Matplotlib, data analysis workflows) Strong communication and presentation skills. Passion for teaching and mentoring. Nice to Have: Industry certifications in relevant tools (e.g., Microsoft, Tableau, Python). Experience conducting online training/webinars. Job Types: Full-time, Part-time, Permanent Pay: ₹10,000.00 - ₹50,000.00 per month Schedule: Day shift Supplemental Pay: Commission pay Performance bonus Application Question(s): Weekdays Availability Monday to Friday Education: Bachelor's (Preferred) Experience: Teaching: 3 years (Preferred) Location: Laxmi Nagar, Delhi, Delhi (Required) Shift availability: Day Shift (Required) Work Location: In person

Posted 1 month ago

Apply

10.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Growing What Matters Starts With You As the world’s only major agriscience company completely dedicated to agriculture, we’re building a culture that stays curious, thinks differently, acts boldly, and takes a stand on what’s right for our customers, our co-workers, our partners and our planet. We know we’ve got big challenges to solve - we hope you'll be part of the solution. Working at Corteva connects you with more than 20,000 colleagues united by a shared vision to grow what matters. We offer career opportunities across more than 140 world-class R&D facilities and in more than 130 countries. We’re hiring for Reporting & Analytics to join our Finance team! Learn how you can be our voice in the conversation about the future of agriculture. You Will Be Part of Growing Team Finance is a Global team tasked with supporting Finance processing requests across various regions. Our team is comprised of members supporting and providing support to the Finance Team and business across from various regions. The role will be performed within the frame of Corteva’ s Brand values: Job Responsibilities The ideal candidate will combine deep knowledge of finance operations (specifically Payables and SAP FICO) with technical proficiency in Power Platform (Power BI, Power Apps, Power Automate), SQL, Azure, SharePoint, VBA Macros, MS Access database management and Python. This role will be instrumental in driving automation, analytics, and insights to improve financial reporting, compliance, and operational efficiency. Providing Strategic, Analytic & Reporting support to Global Service Centers and Payables across regions. MIS reporting for Accounts Payable processes including vendor payments, ageing analysis, GR/IR and payment forecast reports and compliance metrics. Develop and deploy automated dashboards and reports using Power BI and SQL for internal stakeholders and auditors to bring some clarity to complex AP data. Automate finance workflows using Power Automate and Excel VBA/Macros—think reconciliation, reminders, and reporting. Explore opportunities to automate manual processes. Leverage SAP FICO for reporting, audit trails, and transaction analysis. Identify, analyze, and interpret trends or patterns in complex data sets. Transform data using Python and SQL for reporting. Manage data pipelines through Azure Data Services, integrating inputs from SAP, Excel, and cloud databases. Use Python for automation: bulk file processing, vendor statement reconciliation, and email/report workflows automation. Competent in Analysis & Judgment, Customer Relationship Management, BI tools & Microsoft Suite. Should have sufficient Procure to Pay knowledge. Partner with Procurement, Supply Chain, IT, and Treasury teams to ensure data consistency and reporting alignment. Manage, coach and develop team members Explore and implement continuous improvement with an owner’s mindset. Accountable for managing the Supplier Payments database for entire organization and provide Strategic, Analytic & Reporting support to Global Service Centers and P2P across regions Location: Corteva Global Service Center, Hyderabad, India To Grow What Matters, You Will Need Bachelor’s or master’s degree in finance, Accounting, or a related field. 6–10 years of relevant experience in Finance MIS or PTP analytic roles. Strong working knowledge of SAP FICO – especially AP-related T-codes and tables. Knowledge of ERP system, statistics and experience using statistical packages for analyzing large datasets (Excel, SPSS, SAS etc.) is preferable Technical Skills Strong Knowledge on reporting packages [Business objects] Advanced Excel with hands-on experience in VBA/macros. Proficiency in Power BI, Power Automate, and Power Apps. Strong SQL scripting and experience in working with relational databases. Exposure to Microsoft Azure (Data Factory, Synapse, or Logic Apps) is highly desirable. Experience in data modeling, cleansing, and performance tuning for large datasets. Python for data analysis and automation (e.g., pandas, matplotlib, openpyxl) Soft Skills Strong analytical mindset and attention to detail. Effective communication and ability to collaborate with cross-functional teams. Proactive problem-solver with a process improvement orientation. Ability to manage deadlines and prioritize in a fast-paced environment Preferred Skills (Optional But a Plus) Microsoft Certified: Power Platform Fundamentals or Data Analyst Associate SAP FICO Certification Azure Data Fundamentals. Who Are We Looking For? Curious, bold thinkers who want to grow their careers and be part of a winning team. Market shaping individuals who want to transform the agriculture industry to meet the world’s growing need for food. Collaborators who thrive in a diverse, inclusive work environment Innovators who bring initiative and fresh ideas that drive our business into the future and make us an industry leader. GROWING WHAT MATTERS STARTS WITH YOU… WHAT CAN WE OFFER TO HELP YOU GROW? Opportunity to be part of a global industry leader working to discover solutions to the most pressing agricultural challenges of our time. Challenging work assignments that grow your skills, capabilities and experiences. Diverse, inclusive work environment where employees bring their whole selves to work and feel heard, valued and empowered. Dedicated and customized resource to help grow your professional skills, industry expertise and personal perspectives. Opportunity to strengthen your professional network through valuable relationships. Support for the health and well-being of every employee by offering world-class benefits, meaningful work and competitive salary. Performance driven culture with a strong focus on speed, accountability and agility.

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Jaipur

On-site

Job Summary Auriga IT is seeking a proactive, problem-solving Data Analyst with 3–5 years of experience owning end-to-end data pipelines. You’ll partner with stakeholders across engineering, product, marketing, and finance to turn raw data into actionable insights that drive business decisions. You must be fluent in the core libraries, tools, and cloud services listed below. Your Responsibilities: Pipeline Management Design, build, and maintain ETL/ELT workflows using orchestration frameworks (e.g., Airflow, dbt). Exploratory Data Analysis & Visualization Perform EDA and statistical analysis using Python or R . Prototype and deliver interactive charts and dashboards. BI & Reporting Develop dashboards and scheduled reports to surface KPIs and trends. Configure real-time alerts for data anomalies or thresholds. Insights Delivery & Storytelling Translate complex analyses (A/B tests, forecasting, cohort analysis) into clear recommendations. Present findings to both technical and non-technical audiences. Collaboration & Governance Work cross-functionally to define data requirements, ensure quality, and maintain governance. Mentor junior team members on best practices in code, version control, and documentation. Key Skills: You must know at least one technology from each category below: Data Manipulation & Analysis Python: pandas, NumPy R: tidyverse (dplyr, tidyr) Visualization & Dashboarding Python: matplotlib, seaborn, Plotly R: ggplot2, Shiny BI Platforms Commercial or Open-Source (e.g., Tableau, Power BI, Apache Superset, Grafana) ETL/ELT Orchestration Apache Airflow, dbt, or equivalent Cloud Data Services AWS (Redshift, Athena, QuickSight) GCP (BigQuery, Data Studio) Azure (Synapse, Data Explorer) Databases & Querying RDBMS Strong SQL Skill (PostgreSQL, MySQL, Snowflake) Decent Knowledge of NoSQL databases Additionally : Bachelor’s or Master’s in a quantitative field (Statistics, CS, Economics, etc.). 3–5 years in a data analyst (or similar) role with end-to-end pipeline ownership. Strong problem-solving mindset and excellent communication skills. Certification of power BI, Tableau is a plus Desired Skills & Attributes Familiarity with version control (Git) and CI/CD for analytics code. Exposure to basic machine-learning workflows (scikit-learn, caret). Comfortable working in Agile/Scrum environments and collaborating across domains. About Company Hi there! We are Auriga IT. We power businesses across the globe through digital experiences, data and insights. From the apps we design to the platforms we engineer, we're driven by an ambition to create world-class digital solutions and make an impact. Our team has been part of building the solutions for the likes of Zomato, Yes Bank, Tata Motors, Amazon, Snapdeal, Ola, Practo, Vodafone, Meesho, Volkswagen, Droom and many more. We are a group of people who just could not leave our college-life behind and the inception of Auriga was solely based on a desire to keep working together with friends and enjoying the extended college life. Who Has not Dreamt of Working with Friends for a Lifetime Come Join In https://www.aurigait.com/ -https://aurigait.com/https://aurigait.com

Posted 1 month ago

Apply

3.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Job Description Primary Duties & Responsibilities Requirement gathering from the business users Analyzing the business requirements and converting the requirement into Machine learning/Generative AI problem Identifying the needed data features and labels Data Ingestion from various Source databases into Big Data Environment. Building related models by using Machine learning and Generative AI techniques Recommending the best suitable solution for the given business requirement Presenting the solution to Business users and stakeholders by best story telling approaches Education & Experience Bachelor’s / Master’s degree in Computer Science Engineering / Master’s degree in Statistics / Mathematics. 3 years minimum Experience in building various Machine Learning, Deep learning regression and classification models 1-year minimum experience in Generative AI tech stack (Prompt engineering, ChatGPT, Ollama, Gemini, RAG etc) Extensive knowledge and skills in frameworks (like TensorFlow, Pytorch, Huggingface), libraries (LlamaIndex, Langchain) Expertise in Python and Visualization Python modules like matplotlib, seaborn Knowledge of Big data systems, Data ingestion tools/processes/techniques like Streamsets, Spark Desirable skills: REST API development using Flask, Fast API Skills Strong interpersonal, and problem-solving skills. Strong stakeholder management. Work effectively with other members of Coherent Corp across the globe. Working Conditions Hybrid work structure . i.e. 3 days in office. Culture Commitment Ensure adherence to company’s values (ICARE) in all aspects of your position at Coherent Corp.: I ntegrity – Create an Environment of Trust C ollaboration – Innovate Through the Sharing of Ideas A ccountability – Own the Process and the Outcome R espect – Recognize the Value in Everyone E nthusiasm – Find a Sense of Purpose in Work Coherent Corp. is an equal opportunity/affirmative action employer. All qualified applicants will receive consideration for employment without regard to sex, gender identity, sexual orientation, race, color, religion, national origin, disability, protected Veteran status, age, or any other characteristic protected by law. Finisar India (Subsidiary of Coherent Corp) is an equal opportunity employer. All qualified applicants will receive consideration for employment without regard to gender identity, sexual orientation, race, color, religion, national origin, disability, or any other characteristic protected by law. About Us Coherent is a global leader in lasers, engineered materials and networking components. We are a vertically integrated manufacturing company that develops innovative products for diversified applications in the industrial, optical communications, military, life sciences, semiconductor equipment, and consumer markets. Coherent provides a comprehensive career development platform within an environment that challenges employees to perform at their best, while rewarding excellence and hard-work through a competitive compensation program. It's an exciting opportunity to work for a company that offers stability, longevity and growth. Come Join Us! Note to recruiters and employment agencies: We will not pay for unsolicited resumes from recruiters and employment agencies unless we have a signed agreement and have required assistance, in writing, for a specific opening. LinkedIn

Posted 1 month ago

Apply

0 years

0 Lacs

Kochi, Kerala, India

On-site

About the Company Turbolab Technologies is a revenue-positive, bootstrapped startup, and home to a wide range of products and services related to Data Analysis and Web Crawling. Over the past few years, we have expanded into areas such as Machine Learning, Image Processing, Video Analysis, and more. Most of our products cater to enterprise clients, empowering them to harness the power of data to grow their businesses. Job description We are looking for highly motivated and uniquely talented Quality Analysts to join our Data Team. We are looking for someone with the creativity, technical skills, attention to detail, and enthusiasm to join our team and help leverage data by ensuring quality to enable enterprises to grow their business. Go ahead and apply if you are excited to work with us at our Kochi office. Key Responsibilities Regularly audit datasets to find out issues and write reports. Coordinate with the developers to maintain data quality standards. Do routine inspections of automated QA processes and the resulting data. Actively participate in improving data quality workflow and be innovative. Required Skills & Experience Proficiency in Python. Familiarity with packages such as Numpy, Pandas, Matplotlib and/or Seaborn. Good understanding of data file formats (CSV, XML, JSON, etc.) and data processing tools (JupyterHub and OpenRefine) Ability to use basic querying methods (Regex, SQL, and XPath or XQuery) Basic understanding of web technologies (HTML, JavaScript, CSS, etc.) Experience in extracting, cleaning, and structuring data from unstructured or semi-structured sources Good knowledge of databases and ORM tools. Ability to- work independently, prioritize tasks and take initiatives. Ability to document requirements and specifications. Excellent time-management, multi-tasking, communication and interpersonal skills.

Posted 1 month ago

Apply

0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

● Minimum of (3+) years of experience in AI-based application development. ● Fine-tune pre-existing models to improve performance and accuracy. ● Experience with TensorFlow or PyTorch, Scikit-learn, or similar ML frameworks and familiarity with APIs like OpenAI or vertex AI ● Experience with NLP tools and libraries (e.g., NLTK, SpaCy, GPT, BERT). ● Implement frameworks like LangChain, Anthropics Constitutional AI, OpenAIs, Hugging Face, and Prompt Engineering techniques to build robust and scalable AI applications. ● Evaluate and analyze RAG solution and Utilise the best-in-class LLM to define customer experience solutions (Fine tune Large Language models (LLM)). ● Architect and develop advanced generative AI solutions leveraging state-of-the-art language models (LLMs) such as GPT, LLaMA, PaLM, BLOOM, and others. ● Strong understanding and experience with open-source multimodal LLM models to customize and create solutions. ● Explore and implement cutting-edge techniques like Few-Shot Learning, Reinforcement Learning, Multi-Task Learning, and Transfer Learning for AI model training and fine-tuning. ● Proficiency in data preprocessing, feature engineering, and data visualization using tools like Pandas, NumPy, and Matplotlib. ● Optimize model performance through experimentation, hyperparameter tuning, and advanced optimization techniques. ● Proficiency in Python with the ability to get hands-on with coding at a deep level. ● Develop and maintain APIs using Python's FastAPI, Flask, or Django for integrating AI capabilities into various systems. ● Ability to write optimized and high-performing scripts on relational databases (e.g., MySQL, PostgreSQL) or non-relational database (e.g., MongoDB or Cassandra) ● Enthusiasm for continuous learning and professional developement in AI and leated technologies. ● Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions. ● Knowledge of cloud services like AWS, Google Cloud, or Azure. ● Proficiency with version control systems, especially Git. ● Familiarity with data pre-processing techniques and pipeline development for Al model training. ● Experience with deploying models using Docker, Kubernetes ● Experience with AWS Bedrock, and Sagemaker is a plus ● Strong problem-solving skills with the ability to translate complex business problems into Al solutions.

Posted 1 month ago

Apply

3.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Title: Python Developer Experience: 1 – 3 Years Location: Perungudi, Chennai (Work From Office) Shift Timings: Regular Shift: 10:00 AM – 7:00 PM Late Shift: 1:00 PM – 10:00 PM (Shifts alternate every two weeks). We are hiring a Python Developer to join the Market Risk team of a global organization in Chennai. This role involves developing and maintaining risk analytics tools and automating reporting processes to support commodity risk management. Key Responsibilities: Develop, test, and maintain Python scripts for data analysis and reporting Write scalable, clean code using Pandas, NumPy, Matplotlib, and OOPS principles Collaborate with risk analysts to implement process improvements Document workflows and maintain SOPs in Confluence Optimize code performance and adapt to evolving business needs Requirements: Strong hands-on experience with Python, Pandas, NumPy, Matplotlib, and OOPS Good understanding of data structures and algorithms Experience with Excel and VBA is an added advantage Exposure to financial/market risk environments is preferred Excellent problem-solving, communication, and documentation skills Interested candidates, please share your updated CV to: janani.sivanandam@antal.com

Posted 1 month ago

Apply

15.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Years of experience: 15+ Years Location: Hyderabad Level: Associate Director Required Qualifications: Bachelor's or Master’s degree in Computer Science, Statistics, Mathematics, Data Science, or a related field. Proven experience in data science, machine learning, or applied statistics. Proficiency in programming languages such as Python, R, or SQL. Experience with data visualization tools (e.g., Tableau, Power BI, Matplotlib, Seaborn). Expertise working on Generative AI Engineering/ Architecture including LLMs and hyperparameters (Azure / AWS / GCP / Open Source), Embedding models and vector database knowledge, Prompting Techniques (Zero shot, few shot, chain of thought), Langchain, Pydantic. Familiarity with machine learning libraries and frameworks (e.g., Scikit-learn, TensorFlow, Keras, PyTorch). Experience with cloud platforms like AWS, GCP, or Azure. Strong analytical, problem-solving, and communication skills. For further information, kindly reach us at ruchita.khole@credencehrservices.com

Posted 1 month ago

Apply

0 years

0 Lacs

Cuddapah, Andhra Pradesh, India

On-site

Data Science & Data Analysis Trainer 📍 Location : Kadapa, Andhra Pradesh ( Relocation Mandatory ) 🕐 Contract | No Fixed Timings | On-site Role Are you skilled in Python, SQL, Excel, and Power BI and confident enough to handle students on your own? Then we want you on board! 💼 Role Overview : As the lead trainer, you’ll be responsible for conducting offline classes for college students in Kadapa. There are no fixed class timings — sessions will be scheduled as per the college or student batches, offering you flexibility in your day. Topics You’ll Handle : Data Science with Python (NumPy, Pandas, Matplotlib, etc.) SQL (queries, joins, subqueries) Advanced Excel (dashboards, pivot tables, formulas) Power BI (data visualization and reports) Eligibility Criteria : Education : Any Bachelor's Degree (B.Tech/B.Sc/BCA/B.Com/BBA or equivalent) Certification in Data Science, Python, or Power BI is a big plus Skills Required: Strong grip on Python, SQL, Excel, and Power BI Ability to handle a batch independently Good communication skills in English (Telugu-speaking preferred) Experience: Freshers are welcome! If you’ve got the skills and confidence to manage students, you're eligible. Prior teaching experience is optional, not required. What We Offer : No fixed class timings — flexible work hours based on sessions Accommodation guidance Scope to grow with us as we expand to more colleges An opportunity to genuinely impact students’ futures This is a full-time, offline teaching role based in Kadapa, A.P. Relocation is mandatory — apply only if you’re ready to shift. Apply Now: Send your resume + any sample work/projects to careers@neeuvnext.in

Posted 1 month ago

Apply

0.0 - 5.0 years

0 Lacs

Coimbatore, Tamil Nadu

On-site

Job Information Date Opened 06/24/2025 Job Type Full time Industry IT Services Work Experience 1-3 years City Coimbatore State/Province Tamil Nadu Country India Zip/Postal Code 641014 Job Description Role: Data Science, AI & ML Trainer Company: QBrainX (https://qbrainx.com/) Role: Data Science and AI ML Trainer for Kodo IT Program by QBrainX (https://xnovaq.com/kodo-program) Work Arrangement: Work from Office Office Location: Tidel Park, Coimbatore We are seeking a passionate and skilled Data Science, AI & ML Trainer with 2-5 years of experience to join our team. The ideal candidate will deliver engaging, hands-on training sessions, simplifying complex concepts and guiding learners through practical, real-world projects. This role is perfect for an individual enthusiastic about teaching and staying at the forefront of AI/ML advancements. Key Responsibilities Design and deliver high-quality training sessions on Python, Data Analysis, Machine Learning, and AI fundamentals. Collaborate with the curriculum development team to create relevant course content, hands-on exercises, and mini-projects. Mentor and support learners, addressing technical queries and fostering a collaborative learning environment. Stay updated with the latest tools, frameworks, and trends in Data Science, AI, and ML to ensure training content remains current. Evaluate learner progress and provide constructive feedback to enhance skill development. Contribute to the creation of training materials, including presentations, tutorials, and case studies. Requirements 2-5 years of professional experience in Data Science, AI/ML roles, or training. Strong proficiency in Python and key libraries such as Pandas, NumPy, and Scikit-learn. Excellent communication and presentation skills, with the ability to explain complex concepts clearly. Passion for teaching, mentoring, and empowering learners. Strong organizational skills and the ability to manage multiple training sessions effectively. Proactive and self-motivated with a commitment to continuous learning. Good to Have Experience with Deep Learning frameworks such as TensorFlow or PyTorch. Familiarity with Natural Language Processing (NLP) or cloud platforms (e.g., AWS, Azure, GCP). Prior experience conducting workshops, bootcamps, or corporate training sessions. Knowledge of data visualization tools like Matplotlib, Seaborn, or Tableau. Benefits Join a dynamic team dedicated to shaping the next generation of Data Science and AI professionals. This role offers the opportunity to make a meaningful impact through teaching, while staying connected to cutting-edge developments in AI and ML. If you are excited about empowering learners and have the skills to excel as a Data Science, AI & ML Trainer, we encourage you to apply!

Posted 1 month ago

Apply

3.0 - 6.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

Role Overview We're looking for a Python-based AI/ML Developer who brings solid hands-on experience in building machine learning models and deploying them into scalable, production-ready APIs using FastAPI or Django. The ideal candidate is both analytical and implementation-savvy, capable of transforming models into live services and integrating them with real-world systems. Key Responsibilities Design, train, and evaluate machine learning models (classification, regression, clustering, etc.) Build and deploy scalable REST APIs for model serving using FastAPI or Django Collaborate with data scientists, backend developers, and DevOps to integrate models into production systems Develop clean, modular, and optimized Python code using best practices Perform data preprocessing, feature engineering, and data visualization using Pandas, NumPy, Matplotlib, and Seaborn Implement model serialization techniques (Pickle, Joblib, ONNX) and deploy models using containers (Docker) Manage API security with JWT and OAuth mechanisms Participate in Agile development with code reviews, Git workflows, CI/CD pipelines Must-Have Skills Python & Development : Proficient in Python 3.x, OOP, and clean code principles Experience with Git, Docker, debugging, unit testing AI/ML Good grasp of supervised/unsupervised learning, model evaluation, and data wrangling Hands-on with Scikit-learn, XGBoost, LightGBM Web Frameworks FastAPI : API routes, async programming, Pydantic, JWT Django : REST Framework, ORM, Admin panel, Middleware DevOps & Cloud Experience with containerized deployment using Docker Exposure to cloud platforms: AWS, Azure, or GCP CI/CD with GitHub Actions, Jenkins, or GitLab CI Databases SQL : PostgreSQL, MySQL NoSQL : MongoDB, Redis ORM : Django ORM, Skills : Model tracking/versioning tools (MLflow, DVC) Knowledge of LLMs, transformers, vector DBs (Pinecone, Faiss) Airflow, Prefect, or other workflow automation tools Basic frontend skills (HTML, JavaScript, React) Requirements Education: B.E./B.Tech or M.E./M.Tech in Computer Science, Data Science, or related fields Experience: 3-6 years of industry experience in ML development and backend API integration Strong communication skills and ability to work with cross-functional teams (ref:hirist.tech)

Posted 1 month ago

Apply

5.0 - 10.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Project Role : Analytics Practitioner Project Role Description : Drive innovation and Intellectual property (IP) around specific analytics models and offerings Must have skills : Python (Programming Language) Good to have skills : NAMinimum 7.5 year(s) of experience is required Educational Qualification : Any degree in computer science Summary :As an Analytics Practitioner, you will drive innovation and intellectual property around specific analytics models and offerings. A typical day involves collaborating with various teams to enhance analytics capabilities, developing new models, and ensuring the effective implementation of analytics solutions that meet business needs. You will engage in problem-solving activities, leveraging your expertise to provide insights and recommendations that support strategic decision-making. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with organizational goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Python (Programming Language).- Strong analytical skills to interpret complex data sets.- Experience with data manipulation and analysis libraries such as Pandas and NumPy.- Familiarity with data visualization tools to present findings effectively.- Ability to develop and implement machine learning models. Additional Information:- The candidate should have minimum 7.5 years of experience in Python (Programming Language).- This position is based in Hyderabad.- Any degree in computer science is required. Qualification Any degree in computer science

Posted 1 month ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Ahmedabad

Work from Office

We are seeking a skilled Python Developer to join our team. The ideal candidate will be responsible for working with existing APIs or developing new APIs based on our requirements. You should have a strong foundation in Python and experience with RESTful services and cloud infrastructure. Requirements: Strong understanding of Python Experience with RESTful services and cloud infrastructure Ability to develop microservices/functions Familiarity with libraries such as Pandas, NumPy, and Matplotlib Basic understanding of SQL and databases Ability to write clean, maintainable code Experience deploying applications at scale in production environments Experience with web scraping using tools like BeautifulSoup, Scrapy, or Selenium Knowledge of equities, futures, or options microstructures is a plus Experience with data visualization and dashboard building is a plus Why Join Us? Opportunity to work on high-impact real-world projects Exposure to cutting-edge technologies and financial datasets A collaborative, supportive, and learning-focused team culture 5-day work week (Monday to Friday)

Posted 1 month ago

Apply

3.0 - 6.0 years

9 - 19 Lacs

Hyderabad

Hybrid

Were looking for a Python-based AI/ML Developer who brings solid hands-on experience in building machine learning models and deploying them into scalable, production-ready APIs using FastAPI or Django. The ideal candidate is both analytical and implementation-savvy, capable of transforming models into live services and integrating them with real-world systems. Key Responsibilities Design, train, and evaluate machine learning models (classification, regression, clustering, etc.) Build and deploy scalable REST APIs for model serving using FastAPI or Django Collaborate with data scientists, backend developers, and DevOps to integrate models into production systems Develop clean, modular, and optimized Python code using best practices Perform data preprocessing, feature engineering, and data visualization using Pandas , NumPy , Matplotlib , and Seaborn Implement model serialization techniques (Pickle, Joblib, ONNX) and deploy models using containers (Docker) Manage API security with JWT and OAuth mechanisms Participate in Agile development with code reviews, Git workflows, CI/CD pipelines Must-Have Skills Python & Development Proficient in Python 3.x, OOP, and clean code principles Experience with Git, Docker, debugging, unit testing AI/ML Good grasp of supervised/unsupervised learning, model evaluation, and data wrangling Hands-on with Scikit-learn , XGBoost , LightGBM Web Frameworks FastAPI : API routes, async programming, Pydantic, JWT Django : REST Framework, ORM, Admin panel, Middleware DevOps & Cloud Experience with containerized deployment using Docker Exposure to cloud platforms: AWS , Azure , or GCP CI/CD with GitHub Actions, Jenkins, or GitLab CI Databases SQL: PostgreSQL, MySQL NoSQL: MongoDB, Redis ORM: Django ORM, SQLAlchemy Bonus/Nice-to-Have Skills Model tracking/versioning tools (MLflow, DVC) Knowledge of LLMs , transformers, vector DBs (Pinecone, Faiss) Airflow, Prefect, or other workflow automation tools Basic frontend skills (HTML, JavaScript, React) Requirements Education : B.E./B.Tech or M.E./M.Tech in Computer Science, Data Science, or related fields Experience : 36 years of industry experience in ML development and backend API integration Strong communication skills and ability to work with cross-functional teams Role & responsibilities

Posted 1 month ago

Apply

0.0 - 3.0 years

0 - 0 Lacs

Laxmi Nagar, Delhi, Delhi

On-site

About Us: Analytics Circle is a leading institute dedicated to empowering individuals with in-demand data analytics skills. We are passionate about bridging the industry-academia gap through practical, hands-on training in the most sought-after tools and technologies in data analytics. Job Description: We are looking for a highly skilled and passionate Data Analyst Trainer to join our growing team. The ideal candidate should have real-world industry experience and a strong command over Advanced Excel, Power BI, Tableau, SQL, and Python. As a trainer, you will be responsible for delivering engaging and insightful sessions to our learners, preparing them for careers in data analytics. Key Responsibilities: Deliver interactive and practical training sessions on: Advanced Excel Power BI Tableau SQL Python for Data Analysis Design and update course materials, case studies, and hands-on projects based on current industry trends. Evaluate student progress through assignments, projects, and assessments. Provide one-on-one mentorship and support to learners when needed. Assist in curriculum development and continuous improvement of training content. Stay updated with the latest developments in data analytics tools and technologies. Requirements: Bachelor’s or Master’s degree in Computer Science, Statistics, Data Science, or a related field. Minimum 3+ years of experience in the data analytics domain. Proven training or teaching experience is preferred. Proficiency in: Excel (including pivot tables, lookups, macros, dashboards) Power BI (DAX, Power Query, data modeling) Tableau (data visualization, dashboard building) SQL (queries, joins, data manipulation) Python (Pandas, NumPy, Matplotlib, data analysis workflows) Strong communication and presentation skills. Passion for teaching and mentoring. Nice to Have: Industry certifications in relevant tools (e.g., Microsoft, Tableau, Python). Experience conducting online training/webinars. Job Types: Full-time, Part-time, Permanent Pay: ₹10,000.00 - ₹50,000.00 per month Schedule: Day shift Supplemental Pay: Commission pay Performance bonus Application Question(s): Weekdays Availability Monday to Friday Education: Bachelor's (Preferred) Experience: Teaching: 3 years (Preferred) Location: Laxmi Nagar, Delhi, Delhi (Required) Shift availability: Day Shift (Required) Work Location: In person

Posted 1 month ago

Apply

3.0 years

0 Lacs

Noida, Uttar Pradesh, India

On-site

About Us  Dailoqa is an international, AI-native company focused on solving business problems for clients in the Financial Services sector by combining assets and strong technical and functional skills. We help our clients in their future as an AI-native organisation by providing roadmap for unlocking value by Combined Intelligence in partnership between humans and agents in integrating new technology into complex and legacy IT architectures. Key Responsibilities: Data Analysis and Modelling: Collect, process, and analyse large datasets to extract actionable insights. Develop and implement statistical and machine learning models to solve complex business problems. Algorithm Development: Design and develop algorithms for data mining, predictive modelling, and other data-driven applications. Continuously improve and optimize algorithms for better performance. Visualization and Reporting: Create data visualizations and reports to effectively communicate insights and findings to stakeholders. Develop dashboards and interactive tools for data exploration. Collaboration: Work closely with cross-functional teams, including data engineers, AI engineers, and product managers, to understand project requirements and deliver high-quality solutions. Data Preparation: Perform data cleaning, transformation, and augmentation to ensure data quality and readiness for analysis. Implement ETL processes to streamline data workflows. Machine Learning Implementation: Develop and deploy machine learning models to production environments. Monitor model performance and implement necessary updates and improvements. Documentation and Reporting: Document the data analysis process, including data sources, methodologies, and results. Prepare and present reports on project progress and findings to stakeholders. Ethical Data Practices: Ensure that data analysis and modelling adhere to ethical standards and guidelines, promoting fairness and minimizing bias. Continuous Learning and Improvement: Stay updated with the latest advancements in data science and machine learning. Attend conferences, read research papers, and participate in professional development activities to continuously enhance skills and knowledge. Problem-Solving and Innovation: Identify and solve complex problems using innovative data-driven solutions. Propose and implement creative ideas to leverage data for various applications and industries. Testing and Validation: Conduct rigorous testing and validation of models and algorithms to ensure accuracy, reliability, and scalability. Implement A/B testing and other validation techniques to assess the real-world performance of data-driven solutions. Feedback Incorporation: Collect and analyse feedback from users and stakeholders to improve data models and applications. Iterate on model development based on user needs and project requirements. Skills and Qualifications: Technical Skills: Proficiency in programming languages such as Python, R, and SQL. Strong knowledge of machine learning frameworks and libraries, including TensorFlow, PyTorch, and Scikit-learn. Experience with data visualization tools such as Tableau, Power BI, and Matplotlib. Familiarity with big data technologies, including Hadoop and Spark. Knowledge of data pre-processing, feature engineering, and ETL processes. Mathematics and Statistics: Strong foundation in linear algebra, calculus, and probability & statistics. Soft Skills: Excellent problem-solving and critical thinking skills. Strong communication skills for explaining complex technical concepts to non-technical stakeholders. Ability to work effectively in a team and collaborate with cross-functional teams. Commitment to continuous learning and staying updated with industry advancements. Creativity and innovation in developing data-driven solutions. Domain Knowledge: Understanding of the Financial Services industry and its specific challenges. Awareness of ethical considerations in data science and machine learning. Overall and Relevant Experience: 3+ years overall IT experience with at least 2+ years of relevant experience in data science with Financial Services organisations. Bachelor’s degree in computer science, Engineering, Statistics, Mathematics, or a related quantitative field. Why Join Us: Innovative Environment: Work on cutting-edge AI technologies and innovative projects. International Exposure : To work with international clients/team Collaborative Culture: Be part of a passionate and supportive team. Professional Growth: Opportunities for continuous learning and development. Impactful Work: Contribute to meaningful projects that drive innovation and make a difference. If you are excited about the prospect of working in a dynamic and forward-thinking company, we would love to hear from you!

Posted 1 month ago

Apply

5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Responsibilities: Design and implement robust and scalable data architectures, including data warehouses, data lakes, and operational data stores, primarily leveraging Oracle Cloud services. Develop and maintain data models (conceptual, logical, and physical) that align with business requirements and ensure data integrity and consistency. Define data governance policies and procedures to ensure data quality, security, and compliance. Collaborate with data engineers to build and optimize ETL/ELT pipelines for efficient data ingestion, transformation, and loading. Develop and execute data migration strategies to Oracle Cloud. Utilize strong SQL skills to query, manipulate, and analyze large datasets from various sources. Leverage Python and relevant libraries (e.g., Pandas, NumPy) for data cleaning, transformation, and analysis. Design and develop interactive and insightful data visualizations using tools like [Specify Visualization Tools - e.g., Tableau, Power BI, Matplotlib, Seaborn, Plotly] to communicate data-driven insights to both technical and non-technical stakeholders. Work closely with business analysts and stakeholders to understand their data needs and translate them into effective data models and visualizations. Ensure the performance and reliability of data visualization dashboards and reports. Stay up-to-date with the latest trends and technologies in data architecture, cloud computing (especially Oracle Cloud), and data visualization. Troubleshoot data-related issues and provide timely resolutions. Document data architectures, data flows, and data visualization solutions. Participate in the evaluation and selection of new data technologies and tools. Qualifications: Bachelor's or Master's degree in Computer Science, Data Science, Information Systems, or a related field. Proven experience (typically 5+ years) as a Data Architect, Data Modeler, or similar role. Deep understanding of data warehousing concepts, dimensional modeling (e.g., star schema, snowflake schema), and ETL/ELT processes. Extensive experience working with relational databases, particularly Oracle , and proficiency in SQL . Hands-on experience with Oracle Cloud data services (e.g., Autonomous Data Warehouse, Object Storage, Data Integration). Strong programming skills in Python and experience with data manipulation and analysis libraries (e.g., Pandas, NumPy). Demonstrated ability to create compelling and effective data visualizations using industry-standard tools (e.g., Tableau, Power BI, Matplotlib, Seaborn, Plotly). Excellent analytical and problem-solving skills with the ability to interpret complex data and translate it into actionable insights. Strong communication and presentation skills, with the ability to effectively communicate technical concepts to non-technical audiences. Experience with data governance and data quality principles. Familiarity with agile development methodologies. Ability to work independently and collaboratively within a team environment.

Posted 1 month ago

Apply

0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Introduction In this role, you'll work in one of our IBM Consulting Client Innovation Centers (Delivery Centers), where we deliver deep technical and industry expertise to a wide range of public and private sector clients around the world. Our delivery centers offer our clients locally based skills and technical expertise to drive innovation and adoption of new technology Your Role And Responsibilities Scale an existing RAG code base for a production grade AI application Requirements: Proficiency in Prompt Engineering, LLMs, and Retrieval Augmented Generation Programming languages like Python or Java Experience with vector databases Experience using LLMs in software applications, including prompting, calling, and processing outputs Experience with AI frameworks such as LangChain Troubleshooting skills and creative in finding new ways to leverage LLM Experience with Azure Proof of Concept (POC) Development: Develop POCs to validate and showcase the feasibility and effectiveness of the proposed AI solutions. Collaborate with development teams to implement and iterate on POCs, ensuring alignment with customer requirements and expectations. Help in showcasing the ability of Gen AI code assistant to refactor/rewrite and document code from one language to another, particularly COBOL to JAVA through rapid prototypes/ PoC Documentation and Knowledge Sharing: Document solution architectures, design decisions, implementation details, and lessons learned. Create technical documentation, white papers, and best practice guides. Contribute to internal knowledge sharing initiatives and mentor new team members. Industry Trends and Innovation: Stay up to date with the latest trends and advancements in AI, foundation models, and large language models. Evaluate emerging technologies, tools, and frameworks to assess their potential impact on solution design and implementation. Experience in python and pyspark will be added advantage Preferred Education Master's Degree Required Technical And Professional Expertise Strong programming skills, with proficiency in Python and experience with AI frameworks such as TensorFlow, PyTorch, Keras or Hugging Face. Understanding in the usage of libraries such as SciKit Learn, Pandas, Matplotlib, etc. Familiarity with cloud platforms (e.g. Kubernetes, AWS, Azure, GCP) and related services is a plus. Experience and working knowledge in COBOL & JAVA would be preferred Having experience in Code generation, code matching & code translation Prepare the effort estimates, WBS, staffing plan, RACI, RAID etc. . Excellent interpersonal and communication skills. Engage with stakeholders for analysis and implementation. Commitment to continuous learning and staying updated with advancements in the field of AI. Demonstrate a growth mindset to understand clients' business processes and challenges Preferred Technical And Professional Experience Education: Bachelor's, Master's, or Ph.D. degree in Computer Science, Artificial Intelligence, Data Science or a related field. Ability to communicate with internal and external clients to understand and define business needs, providing analytical solutions Ability to communicate results to technical and non-technical audiences

Posted 1 month ago

Apply

0.0 - 3.0 years

9 - 12 Lacs

Hyderabad

Work from Office

As an AI Engineering Intern , you will work closely with our AI research and development team to build, test, and deploy machine learning models and AI solutions. You will gain hands-on experience with various AI techniques and technologies, helping to develop and improve AI-powered systems. Responsibilities: Assist in the development and optimization of machine learning models and algorithms. Support data preprocessing, cleaning, and analysis for AI-related projects. Collaborate with the AI team to implement and integrate AI solutions into production systems. Contribute to the design and development of AI systems, including NLP, computer vision, or other domains based on project needs. Help in writing clean, scalable, and well-documented code for AI applications. Participate in the testing and validation of AI models, and identify areas for improvement. Stay up-to-date with the latest advancements in AI and machine learning technologies. Qualifications: Currently pursuing a degree in Computer Science, Engineering, Mathematics, or a related field (preferably at the undergraduate or graduate level). Solid understanding of machine learning concepts and algorithms (e.g., supervised learning, unsupervised learning, deep learning, etc.). Familiarity with programming languages such as Python, R, or similar. Experience with machine learning frameworks (e.g., TensorFlow, PyTorch, Scikit-learn) is a plus. Strong problem-solving skills and analytical thinking. Ability to work independently as well as part of a collaborative team. Good communication skills, with the ability to present ideas and technical concepts clearly. Preferred Qualifications: Experience with cloud platforms such as AWS, GCP, or Azure. Familiarity with data wrangling and data visualization tools (e.g., Pandas, Matplotlib, Seaborn). Knowledge of advanced AI topics such as reinforcement learning, generative models, or NLP. Exposure to version control systems (e.g., Git). Benefits: Mentorship from experienced AI engineers. Hands-on experience with state-of-the-art AI technologies. Opportunity to contribute to real-world AI projects.

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 9 Lacs

Mumbai

Work from Office

3+ years experience of building software solutions using Python Strong fundamentals of Python like Python Data Layout, Generators, Decorators, File IO, Dynamic Programming, Algorithms etc. Working knowledge of Python Standard Libraries and libraries like any ORM library, numpy, scipy, matplotlib, mlab etc. Knowledge of fundamental design principles to build a scalable application Knowledge of Python web frameworks Working knowledge of core Java is added plus Knowledge of web technologies (HTTP, JS) is added plus A financial background will be added plus Any technical capabilities in the area of big data analytics is also added plus Salary Package: As per the industry standard Preferred Programs: BE or BTech or equivalent degree with strong Mathematics and Statistics foundation (example, B.Sc. or M.Sc. in Mathematics & Computer Science)

Posted 1 month ago

Apply

3.0 - 5.0 years

0 Lacs

Bengaluru, Karnataka, India

On-site

We are looking for a Senior AI/Data Scientist with 3-5 years of experience who is passionate about building AI and machine learning solutions for real-world business problems. As part of our AI team, you will design, develop and deploy advanced machine learning models, Generative AI applications and AI-powered decision systems. You will work with structured and unstructured data, develop predictive models, AI-driven insights and business-aware Generative AI agents that enhance productivity and decision-making. Key Responsibilities • Build Gen AI-enabled solutions using online and offline LLMs, SLMs and TLMs tailored to domain-specific problems. • Deploy agentic AI workflows and use cases using frameworks like LangGraph, Crew AI etc. • Apply NLP, predictive modelling and optimization techniques to develop scalable machine learning solutions. • Integrate enterprise knowledge bases using Vector Databases and Retrieval Augmented Generation (RAG). • Apply advanced analytics to address complex challenges in Healthcare, BFSI and Manufacturing domains. • Deliver embedded analytics within business systems to drive real-time operational insights. Required Skills & Experience • 3–5 years of experience in applied Data Science or AI roles. • Experience working in any one of the following domains: BFSI, Healthcare/Health Sciences, Manufacturing or Utilities. • Proficiency in Python, with hands-on experience in libraries such as scikit-learn, TensorFlow • Practical experience with Gen AI (LLMs, RAG, vector databases), NLP and building scalable ML solutions. • Experience with time series forecasting, A/B testing, Bayesian methods and hypothesis testing. • Strong skills in working with structured and unstructured data, including advanced feature engineering. • Familiarity with analytics maturity models and the development of Analytics Centre of Excellence (CoE’s). • Exposure to cloud-based ML platforms like Azure ML, AWS SageMaker or Google Vertex AI. • Data visualization using Matplotlib, Seaborn, Plotly; experience with Power BI is a plus. What We Look for (Values & Behaviours) • AI-First Thinking – Passion for leveraging AI to solve business problems. • Data-Driven Mindset – Ability to extract meaningful insights from complex data. • Collaboration & Agility – Comfortable working in cross-functional teams with a fast-paced mindset. • Problem-Solving – Think beyond the obvious to unlock AI-driven opportunities. • Business Impact – Focus on measurable outcomes and real-world adoption of AI. • Continuous Learning – Stay updated with the latest AI trends, research and best practices. Why Join Us? • Work on cutting-edge AI & GenAI projects. • Be part of a high-Caliber AI team solving complex business challenges. • Exposure to global enterprises and AI-driven decision-making. • Competitive compensation and fast-track career growth in AI. • Get mentored by best-in-class AI leaders who will help shape you into a top AI professional

Posted 1 month ago

Apply

2.0 - 3.0 years

0 Lacs

Mumbai Metropolitan Region

On-site

Experience Required 2 to 3 years Job Description About The Company Axis My India is India’s foremost Consumer Data Intelligence Company, which in partnership with Google is building a single-stop People Empowerment Platform, the ‘a’ app, that aims to change people’s awareness, accessibility, and utilization of a slew of services. At Axis, we are dedicated to making a tangible impact on the lives of millions. If you're passionate about creating meaningful changes and aren't afraid to get your hands dirty, we want you on our team! For more insights of the company, kindly visit our website https://www.axismyindia.org Role Overview Axis My India is seeking a skilled Data Analyst & Data Visualizer with at least 2 years of experience to join our team supporting the Axis My India “a” APP and custom projects. In this role, you will be responsible for analyzing, and interpreting data as well as designing impactful visualizations and dashboards. Your work will help drive data-driven decision-making, enhance user engagement, and support the company’s mission to connect and resolve problems for 250 million Indian households through innovative digital solutions. Key Responsibilities Collect, clean, and process data from multiple sources including databases, APIs, and third-party platforms to ensure data accuracy and reliability for the APP data and similarly for other custom projects. Analyze data to identify trends, patterns, and actionable insights that inform product development, market research, and social impact initiatives. Design and develop interactive dashboards and visualizations using tools such as Power BI or similar platforms to communicate findings clearly. Present complex data insights using fusion charts, concise reports and visual formats to technical and non-technical stakeholders. Collaborate closely with cross-functional teams including product, Technology, Operations and research to define analytics and visualization requirements. Monitor and report on key performance indicators (KPIs) relevant to the app’s usage, impact, and outreach. Ensure data integrity and governance throughout the data lifecycle. Stay updated on the latest analytics and visualization tools, techniques, and best practices to continuously improve the data experience for Axis My India projects and APP users. Creating the presentations, documents, reports based on requirement Required Skills & Qualifications Bachelor’s degree in Statistics, Mathematics, Computer Science, Data Science, Design, or a related field. Minimum 2 years of professional experience in data analysis and data visualization roles. Proficiency in Python and Power BI. Proficient with basic statistics. Experience with data visualization tools like Power BI, PowerPoint and excel. Strong analytical, problem-solving, and data storytelling skills. Knowledge of data blending, dashboard optimization, and UI/UX principles. Excellent communication and collaboration skills, with the ability to translate complex data into actionable insights. Ability to manage multiple projects and deliver results in a fast-paced environment. Preferred Experience Experience working with app-based data and multi-project analytics environments. Familiarity with analyzing and visualizing multiple survey data. Worked on basic statistics for data analysis Requirements Technical Skills Python: Data manipulation, analysis, and visualization with libraries like Pandas, NumPy, Matplotlib, and Seaborn. Power BI: Industry-standard tools for creating interactive dashboards and visual reports, enabling clear communication of insights to stakeholders. Matplotlib/Seaborn: Python libraries for custom visualizations and advanced charting. Excel: Useful for basic data analysis, quick visualizations, and reporting Proficiency in cleaning, preprocessing, and transforming raw data-handling missing values, outliers, duplicates, and standardizing formats to ensure data accuracy and reliability. Must be strong in using Statistical concepts for analysis and Visual design principles Analytical & Business Skills Critical Thinking & Problem Solving: Strong analytical mindset to interpret complex data, identify trends, and provide actionable insights. Data Transformation: Ability to transform complex data into compelling narratives that highlight key findings and support data-driven decision-making. Communication & Collaboration Effective Communication: Ability to present complex data insights clearly and succinctly to internal teams, stakeholders, including senior executives, through reports, dashboards and presentations. Working with Cross-functional teams: Experience collaborating with cross-functional teams, gathering requirements, and incorporating feedback to refine dashboards and reports. Benefits Competitive salary and benefits package Opportunity to make significant contributions to a dynamic company Evening snacks are provided by the company to keep you refreshed towards the end of the day Walking distance from Chakala metro station, making commuting easy and convenient. At Axis My India, we value discipline and focus. Our team members wear uniforms, adhere to a no-mobile policy during work hours, and work from our office with alternate Saturdays off. If you thrive in a structured environment and are committed to excellence, we encourage you to apply. check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#9C27B0;border-color:#9C27B0;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered="">

Posted 1 month ago

Apply

0 years

0 Lacs

India

Remote

CryptoChakra is a forward-thinking cryptocurrency analytics and education platform committed to making data-driven insights accessible and actionable for users worldwide. As we scale our operations and enhance our offerings, we prioritize transparency, innovation, and user-centric design. Our team is dedicated to transforming complex blockchain data into clear, compelling visual narratives that empower decision-making and foster crypto literacy. Role Description Position: Fresher Data Visualisation Analyst – Remote Employment Type: Internship (Paid or unpaid, depending on suitability and project requirements) Key Responsibilities: Data Visualisation Design: Create engaging and intuitive charts, graphs, dashboards, and infographics to communicate trends, patterns, and insights from cryptocurrency market data. Data Interpretation: Collaborate with data analysts and product teams to understand analysis results and translate them into visually compelling stories for diverse audiences. Tool Proficiency: Utilize industry-standard visualisation tools such as Tableau, Power BI, or Python libraries (Matplotlib, Seaborn, Plotly) to build interactive dashboards and reports. Data Quality: Assist in reviewing and validating datasets to ensure accuracy and relevance for visualisation purposes. Stakeholder Communication: Present visualisations and findings to internal teams and stakeholders, ensuring clarity and actionable insights for both technical and non-technical audiences. Collaborative Innovation: Work closely with data scientists, engineers, and product managers to refine visualisation requirements and enhance the overall user experience. Documentation: Maintain documentation of visualisation methodologies, data sources, and design choices for transparency and future reference. Learning Outcomes: Hands-on Experience: Gain practical experience with real-world crypto datasets and advanced visualisation tools in a fintech environment. Skill Development: Learn best practices in data storytelling, dashboard design, and user interface principles. Mentorship: Receive guidance from experienced data visualisation specialists and blockchain professionals. Qualifications Core Requirements: Analytical Mindset: Strong interest in data analysis and the ability to interpret complex datasets. Technical Skills: Familiarity with data visualisation tools (Tableau, Power BI, or Python libraries) and basic data manipulation techniques (Excel, SQL). Design Sensibility: Eye for design, including layout, color theory, and accessibility in visual communication. Communication: Excellent written and verbal skills to present findings clearly and collaborate with cross-functional teams. Remote Work Ethic: Self-motivated, organized, and able to work independently in a distributed team. Academic Background: Pursuing or recently completed a degree in Data Science, Computer Science, Statistics, Graphic Design, or a related field. Preferred Assets: Portfolio: Showcase of past visualisation projects (GitHub, Tableau Public, or academic work). Blockchain/Crypto Interest: Curiosity about cryptocurrency markets or blockchain technology. UI/UX Basics: Understanding of user experience principles for dashboard usability. Why Join CryptoChakra? Impactful Contribution: Help shape how crypto data is understood and acted upon by users worldwide. Professional Growth: Develop expertise in data visualisation, analytics, and fintech innovation. Flexible Environment: Enjoy remote work, mentorship, and opportunities to contribute to exciting projects.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies