Home
Jobs

688 Pandas Jobs - Page 24

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 6.0 years

4 - 8 Lacs

Chennai

Work from Office

Naukri logo

Key Responsibilities : - Develop, deploy, and maintain scalable web applications using Python (Flask/Django). - Design and implement RESTful APIs with strong security and authentication mechanisms. - Work with MongoDB and other database management systems to store and query data efficiently. - Support and productize Machine Learning models, including feature engineering, training, tuning, and scoring. - Understand and apply distributed computing concepts to build high-performance systems. - Handle web hosting and deployment of applications, ensuring uptime and performance. - Collaborate with stakeholders to translate business requirements into technical solutions. - Communicate effectively with both technical and non-technical team members. - Take ownership of projects, troubleshoot production issues, and implement solutions proactively. Required Skills & Qualifications : - 3-5 years of experience in Python development, primarily with Flask (Django experience is a plus). - Solid knowledge of distributed systems and web architectures. - Hands-on experience with Machine Learning workflows and model deployment. - Experience with MongoDB and other database technologies. - Strong knowledge of RESTful API development and security best practices. - Excellent problem-solving skills and the ability to work independently. - Strong communication skills to clearly explain technical concepts to a diverse audience. Nice to Have : - Bachelor's or Master's degree in Computer Science, IT, or a related field. - Experience with data manipulation using Pandas, Spark, and handling large datasets. - Familiarity with Django framework in addition to Flask. Apply Insights Follow-up Save this job for future reference Did you find something suspiciousReport Here! Hide This Job Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.

Posted 1 month ago

Apply

3.0 - 7.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

About US Omni's team is passionate about Commerce and Digital Transformation We've been successfully delivering Commerce solutions for clients across North America, Europe, Asia, and Australia The team has experience executing and delivering projects in B2B and B2C solutions JOB DESCRIPTION We are seeking a high-impact AI/ML Engineer to lead the design, development, and deployment of machine learning and AI solutions across vision, audio, and language modalities You'll be part of a fast-paced, outcome-oriented AI & Analytics team, working alongside data scientists, engineers, and product leaders to transform business use cases into real-time, scalable AI systems, This role demands strong technical leadership, a product mindset, and hands-on expertise in Computer Vision, Audio Intelligence, and Deep Learning, Key Responsibilities Architect, develop, and deploy ML models for multimodal problems, including vision (image/video), audio (speech/sound), and NLP tasks, Own the complete ML lifecycle: data ingestion, model development, experimentation, evaluation, deployment, and monitoring, Leverage transfer learning, foundation models, or self-supervised approaches where suitable, Design and implement scalable training pipelines and inference APIs using frameworks like PyTorch or TensorFlow, Collaborate with MLOps, data engineering, and DevOps to productionize models using Docker, Kubernetes, or serverless infrastructure, Continuously monitor model performance and implement retraining workflows to ensure accuracy over time, Stay ahead of the curve on cutting-edge AI research (e-g , generative AI, video understanding, audio embeddings) and incorporate innovations into production systems, Write clean, well-documented, and reusable code to support agile experimentation and long-term platform sustainability, Requirements Bachelors or Masters degree in Computer Science, Artificial Intelligence, DataScience, or a related field, 58+ years of experience in AI/ML Engineering, with at least 3 years in applied deep learning, Technical Skills Languages: Expert in Python; good knowledge of R or Java is a plus, ML/DL Frameworks: Proficient with PyTorch, TensorFlow, Scikit-learn, ONNX, Computer Vision: Image classification, object detection, OCR, segmentation, tracking (YOLO, Detectron2, OpenCV, MediaPipe), Audio AI: Speech recognition (ASR), sound classification, audio embedding models (Wav2Vec2, Whisper, etc ), Data Engineering: Strong with Pandas, NumPy, SQL, and preprocessing pipelines for structured and unstructured data, NLP/LLMs: Working knowledge of Transformers, BERT/LLAMA, Hugging Face ecosystem is preferred, Cloud & MLOps: Experience with AWS/GCP/Azure, MLFlow, SageMaker, Vertex AI, or Azure ML, Deployment & Infrastructure: Experience with Docker, Kubernetes, REST APIs, serverless ML inference, CI/CD & Version Control: Git, DVC, ML pipelines, Jenkins, Airflow, etc Soft Skills & Competencies Strong analytical and systems thinking; able to break down business problems into ML components, Excellent communication skills able to explain models, results, and decisions to non-technical stakeholders, Proven ability to work cross-functionally with designers, engineers, product managers, and analysts, Demonstrated bias for action, rapid experimentation, and iterative delivery of impact,

Posted 1 month ago

Apply

3.0 - 6.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Reference 250008SV Responsibilities ML OPS Engineer You will be responsible for the deployment and maintenance of the group data science platform infrastructure, on which data science pipelines are deployed and scaled To achieve this, you will collaborate with Data Scientists and Data Engineers from various business lines and the Global Technology Service infrastructure team (GTS), Roles: Implement techniques and processes for supporting the development and scaling of data science pipelines, Industrialize inference, retraining, monitoring data science pipelines, ensuring their maintainability and compliance, Provide platform support to end-users, Be attentive to the needs and requirements expressed by the end-users, Anticipate needs and necessary developments for the platform, Work closely with Data Scientists, Data Engineers, and business stakeholders, Stay updated and demonstrate a keen interest in the ML OPS domain, Environment: Cloud on-premise, AZure Python, Kubernetes Integrated vendor solutions: Dataiku, Snowflake DB: PostGreSQL Distributed computing: Spark Big Data: Hadoop, S3/Scality, MAPR Datascience: Scikit-learn, Transformers, ML Flow, Kedro, DevOps, CI/CD: JFROG, Harbor, Github Actions, Jenkins Monitoring: Elastic Search/Kibana, Grafana, Zabbix Agile ceremonies: PI planning, Sprint, Sprint Review, Refinement, Retrospectives, ? ITIL framework Required Profile required Technical Skills: Python FastApi, SqlAlchemy Numpy, Pandas, Scikit-learn, Transformers Kubernetes, Docker Pytest CI/CD: Jenkins, Ansible, GitHub Action, Harbor, Docker Soft Skills: Client Focus: Demonstrate strong listening skills, understanding, and anticipation of user needs, Team Spirit: Organize collaboration, workshops to find the best solutions, Share expertise with colleagues to find the most suitable solutions, Innovation: Propose innovative ideas, solutions, or strategies, and think out the box, Prefer simplicity over complexity, Responsibility: Take ownership, keep commitments and respect deadlines, Why join us "We are committed to creating a diverse environment and are proud to be an equal opportunity employer All qualified applicants receive consideration for employment without regard to race, color, religion, gender, gender identity or expression, sexual orientation, national origin, genetics, disability, age, or veteran status?, Business insight At SocitGnrale, we are convinced that people are drivers of change, and that the world of tomorrow will be shaped by all their initiatives, from the smallest to the most ambitious Whether youre joining us for a period of months, years or your entire career, together we can have a positive impact on the future Creating, daring, innovating, and taking action are part of our DNA If you too want to be directly involved, grow in a stimulating and caring environment, feel useful on a daily basis and develop or strengthen your expertise, you will feel right at home with us! Still hesitating You should know that our employees can dedicate several days per year to solidarity actions during their working hours, including sponsoring people struggling with their orientation or professional integration, participating in the financial education of young apprentices, and sharing their skills with charities There are many ways to get involved, We are committed to support accelerating our Groups ESG strategy by implementing ESG principles in all our activities and policies They are translated in our business activity (ESG assessment, reporting, project management or IT activities), our work environment and in our responsible practices for environment protection, Diversity and Inclusion We are an equal opportunities employer and we are proud to make diversity a strength for our company Societe Generale is committed to recognizing and promoting all talents, regardless of their beliefs, age, disability, parental status, ethnic origin, nationality, gender identity, sexual orientation, membership of a political, religious, trade union or minority organisation, or any other characteristic that could be subject to discrimination,

Posted 1 month ago

Apply

6.0 - 11.0 years

16 - 18 Lacs

Bangalore Rural, Bengaluru

Hybrid

Naukri logo

6–8 yrs of experience as a Automation Engineer u in Python, AI/ML (TensorFlow/PyTorch), test automation (Selenium/Cypress), APIs, cloud (AWS/Azure/GCP), NLP, and Git.

Posted 1 month ago

Apply

3.0 - 8.0 years

9 - 18 Lacs

Hyderabad

Hybrid

Naukri logo

Data Engineer with Python development experience Experience: 3+ Years Mode: Hybrid (2-3 days/week) Location: Hyderabad Key Responsibilities Develop, test, and deploy data processing pipelines using AWS Serverless technologies such as AWS Lambda, Step Functions, DynamoDB, and S3. Implement ETL processes to transform and process structured and unstructured data eiciently. Collaborate with business analysts and other developers to understand requirements and deliver solutions that meet business needs. Write clean, maintainable, and well-documented code following best practices. Monitor and optimize the performance and cost of serverless applications. Ensure high availability and reliability of the pipeline through proper design and error handling mechanisms. Troubleshoot and debug issues in serverless applications and data workows. Stay up-to-date with emerging technologies in the AWS and serverless ecosystem to recommend improvements. Required Skills and Experience 3-5 years of hands-on Python development experience, including experience with libraries like boto3, Pandas, or similar tools for data processing. Strong knowledge of AWS services, especially Lambda, S3, DynamoDB, Step Functions, SNS, SQS, and API Gateway. Experience building data pipelines or workows to process and transform large datasets. Familiarity with serverless architecture and event-driven programming. Knowledge of best practices for designing secure and scalable serverless applications. Prociency in version control systems (e.g., Git) and collaboration tools. Understanding of CI/CD pipelines and DevOps practices. Strong debugging and problem-solving skills. Familiarity with database systems, both SQL (e.g., RDS) and NoSQL (e.g., DynamoDB). Preferred Qualications AWS certications (e.g., AWS Certied Developer Associate or AWS Certied Solutions Architect Associate). Familiarity with testing frameworks (e.g., pytest) and ensuring test coverage for Python applications. Experience with Infrastructure as Code (IaC) tools such as AWS CDK, CloudFormation. Knowledge of monitoring and logging tools . Apply for Position

Posted 1 month ago

Apply

8.0 - 12.0 years

13 - 20 Lacs

Chennai

Work from Office

Naukri logo

We are looking for an experienced Python ETL Developer to design, develop, and optimize data pipelines. The ideal candidate should have expertise in Python, PySpark, Airflow, and data processing frameworks, along with the ability to work independently and communicate effectively in English. Roles & Responsibilities : - Develop and maintain ETL pipelines using Python, NumPy, Pandas, PySpark, and Apache Airflow. - Work with large-scale data processing and transformation workflows. - Optimize and enhance ETL performance and scalability. - Collaborate with data engineers and business teams to ensure efficient data flow. - Troubleshoot and debug ETL-related issues to ensure data integrity and reliability. Qualifications & Skills : - 8+ years of Python experience, with 5+ years dedicated to Python ETL development. - Proficiency in PySpark, Apache Airflow, NumPy, and Pandas. - Experience in working with SQLAlchemy and FastAPI (added advantage). - Strong problem-solving skills and the ability to work independently. - Good English communication skills to collaborate with global teams. Preferred Qualifications : - Experience in cloud-based ETL solutions (AWS, GCP, Azure). - Knowledge of big data technologies like Hadoop, Spark, or Kafka.

Posted 1 month ago

Apply

5.0 - 10.0 years

20 - 32 Lacs

Hyderabad

Hybrid

Naukri logo

Responsibilities: As a Python Developer, you will be responsible for designing, developing, and maintaining Python-based applications and services. You will collaborate with cross-functional teams to deliver high-quality software solutions that meet the needs of our business divisions. Your role will involve: • Developing and maintaining Python applications and services. • Collaborating with product managers, designers, and other developers to create efficient and scalable solutions. • Writing clean, maintainable, and efficient code. • Participating in code reviews and providing constructive feedback to peers. • Troubleshooting and debugging issues across the stack. • Ensuring the performance, quality, and responsiveness of applications. • Staying up-to-date with emerging technologies and industry trends. Mandatory Skills: Full Stack Mandatory Skills Description: • Proven experience of more than 5 years as a Python Developer or similar role. • Proficiency in Python and its frameworks such as Django or Flask. • Strong knowledge of back-end technologies and RESTful APIs. • Experience in Pandas, and knowledge of AI/ML libraries like TensorFlow, PyTorch, etc. • Experience of working on Kubernetes / OpenShift, Dockers, Cloud Native frameworks. • Experience with database technologies such as SQL, NoSQL, and ORM frameworks. • Familiarity with version control systems like Git. • Familiarity with Linux and Windows Operating Systems. • Knowledge of Shell scripting (Bash for Linux and PowerShell for Windows) will be advantageous. • Understanding of Agile methodologies and DevOps practices. • Excellent problem-solving skills and attention to detail. • Strong communication and collaboration skills. • Ability to work independently and as part of a team. • A degree in Computer Science, Engineering, or a related field is preferred. Nice-to-Have Skills Description: Experience in Agile Framework

Posted 1 month ago

Apply

1.0 - 3.0 years

4 - 5 Lacs

Ahmedabad, Surat

Work from Office

Naukri logo

About Us: Founded in 2008, Red & White is Gujarats leading NSDC & ISO-certified institute, focused on industry-relevant education and global employability. Role Overview: Were hiring a faculty member to teach AI, Machine Learning, and Data Science. The role includes delivering lectures, guiding projects, mentoring students, and staying updated with tech trends. Key Responsibilities: Deliver high-quality lectures on AI, Machine Learning, and Data Science . Design and update course materials, assignments, and projects. Guide students on hands-on projects, real-world applications, and research work. Provide mentorship and support for student learning and career development. Stay updated with the latest trends and advancements in AI/ML and Data Science. Conduct assessments, evaluate student progress, and provide feedback. Participate in curriculum development and improvements. Skills & Tools: Core Skills: ML, Deep Learning, NLP, Computer Vision, Business Intelligence, AI Model Development, Business Analysis. Programming: Python, SQL (Must), Pandas, NumPy, Excel. ML & AI Tools: Scikit-learn (Must), XGBoost, LightGBM, TensorFlow, PyTorch (Must), Keras, Hugging Face. Data Visualization: Tableau, Power BI (Must), Matplotlib, Seaborn, Plotly. NLP & CV: Transformers, BERT, GPT, OpenCV, YOLO, Detectron2. Advanced AI: Transfer Learning, Generative AI, Business Case Studies. Education & Experience Requirements: Bachelor's/Masters/Ph.D. in Computer Science, AI, Data Science, or a related field. Minimum 1+ years of teaching or industry experience in AI/ML and Data Science. Hands-on experience with Python, SQL, TensorFlow, PyTorch, and other AI/ML tools. Practical exposure to real-world AI applications, model deployment, and business analytics. For further information, please feel free to contact 7862813693 us via email at career@rnwmultimedia.edu.in

Posted 1 month ago

Apply

8.0 - 12.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

Experience: - 8+ Years Location: - Bangalore Notice: - Immediate Joiners only Job Description: - Summary: Automate critical aspects of the reserves workflow to improve consistency and reduce repetitive manual creation of tables and plots. Leverage MCBUs type curves automation and adapt to reserves. Current type curve workflow is built through a series of Python scripts that generate standardized documentation from technical work developed by engineers and basin-wide statistics that support asset characterization. Objective: To enhance the efficiency, accuracy, and reliability of the Reserves Advisory Committee (RAC) documentation process by automating data preparation, streamlining workflows, and improving version control, while eliminating the use of multiple data sources by Qualified Reserves Estimators (QREs) and ensuring seamless integration between ValNav, TRACER, and CRSS. Technical skills needed: Robust knowledge of the Python programming (key libraries: pandas, openpyxl, pptx, matplotlib, shapefile). Solid expertise with Gradio, main platform used by the user as interface, to create input/output visualisations. Proficient in VS Code and ADO pipeline management. Fluency in construction of data visualisation and analytics.

Posted 1 month ago

Apply

5.0 - 9.0 years

25 - 40 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

Salary : 25 to 40 LPA Exp: 4 to 8 years Location :Noida/Gurugram/Bangalore Notice: Immediate to 30 days..!! Roles & responsibilities: 5+ years exp on Python , ML and Banking model development Interact with the client to understand their requirements and communicate / brainstorm solutions, model Development: Design, build, and implement credit risk model. Contribute to how analytical approach is structured for specification of analysis Contribute insights from conclusions of analysis that integrate with initial hypothesis and business objective. Independently address complex problems 3+ years exp on ML/Python (predictive modelling) . Design, implement, test, deploy and maintain innovative data and machine learning solutions to accelerate our business. Create experiments and prototype implementations of new learning algorithms and prediction techniques Collaborate with product managers, and stockholders to design and implement software solutions for science problems Use machine learning best practices to ensure a high standard of quality for all of the team deliverables Has experience working on unstructured data ( text ): Text cleaning, TFIDF, text vectorization Hands-on experience with IFRS 9 models and regulations. Data Analysis: Analyze large datasets to identify trends and risk factors, ensuring data quality and integrity. Statistical Analysis: Utilize advanced statistical methods to build robust models, leveraging expertise in R programming. Collaboration: Work closely with data scientists, business analysts, and other stakeholders to align models with business needs. Continuous Improvement: Stay updated with the latest methodologies and tools in credit risk modeling and R programming.

Posted 1 month ago

Apply

5.0 - 9.0 years

15 - 25 Lacs

Navi Mumbai, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Hi Candidates, we are Hiring ! Designation Python Developer Location Bangalore /Mumbai Experience 5+Years Work Mode - WFO-Bangalore / Mumbai Notice Period- Immediate Joiner/serving Notice Period Company -Talent SKetchers Required qualifications, capabilities, and skills Formal training or certification on software engineering concepts and 3+ years applied experience Proficient in coding in Python In-depth understanding of the Python software development stacks, ecosystems, frameworks and tools Experience with popular Python frameworks such as Numpy, Panda, Django, Flask or Pyramid Hands-on practical experience in system design, application development, testing, and operational stability Experience in developing, debugging, and maintaining code in a large corporate environment with one or more modern programming languages and database querying languages Overall knowledge of the Software Development Life Cycle Solid understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Demonstrated knowledge of software applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.) Preferred qualifications, capabilities, and skills Familiarity with modern front-end technologies A working understanding of cloud platforms such as AWS, Google Cloud or Azure Note -Intrested Candidate share your resume -anita@talentsketchers.com

Posted 1 month ago

Apply

2 - 5 years

5 - 13 Lacs

Pune

Work from Office

Naukri logo

We are seeking a skilled Python Developer to join our dynamic team in Pune. The ideal candidate will have 2-4 years of professional experience and demonstrate expertise in Python development, frameworks, and associated tools.

Posted 1 month ago

Apply

2 - 5 years

12 - 20 Lacs

Pune

Work from Office

Naukri logo

Looking for a highly motivated, self-driven AI Experienced Data Scientist for building differentiating solutions and for implementing AI and Generative AI (Gen AI) based solutions for customer's business problems. As an AI Solution Engineer , build AI and Gen AI empowered practical in-depth solutions for solving customers business problems. As an AI and Data Solution Engineer , design and implement Cloud Data, AI, and Gen AI based solutions , including LLM-powered systems, for customer programs. Apply statistics, modeling, LLMs , and machine learning to improve the efficiency of systems and relevance algorithms across our business application products. Requirements Bachelors degree or Masters degree in Computer Science / AIML / Data Science. 4 to 6 years of overall experience and hands-on experience with the design and implementation of Machine Learning models, Deep Learning models, and Gen AI models (e.g., GPT, LLaMA, Mistral) for solving business problems. Proven experience working with Generative AI technologies , including prompt engineering, fine-tuning large language models (LLMs), embeddings, vector databases (e.g., FAISS, Pinecone), and Retrieval-Augmented Generation (RAG) systems. Expertise in R, Python (NumPy, Scikit-learn, Pandas), TensorFlow, PyTorch, transformers (e.g., Hugging Face) , or MLlib. Expertise in cloud-based data and AI solution design and implementation using GCP / AWS / Azure , including the use of their Gen AI services. Good experience in building complex and scalable ML and Gen AI solutions and deploying them into production environments. Experience with scripting in SQL, extracting large datasets, and designing ETL flows. Excellent problem-solving and analytical skills with the ability to translate business requirements into data science and Gen AI solutions. Effective communication skills, with the ability to convey complex technical concepts to both technical and non-technical stakeholders.

Posted 1 month ago

Apply

5 - 9 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

Wipro Limited (NYSEWIT, BSE507685, NSEWIPRO) is a leading technology services and consulting company focused on building innovative solutions that address clients’ most complex digital transformation needs. Leveraging our holistic portfolio of capabilities in consulting, design, engineering, and operations, we help clients realize their boldest ambitions and build future-ready, sustainable businesses. With over 230,000 employees and business partners across 65 countries, we deliver on the promise of helping our customers, colleagues, and communities thrive in an ever-changing world. For additional information, visit us at www.wipro.com. About The Role Role Purpose The Requirement Name / TitlePython Developer/Lead - P1 Mandatory Skills: Python, NumPy, Pandas, Rest API Preferred Skills: Strong in Data Engineering and Analysis, SQL Server (Complex SQL) About The Role / Roles & Responsibilities (in Detail) This is hands on role. The role will involve design, coding, testing, working with product owners / scrum master for scrum planning, estimation, demos & leading / guiding junior developers as needed. Good in writing Python code. Hands-on experience with pandas and numpy stack Able to perform data cleanup. Summarization using NumPy/pandas. SQL knowledge is Essential Must have expertise in Rest API development Cloud experience is preferred (Azure). Uses pertinent data and facts to identify and solve a range of problems within area of expertise Investigates non- standard requests and problems, with some assistance from others. Experience Level – 5 to 9 Years relevant experience ? Do Oversee and support process by reviewing daily transactions on performance parameters Review performance dashboard and the scores for the team Support the team in improving performance parameters by providing technical support and process guidance Record, track, and document all queries received, problem-solving steps taken and total successful and unsuccessful resolutions Ensure standard processes and procedures are followed to resolve all client queries Resolve client queries as per the SLA’s defined in the contract Develop understanding of process/ product for the team members to facilitate better client interaction and troubleshooting Document and analyze call logs to spot most occurring trends to prevent future problems Identify red flags and escalate serious client issues to Team leader in cases of untimely resolution Ensure all product information and disclosures are given to clients before and after the call/email requests Avoids legal challenges by monitoring compliance with service agreements ? Handle technical escalations through effective diagnosis and troubleshooting of client queries Manage and resolve technical roadblocks/ escalations as per SLA and quality requirements If unable to resolve the issues, timely escalate the issues to TA & SES Provide product support and resolution to clients by performing a question diagnosis while guiding users through step-by-step solutions Troubleshoot all client queries in a user-friendly, courteous and professional manner Offer alternative solutions to clients (where appropriate) with the objective of retaining customers’ and clients’ business Organize ideas and effectively communicate oral messages appropriate to listeners and situations Follow up and make scheduled call backs to customers to record feedback and ensure compliance to contract SLA’s ? Build people capability to ensure operational excellence and maintain superior customer service levels of the existing account/client Mentor and guide Production Specialists on improving technical knowledge Collate trainings to be conducted as triage to bridge the skill gaps identified through interviews with the Production Specialist Develop and conduct trainings (Triages) within products for production specialist as per target Inform client about the triages being conducted Undertake product trainings to stay current with product features, changes and updates Enroll in product specific and any other trainings per client requirements/recommendations Identify and document most common problems and recommend appropriate resolutions to the team Update job knowledge by participating in self learning opportunities and maintaining personal networks ? Deliver NoPerformance ParameterMeasure1ProcessNo. of cases resolved per day, compliance to process and quality standards, meeting process level SLAs, Pulse score, Customer feedback, NSAT/ ESAT2Team ManagementProductivity, efficiency, absenteeism3Capability developmentTriages completed, Technical Test performance Reinvent your world. We are building a modern Wipro. We are an end-to-end digital transformation partner with the boldest ambitions. To realize them, we need people inspired by reinvention. Of yourself, your career, and your skills. We want to see the constant evolution of our business and our industry. It has always been in our DNA - as the world around us changes, so do we. Join a business powered by purpose and a place that empowers you to design your own reinvention. Come to Wipro. Realize your ambitions. Applications from people with disabilities are explicitly welcome.

Posted 1 month ago

Apply

- 5 years

6 - 7 Lacs

Kolkata

Work from Office

Naukri logo

Design and develop backend services using Python (Django/Flask/Fast API). Build responsive and user-friendly front-end interfaces using React.js/Angular/Vue.js. Integrate front-end and back-end components into a seamless application.

Posted 1 month ago

Apply

4 - 7 years

0 Lacs

Bengaluru

Hybrid

Naukri logo

Role & responsibilities Preferred candidate profile

Posted 1 month ago

Apply

1 - 2 years

9 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Key Responsibilities ETL Development & AI Data Pipelines: Design and develop ETL processes to extract, transform, and load data from various sources for ML model training and inference Create data pipelines that can efficiently handle structured and unstructured data for AI applications Work with data scientists to implement feature engineering transformations at scale Machine Learning Operations: Assist in developing automated ML workflows for model training, evaluation, and deployment Implement basic model monitoring and observability solutions Create scripts to automate data preparation for ML model development Python Development: Write efficient Python scripts leveraging data science and ML libraries (pandas, scikit-learn, etc.) Develop reusable code modules for data processing and basic ML operations Maintain and optimize existing Python codebases Automation & Orchestration: Build automation frameworks to streamline ML model deployment and monitoring Implement data validation tests to ensure data quality for ML models Create workflow orchestration solutions for end-to-end ML pipelines Collaboration: Work closely with data scientists to understand model requirements and implementation needs Collaborate with cross-functional teams to integrate ML solutions into business applications Participate in code reviews and knowledge sharing sessions Qualifications Education: Bachelor's degree in Computer Science, Data Science, Information Technology, or related field Relevant certifications in data engineering, machine learning, or cloud technologies are a plus Experience: 1-2 years of experience in ETL development, Python programming, and data engineering Exposure to machine learning workflows and basic ML operations Experience with ETL tools such as Apache NiFi, Talend, or similar platforms Technical Skills: Proficiency in Python with experience using data processing libraries (pandas, NumPy) Knowledge of ML libraries like scikit-learn, TensorFlow, or PyTorch Experience with SQL and relational databases Basic understanding of cloud data services (AWS, Azure, or GCP) Familiarity with version control systems (Git) and CI/CD concepts Understanding of data quality practices for ML applications Soft Skills: Strong problem-solving abilities and analytical mindset Good communication skills and ability to work in cross-functional teams Self-motivated with eagerness to learn new technologies Primary Skill Set ETL & Data Pipeline Development: 1-2 years Designing and implementing data pipelines for analytics and ML Experience with ETL tools and data integration patterns Ability to handle various data formats and sources Python Programming & ML Libraries: 1-2 years Proficiency in Python with focus on data manipulation and processing Experience with pandas, NumPy, and scikit-learn Basic knowledge of ML frameworks (TensorFlow/PyTorch) Machine Learning Operations: 0-1 year Basic understanding of ML model deployment workflows Experience with model serving and monitoring Knowledge of feature engineering techniques Data Management for AI Applications: 1-2 years Experience with datasets preparation for machine learning Knowledge of data quality and validation techniques Understanding of data versioning and lineage tracking Automation & Orchestration: 1-2 years Experience building automated workflows for data processing Knowledge of workflow orchestration tools (Airflow, Prefect, etc.) Ability to implement testing frameworks for data pipelines

Posted 1 month ago

Apply

3 - 7 years

0 - 3 Lacs

Bangalore Rural, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Extensive hands-on on Python/ Groovy development • Ability to debug and write new scripts using Python, Pandas, NumPy, MATLAB, Ruby. • Strong OOP experience. • Expertise in at least one popular Python framework (like Django, Flask, or Pyramid) • Expertise in Front-End like HTML, React, RESTful API • Exposure to DevOps practices and tool chains GitLab/Jenkins pipeline for CI/CD. Programming: Python, Pandas, NumPy, PySpark, Object Oriented Principles , Multi-threading concepts • DevOps: GitLab pipeline, SonarQube, Coverity, Docker, Kubernetes, Terraform, CI/CD • Web Technologies: Flask, Django , HTML, React, RESTful API • Database: MySQL, MongoDB, Microsoft SQL Server Management • Misc: Tableau, PyQT5

Posted 1 month ago

Apply

7 - 10 years

19 - 27 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Python Developer(Numpy, Scipy, Pandas, Dask, spaCy, NLTK, sci-kit-learn, and PyTorch) As part of the team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight. You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution design You will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organizations financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers.

Posted 1 month ago

Apply

4 - 6 years

30 - 34 Lacs

Bengaluru

Work from Office

Naukri logo

Overview Annalect is seeking a hands-on Data QA Manager to lead and elevate data quality assurance practices across our growing suite of software and data products. This is a technical leadership role embedded within our Technology teams, focused on establishing best-in-class data quality processes that enable trusted, scalable, and high-performance data solutions. As a Data QA Manager, you will drive the design, implementation, and continuous improvement of end-to-end data quality frameworks, with a strong focus on automation, validation, and governance. You will work closely with data engineering, product, and analytics teams to ensure data integrity, accuracy, and compliance across complex data pipelines, platforms, and architectures, including Data Mesh and modern cloud-based ecosystems. This role requires deep technical expertise in SQL, Python, data testing frameworks like Great Expectations, data orchestration tools (Airbyte, DbT, Trino, Starburst), and cloud platforms (AWS, Azure, GCP). You will lead a team of Data QA Engineers while remaining actively involved in solution design, tool selection, and hands-on QA execution. Responsibilities Key Responsibilities: Develop and implement a comprehensive data quality strategy aligned with organizational goals and product development initiatives. Define and enforce data quality standards, frameworks, and best practices, including data validation, profiling, cleansing, and monitoring processes. Establish data quality checks and automated controls to ensure the accuracy, completeness, consistency, and timeliness of data across systems. Collaborate with Data Engineering, Product, and other teams to design and implement scalable data quality solutions integrated within data pipelines and platforms. Define and track key performance indicators (KPIs) to measure data quality and effectiveness of QA processes, enabling actionable insights for continuous improvement. Generate and communicate regular reports on data quality metrics, issues, and trends to stakeholders, highlighting opportunities for improvement and mitigation plans. Maintain comprehensive documentation of data quality processes, procedures, standards, issues, resolutions, and improvements to support organizational knowledge-sharing. Provide training and guidance to cross-functional teams on data quality best practices, fostering a strong data quality mindset across the organization. Lead, mentor, and develop a team of Data QA Analysts/Engineers, promoting a high-performance, collaborative, and innovative culture. Provide thought leadership and subject matter expertise on data quality, influencing technical and business stakeholders toward quality-focused solutions. Continuously evaluate and adopt emerging tools, technologies, and methodologies to advance data quality assurance capabilities and automation. Stay current with industry trends, innovations, and evolving best practices in data quality, data engineering, and analytics to ensure cutting-edge solutions. Qualifications Required Skills 11+ years of hands-on experience in Data Quality Assurance, Data Test Automation, Data Comparison, and Validation across large-scale datasets and platforms. Strong proficiency in SQL for complex data querying, data validation, and data quality investigations across relational and distributed databases. Deep knowledge of data structures, relational and non-relational databases, stored procedures, packages, functions, and advanced data manipulation techniques. Practical experience with leading data quality tools such as Great Expectations, DbT tests, and data profiling and monitoring solutions. Experience with data mesh and distributed data architecture principles for enabling decentralized data quality frameworks. Hands-on experience with modern query engines and data platforms, including Trino/Presto, Starburst, and Snowflake. Experience working with data integration and ETL/ELT tools such as Airbyte, AWS Glue, and DbT for managing and validating data pipelines. Strong working knowledge of Python and related data libraries (e.g., Pandas, NumPy, SQLAlchemy) for building data quality tests and automation scripts.

Posted 1 month ago

Apply

10 - 15 years

17 - 20 Lacs

Kolkata

Work from Office

Naukri logo

Choosing Capgemini means choosing a company where you will be empowered to shape your career in the way youd like, where youll be supported and inspired bya collaborative community of colleagues around the world, and where youll be able to reimagine whats possible. Join us and help the worlds leading organizationsunlock the value of technology and build a more sustainable, more inclusive world About The Role Understanding of Generative AI (LLMs, diffusion models, transformers, NLP, multimodal AI, etc.). Knowledge of Agentic AI principles, including autonomous agents, reinforcement learning, prompt engineering, and AI-driven decision-making. Familiarity with AI/ML frameworks such as TensorFlow, PyTorch, LangChain, OpenAI APIs, Hugging Face, etc. Understanding of AI infrastructure (cloud platforms like AWS, Azure, GCP) and MLOps best practices. Primary Skills Generative AI NLP Machine Learning Secondary Skills Python PySpark R

Posted 1 month ago

Apply

10 - 12 years

32 - 37 Lacs

Mumbai

Work from Office

Naukri logo

S&P Global Dow Jones Indices The Role : Senior Lead Development Engineer Python S&P Dow Jones Indices a global leader in providing investable and benchmark indices to the financial markets, is looking for a Senior Lead Development Engineer to join our technology team. The Team : You will be part of global technology team comprising of Dev, QA and BA teams and will be responsible for analysis, design, development and testing. Responsibilities and Impact : You will be working on one of the key systems that is responsible for calculating re-balancing weights and asset selections for S&P indices. Ultimately, the output of this team is used to maintain some of the most recognized and important investable assets globally. Design and development of Python applications deployed to AWS cloud services. Interface with UI application(s), RESTful interfaces, and diagnose issues. Coding, Documentation, Testing, Debugging, Documentation and level 3 support. Taking ownership of code modules and leading code review processes. Work directly with stakeholders and technical architect to formalize/document requirements for both supporting existing application as well as new initiatives. Perform Application & System Performance tuning and troubleshoot performance issues. Define and refine task definition, delegate tasks to team, and conduct code reviews / pull requests. Supervising and mentoring less experienced team member. Whats in it for you : This is an opportunity to work on a team of highly talented and motivated engineers at a highly respected company. You will work on new development as well as enhancements to existing functionality. What Were Looking For : Basic Qualifications : 10 - 12 years of IT experience in application development and support. Bachelor's degree in Computer Science, Information Systems, Engineering or, or in lieu, a demonstrated equivalence in work experience. Expert in modern Python 3.10 and late (minimum 5 years dedicated Python experience). Expertise in related Python libraries including Pandas, Numpy, Pydantic Experience with developing, troubleshooting distributable Python libraries. Backend services development including distributed libraries and packages in Python. Experience with AWS and cloud services including SQL databases, particularly PostgreSQL. Experience with DevOps and CI/CD processes (Jenkins, GitHub actions, etc.). Experience with software testing (unit testing, integration testing, test driven development). Strong Work Ethic, Communication and Thoughtfulness Additional Preferred Qualifications : Strong mathematics stills and understanding of financial markets (stocks, funds, indices, etc.) Algorithm development or rules engine experience is helpful. Demonstrated ability to solve complex, highly detailed business problems through software engineering skills (not just a coder / scripter but can work on system-wide problems). Basic understanding of creating calculation services that are consumed in a cloud environment over RESTful API. Prior ETL (Extract Transform Load) experience is helpful, but candidates should first be an experienced software engineer, and second very strong at analyzing data.

Posted 1 month ago

Apply

- 1 years

6 - 10 Lacs

Coimbatore

Remote

Naukri logo

The 2-month online internship program is offered by leading scientists of Gilbert Research Center . Gilbert Research Center is one of the most reputed and notable research centres in India. What you'll learn from this Online-Internship Basics of Mathematics - Linear Algebra, Probability, Differential Equations Fundamentals of Python Programming Language Fundamentals of Statistics, Statistical Power Analysis & Optimization. Machine Learning Algorithms - Linear, Multiple, Non-Linear Regression, Correlation, Gradient Descent & PCA Fundamental of Neural Networks, Convolution & Deep Learning ( CNN - Convolutional Neural Network). Image Processing, Medical Image Processing - X-ray modality. Industry-relevant projects discussion and implementation, fundamental programming assignments, and attending mentorship sessions of our Scientists. You'll get basic and complete Industry-related knowledge for your recruitment process at Gilbert Research Center , career shift , and other job opportunities . You'll get core knowledge of machine learning & data science from a leading scientist. This program definitely boosts your professional career and promotion. Every candidate will get a Verified Certificate & Letter of Recommendation from a Scientist at Gilbert Research Center. Stipends will be provided to all students based on their assignment performance. Our internship provide to grab a job at Gilbert Research Center, a software industry, fin-tech company based on data science and machine learning. Every candidate will get a Verified Certificate & Letter of Recommendation from a Scientist at Gilbert Research Center. Stipends will be provided to all students based on their performance. Our internship provides a bridge to grab a job opportunity in a software or fin-tech company based on data science and machine learning. Job Role Assigned with Mentorship with Scientist: One of the most reputed research centres in India, Gilbert Research Center invites high-level top leading Internship positions in organizations. It's an Online-Internship Programme. The working location is completely Online . You don't need to switch your locations. It's an Internship Programme working 2 -Months Programme. Our high-level Internship Programme will be mentored and guided by the leading Scientists of Gilbert Research Center. Your role would be to lead a high-level Internship Programme by allocating Industry relevant projects, fundamental programming assignments, and attending mentorship classes of our Scientists. Youll have experience collaborating with our leading Scientists to make your career as Industry-Academic. You'll get top experience in Industry Readiness & Trending topics . Selection is based on a pure merit basis. The interview will be conducted for Shortlisting applicants.

Posted 1 month ago

Apply

3 - 7 years

1 - 1 Lacs

Hyderabad

Work from Office

Naukri logo

Overview Annalect India is seeking a Finance Operations Analyst (Python Engineer & Programmer + Data Analytics) with strong technical and analytical skills to help support the business finance teams that continues to deliver strong financial performance. This might be a great fit if you have a strong flair of automations and dashboards preparation skills and would like to be part of a growing team. You will be closely working with our Global Agency Finance teams. Shift Timing:Night shift (06:30 PM - 03:30 AM) Location: Hyderabad only Mode: Hybrid (3 days working from office in a week) Experience: 4-7 years only **Freshers are not considered About Annalect India We are an integral part of Annalect Global and Omnicom Group, the second largest advertising agency holding company in the world in terms of revenue and is the leading global marketing communications company. Our portfolio includes: three global advertising agency networks: BBDO, DDB and TBWA; three of the world’s premium media services: OMD, PHD and Hearts & Science. Annalect India plays a key role for our group companies and global agencies by providing stellar products and services in areas of Creative Services, Technology, Marketing Sciences (data & analytics), Bunisess Support Services, Market Research and Media Services. Responsibilities This is an exciting role and would entail you to Technical Software Engineeringt: Programming Languages : Proficient in Python, Java, SQL, JavaScript, and C++; experience with frameworks such as Flask, Django, React, and Node.j is a plus Data Analysis & Visualization : Skilled in using tools like Excel (advanced), Power BI, and pandas for processing and visualizing data. Automation & Scripting : Experience with task automation using Python and shell scripting to streamline workflows and reduce manual processes. Version Control & Collaboration : Strong command of Git/GitHub for version control Risk Management: Identify, communicate, and manage project risks within and across teams. Escalate risks as needed and lead efforts to mitigate them. Ability to manage expectations Proficient in unit, integration, and performance testing in regard to programs, automation, and engineering project life-cycles Communication: Ability to communicate with both the team and management regarding the status of current project initiatives; Communicate with project lead regarding project health, status, and risks Ability to work closely with team members on collaborative initiatives, both on individual projects and on teamwide projects Qualifications This may be the right role for you if you have Bachelors or Post Graduate Degree in any stream with 4-5 years of proven programming and software engineering experience, primarily within the finance space. This candidate should have high level expertise in Python, and relatively high level expertise in multiple other programming languages, including but not limited to Java, SQL, and R., to design, develop, and optimize scripts/programs to automate manual reporting processess. The ideal candidate will be responsible for developing robust scripts and programs that allow us to fully automate our financial reporting, and implementing scalable solutions that align with business objectives and regulatory requirements. The goal is to reduce the team’s current time spent in producing financial reports, but to also produce automated solutions for all projects that can be automated, including ad-hoc reports. The candidate must also possess excellent problem-solving skills , the ability to work collaboratively across cross-functional teams, and a commitment to delivering high-quality solutions within tight deadlines.

Posted 1 month ago

Apply

3 - 5 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Job TitleAI/ML Engineer and Developer Location: Bengaluru, Karnataka, India Job Type: Full-Time About The Role : Introduction : We are seeking a passionate and innovative AI/Machine Learning Engineer to join our IT organization as part of a dynamic cross-functional team. In this role, you will develop, train, tune, and optimize machine learning models, deploy them into production, and ensure their scalability. You will also develop applications that integrate AI models into existing products or create new AI-powered solutions to enhance the capabilities and efficiency of our Marketing, Sales, and Communications teams. Key Responsibilities: Machine Learning Model Development and Integration: o Build end-to-end machine learning pipelines, from data ingestion to deployment. o Design, train, and evaluate ML models, including custom models and AutoML / pycaret /ML Flow solutions. o Implement CI/CD processes for model deployment and continuously monitor model performance. o Develop and deploy machine learning models to address specific challenges within the Marketing, Sales, and Communications domains. o Work with large and complex data sets to build models that enhance decision-making, customer segmentation, predictive analytics, and personalization efforts. o Seamlessly integrate ML models into existing API-based systems to ensure efficient consumption by business applications. Performance Monitoring and Optimization: o Continuously monitor the performance of machine learning models in production environments. o Refine and optimize models based on performance data and evolving business needs. o Explore new tools, techniques, and data sources to enhance the capabilities of ML models within the business context. Collaboration and Stakeholder Engagement: o Collaborate with the data team to define data requirements for model training and evaluation. o Work closely with developer teams to provide guidance on API architecture development, ensuring alignment with ML strategy. o Partner with API developers to design and implement APIs that provide access to machine learning insights. o Engage with business teams to understand their needs and deliver ML solutions that add tangible value. o Communicate complex ML concepts and the benefits of ML projects to non-technical stakeholders effectively. Required Qualifications: o Bachelor's or master"™s degree in computer science, Data Science, Engineering, or a related field, with 56 years of IT experience, including a minimum of 34 years of relevant experience as ML AI Developer. o Design, develop, and implement advanced machine learning and statistical models to solve complex business problems. o Explore and utilize various data mining and machine learning techniques to extract valuable insights and patterns from large datasets. o Conduct exploratory data analysis, data cleansing, and feature engineering to prepare datasets for analysis. o Perform statistical analysis, hypothesis testing, and A/B testing to evaluate the effectiveness of models and algorithms. o Should understand Dockerization, Kubernetes and REST APIs o Proficiency in full-stack development, API integration, and cloud-native development. o Strong knowledge of machine learning algorithms, statistics, and model optimization. o Familiarity with LLM/ SLM, MLOps tools (e. g. , Azure ML, Databricks, Kubeflow, MLflow). o Expertise in model deployment frameworks (e. g. , TensorFlow, PyTorch, Pycaret) and DevOps for AI. o Proficiency in programming languages such as Python or R. o Experience with API development and integration (preferably Azure API) o Understanding Hierarchical multi agent structure, RAG , LLM evaluation framework o Proficiency in visualizing data storylines and conducting in-depth data analysis with frameworks (e.g. Plotly, dash) Good to Have o Experience in developing ML solutions for marketing, sales, or communications. o Knowledge of cloud services (AWS, Azure, Google Cloud) and their machine learning offerings. o Demonstrated ability to work in cross-functional teams and manage projects. o Familiarity with agile development methodologies. You"™ll win us over by 3+ years of experience in Artificial Intelligence domain with a proven track record of developing and implementing various ML based solutions. B. Tech or master"™s in data science, Computer Science, Statistics or a related field Strong programming skills in languages such as Python, R, or C++. Should be familiar libraries and concepts such as Tensorflow , Pytorch, Keras, Sklearn, pycaret, Statmodels, Pandas, Numpy, Scipy, SQL, HQL or similar. Proficient in the use of NLP/Analytics tools Proactive and willing to learn new technologies. Excellent verbal, written and presentation skills.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies