Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
10.0 years
0 Lacs
Hyderabad, Telangana, India
On-site
About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. Job Title: AIML Developer Key Skills: Image Analytics & Computer Vision (CV),Machine Learning & Deep Learning Job Locations: Hyderabad Experience: 5 – 10 Years Budget: Based on your Experience Education Qualification : Any Graduation Work Mode: Hybrid Employment Type: Contract Notice Period: Immediate - 15 Days Interview Mode: 2 Rounds of Technical Interview + Including Client round Job Description: Key Focus Areas: · Image Analytics & Computer Vision (CV) · Machine Learning & Deep Learning · Predictive Analytics & Optimization · Generative AI (GenAI) & NLP (as secondary skills) Primary Responsibilities: · Lead and contribute to projects centered around image analytics, computer vision, and visual data processing. · Develop and deploy CV models for tasks such as object detection, image classification, pattern recognition, and anomaly detection. · Apply deep learning frameworks (e.g., TensorFlow, Keras) to solve complex visual data challenges. · Integrate multi-sensor data fusion and multivariate analysis for industrial applications. · Collaborate with cross-functional teams to implement predictive maintenance, fault detection, and process monitoring solutions using visual and sensor data. Mandatory Skills: · Strong hands-on experience in Computer Vision and Image Analytics. · Proficiency in Python and familiarity with AI/ML libraries such as OpenCV, TensorFlow, Keras, scikit-learn, and Matplotlib. · Solid understanding of machine learning techniques: classification, regression, clustering, anomaly detection, etc. · Experience with deep learning architectures (CNNs, autoencoders, etc.) for image-based applications. · Familiarity with Generative AI and LLMs is a plus. Desirable Skills: · Knowledge of optimization techniques and simulation modeling. · Domain experience in Oil & Gas, Desalination, Motors & Pumps, or Industrial Systems. Educational & Professional Background: · Bachelor’s or Master’s degree in Engineering (Mechanical, Electrical, Electronics, Chemical preferred). · Master’s in Industrial/Manufacturing/Production Engineering is a strong plus. · Demonstrated experience in solving real-world industrial problems using data-driven approaches. Soft Skills & Attributes: · Strong analytical and problem-solving skills. · Ability to work independently and manage multiple projects. · Excellent communication and stakeholder engagement skills. · Proven thought leadership and innovation in AI/ML applications. Interested Candidates please share your CV to jyothi.a@people-prime.com Show more Show less
Posted 1 week ago
0.0 years
0 Lacs
Bengaluru, Karnataka
On-site
Bangalore,Karnataka,India Job ID 755095 Join our Team About this opportunity: Infra Cloud Team is looking for Technical Authority Expert (JS5) to Investigate, diagnose, and troubleshoot, Perform MSSQL Database Administration. Technical Proficiency in Oracle Database Administration as a Secondary skill is an added advantage. The candidate has to follow standard procedures for proper escalation of unresolved issues to the appropriate internal teams and talk to clients (internal) through a series of actions, either via phone, email, or chat, until the request is closed. What you will do: Proven experience as a Microsoft SQL Server DBA, managing enterprise-level database systems. Strong knowledge of Microsoft SQL Server, including database design, administration, performance tuning, and troubleshooting. Experience with database backup and recovery strategies, including disaster recovery planning and implementation. Proficiency in SQL programming and scripting languages. Familiarity with database security concepts and best practices. Understanding of high availability (HA) and failover clustering technologies. Excellent analytical and problem-solving skills, with the ability to diagnose and resolve complex database issues. Strong organizational and time management skills, with the ability to handle multiple tasks and priorities simultaneously. Microsoft SQL Server certification (eg, MCSA, MCSE) or equivalent industry certifications. Experience with cloud-based database platforms, such as Microsoft Azure SQL Database. Familiarity with data warehousing concepts and technologies. Knowledge of PowerShell or other automation tools for database administration tasks. Experience with performance monitoring and tuning tools, such as SQL Server Profiler or Extended Events. Effective communication and interpersonal skills, with the ability to collaborate with cross-functional teams and communicate technical concepts to non-technical stakeholders. The skills you bring: Oracle 10g, 11g, 12c, 18C, 19C versions (18 & 19 Additional benefits) Performance tuning (Optimizing Queries, Instance Tuning) Upgrade knowledge on oracle database upgrade from 12c to 19c and 11g to 12c. Installation, configuration and upgrading of Oracle server software and related products (clients, RAC 2 nodes, stand Along) Good Understanding of RAC (2 nodes or 3 nodes) and its services Understanding Active/Passive, Active-Active (ASM and Veritas Cluster) A good knowledge of the physical database design A good understanding of the underlying operating system (windows, unix, linux, solaris & AIX) Different OS few important commands to check the server load (top, topas sar ,vmstat Partitioning (Partition Tables and it’s type understanding Range, Hash etc) and partition pruning. Indexing (Local and Global Index, its benefits and drawbacks) Worked on Heavy databases 100+ TB (Filesystem and ASM) Adding Disk in ASM and Adding datafiles in RAC, tempfiles etc. Why join Ericsson? At Ericsson, you´ll have an outstanding opportunity. The chance to use your skills and imagination to push the boundaries of what´s possible. To build solutions never seen before to some of the world’s toughest problems. You´ll be challenged, but you won’t be alone. You´ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next. What happens once you apply?
Posted 1 week ago
0.0 years
0 Lacs
Bengaluru, Karnataka
Remote
Location Bangalore, Karnataka, 560048 Category Engineering Job Type Full time Job Id 1190321 No Storage software Engineer This role has been designated as ‘Remote/Teleworker’, which means you will primarily work from home. Who We Are: Hewlett Packard Enterprise is the global edge-to-cloud company advancing the way people live and work. We help companies connect, protect, analyze, and act on their data and applications wherever they live, from edge to cloud, so they can turn insights into outcomes at the speed required to thrive in today’s complex world. Our culture thrives on finding new and better ways to accelerate what’s next. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. If you are looking to stretch and grow your career our culture will embrace you. Open up opportunities with HPE. Job Description: In the HPE Hybrid Cloud , we lead the innovation agenda and technology roadmap for all of HPE. This includes managing the design, development, and product portfolio of our next-generation cloud platform, Green Lake. Working with customers, we help them reimagine their information technology needs to deliver a simple, consumable solution that helps them drive their business results. Join us redefine what’s next for you. Job Family Definition: Designs, develops, troubleshoots and debugs software programs for software enhancements and new products. Develops software including operating systems, compilers, routers, networks, utilities, databases and Internet-related tools. Determines hardware compatibility and/or influences hardware design. Management Level Definition: Contributions include applying developed subject matter expertise to solve common and sometimes complex technical problems and recommending alternatives where necessary. Might act as project lead and provide assistance to lower level professionals. Exercises independent judgment and consults with others to determine best method for accomplishing work and achieving objectives. What you will do: Analyzes design and determines coding, programming, and integration activities required based on general objectives and knowledge of overall architecture of product or solution. Writes and executes complete testing plans, protocols, and documentation for assigned portion of application; identifies and debugs, and creates solutions for issues with code and integration into application architecture. Leads a project team of other software systems engineers and internal and outsourced development partners to develop reliable, cost effective and high quality solutions for assigned systems portion or subsystem. Collaborates and communicates with management, internal, and outsourced development partners regarding software systems design status, project progress, and issue resolution. Represents the software systems engineering team for all phases of larger and more-complex development projects. Provides guidance and mentoring to less- experienced staff members. Provides guidance and mentoring to less- experienced staff members. What you will need: Knowledge and Skills: Expertise in multiple software systems design tools and languages. Strong analytical and problem solving skills. Designing software systems running on multiple platform types. Software systems testing methodology, including writing and execution of test plans, debugging, and testing scripts and tools. Excellent written and verbal communication skills; mastery in English and local language. Ability to effectively communicate product architectures, design proposals and negotiate options at management levels. Must have very strong system programming background with C/C++/Golang for large enterprise class software. Must have proficiency with data structures, algorithms, and multi-threaded programming Must have in depth knowledge of OS internals. Must be capable of debugging issues in multi-threaded and clustered environments. Prior experience in one or more of the following areas is a huge plus: Data-path on large and complex modules. Distributed Systems, Clustering or HA Memory management, Virtualization or De-duplication Replication, QoS, Storage Protocols (iSCSI/SCSI, FC, NFS, CIFS). Additional Skills: Cloud Architectures, Cross Domain Knowledge, Design Thinking, Development Fundamentals, DevOps, Distributed Computing, Microservices Fluency, Full Stack Development, Security-First Mindset, Solutions Design, Testing & Automation, User Experience (UX) What We Can Offer You: Health & Wellbeing We strive to provide our team members and their loved ones with a comprehensive suite of benefits that supports their physical, financial and emotional wellbeing. Personal & Professional Development We also invest in your career because the better you are, the better we all are. We have specific programs catered to helping you reach any career goals you have — whether you want to become a knowledge expert in your field or apply your skills to another division. Unconditional Inclusion We are unconditionally inclusive in the way we work and celebrate individual uniqueness. We know varied backgrounds are valued and succeed here. We have the flexibility to manage our work and personal needs. We make bold moves, together, and are a force for good. Let's Stay Connected: Follow @HPECareers on Instagram to see the latest on people, culture and tech at HPE. Job: Engineering Job Level: TCP_03 HPE is an Equal Employment Opportunity/ Veterans/Disabled/LGBT employer. We do not discriminate on the basis of race, gender, or any other protected category, and all decisions we make are made on the basis of qualifications, merit, and business need. Our goal is to be one global team that is representative of our customers, in an inclusive environment where we can continue to innovate and grow together. Please click here: Equal Employment Opportunity. Hewlett Packard Enterprise is EEO Protected Veteran/ Individual with Disabilities. HPE will comply with all applicable laws related to employer use of arrest and conviction records, including laws requiring employers to consider for employment qualified applicants with criminal histories.
Posted 1 week ago
1.0 years
0 Lacs
Bengaluru, Karnataka, India
On-site
Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Description Job Title: Junior Data Scientist Location: Bangalore Reporting to: Senior Manager – Analytics Purpose of the role The Global GenAI Team at Anheuser-Busch InBev (AB InBev) is tasked with constructing competitive solutions utilizing GenAI techniques. These solutions aim to extract contextual insights and meaningful information from our enterprise data assets. The derived data-driven insights play a pivotal role in empowering our business users to make well-informed decisions regarding their respective products. In the role of a Machine Learning Engineer (MLE), you will operate at the intersection of: LLM-based frameworks, tools, and technologies Cloud-native technologies and solutions Microservices-based software architecture and design patterns As an additional responsibility, you will be involved in the complete development cycle of new product features, encompassing tasks such as the development and deployment of new models integrated into production systems. Furthermore, you will have the opportunity to critically assess and influence the product engineering, design, architecture, and technology stack across multiple products, extending beyond your immediate focus. Key tasks & accountabilities Large Language Models (LLM): Experience with LangChain, LangGraph Proficiency in building agentic patterns like ReAct, ReWoo, LLMCompiler Multi-modal Retrieval-Augmented Generation (RAG): Expertise in multi-modal AI systems (text, images, audio, video) Designing and optimizing chunking strategies and clustering for large data processing Streaming & Real-time Processing: Experience in audio/video streaming and real-time data pipelines Low-latency inference and deployment architectures NL2SQL: Natural language-driven SQL generation for databases Experience with natural language interfaces to databases and query optimization API Development: Building scalable APIs with FastAPI for AI model serving Containerization & Orchestration: Proficient with Docker for containerized AI services Experience with orchestration tools for deploying and managing services Data Processing & Pipelines: Experience with chunking strategies for efficient document processing Building data pipelines to handle large-scale data for AI model training and inference AI Frameworks & Tools: Experience with AI/ML frameworks like TensorFlow, PyTorch Proficiency in LangChain, LangGraph, and other LLM-related technologies Prompt Engineering: Expertise in advanced prompting techniques like Chain of Thought (CoT) prompting, LLM Judge, and self-reflection prompting Experience with prompt compression and optimization using tools like LLMLingua, AdaFlow, TextGrad, and DSPy Strong understanding of context window management and optimizing prompts for performance and efficiency 3. Qualifications, Experience, Skills Level of educational attainment required (1 or more of the following) Bachelor's or masterʼs degree in Computer Science, Engineering, or a related field. Previous Work Experience Required Proven experience of 1+ years in developing and deploying applications utilizing Azure OpenAI and Redis as a vector database. Technical Skills Required Solid understanding of language model technologies, including LangChain, OpenAI Python SDK, LammaIndex, OLamma, etc. Proficiency in implementing and optimizing machine learning models for natural language processing. Experience with observability tools such as mlflow, langsmith, langfuse, weight and bias, etc. Strong programming skills in languages such as Python and proficiency in relevant frameworks. Familiarity with containerization and orchestration tools (e.g., Docker, Kubernetes). And above all of this, an undying love for beer! We dream big to create future with more cheer Show more Show less
Posted 1 week ago
6.0 - 12.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Dear Candidate, We are Organizing a walk-in Drive at Kolkata Location on 21-Jun-2025. Please find details below: Role: Citrix Admin Exp: 6 to 12 years Location: Kolkata Venue: Tata Consultancy Services: Candor Tech Space Address-DH Block(Newtown), Action Area I, Newtown, Chakpachuria, New Town, West Bengal 700135. nperson venues JD: "• Monitor and maintain Citrix infrastructure including Citrix Virtual Apps and Desktops (CVAD), Citrix StoreFront, Citrix Director, and Citrix Workspace. Experience with Citrix ADC (NetScaler), Profile management tools (UPM, FSLogix) • Troubleshoot and resolve issues related to: • User login and session performance • Application publishing • Profile management (e.g., Citrix UPM, FSLogix) • Perform routine maintenance tasks such as: • Updating Citrix policies • Managing machine catalogs and delivery groups • Monitoring server health and resource usage • Assist in patching and updating Citrix components. • Experience with windows server infrastructure (2016/2019/2022). • Troubleshoot issues related to Windows, Active Directory, DNS, DHCP, Group Policy. • Manage and optimize server performance, patching, and security configurations. • Automate administrative tasks using PowerShell and other scripting tools. • Perform system upgrades, migrations, and capacity planning. • Implement and manage high availability and disaster recovery solutions (e.g., clustering, DFS, backup). • Maintain documentation for system configurations, procedures, and troubleshooting. • Participate in audits, compliance checks, and DR testing. • Mentor L1/L2 support teams and lead technical initiatives. • Managing and Maintaining Citrix environments which includes tasks like configuring and monitoring Citrix systems, ensuring they meet high availability and disaster recovery requirements. • Troubleshooting and Problem-Solving skills for diagnosing and resolving complex technical issues within the virtualized environment. • Collaborating with cross-functional teams to ensure seamless integration of Citrix solutions within the broader IT landscape. • Maintain documentation for Citrix architecture, configurations, and procedures. • Participate in disaster recovery planning and testing. • Mentor L1/L2 Citrix & windows supports staff and provides technical guidance. • Provide close liaison with project teams to ensure the smooth transition of new applications, systems and initiatives into the production environment. • Review and recommend options to improve the effectiveness of the global Windows & Citrix infrastructure; research/plan/execute migration." Documents to Carry: 1.TCS application form available on iBegin 2. PAN/Aadhaar or any other Government ID Proof. 3. Updated CV/Resume to be provided. 4. 2 Passport size Photographs Please reach the venue by 10:00 AM. Regards Gunsheel Sidana Human Resources Talent Acquisition Group Tata Consultancy Services Show more Show less
Posted 1 week ago
6.0 - 12.0 years
0 Lacs
Kolkata, West Bengal, India
On-site
Dear Candidate, We are Organizing a walk-in Drive at Kolkata Location on 21-Jun-2025. Please find details below: Role: Windows Vmware Admin Exp: 6 to 12 years Location: Kolkata Venue: Tata Consultancy Services: Candor Tech Space Address-DH Block(Newtown), Action Area I, Newtown, Chakpachuria, New Town, West Bengal 700135 JD: "• Design, deploy, and maintain Windows Server infrastructure (2016/2019/2022). • Troubleshoot complex issues related to Windows, Active Directory, DNS, DHCP, Group Policy, and file/print services. • Manage and optimize server performance, patching, and security configurations. • Automate administrative tasks using PowerShell and other scripting tools. • Perform system upgrades, migrations, and capacity planning. • Implement and manage high availability and disaster recovery solutions (e.g., clustering, DFS, backup). • Collaborate with different teams for integrated support. • Maintain documentation for system configurations, procedures, and troubleshooting. • Participate in audits, compliance checks, and DR testing. • Mentor L1/L2 support teams and lead technical initiatives. • Designing and Implementing VMware Solutions • In-depth understanding of vSphere, vCenter, NSX, and other related products. • Experts design and implement VMware solutions tailored to specific business needs, optimizing performance and efficiency. • Managing and Maintaining Virtualized environments which includes tasks like configuring and monitoring VMware systems, ensuring they meet high availability and disaster recovery requirements. • Troubleshooting and Problem-Solving skills for diagnosing and resolving complex technical issues within the virtualized environment. • Collaborating with cross-functional teams to ensure seamless integration of VMware solutions within the broader IT landscape. • Maintain documentation for windows & VMware architecture, configurations, and procedures. • Participate in disaster recovery planning and testing. • Mentor L1/L2 windows & VMware supports staff and provide technical guidance. • Provide close liaison with project teams to ensure the smooth transition of new applications, systems and initiatives into the production environment. • Review and recommend options to improve the effectiveness of the global Windows & VMware infrastructure; research/plan/execute migration." Documents to Carry: 1.TCS application form available on iBegin 2. PAN/Aadhaar or any other Government ID Proof. 3. Updated CV/Resume to be provided. 4. 2 Passport size Photographs Please reach the venue by 10:00 AM. Regards Gunsheel Sidana Human Resources Talent Acquisition Group Tata Consultancy Services Show more Show less
Posted 1 week ago
2.0 - 4.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Summary We are seeking a highly skilled and experienced DBA to join our expanding Information Technology team. In this role, you will help develop and design technology solutions that are scalable, relevant, and critical to our company’s success. You will join the team working on our new platform being built using MS SQL Server and MYSQL Server preferably. You will participate in all phases of the development lifecycle, implementation, maintenance and support and must have a solid skill set, a desire to continue to grow as a Database Administrator, and a team-player mentality. Key Responsibilities 1. Primary responsibility will be the management of production databases servers, including security, deployment, maintenance and performance monitoring. 2. Setting up SQL Server replication, mirroring and high availability as would be required across hybrid environments. 3. Design and implementation of new installations, on Azure, AWS and cloud hosting with no specific DB services. 4. Deploy and maintain on premise installations of SQL Server on Linux/ MySQL installation. 5. Database security and protection against SQL injection, exploiting of intellectual property, etc., 6. To work with development teams assisting with data storage and query design/optimization where required. 7. Participate in the design and implementation of essential applications. 8. Demonstrate expertise and add valuable input throughout the development lifecycle. 9. Help design and implement scalable, lasting technology solutions. 10. Review current systems, suggesting updates as would be required. 11. Gather requirements from internal and external stakeholders. 12. Document procedures to setup and maintain a highly available SQL Server database on Azure cloud, on premise and Hybrid environments. 13. Test and debug new applications and updates 14. Resolve reported issues and reply to queries in a timely manner. 15. Remain up to date on all current best practices, trends, and industry developments. 17. Identify potential challenges and bottlenecks in order to address them proactively. Key Competencies/Skillsets SQL Server management on Hybrid environments (on premise and cloud, preferably, Azure, AWS) MySQL Backup, SQL Server Backup, Replication, Clustering, Log shipping experience on Linux/ Windows. Setting up, management and maintenance of SQL Server/ MySQL on Linux. Experience with database usage and management Experience in implementing Azure Hyperscale database Experience in Financial Services / E-Commerce / Payments industry preferred. Familiar with multi-tier, object-oriented, secure application design architecture Experience in cloud environments preferably Microsoft Azure on Database service tiers. Experience of PCI DSS a plus SQL development experience is a plus Linux experience is a plus Proficient in using issue tracking tools like Jira, etc. Proficient in using version control systems like Git, SVN etc. Strong understanding of web-based applications and technologies Sense of ownership and pride in your performance and its impact on company’s success Critical thinker and problem-solving skills Excellent communication skills and ability to communicate with client’s via different modes of communication email, phone, direct messaging, etc Preferred Education and Experience 1. Bachelor’s degree in computer science or related field 2. Minimum 2 to 4 years’ experience as SQL Server DBA and in MSSQL including Replication, InnoDB Cluster, Upgrading and Patching. 3. Ubuntu Linux knowledge is perferred. 3. MCTS, MCITP, and/or MVP/ Azure DBA/MySQL certifications a plus Show more Show less
Posted 1 week ago
20.0 - 22.0 years
20 - 25 Lacs
Noida
Work from Office
Tata Tele Business Services is looking for Cluster Lead to join our dynamic team and embark on a rewarding career journeyA Cluster Lead is responsible for overseeing and managing a group or cluster of related teams or departments within an organization The primary objective is to ensure efficient operations, productivity, and the achievement of strategic goals within the assigned cluster Key Responsibilities:Team Leadership:Lead and manage a team of managers or supervisors responsible for various functions within the cluster Provide guidance, mentorship, and support to team members, fostering a positive and collaborative work environment Strategic Planning:Collaborate with senior management to develop and execute cluster-specific strategies aligned with the organization's overall goals Monitor and report on progress toward strategic objectives Operations Management:Oversee the day-to-day operations and processes of the cluster, ensuring efficiency and effectiveness Identify opportunities for process improvement and implement solutions Resource Allocation:Allocate resources, including personnel, budget, and technology, to meet the cluster's operational requirements Performance Monitoring:Establish key performance indicators (KPIs) for the cluster and ensure teams are working towards these targets Regularly assess performance and take corrective actions as needed Communication and Collaboration:Foster communication and collaboration both within the cluster and with other departments or clusters Ensure the sharing of best practices and knowledge transfer Risk Management:Identify potential risks and challenges within the cluster's operations and develop strategies to mitigate them Budget Management:Develop and manage the cluster's budget, ensuring cost control and resource optimization
Posted 1 week ago
4.0 years
0 Lacs
Gurugram, Haryana, India
On-site
What you'll be doing Designing and delivering the technical elements of our analytics projects Applying descriptive and predictive analytics techniques including but not limited to: customer segmentation, market basket analysis, topic & sentiment analysis, predictive modelling / machine learning models and data visualization, Simplifying complex analysis into clear and compelling recommendations that enable the project team to achieve our client’s desired outcomes Co-ordinating integration, structuring, visualization and data analysis activities Actively contributing to exchanges and sharing of best practices related to data Developing efficient operations processes for high availability of Global Data platform Work as part of our global team; our operational, Insight and IT teams work together to solve data challenges What experience you'll need Minimum 4 years’ experience working with the data as a data scientist Have worked on cloud computing environments such as Azure, GCP or AWS Experience working with large datasets and build data models Advanced knowledge of machine learning techniques e.g., forecasting, recommender engine, predictive modelling, clustering Experience working with text mining e.g. building topics, sentiments, summarization etc. use cases Advanced knowledge of python Experience working with GenAI, large language models Experience building, testing and deploying data visualization dashboards (PowerBI/tableau) Show more Show less
Posted 1 week ago
1.0 years
0 Lacs
Gurgaon, Haryana, India
Remote
Locations : Gurgaon | Bengaluru Who We Are Boston Consulting Group partners with leaders in business and society to tackle their most important challenges and capture their greatest opportunities. BCG was the pioneer in business strategy when it was founded in 1963. Today, we help clients with total transformation-inspiring complex change, enabling organizations to grow, building competitive advantage, and driving bottom-line impact. To succeed, organizations must blend digital and human capabilities. Our diverse, global teams bring deep industry and functional expertise and a range of perspectives to spark change. BCG delivers solutions through leading-edge management consulting along with technology and design, corporate and digital ventures—and business purpose. We work in a uniquely collaborative model across the firm and throughout all levels of the client organization, generating results that allow our clients to thrive. What You'll Do BCG X is BCG's premier data science and advanced analytics capability with 1000+ data scientists, engineers, and analysts globally. Within BCG X, our X.Delivery Teams are our hybrid client-facing/consulting-support that leverages available tools and code bases to deliver powerful and specialized insights that help our clients tackle their most pressing business problems. We partner with BCG case teams and practices across the full analytics value chain: framing new business challenges and problems, designing and executing innovative analytical solutions, and interpreting findings for our clients. Geo Analytics is a sub-team in X.Delivery with expertise in geospatial analytics and tools. Our 35+ member team is a global resource, working with clients in every BCG region and in every industry area. It is a core member of a rapidly growing analytics enterprise at BCG - a constellation of teams focused on driving practical results for BCG clients by applying leading edge technologies.As an Analyst in Geo Analytics, you will be solving a variety of location-based analytical problems for our clients. You will utilize your knowledge of geospatial data and analytics while using a variety of tools such as Geographic Information Systems (GIS), Python, and visualization/analytical platforms (ex. Tableau, Alteryx). You will leverage a range of techniques such as geospatial clustering, network analysis, location-based optimization and predictive analytics/modeling. What You'll Bring Undergraduate degree in a field linked to business analytics, statistics or geo-statistics, economics, operations research, applied mathematics, computer science, engineering, or related field is required; advanced degree is preferred 1-3 years of relevant work experience in Geo Analytics with advanced degree preferred Relevant degree in geospatial related field such as: Geography, Remote Sensing, Urban Planning, Statistics/Geo Statistics, Applied Mathematics, or Computer Science Previous experience working in a global organization or professional services firm is preferred Data management skills (e.g. data modeling, data integrity QA/QC, database administration) GIS software such as ESRI ArcGIS, QGIS, CART Data science programming/coding in Python and/or R Descriptive statistics and statistical methods (e.g. correlation, regression, clustering, etc.) with applications in machine learning Exposure to BI/Data visualization platforms such as Tableau, Power BI, etc.* Handling, processing, and deriving insights from remotely sensed data preferred #BCGXjob Who You'll Work With You will work with various teams on client projects. Boston Consulting Group is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, age, religion, sex, sexual orientation, gender identity / expression, national origin, disability, protected veteran status, or any other characteristic protected under national, provincial, or local law, where applicable, and those with criminal histories will be considered in a manner consistent with applicable state and local laws. BCG is an E - Verify Employer. Click here for more information on E-Verify. Show more Show less
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
Gurugram, Haryana, India
On-site
AI Engineer Key Responsibilities GenAI Development: Design and develop advanced GenAI models (e.g., LLMs, DALL E models) and AI Agents to automate internal tasks and workflows. Exposure to LLMs: Utilize Azure Open AI APIs, experience on models like GPT4o, O3 , llama3 Enhance the existing RAG based application: In depth understanding of stages of RAG - chunking, retrieval etc. Cloud Deployment: Deploy and scale GenAI solutions on Azure Cloud services (e.g., Azure Function App) for optimal performance. In depth understanding of ML models like linear regression, random forest, decision trees. In depth understanding on clustering and supervised models. AI Agent Development: Build AI agents using frameworks like LangChain to streamline internal processes and boost efficiency. Data Analytics: Perform advanced data analytics to preprocess datasets, evaluate model performance, and derive actionable insights for GenAI solutions. Data Visualization: Create compelling visualizations (e.g., dashboards, charts) to communicate model outputs, performance metrics, and business insights to stakeholders. Stakeholder Collaboration: Partner with departments to gather requirements, align on goals, and present technical solutions and insights effectively to non-technical stakeholders. Model Optimization: Fine-tune GenAI models for efficiency and accuracy using techniques like prompt engineering, quantization, and RAG (Retrieval-Augmented Generation). LLMOps Best Practices: Implement GenAI-specific MLOps, including CI/CD pipelines (Git, Azure DevOps) Leadership: Guide cross-functional teams, mentor junior engineers, and drive project execution with strategic vision and ownership. Helicopters, strategic Thinking**: Develop innovative GenAI strategies to address business challenges, leveraging data insights to align solutions with organizational goals. Self-Driven Execution: Independently lead projects to completion with minimal supervision, proactively resolving challenges and seeking collaboration when needed. Continuous Learning: Stay ahead of GenAI, analytics, and visualization advancements, self-learning new techniques to enhance project outcomes. Required Skills & Experience Experience: 8-12 years in AI/ML development, with at least 4 years focused on Generative AI and AI agent frameworks. Education: BTech/BE in Computer Science, Engineering, or equivalent (Master’s or PhD in AI/ML is a plus). Programming: Expert-level Python proficiency, with deep expertise in GenAI libraries (e.g., LangChain, Hugging Face Transformers, PyTorch, Open AI SDK) and data analytics libraries (e.g., Pandas, NumPy), sk-learn. Mac Data Analytics: Strong experience in data preprocessing, statistical analysis, and model evaluation to support GenAI development and business insights. Data Visualization: Proficiency in visualization tools (e.g., Matplotlib, Seaborn, Plotly, Power BI, or Tableau) to create dashboards and reports for stakeholders. Azure Cloud Expertise: Strong experience with Azure Cloud services (e.g., Azure Function App, Azure ML, serverless) for model training and deployment. GenAI Methodologies: Deep expertise in LLMs, AI agent frameworks, prompt engineering, and RAG for internal workflow automation. Deployment: Proficiency in Docker, Kubernetes, and CI/CD pipelines (e.g., Azure DevOps, GitHub Actions) for production-grade GenAI systems. LLMOps: Expertise in GenAI MLOps, including experiment tracking (e.g., Weights & Biases), automated evaluation metrics (e.g., BLEU, ROUGE), and monitoring. Soft Skills: Communication: Exceptional verbal and written skills to articulate complex GenAI concepts, analytics, and visualizations to technical and non-technical stakeholders. Strategic Thinking: Ability to align AI solutions with business objectives, using data-driven insights to anticipate challenges and propose long-term strategies. Problem-Solving: Strong analytical skills with a proactive, self-starter mindset to independently resolve complex issues. Collaboration: Collaborative mindset to work effectively across departments and engage colleagues for solutions when needed. Speed to outcome Preferred Skills Experience deploying GenAI models in production environments, preferably on Azure Familiarity with multi-agent systems, reinforcement learning, or distributed training (e.g., DeepSpeek). Knowledge of DevOps practices, including Git, CI/CD, and infrastructure-as-code. Advanced data analytics techniques (e.g., time-series analysis, A/B testing) for GenAI applications. Experience with interactive visualization frameworks (e.g., Dash, Streamlit) for real-time dashboards. Contributions to GenAI or data analytics open-source projects or publications in NLP, generative modeling, or data scien Show more Show less
Posted 1 week ago
6.0 - 10.0 years
8 - 16 Lacs
Gurugram
Work from Office
We are seeking a skilled Data Analytics professional who has a strong foundation in data analytics, experience working with large datasets, and expertise in analyzing operational performance and business metrics. Desire Skills & Experience Any Graduate/ Post-Graduate who holds experience into data analytics domain. 6+ years of experience in data analysis, preferably within a BPM, or voice process environment. Proven experience in analyzing operational metrics. Proficiency in data analytics tools and platforms such as SQL , Excel . Experience with scientific analytics tools like Python , R , or SAS for statistical analysis and model development. Competent with data visualization tools such as Tableau or Power BI . Knowledge of statistical techniques, including regression analysis, hypothesis testing, clustering, and machine learning. Strong analytical thinking and problem-solving skills. Excellent organizational skills, attention to detail, and ability to manage multiple tasks simultaneously.
Posted 1 week ago
1.0 - 3.0 years
4 - 8 Lacs
Ahmedabad
Work from Office
Armanino is proud to beAmong the top 20 Largest Accounting and Consulting Firms in the Nationand one of theBest Places to Work. We have a community of resources that are ready and willing to support your ideas, build your skills and expand your professional network. We want you to integrate all aspects of your life with your career. At Armanino, we know you dont check-out of life when you check-in at work. Thats why weve created a unique work environment where your passions, work, and family & friends can overlap. We want to help you achieve growth by giving you access to a network of smart and supportive people, willing to listen to your ideas. This open position is for Armanino India LLP, which is located in Ahmedabad, Gujarat, India. Armanino India LLP is a fully owned subsidiary of Armanino. Job Responsibilities Collecting and cleaning data from various sources. Accurately input data into CRM and other third-party systems. Perform regular updates to existing data entries to reflect current information. Execute mass data loading procedures into the CRM and third-party applications while adhering to best practices to avoid data corruption. Monitor and troubleshoot any issues that arise during data loading processes. Ensuring data quality, accuracy, and consistency across all data sources. Conduct regular audits of data to ensure accuracy, completeness, and consistency. Maintain comprehensive documentation of data sources, processes, and data management best practices Bachelors Degree in relevant field. 1 to 3 years of proven experience in data entry, data management, or a similar role. Strong skills in data cleaning, processing, and validation. Proficiency with CRM systems and experience in mass data loading. Excellent communication and interpersonal skills, with the ability to work collaboratively with cross-functional teams. Ability to work independently and manage multiple tasks simultaneously. Excellent organizational skills and attention to detail to detail with the ability to work with large data sets. A willingness to learn and adapt to new tools and technologies as needed. Compensation and Benefits CompensationCommensurate with Industry standards Other BenefitsProvident Fund, Gratuity, Medical Insurance, Group Personal Accident Insurance etc. employment benefits depending on the position. "Armanino is the brand name under which Armanino LLP, Armanino CPA Armanino provides equal employment opportunities (EEO) to all employees and applicants for employment without regard to race, color, religion, sex, national origin, age, disability or genetics. In addition to federal law requirements, Armanino complies with applicable state and local laws governing nondiscrimination in employment in every location in which the company has facilities. This policy applies to all terms and conditions of employment, including recruiting, hiring, placement, promotion, termination, layoff, recall,transfer, leaves of absence, compensation and training. Armanino expressly prohibits any form of workplace harassment based on race, color, religion, gender, sexual orientation, gender identity or expression, national origin, age, genetic information, disability, or veteran status. Improper interference with the ability of Armanino employees to perform their job duties may result in discipline up to and including discharge. We have a community of resources that are ready and willing to support your ideas, build your skills and expand your professional network.
Posted 1 week ago
5.0 years
0 Lacs
Noida, Uttar Pradesh, India
On-site
Our Company Changing the world through digital experiences is what Adobe’s all about. We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences! We’re passionate about empowering people to create beautiful and powerful images, videos, and apps, and transform how companies interact with customers across every screen. We’re on a mission to hire the very best and are committed to creating exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new ideas can come from everywhere in the organization, and we know the next big idea could be yours! Our Company Changing the world through digital experiences is what Adobe’s all about! We give everyone—from emerging artists to global brands—everything they need to design and deliver exceptional digital experiences. We’re passionate about empowering people to craft alluring and powerful images, videos, and apps, and transform how companies harmonize with customers across every screen. We’re on a mission to hire the very best and are committed to building exceptional employee experiences where everyone is respected and has access to equal opportunity. We realize that new insights can come from everywhere in the organization, and we know the next big idea could be yours. The Opportunity Use your expertise in data science engineering to drive the next stage of growth at Adobe. The Customer Analytics & GTM team is focused on using the power of data to deliver optimized experiences through personalization. This role will drive data engineering for large-scale data science initiatives across a wide variety of strategic projects. As a member of the Data Science Engineering team, you will have significant responsibility to help build large scale cloud-based data and analytics platform with enterprise-wide consumers. This role is inherently multi-functional, and the ideal candidate will work across teams. The position requires ability to own things, come up with innovative solutions, try new tools, technologies, and entrepreneurial personality. Come join us for a truly exciting career, best benefits and outstanding work life balance. What You Will Do Build fault tolerant, scalable, quality data pipelines using multiple cloud- based tools. Develop analytical, personalization capabilities using pioneering technologies by bringing to bear Adobe tools. Build LLM agents to optimize and automate data pipelines following the best engineering practices. Deliver End to End Data Pipelines to run Machine Learning Models in a production platform. Innovative solutions to help broader organization take significant actions fast and efficiently. Chip into data engineering and data science frameworks, tools, and processes. Implement outstanding data operations and implement standard methodologies to use resources in an optimum way. Architect data ingestion, data transformation, data consumption, data governance frameworks. Help build production grade ML models and integration with operational systems. This is a high visibility role for a team which is on a critical mission to stop software privacy. A lot of collaboration with global multi-functional operations teams is required to onboard the customers to use genuine software. Work in a collaborative environment and contribute to the team as well as organization’s success. What You Will Need Bachelor’s degree in computer science or equivalent. Master’s degree or equivalent experience is preferred. 5-8 years of consistent track record as a data engineer. At least 2+ years of demonstrable experience and proven track record with Mobile data ecosystem is a must. App Store Optimization (ASO), 3rd Party systems like Branch, Revenue Cat, Google and Apple APIs etc. building data pipelines for In App purchases, Paywall impressions and tracking, App crashes etc. 5+ years validated ability in distributed data technologies e.g., Hadoop, Hive, Presto, Spark etc. 3+ years of experience with Cloud based technologies – Databricks, S3, Azure Blob Storage, Notebooks, AWS EMR, Athena, Glue etc. Familiarity and usage of different file formats in batch/streaming processing i.e., Delta/Parquet/ORC etc. 2+ years’ experience with streaming data ingestion and transformation using Kafka, Kinesis etc. Outstanding SQL experience. Ability to write optimized SQLs across platforms. Proven hands-on experience in Python/PySpark/Scala and ability to manipulate data using Pandas, NumPy, Koalas etc. and using APIs to transfer data. Experience working as an architect to design large scale distributed data platforms. Experience with CI/CD tools i.e., GitHub, Jenkins etc. Working experience with Open- source orchestration tools i.e., Apache Air Flow/ Azkaban etc. Teammate with excellent communication/teamwork skills when it comes to closely working with data scientists and machine learning engineers daily. Hands-on work experience with Elastic Stack (Elastic, Logstash, Kibana) and Graph Databases (neo4j, Neptune etc.) is highly desired. Work experience with ML algorithms & frameworks i.e., Keras, Tensor Flow, PyTorch, XGBoost, Linear Regression, Classification, Random Forest, Clustering, mlFlow etc. Nice to have Showcase your work if you are an open - source contributor. Passion to contribute to Open-source community is highly valued. Experience with Data Governance tools e.g., Collibra and Collaboration tools e.g., JIRA/ Confluence etc. Familiarity with Adobe tools like Adobe Experience Platform, Adobe Analytics, Customer Journey Analytics, Adobe Journey Optimizer is a plus. Experience with LLM Models/ Agentic workflows using Copilot, Claude, LLAMA, Databricks Genie etc. is highly preferred. Opportunity and affirmative action employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015. Adobe values a free and open marketplace for all employees and has policies in place to ensure that we do not enter into illegal agreements with other companies to not recruit or hire each other’s employees. Adobe is proud to be an Equal Employment Opportunity employer. We do not discriminate based on gender, race or color, ethnicity or national origin, age, disability, religion, sexual orientation, gender identity or expression, veteran status, or any other applicable characteristics protected by law. Learn more about our vision here. Adobe aims to make Adobe.com accessible to any and all users. If you have a disability or special need that requires accommodation to navigate our website or complete the application process, email accommodations@adobe.com or call (408) 536-3015. Show more Show less
Posted 1 week ago
8.0 years
0 Lacs
Chennai, Tamil Nadu, India
On-site
Job Summary Experience in applying machine learning techniques, Natural Language Processing or Computer Vision using TensorFlow, Pytorch Strong analytical and problem-solving skills Solid software engineering skills across multiple languages including but not limited to Java or Python, C/C++ Build and deploy end to end ML models and leverage metrics to support predictions, recommendations, search, and growth strategies Deep understanding of ML techniques such as: classification, clustering, deep learning, optimization methods, supervised and unsupervised techniques Proven ability to apply, debug, and develop machine learning models Establish scalable, efficient, automated processes for data analyses, model development, validation and implementation, Choose suitable DL algorithms, software, hardware and suggest integration methods. Ensure AI ML solutions are developed, and validations are performed in accordance with Responsible AI guidelines & Standards To closely monitor the Model Performance and ensure Model Improvements are done post Project Delivery Coach and mentor our team as we build scalable machine learning solutions Strong communication skills and an easy-going attitude Oversee development and implementation of assigned programs and guide teammates Carry out testing procedures to ensure systems are running smoothly Ensure that systems satisfy quality standards and procedures Build and manage strong relationships with stakeholders and various teams internally and externally, Provide direction and structure to assigned projects activities, establishing clear, precise goals, objectives and timeframes, run Project Governance calls with senior Stakeholders Strategy As the Squad Lead of AI ML Delivery team, the candidate is expected to lead the squad Delivery for AIML. Business Understand the Business requirement and execute the ML solutioning and ensue the delivery commitments are delivered on time and schedule. Processes Design and Delivery of AI ML Use cases RAI, Security & Governance Model Validation & Improvements Stakeholder Management People & Talent Manage the team in terms of project assignments and deadlines Manage a team dedicated for reviewing models related unstructured and structured data. Hire, nurture talent as required. Risk Management Ownership of the delivery, highlighting various risks on a timely manner to the stakeholders. Identifying proper remediation plan for the risks with proper risk roadmap. Governance Awareness and understanding of the regulatory framework, in which the Group operates, and the regulatory requirements and expectations relevant to the role. Key Responsibilities Regulatory & Business Conduct Display exemplary conduct and live by the Group’s Values and Code of Conduct. Take personal responsibility for embedding the highest standards of ethics, including regulatory and business conduct, across Standard Chartered Bank. This includes understanding and ensuring compliance with, in letter and spirit, all applicable laws, regulations, guidelines and the Group Code of Conduct. Lead the [country / business unit / function/XXX [team] to achieve the outcomes set out in the Bank’s Conduct Principles: [Fair Outcomes for Clients; Effective Financial Markets; Financial Crime Compliance; The Right Environment.] Effectively and collaboratively identify, escalate, mitigate and resolve risk, conduct and compliance matters Key stakeholders Business Stakeholders AIML Engineering Team AIML Product Team Product Enablement Team SCB Infrastructure Team Interfacing Program Team Skills And Experience Use NLP, Vision and ML techniques to bring order to unstructured data Experience in extracting signal from noise in large unstructured datasets a plus Work within the Engineering Team to design, code, train, test, deploy and iterate on enterprise scale machine learning systems Work alongside an excellent, cross-functional team across Engineering, Product and Design create solutions and try various algorithms to solve the problem. Stakeholder Management Qualifications Masters with specialisation in Technology with certification in AI and ML 8- 12 years relevant of Hands-on Experience in developing and delivering AI solutions About Standard Chartered We're an international bank, nimble enough to act, big enough for impact. For more than 170 years, we've worked to make a positive difference for our clients, communities, and each other. We question the status quo, love a challenge and enjoy finding new opportunities to grow and do better than before. If you're looking for a career with purpose and you want to work for a bank making a difference, we want to hear from you. You can count on us to celebrate your unique talents and we can't wait to see the talents you can bring us. Our purpose, to drive commerce and prosperity through our unique diversity, together with our brand promise, to be here for good are achieved by how we each live our valued behaviours. When you work with us, you'll see how we value difference and advocate inclusion. Together We Do the right thing and are assertive, challenge one another, and live with integrity, while putting the client at the heart of what we do Never settle, continuously striving to improve and innovate, keeping things simple and learning from doing well, and not so well Are better together, we can be ourselves, be inclusive, see more good in others, and work collectively to build for the long term What We Offer In line with our Fair Pay Charter, we offer a competitive salary and benefits to support your mental, physical, financial and social wellbeing. Core bank funding for retirement savings, medical and life insurance, with flexible and voluntary benefits available in some locations. Time-off including annual leave, parental/maternity (20 weeks), sabbatical (12 months maximum) and volunteering leave (3 days), along with minimum global standards for annual and public holiday, which is combined to 30 days minimum. Flexible working options based around home and office locations, with flexible working patterns. Proactive wellbeing support through Unmind, a market-leading digital wellbeing platform, development courses for resilience and other human skills, global Employee Assistance Programme, sick leave, mental health first-aiders and all sorts of self-help toolkits A continuous learning culture to support your growth, with opportunities to reskill and upskill and access to physical, virtual and digital learning. Being part of an inclusive and values driven organisation, one that embraces and celebrates our unique diversity, across our teams, business functions and geographies - everyone feels respected and can realise their full potential. Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Bengaluru, Karnataka, India
Remote
Sapiens is on the lookout for a Senior Data Scientist to become a key player in our Bangalore team. If you're a seasoned Data Scientist pro and ready to take your career to new heights with an established, globally successful company, this role could be the perfect fit. Location: Bangalore Working Model: Our flexible work arrangement combines both remote and in-office work, optimizing flexibility and productivity. This position will be part of Sapiens’ Digital (Data Suite) division, for more information about it, click here: https://sapiens.com/solutions/digitalsuite-customer-experience-and-engagement-software-for-insurers/ We are looking for an enthusiastic Data Scientist with a foundational understanding of data science principles and an interest in the insurance industry. The role involves working alongside senior data scientists to analyze data, build models, and generate insights. What You’ll Do Data Collection & Preprocessing Assist in gathering, cleaning, and preprocessing data from various sources, ensuring data quality and consistency. Conduct exploratory data analysis (EDA) to identify trends, patterns, and insights within the data. Model Development & Evaluation Building predictive models to enhance accuracy. Test and evaluate basic machine learning models, including linear and logistic regression, decision trees, and clustering algorithms. Reporting & Visualization Create clear and informative data visualizations and reports to communicate insights to business stakeholders. Tracking key metrics and performance indicators related to business objectives. Cross-functional Collaboration Collaborate with business and technical teams to understand project requirements and contribute to the design of data-driven solutions. Participate in team meetings, brainstorming sessions, and discussions on data science projects and initiatives. Research & Learning Stay updated on the latest trends and tools in data science, machine learning, and the insurance industry. Participate in ongoing training and development programs to enhance technical skills and industry knowledge. Must Have Skills. What to Have for this position. Education: Bachelor’s degree in Data Science, Statistics, Mathematics, Computer Science, or a related field. A Master’s degree is a plus. Experience : 4+ years of experience in data science or analytics Basic Industry Knowledge: General understanding of insurance concepts (e.g., policies, claims, risk, customer lifecycle) is desirable but not required. Insurance Data Familiarity: Some experience working with structured data (e.g., customer demographics, policy details) and an eagerness to learn about the specific types of data used in the insurance industry. Predictive Models: Exposure to basic predictive modeling techniques such as regression, classification, forecasting or clustering. Experience with insurance-specific models (e.g., claims prediction, risk assessment) is a plus. Data Preparation: Knowledge of data cleaning and preprocessing techniques, with experience in handling datasets to prepare them for analysis. Programming Skills: Proficiency in Python for data analysis, with familiarity in using libraries such as Pandas, NumPy, and Scikit-learn. Data Manipulation: Basic skills in SQL for querying and extracting data from databases. Statistical Knowledge: Understanding of fundamental statistical concepts, including distributions, probability, hypothesis testing, and descriptive statistics. Required Soft Skills Curiosity & Willingness to Learn: Enthusiastic about learning new concepts, techniques, and industry-specific knowledge in the insurance domain. Problem-Solving Skills: Ability to approach challenges analytically and think critically about data and project requirements. Attention to Detail: Keen eye for detail in data handling and model development to ensure high-quality outcomes. Communication Skills: Clear and concise communication skills, with the ability to present findings and insights effectively. Team Collaboration : Willingness to work collaboratively within a team, receiving and applying feedback from senior team members. About Sapiens Sapiens is a global leader in the insurance industry, delivering its award-winning, cloud-based SaaS insurance platform to over 600 customers in more than 30 countries. Sapiens’ platform offers pre-integrated, low-code capabilities to accelerate customers’ digital transformation. With more than 40 years of industry expertise, Sapiens has a highly professional team of over 5,000 employees globally. For More information visit us on www.sapiens.com . Sapiens is an equal opportunity employer. We value diversity and strive to create an inclusive work environment that embraces individuals from diverse backgrounds. Disclaimer: Sapiens India does not authorise any third parties to release employment offers or conduct recruitment drives via a third party. Hence, beware of inauthentic and fraudulent job offers or recruitment drives from any individuals or websites purporting to represent Sapiens . Further, Sapiens does not charge any fee or other emoluments for any reason (including without limitation, visa fees) or seek compensation from educational institutions to participate in recruitment events. Accordingly, please check the authenticity of any such offers before acting on them and where acted upon, you do so at your own risk. Sapiens shall neither be responsible for honouring or making good the promises made by fraudulent third parties, nor for any monetary or any other loss incurred by the aggrieved individual or educational institution. In the event that you come across any fraudulent activities in the name of Sapiens , please feel free report the incident at sapiens to sharedservices@sapiens.com Show more Show less
Posted 1 week ago
4.0 years
0 Lacs
Andhra Pradesh, India
On-site
A career within Data and Analytics services will provide you with the opportunity to help organisations uncover enterprise insights and drive business results using smarter data analytics. We focus on a collection of organisational technology capabilities, including business intelligence, data management, and data assurance that help our clients drive innovation, growth, and change within their organisations in order to keep up with the changing nature of customers and technology. We make impactful decisions by mixing mind and machine to leverage data, understand and navigate risk, and help our clients gain a competitive edge. As part of our Analytics and Insights Consumption team, you’ll analyze data to drive useful insights for clients to address core business issues or to drive strategic outcomes. You'll use visualization, statistical and analytics models, AI/ML techniques, Modelops and other techniques to develop these insights. Years of Experience: Candidates with 4+ years of hands on experience Required Skills Familiarity with the Conversational AI domain, conversational design & implementation, customer experience metrics, and industry-specific challenges Understanding of conversational (chats, emails and calls) data and its preprocessing (including feature engineering if required) to train Conversational AI systems. Strong problem-solving and analytical skills to troubleshoot and optimize conversational AI systems. Familiarity with NLP/NLG techniques such as parts of speech tagging, lemmatization, canonicalization, Word2vec, sentiment analysis, topic modeling, and text classification. NLP and NLU Verticals Expertise: Text to Speech (TTS), Speech to Text (STT), SSML modeling, Intent Analytics, Proactive Outreach Orchestration, OmniChannel AI & IVR (incl. Testing), Intelligent Agent Assist, Contact Center as a Service (CCaaS), Modern Data for Conversational AI and Generative AI. Experience building chatbots using bot frameworks like RASA/ LUIS/ DialogFlow/Lex etc. and building NLU model pipeline using feature extraction, entity extraction, intent classification etc. Understanding and experience on cloud platforms (e.g., AWS, Azure, Google Cloud, Omilia Cloud Platform, Kore.ai, OneReach.ai, NICE, Salesforce, etc.) and their services for building Conversational AI solutions for clients Expertise in Python or PySpark. R and JavaScript framework. Expertise in visualization tools such as Power BI, Tableau, Qlikview, Spotfire etc. Experience with evaluating and improving conversational AI system performance through metrics and user feedback Excellent communication and collaboration skills to work effectively with cross-functional teams and stakeholders. Proven track record of successfully delivering conversational AI projects on time Familiarity with Agile development methodologies and version control systems. Ability to stay updated with the latest advancements and trends in conversational AI technologies. Strong strategic thinking and ability to align conversational AI initiatives with business goals. Knowledge of regulatory and compliance requirements related to conversational AI applications Experience in the telecom industry or a similar field Familiarity with customer service operations and CRM systems Nice To Have Familiarity with data wrangling tools such as Alteryx, Excel and Relational storage (SQL) ML modeling skills: Experience in various statistical techniques such as Regression, Time Series Forecasting, Classification, XGB, Clustering, Neural Networks, Simulation Modelling, Etc. Experience in survey analytics, organizational functions such as pricing, sales, marketing, operations, customer insights, etc. Understanding of NoSQL databases (e.g., MongoDB, Cassandra) for handling unstructured and semi-structured data. Good Communication and presentation skills Show more Show less
Posted 1 week ago
3.0 years
0 Lacs
Greater Nashik Area
On-site
Dreaming big is in our DNA. It’s who we are as a company. It’s our culture. It’s our heritage. And more than ever, it’s our future. A future where we’re always looking forward. Always serving up new ways to meet life’s moments. A future where we keep dreaming bigger. We look for people with passion, talent, and curiosity, and provide them with the teammates, resources and opportunities to unleash their full potential. The power we create together – when we combine your strengths with ours – is unstoppable. Are you ready to join a team that dreams as big as you do? AB InBev GCC was incorporated in 2014 as a strategic partner for Anheuser-Busch InBev. The center leverages the power of data and analytics to drive growth for critical business functions such as operations, finance, people, and technology. The teams are transforming Operations through Tech and Analytics. Do You Dream Big? We Need You. Job Title: Data Scientist Location: Bangalore Reporting to: Manager - Analytics Purpose of the role Contributing to the Data Science efforts of AB InBevʼs global non-commercial analytics capability of Procurement Analytics. Candidate will be required to contribute and may also need to guide the DS team staffed on the area and assess the efforts required to scale and standardize the use of Data Science across multiple ABI markets KEY TASKS AND ACCOUNTABILITIES Understand the business problem and translate that to an analytical problem; participate in the solution design process. Manage the full AI/ML lifecycle, including data preprocessing, feature engineering, model training, validation, deployment, and monitoring. Develop reusable and modular Python code adhering to OOP (Object-Oriented Programming) principles. Design, develop, and deploy machine learning models into production environments on Azure. Collaborate with data scientists, software engineers, and other stakeholders to meet business needs. Ability to communicate findings clearly to both technical and business stakeholders. Qualifications, Experience, Skills Level of educational attainment required (1 or more of the following) B.Tech /BE/ Masters in CS/IS/AI/ML Previous Work Experience Required Minimum 3 years of relevant experience Technical Skills Required Must Have Strong expertise in Python, including advanced knowledge of OOP concepts. Exposure to AI/ML methodologies with a previous hands-on experience in ML concepts like forecasting, clustering, regression, classification, optimization, deep learning , NLP using Python Solid understanding of GenAI concepts and experience in Prompt Engineering and RAG Experience with version control tools such as Git. Consistently display an intent for problem solving Strong communication skills (vocal and written) Ability to effectively communicate and present information at various levels of an organization Good To Have Preferred industry exposure in CPG and experience of working in the domain of Procurement Analytics Product building experience would be a plus Familiarity with Azure Tech Stack, Databricks, ML Flow in any cloud platform Experience with Airflow for orchestrating and automating workflows Familiarity with MLOPS and containerization tools like Docker would be plus. Other Skills Required Passion for solving problems using data Detail oriented, analytical and inquisitive Ability to learn on the go Ability to work independently and with others We dream big to create future with more cheers Show more Show less
Posted 1 week ago
0 years
0 Lacs
Gurugram, Haryana, India
On-site
Job Description: We are seeking a Data Scientist with strong expertise in AI/ML, AWS, Big Data, and Python to join our data-driven team. You will design, build, and deploy scalable machine learning models and data pipelines that drive key business decisions. Key Responsibilities: Develop and implement machine learning models for classification, regression, clustering, and forecasting. Work with large-scale structured and unstructured data using Big Data tools (e.g., Spark, Hadoop). Build data pipelines and deploy models on AWS cloud services (e.g., S3, SageMaker, Lambda, EMR). Conduct exploratory data analysis and feature engineering using Python (NumPy, Pandas, Scikit-learn, etc.). Collaborate with engineering, product, and analytics teams to integrate models into production systems. Continuously improve model accuracy and performance through testing and experimentation. Required Skills: Strong programming skills in Python and experience with ML libraries (e.g., Scikit-learn, TensorFlow, PyTorch). Experience with AWS services for data processing and model deployment. Familiarity with Big Data technologies like Hadoop, Spark, Hive, or similar. Show more Show less
Posted 1 week ago
7.0 - 10.0 years
0 Lacs
Pune, Maharashtra, India
On-site
Job Description The ideal candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. The candidate should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems.The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. The role must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Senior Process Manager Role And Responsibilities Understand business problem and requirements by building domain knowledge and translate to data science problem. Conceptualize and design cutting edge data science solution to solve the data science problem, apply design-thinking concepts. Identify the right algorithms, tech stack, sample outputs required to efficiently adder the end need. Prototype and experiment the solution to successfully demonstrate the value. Independently or with support from team, execute the conceptualized solution as per plan by following project management guidelines. Present the results to internal and client stakeholder in an easy to understand manner with great story telling, story boarding, insights and visualization. Help build overall data science capability for eClerx through support in pilots, pre sales pitches, product development, and practice development initiatives. Technical And Functional Skills Bachelor’s degree in Computer Science with 7 to 10 years of work experience. Must have experience in Advance Analytics, Data Science, regression, forecasting, analytics, SQL, R, Python, decision tree, random forest, SAS, clustering classification. Ability to engage clients to understand business requirements and convert the same into technical/modelling problems for solution development. Demonstrate strong interpersonal skills and a comfort interacting with clients from the C-suite to marketing managers to technical specialists. Demonstrated knowledge of analytical/statistical techniques and their applications; a working knowledge of/experience in R and Python is a plus. Demonstrated excellent communications skills, both written and spoken, as well as being able to explain complex technical concepts in plain English. Ability to present results of statistical models in business language. Domain understanding of at least one preferably two verticals amongst Retail, Cable, Technology (not mandate). Knowledge of data visualization tools (Tableau, QlikView, etc.) is a plus. Demonstrate strong analytical and storytelling skills and the ability to find relevant stories from piles of reports. Ability to manage specific tasks to completion with minimal direction. Ideal candidate has been in a consulting role previously. Hands-on expertise on the applied statistical techniques including multi-variate regression, logistic regression, market-mix models, clustering, classification, survival, churn models, speech analytics, image analytics, etc. Ability to collaborate with onsite colleagues in the US & UK. Expert in handling large data, cleansing & preparation for modelling. Very high attention to detail and quality. About Us At eClerx, we serve some of the largest global companies – 50 of the Fortune 500 clients. Our clients call upon us to solve their most complex problems, and deliver transformative insights. Across roles and levels, you get the opportunity to build expertise, challenge the status quo, think bolder, and help our clients seize value About The Team eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law. Show more Show less
Posted 1 week ago
5.0 years
15 Lacs
Cochin
On-site
Job Title: Database Lead (DB Lead) Location: Kochi Experience: 5+ Years Compensation: 20–25% hike on current CTC Employment Type: Full-Time Roles & Responsibilities: 1. Hands-on experience in writing complex SQL queries, stored procedures, packages, functions, and leveraging SQL analytical functions. 2. Expertise with Microsoft SQL Server tools and services, particularly SSIS (ETL processes). 3. Troubleshoot and support existing Data Warehouse (DW) processes. 4. Perform production-level performance tuning for MS SQL databases. 5. Monitor and report on SQL environment performance and availability metrics; implement best practices for performance optimization. 6. Participate in SQL code reviews with application teams to enforce SQL coding standards. 7. Manage database backup and restore operations, including scheduled Disaster Recovery (DR) tests. Should be well-versed in clustering , replication , and MS SQL restoration techniques. 8. Exhibit strong communication and coordination skills, with the ability to work efficiently under pressure. Desired Candidate Profile: · Bachelor’s Degree in Engineering (B.Tech) or Master of Computer Applications (MCA). · Minimum 5 years of relevant work experience in database development/administration. · Professional certifications in Database Development or Management are highly preferred. · Experience working in Agile/Scrum environments. Familiarity with JIRA is a plus. Job Types: Full-time, Permanent Pay: From ₹1,500,000.00 per year Schedule: Day shift Application Question(s): Do you have at least 5 years of hands-on experience with Microsoft SQL Server, including writing complex queries, stored procedures, and using SSIS (ETL processes)? Do you have experience with database backup/restoration, clustering, and Disaster Recovery (DR) testing in a production environment? Are you willing to work from Kochi and open to joining full-time with a 20–25% hike on your current CTC? Work Location: In person
Posted 1 week ago
10.0 years
0 Lacs
Hyderābād
On-site
Overview: As Sales Sr. Mgr, own planogram delivery for AMESA perfect store & lead a team of POG analysts supporting AMESA sector (perfect store + catman POG services). Ensure that exceptional leadership & operational direction is provided by his/her analysts team to AMESA sales employees across multiple teams and markets. Ensure that his/her Planogram Analysts deliver visually appealing planograms based on store clustering, space definitions and defined flow. Work closely with AMESA sector, BU & category management teams to ensure planograms meet approved parameters. Implement operational practices to ensure accurate & on-time delivery of planograms (i.e. ensuring all planograms meet assortment requirements, visual appeal, innovation opportunities and shelving metrics). Continuously identify opportunities and implement processes to improve service delivery (output quality & timeliness) and develop process efficiency through automation. Lead global stakeholder engagement & build trusted relationships to strengthen total team partnership. Demonstrate strong team & talent management practices including hiring, staffing, performance management & career development for his/her team. Responsibilities: Functional responsibilities - Execution (50%) + People Leadership (50%) Execution responsibilities: Be a single point of contact for AMESA perfect store processes by mastering PEP Process and Category knowledge. Partner with Category Manager / KAM’s to build business context and create effortless partnership to tailor deliverables according to market needs. Own accurate & on-time delivery of AMESA Perfect Store POG processes through effective project management, strong learnability & attention to detail. Drive continuous improvement through process streamlining/automation. Gain in-depth knowledge of PepsiCo business, categories, products, tools and share new learnings with the AMESA POG team on a continual basis to enhance range and space deliverables for AMESA. People leadership responsibilities: Head the AMESA DX POG team (perfect store + catman) and ensure efficient, effective and comprehensive support of the sales employees across multiple teams and markets. Work closely with AMESA sector, BU & Category Management teams to ensure planogram meet approved parameters. Implement planogram quality control practices ensuring all planograms meet assortment requirements, visual appeal, innovation opportunities and shelving metrics. Lead workload forecasting and effectively drive prioritization conversation to support capacity management. Implement operational controls to track progress, monitor progress & control risks. Strong stakeholder engagement to elevate team collaboration, contribution & communication. Drive process efficiencies through process streamlining and/or automation. Build stronger business context and elevate the teams capability from execution focused to end to end capability focused. Scale-up operations in-line with business growth, both within existing scope, as well as new areas of opportunity Create an inclusive and collaborative environment Partner with global teams to define strategy for End to End execution ownership and accountabitity. Responsible for hiring, talent assessment, competency development, performance management, productivity improvement, talent retention, career planning and development Qualifications: 10+ years of experience in retail/merchandizing experience (inclusive of JDA) Bachelor’s in commerce/business administration/marketing, Master’s degree is a plus Advanced level skill in Microsoft Office, with demonstrated advanced Excel skills necessary Experience with analyzing and reporting data to identify issues, trends, or exceptions to drive Advanced knowledge and experience of space management technology platform JDA (5 years) Propensity to learn PepsiCo software systems and ability to provide superior customer service Best-in-class time management skills, ability to multitask, set priorities and plan
Posted 1 week ago
10.0 years
7 - 20 Lacs
India
On-site
About MostEdge At MostEdge , we’re on a mission to accelerate commerce and build sustainable, trusted experiences . Our slogan — Protect Every Penny. Power Every Possibility. —reflects our commitment to operational excellence, data integrity, and real-time intelligence that help retailers run smarter, faster, and stronger. Our systems are mission-critical and designed for 99.99999% uptime , powering millions of transactions and inventory updates daily . We work at the intersection of AI, microservices, and retail commerce—and we win as a team. Role Overview We are looking for a Senior Database Administrator (DBA) to own the design, implementation, scaling, and performance of our data infrastructure. You will be responsible for mission-critical OLTP systems spanning MariaDB, MySQL, PostgreSQL, and MongoDB , deployed across AWS, GCP, and containerized Kubernetes clusters . This role plays a key part in ensuring data consistency, security, and speed across billions of rows and real-time operations. Scope & Accountability What You Will Own Manage and optimize multi-tenant, high-availability databases for real-time inventory, pricing, sales, and vendor data. Design and maintain scalable, partitioned database architectures across SQL and NoSQL systems. Monitor and tune query performance and ensure fast recovery, replication, and backup practices. Partner with developers, analysts, and DevOps teams on schema design, ETL pipelines, and microservices integration . Maintain security best practices, audit logging, encryption standards, and data retention compliance . What Success Looks Like 99.99999% uptime maintained across all environments. <100ms query response times for large-scale datasets. Zero unplanned data loss or corruption incidents. Developer teams experience zero bottlenecks from DB-related delays. Skills & Experience Must-Have 10+ years of experience managing OLTP systems at scale. Strong hands-on with MySQL, MariaDB, PostgreSQL, and MongoDB . Proven expertise in replication, clustering, indexing, and sharding . Experience with Kubernetes-based deployments , Kafka queues , and Dockerized apps . Familiarity with AWS S3 storage , GCP services, and hybrid cloud data replication. Experience in startup environments with fast-moving agile teams. Track record of creating clear documentation and managing tasks via JIRA . Nice-to-Have Experience with AI/ML data pipelines , vector databases, or embedding stores. Exposure to infrastructure as code (e.g., Terraform, Helm). Familiarity with LangChain, FastAPI , or modern LLM-driven architectures. How You Reflect Our Values Lead with Purpose : You enable smarter, faster systems that empower our retail customers. Build Trust : You create safe, accurate, and recoverable environments. Own the Outcome : You take responsibility for uptime, audits, and incident resolution. Win Together : You collaborate seamlessly across product, ops, and engineering. Keep It Simple : You design intuitive schemas, efficient queries, and clear alerts. Why Join MostEdge? Work on high-impact systems powering real-time retail intelligence . Collaborate with a passionate, values-driven team across AI, engineering, and operations. Build at scale—with autonomy, ownership, and cutting-edge tech. Job Types: Full-time, Permanent Pay: ₹727,996.91 - ₹2,032,140.73 per year Benefits: Health insurance Life insurance Paid sick time Paid time off Provident Fund Schedule: Evening shift Morning shift US shift Supplemental Pay: Performance bonus Yearly bonus Work Location: In person Expected Start Date: 31/07/2025
Posted 1 week ago
8.0 years
3 - 8 Lacs
Gurgaon
On-site
Date: Jun 5, 2025 Job Requisition Id: 61535 Location: Gurgaon, IN YASH Technologies is a leading technology integrator specializing in helping clients reimagine operating models, enhance competitiveness, optimize costs, foster exceptional stakeholder experiences, and drive business transformation. At YASH, we’re a cluster of the brightest stars working with cutting-edge technologies. Our purpose is anchored in a single truth – bringing real positive changes in an increasingly virtual world and it drives us beyond generational gaps and disruptions of the future. We are looking forward to hire Microsoft Fabric Professionals in the following areas : Experience 8+ Years Job Description Position: Data Analytics Lead. Experience: 8+ Years. Responsibilities: Build, manage, and foster a high-functioning team of data engineers and Data analysts. Collaborate with business and technical teams to capture and prioritize platform ingestion requirements. Experience of working with manufacturing industry in building a centralized data platform for self service reporting. Lead the data analytics team members, providing guidance, mentorship, and support to ensure their professional growth and success. Responsible for managing customer, partner, and internal data on the cloud and on-premises. Evaluate and understand current data technologies and trends and promote a culture of learning. Build and end to end data strategy from collecting the requirements from business to modelling the data and building reports and dashboards Required Skills: Experience in data engineering and architecture, with a focus on developing scalable cloud solutions in Azure Synapse / Microsoft Fabric / Azure Databricks Accountable for the data group’s activities including architecting, developing, and maintaining a centralized data platform including our operational data, data warehouse, data lake, Data factory pipelines, and data-related services. Experience in designing and building operationally efficient pipelines, utilising core Azure components, such as Azure Data Factory, Azure Databricks and Pyspark etc Strong understanding of data architecture, data modelling, and ETL processes. Proficiency in SQL and Pyspark Strong knowledge of building PowerBI reports and dashboards. Excellent communication skills Strong problem-solving and analytical skills. Required Technical/ Functional Competencies Domain/ Industry Knowledge: Basic knowledge of customer's business processes- relevant technology platform or product. Able to prepare process maps, workflows, business cases and simple business models in line with customer requirements with assistance from SME and apply industry standards/ practices in implementation with guidance from experienced team members. Requirement Gathering and Analysis: Working knowledge of requirement management processes and requirement analysis processes, tools & methodologies. Able to analyse the impact of change requested/ enhancement/ defect fix and identify dependencies or interrelationships among requirements & transition requirements for engagement. Product/ Technology Knowledge: Working knowledge of technology product/platform standards and specifications. Able to implement code or configure/customize products and provide inputs in design and architecture adhering to industry standards/ practices in implementation. Analyze various frameworks/tools, review the code and provide feedback on improvement opportunities. Architecture tools and frameworks: Working knowledge of architecture Industry tools & frameworks. Able to identify pros/ cons of available tools & frameworks in market and use those as per Customer requirement and explore new tools/ framework for implementation. Architecture concepts and principles : Working knowledge of architectural elements, SDLC, methodologies. Able to provides architectural design/ documentation at an application or function capability level and implement architectural patterns in solution & engagements and communicates architecture direction to the business. Analytics Solution Design: Knowledge of statistical & machine learning techniques like classification, linear regression modelling, clustering & decision trees. Able to identify the cause of errors and their potential solutions. Tools & Platform Knowledge: Familiar with wide range of mainstream commercial & open-source data science/analytics software tools, their constraints, advantages, disadvantages, and areas of application. Required Behavioral Competencies Accountability: Takes responsibility for and ensures accuracy of own work, as well as the work and deadlines of the team. Collaboration: Shares information within team, participates in team activities, asks questions to understand other points of view. Agility: Demonstrates readiness for change, asking questions and determining how changes could impact own work. Customer Focus: Identifies trends and patterns emerging from customer preferences and works towards customizing/ refining existing services to exceed customer needs and expectations. Communication: Targets communications for the appropriate audience, clearly articulating and presenting his/her position or decision. Drives Results: Sets realistic stretch goals for self & others to achieve and exceed defined goals/targets. Resolves Conflict: Displays sensitivity in interactions and strives to understand others’ views and concerns. Certifications Mandatory At YASH, you are empowered to create a career that will take you to where you want to go while working in an inclusive team environment. We leverage career-oriented skilling models and optimize our collective intelligence aided with technology for continuous learning, unlearning, and relearning at a rapid pace and scale. Our Hyperlearning workplace is grounded upon four principles Flexible work arrangements, Free spirit, and emotional positivity Agile self-determination, trust, transparency, and open collaboration All Support needed for the realization of business goals, Stable employment with a great atmosphere and ethical corporate culture
Posted 1 week ago
3.0 years
1 - 6 Lacs
Noida
On-site
Level AI was founded in 2019 and is a Series C startup headquartered in Mountain View, California. Level AI revolutionizes customer engagement by transforming contact centers into strategic assets. Our AI-native platform leverages advanced technologies such as Large Language Models to extract deep insights from customer interactions. By providing actionable intelligence, Level AI empowers organizations to enhance customer experience and drive growth. Consistently updated with the latest AI innovations, Level AI stands as the most adaptive and forward-thinking solution in the industry. Empowering contact center stakeholders with real-time insights, our tech facilitates data-driven decision-making for contact centers, enhancing service levels and agent performance. As a vital team member, your work will be cutting-edge technologies and will play a high-impact role in shaping the future of AI-driven enterprise applications. You will directly work with people who've worked at Amazon, Facebook, Google, and other technology companies in the world. With Level AI, you will get to have fun, learn new things, and grow along with us. Ready to redefine possibilities? Join us! We'll love to explore more about you if you have Qualification: B.E/B.Tech/M.E/M.Tech/PhD from tier 1 engineering institutes with relevant work experience with a top technology company in computer science or mathematics-related fields with 3-5 years of experience in machine learning and NLP. Knowledge and practical experience in solving NLP problems in areas such as text classification, entity tagging, information retrieval, question-answering, natural language generation, clustering, etc. 3+ years of experience working with LLMs in large-scale environments. Expert knowledge of machine learning concepts and methods, especially those related to NLP, Generative AI, and working with LLMs. Knowledge and hands-on experience with Transformer-based Language Models like BERT, DeBERTa, Flan-T5, Mistral, Llama, etc. Deep familiarity with internals of at least a few Machine Learning algorithms and concepts. Experience with Deep Learning frameworks like Pytorch and common machine learning libraries like scikit-learn, numpy, pandas, NLTK, etc. Experience with ML model deployments using REST API, Docker, Kubernetes, etc. Knowledge of cloud platforms (AWS/Azure/GCP) and their machine learning services is desirable. Knowledge of basic data structures and algorithms. Knowledge of real-time streaming tools/architectures like Kafka, Pub/Sub is a plus. Your role at Level AI includes but is not limited to Big picture: Understand customers’ needs, innovate and use cutting edge Deep Learning techniques to build data-driven solutions. Work on NLP problems across areas such as text classification, entity extraction, summarization, generative AI, and others. Collaborate with cross-functional teams to integrate/upgrade AI solutions into the company’s products and services. Optimize existing deep learning models for performance, scalability, and efficiency. Build, deploy, and own scalable production NLP pipelines. Build post-deployment monitoring and continual learning capabilities. Propose suitable evaluation metrics and establish benchmarks. Keep abreast with SOTA techniques in your area and exchange knowledge with colleagues. Desire to learn, implement and work with latest emerging model architectures, training and inference techniques, data curation pipelines, etc. To learn more visit : https://thelevel.ai/ Funding : https://www.crunchbase.com/organization/level-ai LinkedIn : https://www.linkedin.com/company/level-ai/
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane