Home
Jobs

1498 Clustering Jobs - Page 4

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Are you ready to make an impact at DTCC? Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development? At DTCC, we are at the forefront of innovation in the financial markets. We are committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a growing internal community and are committed to creating a workplace that looks like the world that we serve. Pay and Benefits: Competitive compensation, including base pay and annual incentive Comprehensive health and life insurance and well-being benefits, based on location Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact you will have in this role: The Database Administrator will provide database administrative support for all DTCC environments including Development, QA, client test, and our critical high availability production environment and DR Data Centers. Extensive knowledge on all aspects of MSSQL database administration and the ability to support other database platforms including both Aurora PostgreSQL and Oracle. This DBA will have a high level of impact in the generation of new processes and solutions, while operating under established procedures and processes in a critically important Financial Services infrastructure environment. The Ideal candidate will ensure optimal performance, data security and reliability of our database infrastructure. What You'll Do: Software Installation, configure and maintain Oracle server instances Implement and handle High availability solutions including Always ON availability groups and clustering. Support development, QA, PSE and Production environments using ServiceNow ticketing system. Review production performance reports for variances from normal operation. Optimize SQL queries and indexes for better efficiency. Analyze query and recommend the tuning strategies. Maintain database performance by calculating optimum values for database parameters; implementing new releases; completing maintenance requirements; evaluating computer operating systems and hardware products. Perform database backup and recovery strategy using tools like SQL server backup, log shipping and other technologies. Provide 3rd level support for DTCC critical production environments. Participate in root cause analysis for database issues. Prepare users by conducting training; providing information; and resolving problems. Maintains quality service by establishing and enforcing organizational standards. Setup and maintain database replication and clustering solutions. Maintains professional and technical knowledge by attending educational workshops; reviewing professional publications; establishing personal networks; benchmarking innovative practices; participating in professional societies. Will have shared responsibility for off-hour support. Maintain documentation on database configurations and procedures. Provide leadership and direction for the Architecture, Design, Maintenance and L1, L2 and L3 Level Support of a 7 x 24 global infrastructure. Qualifications: Bachelor's degree or equivalent experience Talents Needed for Success: Strong Oracle Experience of 19c, 21c and 22c. A minimum 4+ years of proven relevant experience in Oracle Solid experience in Oracle Database administration Strong knowledge in Python and Angular. Working knowledge of Oracle’s Golden gate Replication technology Demonstrate strong performance Tuning and Optimization skills in MSSQL, PostgreSQL and Oracle databases. Good Experience in High Availability and Disaster recovery (HA/DR) options for SQL server. Good experience in Backup and restore processes. Proficiency in power shell scripting for automation. Possess Good interpersonal skills and ability to coordinate with various stakeholders. Follow standard processes on Organizational change / Incident management / Problem management. Demonstrated ability to solve complex systems and database environment issues. Actual salary is determined based on the role, location, individual experience, skills, and other considerations. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 2 days ago

Apply

5.0 years

8 - 45 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Industry & Sector: Enterprise IT Infrastructure & Cloud Services in India. A fast-growing managed services provider delivers secure, high-availability virtualization platforms for Fortune 500 and digital-native businesses. About The Opportunity As a VMware Platform Engineer, you will design, deploy, and operate mission-critical virtualization estates on-site at our client facilities, ensuring performance, security, and scalability across private and hybrid clouds. Role & Responsibilities Engineer and harden vSphere, ESXi, and vSAN clusters to deliver 99.99% uptime. Automate build, patching, and configuration tasks using PowerCLI, Ansible, and REST APIs. Monitor capacity, performance, and logs via vRealize Operations and generate improvement plans. Lead migrations, upgrades, and disaster-recovery drills, documenting runbooks and rollback paths. Collaborate with network, storage, and security teams to enforce compliance and zero-trust policies. Provide L3 support, root-cause analysis, and mentoring to junior administrators. Skills & Qualifications Must-Have 5+ years hands-on with VMware vSphere 6.x/7.x in production. Expertise in ESXi host deployment, clustering, vMotion, DRS, and HA. Strong scripting with PowerCLI or Python for automation. Solid grasp of Linux server administration and TCP/IP networking. Experience with backup, replication, and DR tooling (Veeam, SRM, etc.). Preferred Exposure to vRealize Suite, NSX-T, or vCloud Director. Knowledge of container platforms (Tanzu, Kubernetes) and CI/CD pipelines. VMware Certified Professional (VCP-DCV) or higher. Benefits & Culture On-site, enterprise-scale environments offering complex engineering challenges. Continuous learning budget for VMware and cloud certifications. Collaborative, performance-driven culture with clear growth paths. Workplace Type: On-Site | Location: India Skills: automation,rest apis,vmware,vmware vsphere,ansible,backup and replication tools (veeam, srm),vsan,linux server administration,disaster recovery,powercli,tcp/ip networking,scripting,vrealize operations,vmware vsphere 6.x/7.x,platform engineers (vmware),esxi

Posted 2 days ago

Apply

6.0 years

0 Lacs

Kochi, Kerala, India

On-site

Linkedin logo

Work closely with internal BU’s and business partners (clients) to understand their business problems and translate them into data science problems ▪ Design intelligent data science solutions that delivers incremental value the end stakeholders ▪ Work closely with data engineering team in identifying relevant data and pre-processing the data to suitable models ▪ Develop the designed solutions into statistical machine learning models, AI models using suitable tools and frameworks ▪ Work closely with the business intelligence team to build BI system and visualizations that delivers the insights of the underlying data science model in most intuitive ways possible. ▪ Work closely with application team to deliver AI/ML solutions as modular offerings. Skills/Specification ▪ Masters/Bachelor’s in Computer Science or Statistics or Economics ▪ At least 6 years of experience working in Data Science field and is passionate about numbers, quantitative problems ▪ Deep understanding of Machine Learning models and algorithms ▪ Experience in analysing complex business problems, translating it into data science problems and modelling data science solutions for the same ▪ Understanding of and experience in one or more of the following Machine Learning algorithms:- Regression , Time Series Logistic Regression, Naive Bayes, kNN, SVM, Decision Trees, Random Forest, k-Means Clustering etc. NLP, Text Mining LLM (GPTs) -OpenAI , Azure OpenAI, AWS Bed rock, Gemini, Llama, Deepseek etc (knowledge on fine tuning /custom training GPTs would be an add-on advantage). Deep Learning, Reinforcement learning algorithm ▪ Understanding of and experience in one or more of the machine learning frameworks - TensorFlow, Caffe, Torch etc. ▪ Understanding of and experience of building machine learning models using various packages in Python ▪ Knowledge & Experience on SQL, Relational Databases, No SQL Databases and Datawarehouse concepts ▪ Understanding of AWS/Azure Cloud architecture ▪ Understanding on the deployment architectures of AI/ML models (Flask, Azure function, AWS lambda) ▪ Knowledge on any BI and visualization tools is add-on (Tableau/PowerBI/Qlik/Plotly etc). ▪To adhere to the Information Security Management policies and procedures. Soft Skills Required ▪ Must be a good team player with good communication skills ▪ Must have good presentation skills ▪ Must be a pro-active problem solver and a leader by self ▪ Manage & nurture a team of data scientists ▪ Desire for numbers and patterns

Posted 2 days ago

Apply

5.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

About Arctera Arctera keeps the world’s IT systems working. We can trust that our credit cards will work at the store, that power will be routed to our homes and that factories will produce our medications because those companies themselves trust Arctera. Arctera is behind the scenes making sure that many of the biggest organizations in the world – and many of the smallest too – can face down ransomware attacks, natural disasters, and compliance challenges without missing a beat. We do this through the power of data and our flagship products, Insight, InfoScale and Backup Exec. Illuminating data also helps our customers maintain personal privacy, reduce the environmental impact of data storage, and defend against illegal or immoral use of information. It’s a task that continues to get more complex as data volumes surge. Every day, the world produces more data than it ever has before. And global digital transformation – and the arrival of the age of AI – has set the course for a new explosion in data creation. Joining the Arctera team, you’ll be part of a group innovating to harness the opportunity of the latest technologies to protect the world’s critical infrastructure and to keep all our data safe. This position is with InfoScale (Data Resiliency) offering of the Arctera, which is software-defined storage and availability solution that helps organization manage information resiliency and protection across physical, virtual and cloud environments. It provides high availability and disaster recovery for mission critical applications. Responsibilities We are looking for candidates who have experience with storage and cloud technology for data resiliency solution. You should also have an eye for great design and a knack for pushing projects from conception all the way to customers. In this role, you will design and develop data protection solutions using the latest technologies. You will own product quality and overall customer experience. You will also propose technical solutions to product/service problems while refining, designing and implementing software components in line with technical requirements. The Sr. Software Engineer will productively work in a highly collaborative agile team, coach junior team members, actively participate in knowledge sharing all while communicating across teams in a multinational environment. Minimum Required Skills Include MS/BS in Computer Science/Computer Engineering or related field of study with 5+ years of relevant experience Full understanding of storage and cloud technologies, emerging standards and engineering best practices Strong communication skills, both oral and written Hands on experience in developing enterprise products with any of the programming language C/C++/Python/Go/RESTful APIs Hands on experience in developing Kubernetes custom controllers/operators and working knowledge of k8s orchestration platforms - OpenShift/EKS/AKS Designs, develops and maintains high quality code for product components, focusing on implementation. Solid knowledge of algorithms and design patterns Solid knowledge of clustering (HA-DR) concepts, systems programming Strong focus on knowledge and application of industry standard SDLC process including design, coding, debugging, and testing practices for large enterprise grade products is absolute must Designs, develops and maintains high quality code for product components, focusing on implementation. Solid knowledge of algorithms and design patterns Desired Skills Include Knowledge of operating Systems: LINUX/UNIX, Object Oriented Language· Agile Process Experience in developing CNI/CSI plugins/drivers’ development Experience with DevOps and tools (Prometheus, EFK, Helm, Red Hat Registry, Tiller, etc) related to Container technology Experience in Agile development methodologies including unit testing and TDD (test-driven development) Extra credit for Open-Source Contributions: active participation in CNCF SIGs, upstream contributions to K8S Ability to communicate and collaborate among cross-functional teams in a multinational environment

Posted 2 days ago

Apply

12.0 years

0 Lacs

Pune, Maharashtra, India

Remote

Linkedin logo

TCS has been a great pioneer in feeding the fire of young techies like you. We are a global leader in the technology arena and there’s nothing that can stop us from growing together. What we are looking for Role: MQ Admin Experience Range: 8 – 12 Years Location: Pune/Bengaluru Must Have: 1) Administrate Websphere MQ v7.x,8.x, 9.x 2) Building non prod and prod qmgrs as per requirements from clients. 3) Ability to run MQSC commands and remote mq administration. 4) Very good knowledge on Distributed queuing and Clustering. 5) Good knowledge to use IBM MQ utilities like qload, runmqdlq, saveqmgr and tools like MQ explorer, Rfhutil 6) Support the clients in a 24x7 model 7) Linux and Solaris hands on knowledge is expected. 8) Knowledge to handle ITIL components such as IM, PM, CM 9) Knowledge on SSL certificates is a must 10) knowledge on MQ Migrations and Fix Pack Installation 11) Hands on client and server architecture . 12) Ability to support application team with their testing and deployments. Good to Have: 1) Good communication skills 2) Require good communication skills to talk to users, on understanding their requirements and able to provide solution. 3) Work experience on MQ Administration, Define MQ managers ,objects, troubleshoot all MQ issues. 4) Work experience on several MQ tools like IBM keyman tool, qload, Mo71, RfHutil, Mq explorer. 5) Strong in decision and problem solving skills 6) Knowledge on Unix/Perl scripting will be considered an advantage too 7) Knowledge on Networking, Firewall, and Unix based OS Good Production support experience. 8) Knowledge on PUB/SUB, HA, Clustering, Openshift, MQ clients on Fabric. 9) Flexible to work on shifts & provide coverage on weekends 10) Financial domain Knowledge . Essential: L2 Support activities on IBM MQ Implementing client requests and migrating to latest environments Closely work with L3 to implement their ideas and tasks. Maintaining prod env with latest ifixes and fixpacks Minimum Qualification: •15 years of full-time education •Minimum percentile of 50% in 10th, 12th, UG & PG (if applicable)

Posted 2 days ago

Apply

4.0 years

0 Lacs

Hyderabad, Telangana, India

Remote

Linkedin logo

Are you ready to make an impact at DTCC? Do you want to work on innovative projects, collaborate with a dynamic and supportive team, and receive investment in your professional development? At DTCC, we are at the forefront of innovation in the financial markets. We are committed to helping our employees grow and succeed. We believe that you have the skills and drive to make a real impact. We foster a growing internal community and are committed to creating a workplace that looks like the world that we serve. Pay and Benefits: Competitive compensation, including base pay and annual incentive Comprehensive health and life insurance and well-being benefits, based on location Pension / Retirement benefits Paid Time Off and Personal/Family Care, and other leaves of absence when needed to support your physical, financial, and emotional well-being. DTCC offers a flexible/hybrid model of 3 days onsite and 2 days remote (onsite Tuesdays, Wednesdays and a third day unique to each team or employee). The Impact you will have in this role: The Database Administrator will provide database administrative support for all DTCC environments including Development, QA, client test, and our critical high availability production environment and DR Data Centers. Extensive knowledge on all aspects of MSSQL database administration and the ability to support other database platforms including both Aurora PostgreSQL and Oracle. This DBA will have a high level of impact in the generation of new processes and solutions, while operating under established procedures and processes in a critically important Financial Services infrastructure environment. The Ideal candidate will ensure optimal performance, data security and reliability of our database infrastructure. What You'll Do: Software Installation, configure and maintain SQL server instances (On-prem and cloud based) Implement and handle High availability solutions including Always ON availability groups and clustering. Support development, QA, PSE and Production environments using ServiceNow ticketing system. Review production performance reports for variances from normal operation. Optimize SQL queries and indexes for better efficiency. Analyze query and recommend the tuning strategies. Maintain database performance by calculating optimum values for database parameters; implementing new releases; completing maintenance requirements; evaluating computer operating systems and hardware products. Perform database backup and recovery strategy using tools like SQL server backup, log shipping and other technologies. Provide 3rd level support for DTCC critical production environments. Participate in root cause analysis for database issues. Prepare users by conducting training; providing information; and resolving problems. Maintains quality service by establishing and enforcing organizational standards. Setup and maintain database replication and clustering solutions. Maintains professional and technical knowledge by attending educational workshops; reviewing professional publications; establishing personal networks; benchmarking innovative practices; participating in professional societies. Will have shared responsibility for off-hour support. Maintain documentation on database configurations and procedures. Provide leadership and direction for the Architecture, Design, Maintenance and L1, L2 and L3 Level Support of a 7 x 24 global infrastructure. Qualifications: Bachelor's degree or equivalent experience Talents Needed for Success: Strong Oracle Experience of 19c, 21c and 22c. A minimum 4+ years of proven relevant experience in SQL Solid understanding in MSSQL Server and Aurora Postgres database Strong knowledge in Python and Angular. Working knowledge of Oracle’s Golden gate Replication technology Demonstrate strong performance Tuning and Optimization skills in MSSQL, PostgreSQL and Oracle databases. Good Experience in High Availability and Disaster recovery (HA/DR) options for SQL server. Good experience in Backup and restore processes. Proficiency in power shell scripting for automation. Possess Good interpersonal skills and ability to coordinate with various stakeholders. Follow standard processes on Organizational change / Incident management / Problem management. Demonstrated ability to solve complex systems and database environment issues. Actual salary is determined based on the role, location, individual experience, skills, and other considerations. We are an equal opportunity employer and value diversity at our company. We do not discriminate on the basis of race, religion, color, national origin, sex, gender, gender expression, sexual orientation, age, marital status, veteran status, or disability status. We will ensure that individuals with disabilities are provided reasonable accommodation to participate in the job application or interview process, to perform crucial job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 2 days ago

Apply

0 years

0 Lacs

India

Remote

Linkedin logo

Data Science Intern (Remote | 3 Months) Company: INLIGHN TECH Location: Remote Duration: 3 Months Stipend (Top Performers): ₹15,000 Perks: Certificate | Letter of Recommendation | Hands-on Training About INLIGHN TECH INLIGHN TECH empowers students and recent graduates through hands-on, project-based internships. Our Data Science Internship is designed to enhance your technical skills while solving real-world data challenges, equipping you for the industry. Role Overview As a Data Science Intern , you’ll dive deep into real datasets, apply machine learning techniques, and generate insights that support informed decision-making. This internship provides the perfect launchpad for aspiring data professionals. Key Responsibilities Collect, clean, and preprocess data for analysis Apply statistical and machine learning techniques Build models for classification, regression, and clustering tasks Develop dashboards and visualizations using Python or Power BI Present actionable insights to internal stakeholders Collaborate with a team of peers on live data projects Requirements Currently pursuing or recently completed a degree in Computer Science, Data Science, Mathematics, or a related field Solid understanding of Python and key libraries: Pandas, NumPy, Scikit-learn Familiarity with machine learning algorithms and SQL Strong analytical and problem-solving abilities Eagerness to learn and grow in a fast-paced environment What You’ll Gain Real-world experience with industry-standard tools and datasets Internship Completion Certificate Letter of Recommendation for outstanding contributors Potential for full-time opportunities A portfolio of completed data science projects

Posted 2 days ago

Apply

0 years

0 Lacs

Bangalore Urban, Karnataka, India

On-site

Linkedin logo

Cloud Engineering Exp:8-15yrs Location: Bangalore NP:30days Job Profile: Cloud Engineer - Clustering & Pacemaker Expertise Position Overview: We are seeking a highly skilled Cloud Engineer with specialized expertise in clustering technologies, particularly Pacemaker, along with strong knowledge of hyperscaler platforms (Azure, AWS, GCP, IBM Cloud). The ideal candidate will have deep experience working with Red Hat Enterprise Linux (RHEL) and SUSE Linux Enterprise Server (SLES) and the ability to design and implement high-availability solutions using Pacemaker clustering across multiple OS environments. This role is integral to ensuring the stability and availability of cloud-hosted applications, particularly for SAP workloads, and includes leading proof of concept (PoC) projects, as well as collaborating closely with development teams to prepare requirements and documentation. · Technical Skills : o In-depth knowledge and hands-on experience with Pacemaker clustering, including resource agents, fencing, and quorum management. o Strong understanding of cloud platforms (AWS, Azure, GCP, IBM Cloud) with experience in implementing HA solutions. o Experience in deploying and managing SAP systems (S/4 HANA, Netweaver, HANA) in a high-availability clustered environment. o Familiarity databases like HANA and DB2 (optional) in a clustered and HA setup. o A plus (optional): experience with automation tools (e.g., Ansible) · Soft Skills : o Excellent problem-solving and troubleshooting skills with a focus on complex high-availability configurations. o Strong ability to collaborate with cross-functional teams, providing technical leadership in clustered and high-availability solution design. o Fluent communication skills (verbal and written) in business English , with experience in preparing and presenting technical documentation and reports. Preferred Skills: · Advanced certifications in Pacemaker , Red Hat , SLES and SAP systems. · Experience in designing and implementing multi-cloud high availability architectures . · Experience : o Ideally multiple years of experience in clustering and high availability solutions , with a deep focus on Pacemaker and associated technologies. o Hands-on experience in cloud environments (AWS, Azure, GCP, IBM Cloud) with a focus on HA architecture and cloud-native services . o Extensive experience with Linux-based operating systems (RHEL, SLES), including basic knowledge of system administration and OS tuning in clustered environments. Key Responsibilities: 1. Clustering & High Availability Expertise: o Design and implement Pacemaker clustering solutions across multiple operating systems (RHEL, SLES) and platforms to ensure high availability and fault tolerance of critical applications databases and services. o Contribute to clustering architecture decisions, ensuring optimal scalability, and reliability of business-critical workloads. o Troubleshoot and resolve complex issues related to Pacemaker clusters, providing expert guidance on configuration, failover testing and cluster architecture. 2. Hyperscaler Experties: o Utilize extensive knowledge of hyperscaler platforms (Azure, AWS, GCP, IBM Cloud) to design, deploy, and maintain cloud high availability solutions o Implement and manage cloud-based high availability (HA) solutions in multi-cloud environments, leveraging cloud-native services alongside Pacemaker clustering for mission-critical applications. 3. Proof of Concepts & Architecture Documentation: o Lead and support for proof of concept (PoC) exercises to evaluate new clustering architectures and solutions, documenting the setup, configurations, and lessons learned. o Produce detailed technical documentation covering clustering configurations, architecture diagrams, and deployment processes, ensuring knowledge transfer, reproducibility and further automation. 4. SAP Solution Basis & Integration: o Leverage expertise in SAP Solution Basis , specifically SAP S/4 HANA, Netweaver, and HANA databases, to implement and optimize clustering solutions for SAP workloads in the cloud. o Ensure that SAP systems are properly integrated with Pacemaker clustering to meet high availability requirements. 5. Additional Database Knowledge (Optional): o Support DB2 and other database solutions in the context of clustering and high availability to ensure optimal database performance and uptime. o Collaborate with database teams to implement clustered databases that are tightly integrated with the cloud infrastructure and clustering technologies. 6. Collaboration & Requirement Preparation: o Work closely with development and business teams to define infrastructure requirements for clustered solutions. o Participate in detailed requirement analysis sessions with internal and external stakeholders to ensure clustering and HA solutions meet SAP ECS needs and requirements. 7. Communication & Reporting: o Fluent in business English, with the ability to communicate technical concepts clearly to both technical and non-technical stakeholders. o Prepare and deliver technical presentations , status reports, and documentation related to clustering solutions and cloud architectures. Qualifications: · Education : Bachelor’s degree in Computer Science, Information Technology, or a related field. Advanced certifications in cloud platforms (Azure, AWS, GCP) and clustering technologies (Pacemaker) are a plus.

Posted 2 days ago

Apply

9.0 years

0 Lacs

Kerala, India

Remote

Linkedin logo

Position : AI Architect -PERMANENT Only Experience : 9+ years (Relevant 8 years is a must) Budget : Up to ₹40–45 LPA Notice Period : Immediate to 45 days Key Skills : Python, Data Science (AI/ML), SQL Location - TVM/Kochi/remote Job Purpose Responsible for consulting for the client to understand their AI/ML, analytics needs & delivering AI/ML applications to the client. Job Description / Duties & Responsibilities ▪ Work closely with internal BU’s and business partners (clients) to understand their business problems and translate them into data science problems ▪ Design intelligent data science solutions that delivers incremental value the end stakeholders ▪ Work closely with data engineering team in identifying relevant data and pre-processing the data to suitable models ▪ Develop the designed solutions into statistical machine learning models, AI models using suitable tools and frameworks ▪ Work closely with the business intelligence team to build BI system and visualizations that delivers the insights of the underlying data science model in most intuitive ways possible. ▪ Work closely with application team to deliver AI/ML solutions as microservices Job Specification / Skills and Competencies ▪ Masters/Bachelor’s in Computer Science or Statistics or Economics ▪ At least 6 years of experience working in Data Science field and is passionate about numbers, quantitative problems ▪ Deep understanding of Machine Learning models and algorithms ▪ Experience in analysing complex business problems, translating it into data science problems and modelling data science solutions for the same ▪ Understanding of and experience in one or more of the following Machine Learning algorithms:-Regression , Time Series ▪ Logistic Regression, Naive Bayes, kNN, SVM, Decision Trees, Random Forest, k-Means Clustering etc. ▪ NLP, Text Mining, LLM (GPTs) ▪ Deep Learning, Reinforcement learning algorithm ▪ Understanding of and experience in one or more of the machine learning frameworks -TensorFlow, Caffe, Torch etc. ▪ Understanding of and experience of building machine learning models using various packages in one or more of the programming languages– Python / R ▪ Knowledge & Experience on SQL, Relational Databases, No SQL Databases and Datawarehouse concepts ▪ Understanding of AWS/Azure Cloud architecture ▪ Understanding on the deployment architectures of AI/ML models (Flask, Azure function, AWS lambda) ▪ Knowledge on any BI and visualization tools is add-on (Tableau/PowerBI/Qlik/Plotly etc). ▪To adhere to the Information Security Management policies and procedures. Soft Skills Required ▪ Must be a good team player with good communication skills ▪ Must have good presentation skills ▪ Must be a pro-active problem solver and a leader by self ▪ Manage & nurture a team of data scientists ▪ Desire for numbers and patterns

Posted 2 days ago

Apply

6.0 years

0 Lacs

Vadodara, Gujarat, India

On-site

Linkedin logo

Skills We are looking for a Candidate with experience managing and maintaining our organization's database systems, ensuring their optimal performance, security, and reliability. Key responsibilities include database deployment and management, backup and disaster recovery planning, performance tuning, and collaborating with developers to design efficient database structures. Proficiency in SQL, experience with major database management systems like Oracle, SQL Server, or MySQL and knowledge of cloud platforms such as AWS or Azure will be an added advantage. Job Location: Vadodara Office Hours: 09:30 am to 7 pm Experience: 6+ Years Role & Responsibilities Roles And Responsibilities: Design, implement, and maintain database systems. Optimize and tune database performance. Develop database schemas, tables, and other objects. Perform database backups and restores. Implement data replication and clustering for high availability. Monitor database performance and suggest improvements. Implement database security measures including user roles, permissions, and encryption. Ensure compliance with data privacy regulations and standards. Perform regular audits and maintain security logs. Diagnose and resolve database issues, such as performance degradation or connectivity problems. Provide support for database-related queries and troubleshooting. Apply patches, updates, and upgrades to database systems. Conduct database health checks and routine maintenance to ensure peak performance. Coordinate with developers and system administrators for database-related issues. Implement and test disaster recovery and backup strategies. Ensure minimal downtime during system upgrades and maintenance. Work closely with application developers to optimize database-related queries and code. Document database structures, procedures, and policies for team members and future reference. Requirements Education/Qualification (if any Certification): A bachelor's degree in IT, computer science or a related field. Requirements: Proven experience as a DBA or in a similar database management role. Strong knowledge of database management systems (e.g., SQL Server, Oracle, MySQL, PostgreSQL, etc.). Experience with performance tuning, database security, and backup strategies. Familiarity with cloud databases (e.g., AWS RDS, Azure SQL Database) is a plus. Strong SQL and database scripting skills. Proficiency in database administration tasks such as installation, backup, recovery, performance tuning, and user management. Experience with database monitoring tools and utilities. Ability to troubleshoot and resolve database-related issues effectively. Knowledge of database replication, clustering, and high availability setups.

Posted 2 days ago

Apply

7.5 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP Sales and Distribution (SD) Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute to key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the effort to design, build, and configure applications - Act as the primary point of contact for the project - Manage the team and ensure successful project delivery - Collaborate with multiple teams to make key decisions - Provide solutions to problems for the immediate team and across multiple teams Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP Sales and Distribution (SD) - Strong understanding of statistical analysis and machine learning algorithms - Experience with data visualization tools such as Tableau or Power BI - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: - The candidate should have a minimum of 7.5 years of experience in SAP Sales and Distribution (SD) - This position is based in Mumbai - A 15 years full-time education is required

Posted 2 days ago

Apply

7.5 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : ServiceNow IT Service Management Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute to key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the effort to design, build, and configure applications - Act as the primary point of contact for the project - Manage the team and ensure successful project delivery - Collaborate with multiple teams to make key decisions - Provide solutions to problems for the immediate team and across multiple teams Professional & Technical Skills: - Must To Have Skills: Proficiency in ServiceNow IT Service Management - Strong understanding of statistical analysis and machine learning algorithms - Experience with data visualization tools such as Tableau or Power BI - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: - The candidate should have a minimum of 7.5 years of experience in ServiceNow IT Service Management - This position is based at our Bengaluru office - A 15 years full-time education is required

Posted 2 days ago

Apply

3.0 years

0 Lacs

Coimbatore, Tamil Nadu, India

On-site

Linkedin logo

Project Role : Service Management Practitioner Project Role Description : Support the delivery of programs, projects or managed services. Coordinate projects through contract management and shared service coordination. Develop and maintain relationships with key stakeholders and sponsors to ensure high levels of commitment and enable strategic agenda. Must have skills : Microsoft Power Business Intelligence (BI) Good to have skills : Microsoft Power Apps Minimum 3 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As a Service Management Practitioner, you will support the delivery of programs, projects, or managed services. Coordinate projects through contract management and shared service coordination. Develop and maintain relationships with key stakeholders and sponsors to ensure high levels of commitment and enable strategic agenda. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work-related problems. - Coordinate the delivery of programs, projects, or managed services. - Develop and maintain relationships with key stakeholders and sponsors. - Ensure high levels of commitment from stakeholders. - Enable strategic agenda through effective coordination. - Provide regular updates and reports on project progress. Professional & Technical Skills: - Must To Have Skills: Proficiency in Microsoft Power Business Intelligence (BI). - Good To Have Skills: Experience with Microsoft Power Apps. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Microsoft Power Business Intelligence (BI). - This position is based at our Chennai office. - A 15 years full-time education is required.

Posted 2 days ago

Apply

3.0 years

0 Lacs

Kolkata, West Bengal, India

On-site

Linkedin logo

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Machine Learning Operations Good to have skills : NA Minimum 3 Year(s) Of Experience Is Required Educational Qualification : BE Summary: As an AI / ML Engineer, you will develop applications and systems that utilize AI to improve performance and efficiency, including deep learning, neural networks, chatbots, and natural language processing. Roles & Responsibilities: - Expected to perform independently and become an SME. - Required active participation/contribution in team discussions. - Contribute in providing solutions to work related problems. - Implement machine learning models for various applications. - Optimize AI algorithms for improved performance. - Collaborate with cross-functional teams to integrate AI solutions. - Stay updated with the latest trends in AI and ML technologies. - Provide technical guidance and mentor junior team members. Professional & Technical Skills: - Must To Have Skills: Proficiency in Machine Learning Operations. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 3 years of experience in Machine Learning Operations. - This position is based at our Kolkata office. - A BE degree is required.

Posted 2 days ago

Apply

7.5 years

0 Lacs

Gurugram, Haryana, India

On-site

Linkedin logo

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Large Language Models Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an AI/ML Engineer, you will develop applications and systems utilizing AI tools, Cloud AI services, and GenAI models. Your role involves implementing deep learning, neural networks, chatbots, and image processing in production-ready quality solutions. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Lead the implementation of large language models in AI applications. - Research and apply cutting-edge AI techniques to enhance system performance. - Contribute to the development of innovative AI solutions for complex business challenges. Professional & Technical Skills: - Must To Have Skills: Proficiency in Large Language Models. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Large Language Models. - This position is based at our Bengaluru office. - A 15 years full-time education is required.

Posted 2 days ago

Apply

3.0 - 8.0 years

22 - 27 Lacs

Bengaluru

Work from Office

Naukri logo

Our vision for the future is based on the idea that transforming financial lives starts by giving our people the freedom to transform their own. We have a flexible work environment, and fluid career paths. We not only encourage but celebrate internal mobility. We also recognize the importance of purpose, well-being, and work-life balance. Within Empower and our communities, we work hard to create a welcoming and inclusive environment, and our associates dedicate thousands of hours to volunteering for causes that matter most to them. Chart your own path and grow your career while helping more customers achieve financial freedom. Empower Yourself. Job Description Empower India Data Science team is looking for a Sr. Analyst who is capable of producing in-depth analysis that suggests strategic and operational changes. The ideal candidate will understand the business problem, collect and study relevant data and able to produce meaningful and actionable insights that helps to solve the business problem in hand. Role Responsibilities: Understand the complex business problem, able to convert it as an analytical problem. Understand the data need, collect, interpret and clean the data for analysis Produce meaningful insights using analytical/data modelling techniques and interpret and justify the results with the business context Handle the project end-to-end independently Able to present the analysis and results to the senior stakeholders Generate BI reports and able to automate them Work with management to prioritize business and information needs Educational Qualification: Graduate / Post-graduate degree in Business Management/Statistics/Economics/Finance Required Experience: 2-6 years of experience working in analytics, data science or related experience Attention to detail, and should be open to learn quickly on new technologies and statistical techniques in analytics like Machine Learning, AI, SAS, AWS, Python, Tableau etc. Working experience with tools such as Google Analytics/ Adobe Analytics Handling end-to-end projects independently Required skills and competencies: Technical expertise regarding data modelling using statistical techniques, data mining, generate and automate BI reports Strong knowledge of and experience with the data modelling techniques like Time Series, Regression, Segmentation, Clustering, Market Mix, Machine Learning, AI. Strong analytical skills to be able to understand the business problem, collect and analyze the data and able to interpret the results Co-ordinate with different teams to be able to gather required information Strong written and verbal communication skill and be able to present the results to the senior stakeholders Good understanding of US Retirement Industry Self-motivation, excellent accountability and ownership skills Excellent team player This job description is not intended to be an exhaustive list of all duties, responsibilities and qualifications of the job. The employer has the right to revise this job description at any time. You will be evaluated in part based on your performance of the responsibilities and/or tasks listed in this job description. You may be required perform other duties that are not included on this job description. The job description is not a contract for employment, and either you or the employer may terminate employment at any time, for any reason, as per terms and conditions of your employment contract. We are an equal opportunity employer with a commitment to diversity. All individuals, regardless of personal characteristics, are encouraged to apply. All qualified applicants will receive consideration for employment without regard to age, race, color, national origin, ancestry, sex, sexual orientation, gender, gender identity, gender expression, marital status, pregnancy, religion, physical or mental disability, military or veteran status, genetic information, or any other status protected by applicable state or local law.

Posted 2 days ago

Apply

7.5 years

0 Lacs

Bengaluru, Karnataka, India

On-site

Linkedin logo

Project Role : AI / ML Engineer Project Role Description : Develops applications and systems that utilize AI tools, Cloud AI services, with proper cloud or on-prem application pipeline with production ready quality. Be able to apply GenAI models as part of the solution. Could also include but not limited to deep learning, neural networks, chatbots, image processing. Must have skills : Large Language Models Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an AI/ML Engineer, you will develop applications and systems utilizing AI tools, Cloud AI services, and GenAI models. Your role involves implementing deep learning, neural networks, chatbots, and image processing in production-ready quality solutions. Roles & Responsibilities: - Expected to be an SME. - Collaborate and manage the team to perform. - Responsible for team decisions. - Engage with multiple teams and contribute on key decisions. - Provide solutions to problems for their immediate team and across multiple teams. - Lead the implementation of large language models in AI applications. - Research and apply cutting-edge AI techniques to enhance system performance. - Contribute to the development of innovative AI solutions for complex business challenges. Professional & Technical Skills: - Must To Have Skills: Proficiency in Large Language Models. - Strong understanding of statistical analysis and machine learning algorithms. - Experience with data visualization tools such as Tableau or Power BI. - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms. - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity. Additional Information: - The candidate should have a minimum of 7.5 years of experience in Large Language Models. - This position is based at our Bengaluru office. - A 15 years full-time education is required.

Posted 2 days ago

Apply

6.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Who are we? Amdocs helps those who build the future to make it amazing. With our market-leading portfolio of software products and services, we unlock our customers’ innovative potential, empowering them to provide next-generation communication and media experiences for both the individual end user and enterprise customers. Our employees around the globe are here to accelerate service providers’ migration to the cloud, enable them to differentiate in the 5G era, and digitalize and automate their operations. Listed on the NASDAQ Global Select Market, Amdocs had revenue of $5.00 billion in fiscal 2024. For more information, visit www.amdocs.com 📍 Location: Pune 📈 Experience: 6 to 10 Years 🔍 Must-Have Skills: 💡 Generative AI 🐍 Python 📚 RAG (Retrieval-Augmented Generation) 🧠 LLM / NLP 📊 EDA (Exploratory Data Analysis) ☁️ Cloud Platforms (AWS, Azure, GCP) 🔄 Transformers 🧾 Explainable AI 🌌 Deep Learning Desired Background At least 4+ years of relevant experience and track record in Data Science: Machine Learning, Deep Learning and Statistical Data Analysis. MSc or PhD degree in CS or Mathematics, Bioinformatics, Statistics, Engineering, Physics or similar discipline. Strong hands-on experience in Python with the focus being on statistical algorithms development and GenAI practices. Experience with data science libraries such as: sklearn, pandas, numpy, pytorch/tensorflow Experience with GenAI concepts (RAG, LLM) and Agentical development: from conversional to autonomous agents Team player, able to work in collaboration with subject matter experts, with ability to clearly present and communicate findings. Proven ability to build and deliver data solutions in a short time frame. Experience with Azure, docker, and development methodologies - an advantage Proven experiences in productions and DevOPS practices nd find the right answers. Qualifications Bachelor's degree or equivalent experience in quantative field (Statistics, Mathematics, Computer Science, Engineering, etc.) At least 1 - 2 years' of experience in quantitative analytics or data modeling Deep understanding of predictive modeling, machine-learning, clustering and classification techniques, and algorithms Fluency in a programming language (Python, C,C++, Java, SQL) Familiarity with Big Data frameworks and visualization tools (Cassandra, Hadoop, Spark, Tableau)

Posted 2 days ago

Apply

5.0 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Job Role: Machine Learning Engineer As a Machine Learning Engineer, you’ll be applying your expertise to help us develop a world-leading capability in this exciting and challenging domain. You will be responsible for contributing to the design, development, deployment, testing, maintenance and enhancement of ML software solutions. Primary responsibilities: 1. Applying machine learning, deep learning, and signal processing on large datasets (Audio, sensors, images, videos, text) to develop models. 2. Architecting large scale data analytics / modeling systems. 3. Designing and programming machine learning methods and integrating them into our ML framework / pipeline. 4. Work closely with data scientists/analyst to collaborate and support the development of ML data pipelines, platforms and infrastructure 5. Evaluate and validate the analysis with statistical methods. Also presenting this in a lucid form to people not familiar with the domain of data science / computer science. 6. Creation of microservices and APIs for serving ML models and ML services 7. Evaluating new machine learning methods and adopting them for our purposes. 8. Feature engineering to add new features that improve model performance. Required skills: 1. Background and knowledge of recent advances in machine learning, deep learning, natural language processing, and/or image/signal/video processing with around 5 years of professional work experience working on real-world applications. 2. Strong programming background, e.g. Python, Pytorch, MATLAB, C/C++, Java, and knowledge of software engineering concepts (OOP, design patterns). 3. Knowledge of machine learning libraries Tensorflow, Keras, scikit-learn, pyTorch, 4. Excellent mathematical and skills and background, e.g. accuracy, significance tests, visualization, advanced probability concepts 5. Architecting and implementing end-to-end solutions for accelerating experimentation and model building 6. Working knowledge of a variety of machine learning techniques (clustering, decision tree learning, artificial neural networks, etc.) 7. Ability to perform both independent and collaborative research. 8. Excellent written and spoken communication skills. Preferred qualification and experience: B.E.\B. Tech\B.S. candidates' entries with 3+ years of experience in the aforementioned fields will be considered. M.E.\M.S.\M. Tech\PhD preferably in fields related to Computer Science with experience in machine learning, image and signal processing, or statistics preferred.

Posted 2 days ago

Apply

7.5 years

0 Lacs

Ahmedabad, Gujarat, India

On-site

Linkedin logo

Project Role : Application Lead Project Role Description : Lead the effort to design, build and configure applications, acting as the primary point of contact. Must have skills : SAP FI CO Finance Good to have skills : NA Minimum 7.5 Year(s) Of Experience Is Required Educational Qualification : 15 years full time education Summary: As an Application Lead, you will lead the effort to design, build, and configure applications, acting as the primary point of contact. You will be responsible for managing the team and ensuring successful project delivery. Your typical day will involve collaborating with multiple teams, making key decisions, and providing solutions to problems for your immediate team and across multiple teams. Roles & Responsibilities: - Expected to be an SME - Collaborate and manage the team to perform - Responsible for team decisions - Engage with multiple teams and contribute on key decisions - Provide solutions to problems for their immediate team and across multiple teams - Lead the effort to design, build, and configure applications - Act as the primary point of contact for the project - Manage the team and ensure successful project delivery - Collaborate with multiple teams to make key decisions - Provide solutions to problems for the immediate team and across multiple teams Professional & Technical Skills: - Must To Have Skills: Proficiency in SAP FI CO Finance - Strong understanding of statistical analysis and machine learning algorithms - Experience with data visualization tools such as Tableau or Power BI - Hands-on implementing various machine learning algorithms such as linear regression, logistic regression, decision trees, and clustering algorithms - Solid grasp of data munging techniques, including data cleaning, transformation, and normalization to ensure data quality and integrity Additional Information: - The candidate should have a minimum of 7.5 years of experience in SAP FI CO Finance - This position is based at our Hyderabad office - A 15 years full-time education is required

Posted 2 days ago

Apply

3.0 - 6.0 years

8 - 24 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

Data Engineer – Snowflake About The Opportunity An award-winning global IT services & analytics consultancy operating in the Enterprise Cloud Data & Digital Transformation sector. We partner with Fortune 500 firms to modernise data platforms, unlock real-time insights, and drive AI/ML innovation. To scale client programmes in India, we seek a skilled Data Engineer specialised in Snowflake to design high-throughput, secure, and cost-efficient data products in an on-site environment. Role & Responsibilities Design and implement end-to-end Snowflake data warehouses, from staging to curated layers, ensuring ELT best practices. Build and automate data ingestion pipelines using Snowpipe, Python, and cloud services, delivering near real-time availability. Optimise schema design, clustering, and partitioning to reduce query latency and storage costs. Create robust CI/CD workflows for Snowflake objects via Git and orchestration tools like dbt/Airflow. Monitor workload performance, triage bottlenecks, and enforce data governance, security, and compliance policies. Collaborate with analytics, BI, and product teams to translate business requirements into scalable data models. Skills & Qualifications Must-Have 3-6 years professional Data Engineering experience with primary focus on Snowflake. Expert SQL skills and proficiency in scripting (Python or JavaScript) for ETL/ELT tasks. Hands-on with Snowpipe, Tasks, Streams, and Performance Tuning. Experience integrating cloud storage (AWS S3/Azure Blob/GCS) and building event-driven pipelines. Solid understanding of dimensional data modelling and data governance (RBAC, masking, encryption). Version control, testing, and deployment using Git and CI/CD pipelines. Preferred Exposure to dbt, Airflow, or similar orchestration frameworks. Experience with Kafka or real-time messaging platforms. Knowledge of BI tools (Tableau, Power BI) and modern data stack. Snowflake certifications (SnowPro Core/Advanced) a plus. Familiarity with Infra-as-Code (Terraform, CloudFormation). Benefits & Culture Highlights On-site, high-energy data lab with access to cutting-edge cloud tools and sandbox environments. Clear technical career ladder, certification sponsorship, and mentorship from Snowflake SMEs. Comprehensive health cover, performance bonuses, and employee wellness programmes. Skills: sql,elt,dbt,azure blob,masking,streams,data governance,snowflake,aws s3,dimensional data modelling,rbac,git,tasks,performance tuning,gcs,python,encryption,snowpipe,javascript,snowflake data engineer,aws,ci/cd,etl,data modeling

Posted 2 days ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Linkedin logo

About Client: Our Client is a global IT services company headquartered in Southborough, Massachusetts, USA. Founded in 1996, with a revenue of $1.8B, with 35,000+ associates worldwide, specializes in digital engineering, and IT services company helping clients modernize their technology infrastructure, adopt cloud and AI solutions, and accelerate innovation. It partners with major firms in banking, healthcare, telecom, and media. Our Client is known for combining deep industry expertise with agile development practices, enabling scalable and cost-effective digital transformation. The company operates in over 50 locations across more than 25 countries, has delivery centers in Asia, Europe, and North America and is backed by Baring Private Equity Asia. Job Title: AI - ML Developer Key Skills: image analytics, computer vision, and visual data processing, Python, Gen AI Job Locations: Hyderabad Experience: 6 – 10 Years Budget: 16 – 20 LPA Education Qualification : Any Graduation Work Mode: Hybrid Employment Type: Contract Notice Period: Immediate - 15 Days Interview Mode: 2 Rounds of Technical Interview + Including Client round Job Description: Key Focus Areas: · Image Analytics & Computer Vision (CV) · Machine Learning & Deep Learning · Predictive Analytics & Optimization · Generative AI (GenAI) & NLP (as secondary skills) Primary Responsibilities: · Lead and contribute to projects centered around image analytics, computer vision, and visual data processing. · Develop and deploy CV models for tasks such as object detection, image classification, pattern recognition, and anomaly detection. · Apply deep learning frameworks (e.g., TensorFlow, Keras) to solve complex visual data challenges. · Integrate multi-sensor data fusion and multivariate analysis for industrial applications. · Collaborate with cross-functional teams to implement predictive maintenance, fault detection, and process monitoring solutions using visual and sensor data. Mandatory Skills: · Strong hands-on experience in Computer Vision and Image Analytics. · Proficiency in Python and familiarity with AI/ML libraries such as OpenCV, TensorFlow, Keras, scikit-learn, and Matplotlib. · Solid understanding of machine learning techniques: classification, regression, clustering, anomaly detection, etc. · Experience with deep learning architectures (CNNs, autoencoders, etc.) for image-based applications. · Familiarity with Generative AI and LLMs is a plus. Desirable Skills: · Knowledge of optimization techniques and simulation modeling. · Domain experience in Oil & Gas, Desalination, Motors & Pumps, or Industrial Systems. Educational & Professional Background: · Bachelor’s or Master’s degree in Engineering (Mechanical, Electrical, Electronics, Chemical preferred). · Master’s in Industrial/Manufacturing/Production Engineering is a strong plus. · Demonstrated experience in solving real-world industrial problems using data-driven approaches. Soft Skills & Attributes: · Strong analytical and problem-solving skills. · Ability to work independently and manage multiple projects. · Excellent communication and stakeholder engagement skills. · Proven thought leadership and innovation in AI/ML applications. Interested Candidates please share your CV to pnomula@people-prime.com

Posted 2 days ago

Apply

3.0 - 6.0 years

8 - 24 Lacs

Bhubaneswar, Odisha, India

On-site

Linkedin logo

Data Engineer – Snowflake About The Opportunity An award-winning global IT services & analytics consultancy operating in the Enterprise Cloud Data & Digital Transformation sector. We partner with Fortune 500 firms to modernise data platforms, unlock real-time insights, and drive AI/ML innovation. To scale client programmes in India, we seek a skilled Data Engineer specialised in Snowflake to design high-throughput, secure, and cost-efficient data products in an on-site environment. Role & Responsibilities Design and implement end-to-end Snowflake data warehouses, from staging to curated layers, ensuring ELT best practices. Build and automate data ingestion pipelines using Snowpipe, Python, and cloud services, delivering near real-time availability. Optimise schema design, clustering, and partitioning to reduce query latency and storage costs. Create robust CI/CD workflows for Snowflake objects via Git and orchestration tools like dbt/Airflow. Monitor workload performance, triage bottlenecks, and enforce data governance, security, and compliance policies. Collaborate with analytics, BI, and product teams to translate business requirements into scalable data models. Skills & Qualifications Must-Have 3-6 years professional Data Engineering experience with primary focus on Snowflake. Expert SQL skills and proficiency in scripting (Python or JavaScript) for ETL/ELT tasks. Hands-on with Snowpipe, Tasks, Streams, and Performance Tuning. Experience integrating cloud storage (AWS S3/Azure Blob/GCS) and building event-driven pipelines. Solid understanding of dimensional data modelling and data governance (RBAC, masking, encryption). Version control, testing, and deployment using Git and CI/CD pipelines. Preferred Exposure to dbt, Airflow, or similar orchestration frameworks. Experience with Kafka or real-time messaging platforms. Knowledge of BI tools (Tableau, Power BI) and modern data stack. Snowflake certifications (SnowPro Core/Advanced) a plus. Familiarity with Infra-as-Code (Terraform, CloudFormation). Benefits & Culture Highlights On-site, high-energy data lab with access to cutting-edge cloud tools and sandbox environments. Clear technical career ladder, certification sponsorship, and mentorship from Snowflake SMEs. Comprehensive health cover, performance bonuses, and employee wellness programmes. Skills: sql,elt,dbt,azure blob,masking,streams,data governance,snowflake,aws s3,dimensional data modelling,rbac,git,tasks,performance tuning,gcs,python,encryption,snowpipe,javascript,snowflake data engineer,aws,ci/cd,etl,data modeling

Posted 2 days ago

Apply

3.0 - 6.0 years

8 - 24 Lacs

Hyderabad, Telangana, India

On-site

Linkedin logo

Data Engineer – Snowflake About The Opportunity An award-winning global IT services & analytics consultancy operating in the Enterprise Cloud Data & Digital Transformation sector. We partner with Fortune 500 firms to modernise data platforms, unlock real-time insights, and drive AI/ML innovation. To scale client programmes in India, we seek a skilled Data Engineer specialised in Snowflake to design high-throughput, secure, and cost-efficient data products in an on-site environment. Role & Responsibilities Design and implement end-to-end Snowflake data warehouses, from staging to curated layers, ensuring ELT best practices. Build and automate data ingestion pipelines using Snowpipe, Python, and cloud services, delivering near real-time availability. Optimise schema design, clustering, and partitioning to reduce query latency and storage costs. Create robust CI/CD workflows for Snowflake objects via Git and orchestration tools like dbt/Airflow. Monitor workload performance, triage bottlenecks, and enforce data governance, security, and compliance policies. Collaborate with analytics, BI, and product teams to translate business requirements into scalable data models. Skills & Qualifications Must-Have 3-6 years professional Data Engineering experience with primary focus on Snowflake. Expert SQL skills and proficiency in scripting (Python or JavaScript) for ETL/ELT tasks. Hands-on with Snowpipe, Tasks, Streams, and Performance Tuning. Experience integrating cloud storage (AWS S3/Azure Blob/GCS) and building event-driven pipelines. Solid understanding of dimensional data modelling and data governance (RBAC, masking, encryption). Version control, testing, and deployment using Git and CI/CD pipelines. Preferred Exposure to dbt, Airflow, or similar orchestration frameworks. Experience with Kafka or real-time messaging platforms. Knowledge of BI tools (Tableau, Power BI) and modern data stack. Snowflake certifications (SnowPro Core/Advanced) a plus. Familiarity with Infra-as-Code (Terraform, CloudFormation). Benefits & Culture Highlights On-site, high-energy data lab with access to cutting-edge cloud tools and sandbox environments. Clear technical career ladder, certification sponsorship, and mentorship from Snowflake SMEs. Comprehensive health cover, performance bonuses, and employee wellness programmes. Skills: sql,elt,dbt,azure blob,masking,streams,data governance,snowflake,aws s3,dimensional data modelling,rbac,git,tasks,performance tuning,gcs,python,encryption,snowpipe,javascript,snowflake data engineer,aws,ci/cd,etl,data modeling

Posted 2 days ago

Apply

3.0 - 6.0 years

8 - 24 Lacs

Pune, Maharashtra, India

On-site

Linkedin logo

Data Engineer – Snowflake About The Opportunity An award-winning global IT services & analytics consultancy operating in the Enterprise Cloud Data & Digital Transformation sector. We partner with Fortune 500 firms to modernise data platforms, unlock real-time insights, and drive AI/ML innovation. To scale client programmes in India, we seek a skilled Data Engineer specialised in Snowflake to design high-throughput, secure, and cost-efficient data products in an on-site environment. Role & Responsibilities Design and implement end-to-end Snowflake data warehouses, from staging to curated layers, ensuring ELT best practices. Build and automate data ingestion pipelines using Snowpipe, Python, and cloud services, delivering near real-time availability. Optimise schema design, clustering, and partitioning to reduce query latency and storage costs. Create robust CI/CD workflows for Snowflake objects via Git and orchestration tools like dbt/Airflow. Monitor workload performance, triage bottlenecks, and enforce data governance, security, and compliance policies. Collaborate with analytics, BI, and product teams to translate business requirements into scalable data models. Skills & Qualifications Must-Have 3-6 years professional Data Engineering experience with primary focus on Snowflake. Expert SQL skills and proficiency in scripting (Python or JavaScript) for ETL/ELT tasks. Hands-on with Snowpipe, Tasks, Streams, and Performance Tuning. Experience integrating cloud storage (AWS S3/Azure Blob/GCS) and building event-driven pipelines. Solid understanding of dimensional data modelling and data governance (RBAC, masking, encryption). Version control, testing, and deployment using Git and CI/CD pipelines. Preferred Exposure to dbt, Airflow, or similar orchestration frameworks. Experience with Kafka or real-time messaging platforms. Knowledge of BI tools (Tableau, Power BI) and modern data stack. Snowflake certifications (SnowPro Core/Advanced) a plus. Familiarity with Infra-as-Code (Terraform, CloudFormation). Benefits & Culture Highlights On-site, high-energy data lab with access to cutting-edge cloud tools and sandbox environments. Clear technical career ladder, certification sponsorship, and mentorship from Snowflake SMEs. Comprehensive health cover, performance bonuses, and employee wellness programmes. Skills: sql,elt,dbt,azure blob,masking,streams,data governance,snowflake,aws s3,dimensional data modelling,rbac,git,tasks,performance tuning,gcs,python,encryption,snowpipe,javascript,snowflake data engineer,aws,ci/cd,etl,data modeling

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies