Jobs
Interviews

36840 Microservices Jobs - Page 42

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 years

30 - 40 Lacs

Thane, Maharashtra, India

Remote

Experience : 3.00 + years Salary : INR 3000000-4000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: DRIMCO GmbH) (*Note: This is a requirement for one of Uplers' client - AI-powered Industrial Bid Automation Company) What do you need for this opportunity? Must have skills required: Grafana, Graph, LLM, PLM systems, Prometheus, CI/CD, Dask, Kubeflow, MLFlow, or GCP, Python Programming, PyTorch, Ray, Scikit-learn, TensorFlow, Apache Spark, AWS, Azure, Docker, Kafka, Kubernetes, Machine Learning AI-powered Industrial Bid Automation Company is Looking for: We are driving the future of industrial automation and engineering by developing intelligent AI agents tailored for the manufacturing and automotive sectors. As part of our growing team, you’ll play a key role in building robust, scalable, and intelligent AI agentic products that redefine how complex engineering and requirements workflows are solved. Our highly skilled team includes researchers, technologists, entrepreneurs, and developers holding 15 patents and 20+ publications at prestigious scientific venues like ICML, ICLR, and AAAI. Founded in 2020, we are pioneering collaborative requirement assessment in industry. The combination of the founder’s deep industry expertise, an OEM partnership with Siemens, multi-patented AI technologies and VC backing positions us as the thought leader in the field of requirement intelligence. 🔍 Role Description Design, build, and optimize ML models for intelligent requirement understanding and automation. Develop scalable, production-grade AI pipelines and APIs. Own the deployment lifecycle, including model serving, monitoring, and continuous delivery. Collaborate with data engineers and product teams to ensure data integrity, performance, and scalability. Work on large-scale data processing and real-time pipelines. Contribute to DevOps practices such as containerization, CI/CD pipelines, and cloud deployments. Analyze and improve the efficiency and scalability of ML systems in production. Stay current with the latest AI/ML research and translate innovations into product enhancements. 🧠 What are we looking for 3+ years of experience in ML/AI engineering with shipped products. Proficient in Python (e.g., TensorFlow, PyTorch, scikit-learn). Strong software engineering practices: version control, testing, documentation. Experience with MLOps tools (e.g., MLflow, Kubeflow) and model deployment techniques. Familiarity with Docker, Kubernetes, CI/CD, and cloud platforms (AWS, Azure, or GCP). Experience working with large datasets, data wrangling, and scalable data pipelines (Apache Spark, Kafka, Ray, Dask, etc.). Good understanding of microservices, distributed systems and model performance optimization. Comfortable in a fast-paced startup environment; proactive and curious mindset. 🎯 Bonus Points: Experience with natural language processing, document understanding, or LLM (Large Language Model). Experience with Knowledge Graph technologies Experience with logging/monitoring tools (e.g., Prometheus, Grafana). Knowledge of requirement engineering or PLM systems. ✨ What we offer: Attractive Compensation Work on impactful AI products solving real industrial challenges. A collaborative, agile, and supportive team culture. Flexible work hours and location (hybrid/remote). How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

3.0 years

30 - 40 Lacs

Nagpur, Maharashtra, India

Remote

Experience : 3.00 + years Salary : INR 3000000-4000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: DRIMCO GmbH) (*Note: This is a requirement for one of Uplers' client - AI-powered Industrial Bid Automation Company) What do you need for this opportunity? Must have skills required: Grafana, Graph, LLM, PLM systems, Prometheus, CI/CD, Dask, Kubeflow, MLFlow, or GCP, Python Programming, PyTorch, Ray, Scikit-learn, TensorFlow, Apache Spark, AWS, Azure, Docker, Kafka, Kubernetes, Machine Learning AI-powered Industrial Bid Automation Company is Looking for: We are driving the future of industrial automation and engineering by developing intelligent AI agents tailored for the manufacturing and automotive sectors. As part of our growing team, you’ll play a key role in building robust, scalable, and intelligent AI agentic products that redefine how complex engineering and requirements workflows are solved. Our highly skilled team includes researchers, technologists, entrepreneurs, and developers holding 15 patents and 20+ publications at prestigious scientific venues like ICML, ICLR, and AAAI. Founded in 2020, we are pioneering collaborative requirement assessment in industry. The combination of the founder’s deep industry expertise, an OEM partnership with Siemens, multi-patented AI technologies and VC backing positions us as the thought leader in the field of requirement intelligence. 🔍 Role Description Design, build, and optimize ML models for intelligent requirement understanding and automation. Develop scalable, production-grade AI pipelines and APIs. Own the deployment lifecycle, including model serving, monitoring, and continuous delivery. Collaborate with data engineers and product teams to ensure data integrity, performance, and scalability. Work on large-scale data processing and real-time pipelines. Contribute to DevOps practices such as containerization, CI/CD pipelines, and cloud deployments. Analyze and improve the efficiency and scalability of ML systems in production. Stay current with the latest AI/ML research and translate innovations into product enhancements. 🧠 What are we looking for 3+ years of experience in ML/AI engineering with shipped products. Proficient in Python (e.g., TensorFlow, PyTorch, scikit-learn). Strong software engineering practices: version control, testing, documentation. Experience with MLOps tools (e.g., MLflow, Kubeflow) and model deployment techniques. Familiarity with Docker, Kubernetes, CI/CD, and cloud platforms (AWS, Azure, or GCP). Experience working with large datasets, data wrangling, and scalable data pipelines (Apache Spark, Kafka, Ray, Dask, etc.). Good understanding of microservices, distributed systems and model performance optimization. Comfortable in a fast-paced startup environment; proactive and curious mindset. 🎯 Bonus Points: Experience with natural language processing, document understanding, or LLM (Large Language Model). Experience with Knowledge Graph technologies Experience with logging/monitoring tools (e.g., Prometheus, Grafana). Knowledge of requirement engineering or PLM systems. ✨ What we offer: Attractive Compensation Work on impactful AI products solving real industrial challenges. A collaborative, agile, and supportive team culture. Flexible work hours and location (hybrid/remote). How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

3.0 years

30 - 40 Lacs

Kanpur, Uttar Pradesh, India

Remote

Experience : 3.00 + years Salary : INR 3000000-4000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: DRIMCO GmbH) (*Note: This is a requirement for one of Uplers' client - AI-powered Industrial Bid Automation Company) What do you need for this opportunity? Must have skills required: Grafana, Graph, LLM, PLM systems, Prometheus, CI/CD, Dask, Kubeflow, MLFlow, or GCP, Python Programming, PyTorch, Ray, Scikit-learn, TensorFlow, Apache Spark, AWS, Azure, Docker, Kafka, Kubernetes, Machine Learning AI-powered Industrial Bid Automation Company is Looking for: We are driving the future of industrial automation and engineering by developing intelligent AI agents tailored for the manufacturing and automotive sectors. As part of our growing team, you’ll play a key role in building robust, scalable, and intelligent AI agentic products that redefine how complex engineering and requirements workflows are solved. Our highly skilled team includes researchers, technologists, entrepreneurs, and developers holding 15 patents and 20+ publications at prestigious scientific venues like ICML, ICLR, and AAAI. Founded in 2020, we are pioneering collaborative requirement assessment in industry. The combination of the founder’s deep industry expertise, an OEM partnership with Siemens, multi-patented AI technologies and VC backing positions us as the thought leader in the field of requirement intelligence. 🔍 Role Description Design, build, and optimize ML models for intelligent requirement understanding and automation. Develop scalable, production-grade AI pipelines and APIs. Own the deployment lifecycle, including model serving, monitoring, and continuous delivery. Collaborate with data engineers and product teams to ensure data integrity, performance, and scalability. Work on large-scale data processing and real-time pipelines. Contribute to DevOps practices such as containerization, CI/CD pipelines, and cloud deployments. Analyze and improve the efficiency and scalability of ML systems in production. Stay current with the latest AI/ML research and translate innovations into product enhancements. 🧠 What are we looking for 3+ years of experience in ML/AI engineering with shipped products. Proficient in Python (e.g., TensorFlow, PyTorch, scikit-learn). Strong software engineering practices: version control, testing, documentation. Experience with MLOps tools (e.g., MLflow, Kubeflow) and model deployment techniques. Familiarity with Docker, Kubernetes, CI/CD, and cloud platforms (AWS, Azure, or GCP). Experience working with large datasets, data wrangling, and scalable data pipelines (Apache Spark, Kafka, Ray, Dask, etc.). Good understanding of microservices, distributed systems and model performance optimization. Comfortable in a fast-paced startup environment; proactive and curious mindset. 🎯 Bonus Points: Experience with natural language processing, document understanding, or LLM (Large Language Model). Experience with Knowledge Graph technologies Experience with logging/monitoring tools (e.g., Prometheus, Grafana). Knowledge of requirement engineering or PLM systems. ✨ What we offer: Attractive Compensation Work on impactful AI products solving real industrial challenges. A collaborative, agile, and supportive team culture. Flexible work hours and location (hybrid/remote). How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

3.0 years

30 - 40 Lacs

Nashik, Maharashtra, India

Remote

Experience : 3.00 + years Salary : INR 3000000-4000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: DRIMCO GmbH) (*Note: This is a requirement for one of Uplers' client - AI-powered Industrial Bid Automation Company) What do you need for this opportunity? Must have skills required: Grafana, Graph, LLM, PLM systems, Prometheus, CI/CD, Dask, Kubeflow, MLFlow, or GCP, Python Programming, PyTorch, Ray, Scikit-learn, TensorFlow, Apache Spark, AWS, Azure, Docker, Kafka, Kubernetes, Machine Learning AI-powered Industrial Bid Automation Company is Looking for: We are driving the future of industrial automation and engineering by developing intelligent AI agents tailored for the manufacturing and automotive sectors. As part of our growing team, you’ll play a key role in building robust, scalable, and intelligent AI agentic products that redefine how complex engineering and requirements workflows are solved. Our highly skilled team includes researchers, technologists, entrepreneurs, and developers holding 15 patents and 20+ publications at prestigious scientific venues like ICML, ICLR, and AAAI. Founded in 2020, we are pioneering collaborative requirement assessment in industry. The combination of the founder’s deep industry expertise, an OEM partnership with Siemens, multi-patented AI technologies and VC backing positions us as the thought leader in the field of requirement intelligence. 🔍 Role Description Design, build, and optimize ML models for intelligent requirement understanding and automation. Develop scalable, production-grade AI pipelines and APIs. Own the deployment lifecycle, including model serving, monitoring, and continuous delivery. Collaborate with data engineers and product teams to ensure data integrity, performance, and scalability. Work on large-scale data processing and real-time pipelines. Contribute to DevOps practices such as containerization, CI/CD pipelines, and cloud deployments. Analyze and improve the efficiency and scalability of ML systems in production. Stay current with the latest AI/ML research and translate innovations into product enhancements. 🧠 What are we looking for 3+ years of experience in ML/AI engineering with shipped products. Proficient in Python (e.g., TensorFlow, PyTorch, scikit-learn). Strong software engineering practices: version control, testing, documentation. Experience with MLOps tools (e.g., MLflow, Kubeflow) and model deployment techniques. Familiarity with Docker, Kubernetes, CI/CD, and cloud platforms (AWS, Azure, or GCP). Experience working with large datasets, data wrangling, and scalable data pipelines (Apache Spark, Kafka, Ray, Dask, etc.). Good understanding of microservices, distributed systems and model performance optimization. Comfortable in a fast-paced startup environment; proactive and curious mindset. 🎯 Bonus Points: Experience with natural language processing, document understanding, or LLM (Large Language Model). Experience with Knowledge Graph technologies Experience with logging/monitoring tools (e.g., Prometheus, Grafana). Knowledge of requirement engineering or PLM systems. ✨ What we offer: Attractive Compensation Work on impactful AI products solving real industrial challenges. A collaborative, agile, and supportive team culture. Flexible work hours and location (hybrid/remote). How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

3.0 years

30 - 40 Lacs

Kochi, Kerala, India

Remote

Experience : 3.00 + years Salary : INR 3000000-4000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: DRIMCO GmbH) (*Note: This is a requirement for one of Uplers' client - AI-powered Industrial Bid Automation Company) What do you need for this opportunity? Must have skills required: Grafana, Graph, LLM, PLM systems, Prometheus, CI/CD, Dask, Kubeflow, MLFlow, or GCP, Python Programming, PyTorch, Ray, Scikit-learn, TensorFlow, Apache Spark, AWS, Azure, Docker, Kafka, Kubernetes, Machine Learning AI-powered Industrial Bid Automation Company is Looking for: We are driving the future of industrial automation and engineering by developing intelligent AI agents tailored for the manufacturing and automotive sectors. As part of our growing team, you’ll play a key role in building robust, scalable, and intelligent AI agentic products that redefine how complex engineering and requirements workflows are solved. Our highly skilled team includes researchers, technologists, entrepreneurs, and developers holding 15 patents and 20+ publications at prestigious scientific venues like ICML, ICLR, and AAAI. Founded in 2020, we are pioneering collaborative requirement assessment in industry. The combination of the founder’s deep industry expertise, an OEM partnership with Siemens, multi-patented AI technologies and VC backing positions us as the thought leader in the field of requirement intelligence. 🔍 Role Description Design, build, and optimize ML models for intelligent requirement understanding and automation. Develop scalable, production-grade AI pipelines and APIs. Own the deployment lifecycle, including model serving, monitoring, and continuous delivery. Collaborate with data engineers and product teams to ensure data integrity, performance, and scalability. Work on large-scale data processing and real-time pipelines. Contribute to DevOps practices such as containerization, CI/CD pipelines, and cloud deployments. Analyze and improve the efficiency and scalability of ML systems in production. Stay current with the latest AI/ML research and translate innovations into product enhancements. 🧠 What are we looking for 3+ years of experience in ML/AI engineering with shipped products. Proficient in Python (e.g., TensorFlow, PyTorch, scikit-learn). Strong software engineering practices: version control, testing, documentation. Experience with MLOps tools (e.g., MLflow, Kubeflow) and model deployment techniques. Familiarity with Docker, Kubernetes, CI/CD, and cloud platforms (AWS, Azure, or GCP). Experience working with large datasets, data wrangling, and scalable data pipelines (Apache Spark, Kafka, Ray, Dask, etc.). Good understanding of microservices, distributed systems and model performance optimization. Comfortable in a fast-paced startup environment; proactive and curious mindset. 🎯 Bonus Points: Experience with natural language processing, document understanding, or LLM (Large Language Model). Experience with Knowledge Graph technologies Experience with logging/monitoring tools (e.g., Prometheus, Grafana). Knowledge of requirement engineering or PLM systems. ✨ What we offer: Attractive Compensation Work on impactful AI products solving real industrial challenges. A collaborative, agile, and supportive team culture. Flexible work hours and location (hybrid/remote). How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

8.0 years

0 Lacs

India

Remote

About Us HighLevel is a cloud-based, all-in-one white-label marketing and sales platform that empowers marketing agencies, entrepreneurs, and businesses to elevate their digital presence and drive growth. We are proud to support a global and growing community of over 2 million businesses, from marketing agencies to entrepreneurs to small businesses and beyond. Our platform empowers users across industries to streamline operations, drive growth, and crush their goals. HighLevel processes over 15 billion API hits and handles more than 2.5 billion message events every day. Our platform manages 470 terabytes of data distributed across five databases, operates with a network of over 250 microservices, and supports over 1 million domain names. Our People With over 1,500 team members across 15+ countries, we operate in a global, remote-first environment. We are building more than software; we are building a global community rooted in creativity, collaboration, and impact. We take pride in cultivating a culture where innovation thrives, ideas are celebrated, and people come first, no matter where they call home. Our Impact Every month, our platform powers over 1.5 billion messages, helps generate over 200 million leads, and facilitates over 20 million conversations for the more than 2 million businesses we serve. Behind those numbers are real people growing their companies, connecting with customers, and making their mark - and we get to help make that happen. Learn more about us on our YouTube Channel or Blog Posts Who You Are: We are seeking a seasoned and consultative Lead Talent Acquisition Specialist to drive full-cycle, non-technical hiring across global regions, with a primary focus on India and the U.S. This individual contributor role is ideal for someone who thrives in a strategic, high-impact recruiting environment and brings strong ownership to every stage of the hiring process. You will lead recruitment efforts for a wide range of business functions—including Marketing, Sales, Finance, Legal, People, and Operations—while partnering closely with senior stakeholders. Though this role does not involve direct people management, it offers significant influence and visibility, playing a key role in shaping and elevating our global talent acquisition strategy. What You’ll Be Doing: Independently manage full-cycle recruitment for non-technical roles across business functions in India and the U.S. regions. Conduct strategic intake meetings to define hiring needs, set expectations, and advise hiring managers on best practices. Design and execute innovative sourcing strategies to attract high-quality, passive and active non-technical talent across regions. Provide data-driven insights on talent availability, compensation trends, and competitive benchmarking to inform hiring decisions. Ensure an exceptional and inclusive candidate experience from first touch to offer stage, maintaining speed and quality at every step. Use recruiting metrics and reporting to drive pipeline visibility, improve conversion rates, and optimize time-to-hire and quality-of-hire. Collaborate on recruitment process improvements, including interview calibration, structured hiring, and operational efficiency. Serve as a thought partner and mentor to junior recruiters, sharing best practices and supporting team-wide learning. What You’ll Bring: 8+ years of experience in non-technical talent acquisition, preferably in high-growth SaaS or global startup environments. Proven ability to lead complex, cross-functional searches independently and with minimal supervision. Strong consultative skills with the ability to influence stakeholders using data, market trends, and strategic insights. Experience building and presenting talent maps, pipeline health reports, and other sourcing metrics to senior stakeholders. Demonstrated success in global hiring, navigating different cultural expectations and labor markets (especially India and U.S.). Expertise in candidate engagement, sourcing platforms, and diversity hiring strategies. Familiarity with Applicant Tracking Systems (ATS) such as Lever, Greenhouse, or equivalent. Exceptional communication, follow-through, and prioritization skills in a fast-paced, distributed work environment. Equal Employment Opportunity Information The company is an Equal Opportunity Employer. As an employer subject to affirmative action regulations, we invite you to voluntarily provide the following demographic information. This information is used solely for compliance with government recordkeeping, reporting, and other legal requirements. Providing this information is voluntary and refusal to do so will not affect your application status. This data will be kept separate from your application and will not be used in the hiring decision.

Posted 1 day ago

Apply

2.0 - 4.0 years

10 - 15 Lacs

Pune

Hybrid

So, what’s the role all about? We are seeking a highly skilled Backend Software Engineer to join the GenAI Solutions for CX , our fully integrated AI cloud customer experience platform. On this role you will get the exposure to new and exciting technologies and collaborate with professional engineers, architects, and product managers to create NICE’s advanced line of AI cloud products How will you make an impact? Design and implement high-performance microservices using AWS cloud technologies Build scalable backend systems using Python Lead the development of event-driven architectures utilizing Kafka and AWS Firehose Integrate with Athena , DynamoDB , S3 , and other AWS services to deliver end-to-end solutions Ensure high-quality deliverables with testable, reusable, and production-ready code Collaborate within an agile team, influencing architecture, design, and technology adoption Have you got what it takes? 2 + years of backend software development experience Strong expertise in Python /C# Deep knowledge of microservices architecture , RESTful APIs , and cloud-native development Hands -on experience with AWS Lambda , S3 , Athena , Kinesis Firehose , and Kafka Strong database skills (SQL & NoSQL), including schema design and performance tuning Experience designing scalable systems and delivering enterprise-grade software Comfortable working with CI/CD pipelines and DevOps practices Passion for clean code, best practices, and continuous improvement Excellent communication and collaboration abilities Fluent in English (written and spoken) What’s in it for you? Join an ever-growing, market-disrupting global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NICE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NICEr! Enjoy FLEX! At NICE, we work according to the NICE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7989 Reporting into: Tech Manager Role Type: Individual Contributor

Posted 1 day ago

Apply

10.0 years

0 Lacs

Kochi, Kerala, India

On-site

Introduction IBM Cognos Analytics is a comprehensive business intelligence platform that transforms raw data into actionable insights through advanced reporting, AI-powered analytics, and interactive visualizations. Designed to cater to organizations of all sizes, it offers high-quality, scalable reporting capabilities, enabling users to create and share customized reports efficiently. The platform's intuitive interface allows for seamless exploration of data, uncovering hidden trends and facilitating informed decision-making without the need for advanced technical skills. With robust governance and security features, IBM Cognos Analytics ensures data integrity and confidentiality, making it a trusted solution for businesses aiming to harness the full potential of their data. Your Role And Responsibilities Work alongside our multidisciplinary team of developers and designers to create the next generation of enterprise software. Support the entire application lifecycle (concept, design, develop, test, release and support) Responsible for end-to-end product development of a Java/J2EE/C++/GoLang based application. It may include application development based on Microservice Architecture. Work with developers to implement best practices, introduce new tools, and improve processes. Stay up to date with new technology trends Preferred Education Master's Degree Required Technical And Professional Expertise 10+ years of software engineering experience implementing Enterprise Applications Strong skills on Java/J2EE, Spring, Microservices etc. Ability to Integrate with existing REST services and create new REST services Hands-on experience of SQL and NOSQL databases like Db2, Oracle, SQL Server, PostgreSQL, MySQL, MongoDB etc. Hands-on experience with IDEs like VSCode, Eclipse Hands-on experience of creating applications on cloud platforms (Kubernetes, RedHat OCP) Experience of building microservices/container-based architectures and solutions Strong oral and written communications Experience in Unit Testing, debugging and resolving performance concerns Team Git workflow and version control (Git, GitHub/GitLab/Bitbucket) Preferred Technical And Professional Experience Desirable to have experience with JavaScript, HTML5, React.js, Carbon JS, CSS, Hands-on experience with C, C++, GoLang Basic knowledge of Full Stack development skills. Knowledge of software design patterns Agile software development methodologies, SOLID principals of OOP Knowledge of CI/CD, Openshift, Kubernetes etc. Ability to adapt to and learn new technologies Exposure to Analytics domain will be added advantage

Posted 1 day ago

Apply

3.0 years

30 - 40 Lacs

Visakhapatnam, Andhra Pradesh, India

Remote

Experience : 3.00 + years Salary : INR 3000000-4000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: DRIMCO GmbH) (*Note: This is a requirement for one of Uplers' client - AI-powered Industrial Bid Automation Company) What do you need for this opportunity? Must have skills required: Grafana, Graph, LLM, PLM systems, Prometheus, CI/CD, Dask, Kubeflow, MLFlow, or GCP, Python Programming, PyTorch, Ray, Scikit-learn, TensorFlow, Apache Spark, AWS, Azure, Docker, Kafka, Kubernetes, Machine Learning AI-powered Industrial Bid Automation Company is Looking for: We are driving the future of industrial automation and engineering by developing intelligent AI agents tailored for the manufacturing and automotive sectors. As part of our growing team, you’ll play a key role in building robust, scalable, and intelligent AI agentic products that redefine how complex engineering and requirements workflows are solved. Our highly skilled team includes researchers, technologists, entrepreneurs, and developers holding 15 patents and 20+ publications at prestigious scientific venues like ICML, ICLR, and AAAI. Founded in 2020, we are pioneering collaborative requirement assessment in industry. The combination of the founder’s deep industry expertise, an OEM partnership with Siemens, multi-patented AI technologies and VC backing positions us as the thought leader in the field of requirement intelligence. 🔍 Role Description Design, build, and optimize ML models for intelligent requirement understanding and automation. Develop scalable, production-grade AI pipelines and APIs. Own the deployment lifecycle, including model serving, monitoring, and continuous delivery. Collaborate with data engineers and product teams to ensure data integrity, performance, and scalability. Work on large-scale data processing and real-time pipelines. Contribute to DevOps practices such as containerization, CI/CD pipelines, and cloud deployments. Analyze and improve the efficiency and scalability of ML systems in production. Stay current with the latest AI/ML research and translate innovations into product enhancements. 🧠 What are we looking for 3+ years of experience in ML/AI engineering with shipped products. Proficient in Python (e.g., TensorFlow, PyTorch, scikit-learn). Strong software engineering practices: version control, testing, documentation. Experience with MLOps tools (e.g., MLflow, Kubeflow) and model deployment techniques. Familiarity with Docker, Kubernetes, CI/CD, and cloud platforms (AWS, Azure, or GCP). Experience working with large datasets, data wrangling, and scalable data pipelines (Apache Spark, Kafka, Ray, Dask, etc.). Good understanding of microservices, distributed systems and model performance optimization. Comfortable in a fast-paced startup environment; proactive and curious mindset. 🎯 Bonus Points: Experience with natural language processing, document understanding, or LLM (Large Language Model). Experience with Knowledge Graph technologies Experience with logging/monitoring tools (e.g., Prometheus, Grafana). Knowledge of requirement engineering or PLM systems. ✨ What we offer: Attractive Compensation Work on impactful AI products solving real industrial challenges. A collaborative, agile, and supportive team culture. Flexible work hours and location (hybrid/remote). How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

3.0 years

30 - 40 Lacs

Greater Bhopal Area

Remote

Experience : 3.00 + years Salary : INR 3000000-4000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: DRIMCO GmbH) (*Note: This is a requirement for one of Uplers' client - AI-powered Industrial Bid Automation Company) What do you need for this opportunity? Must have skills required: Grafana, Graph, LLM, PLM systems, Prometheus, CI/CD, Dask, Kubeflow, MLFlow, or GCP, Python Programming, PyTorch, Ray, Scikit-learn, TensorFlow, Apache Spark, AWS, Azure, Docker, Kafka, Kubernetes, Machine Learning AI-powered Industrial Bid Automation Company is Looking for: We are driving the future of industrial automation and engineering by developing intelligent AI agents tailored for the manufacturing and automotive sectors. As part of our growing team, you’ll play a key role in building robust, scalable, and intelligent AI agentic products that redefine how complex engineering and requirements workflows are solved. Our highly skilled team includes researchers, technologists, entrepreneurs, and developers holding 15 patents and 20+ publications at prestigious scientific venues like ICML, ICLR, and AAAI. Founded in 2020, we are pioneering collaborative requirement assessment in industry. The combination of the founder’s deep industry expertise, an OEM partnership with Siemens, multi-patented AI technologies and VC backing positions us as the thought leader in the field of requirement intelligence. 🔍 Role Description Design, build, and optimize ML models for intelligent requirement understanding and automation. Develop scalable, production-grade AI pipelines and APIs. Own the deployment lifecycle, including model serving, monitoring, and continuous delivery. Collaborate with data engineers and product teams to ensure data integrity, performance, and scalability. Work on large-scale data processing and real-time pipelines. Contribute to DevOps practices such as containerization, CI/CD pipelines, and cloud deployments. Analyze and improve the efficiency and scalability of ML systems in production. Stay current with the latest AI/ML research and translate innovations into product enhancements. 🧠 What are we looking for 3+ years of experience in ML/AI engineering with shipped products. Proficient in Python (e.g., TensorFlow, PyTorch, scikit-learn). Strong software engineering practices: version control, testing, documentation. Experience with MLOps tools (e.g., MLflow, Kubeflow) and model deployment techniques. Familiarity with Docker, Kubernetes, CI/CD, and cloud platforms (AWS, Azure, or GCP). Experience working with large datasets, data wrangling, and scalable data pipelines (Apache Spark, Kafka, Ray, Dask, etc.). Good understanding of microservices, distributed systems and model performance optimization. Comfortable in a fast-paced startup environment; proactive and curious mindset. 🎯 Bonus Points: Experience with natural language processing, document understanding, or LLM (Large Language Model). Experience with Knowledge Graph technologies Experience with logging/monitoring tools (e.g., Prometheus, Grafana). Knowledge of requirement engineering or PLM systems. ✨ What we offer: Attractive Compensation Work on impactful AI products solving real industrial challenges. A collaborative, agile, and supportive team culture. Flexible work hours and location (hybrid/remote). How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

7.0 years

0 Lacs

Hyderabad, Telangana, India

On-site

At Microsoft, our core mission is empowering every person and every organization on the planet to achieve more. Industry Solutions (IS) is a global organization of over 16,000 strategic sellers, industry experts, elite engineers, architects and consultants, who along with delivery experts are working together to bring Microsoft’s mission of empowerment – and cutting-edge technology - to life for the world’s most influential customers. We are on the front lines of innovation, working side-by-side with customers to drive value across the entirety of their digital transformation journey. Our team prides itself on embracing a growth mindset, inspiring excellence, and encouraging everyone to share their unique viewpoints and be their authentic selves. Responsibilities Candidates must understand customer and partner business and IT environment and have demonstrated skills designing and developing technology in area(s) of specialization to solve business problems. Candidates must have experience developing software for the Microsoft platform using programming languages and development platforms including C# or any modern programming language and Microsoft SQL Server. Participates in the delivery of complex solution as technical individual contributor or as a team member under coaching/guidance of senior team members. Strong knowledge of the Software Development Life Cycle methodology, technical design, development and implementation decisions on the use of technology in area(s) of specialization. Contribution to overall Release and Sprint planning Overall Application Design (HLD) Validate planned velocity of the feature teams Provide user story estimates in refinements to deliver quality and on time delivery Drive team to Code Metrics like Style Cop, FXCop, CC, Class Coupling are met as defined by the project Drive team to ensure code complies with security and performance standards Track and monitor the feature team’s delivery progress Track team sprint bug's RCA for quality improvement in subsequent sprints and take action Track quality reports like Build and Sonarqube Conduct User story ready meeting DoD once Sprint ends Lead Sprint planning and task distribution Track creation of LLD for relevant user story Review LLD for relevant user story Ensure Unit Test, Integration test Automation as per the acceptance and P1, P2 test cases Ensure UI Automation for validation scenarios and 80% P1, P2 Validate Performance Test Scenarios and Execution Validate Environment Readiness Ensure Functionality is verified to be working in the DEV Integration environment against the test cases identified by the test team. Ensure UI verification and validation Ability to understand and analyze issues and use judgment to make decisions. Strong problem solving & troubleshooting skills Strong communication and leadership skills Responsible for self-development according to professional development plan Qualifications Required qualifications for Consultants include: Highly proficient, customer-facing experience involving project envisioning, planning, design, development, and deployment of complex solutions 7+ years of software development: automation-related experience is valued. Scripting languages such as bash, python, and PowerShell, or compiled languages such as C, C# and Go are most relevant, but others are acceptable. Awareness of, and ability to reason about, modern software & systems architectures, including load-balancing, queueing, caching, distributed systems failure modes generally, microservices, and so on Associated troubleshooting skills, including the ability to follow RPC call-chains across arbitrary network steps. Consequent understanding of monitoring in distributed systems Deep understanding of operating system level concepts such as processes, memory allocation, and the network stack; understanding of how applications are affected by the above, and ability to debug same Experience with working in a team, including coordinating large projects, communicating well, and exercising initiative when presented with problems Strong understanding of customer and partner business and IT environments, with proven record of delivering successful technical solutions Practical experience running large scale online systems is always an advantage Demonstrated ability to focus on the development of customer business agility and business value, while simultaneously providing deep experience with specific technologies Strong communication, consulting, analytical and problem-solving skills Growth mindset Flexible work approaches with strong proactivity and inclination to resolving customer situations Demonstrated skills designing and developing technology in area(s) of specialization to solve business problems and help customers in enhancing existing applications; core skills must include Microsoft technology, with open-source experience helpful Degree in Computer Science Engineering, MCA or equivalent with work experience; higher relevant education preferred MCSD/MCSE/MCAD/Azure certification is a plus Required Skills Strong Object-Oriented Programming and Design experience Understanding of Technical Design Patterns SQL Programming knowledge .Net 4.0/Core (ASP.Net, Web API, PowerShell script. MVC, TFS/GitHub) Azure (Deep technical understanding of PaaS, Good experience in Azure services like Service Fabric, Web Apps, Cosmos DB, Azure Storage, Azure Service Bus, Azure Functions, and API management, SQL. with Azure Application Dev experience)Deep technical and architectural knowledge of at least 3 of the following areas, as well as broad understanding across the Microsoft Development toolset: Cloud Technologies (preferably Azure Full Stack) Skills on projects targeting a variety of form factors (e.g. Mobile, IoT) Development Languages (e.g. C#, JavaScript, Java, Front end skills like React, Angular etc.) Application Lifecycle Management (e.g. Agile, TFS etc.) Solution Architecture (e.g. SOA, Enterprise Architecture) Identity and Access Management (IAM), Conditional Access Policies, Strong Authentication, Risk-Based Access, Encryption, Backup & Disaster Recovery, Data Loss Prevention (DLP), Secure Coding Practices, Input Validation, Output Encoding, Authentication and Password Management, Session Management, Cryptography, Threat and Vulnerability Identification, Security Breach Response. Techniques:Implementing modern authentication providers (like EntraID), simulating attacks, writing resilient code, using encryption algorithms, hashing techniques, secret management with key vault, cryptographic protocols. Tools: Azure Security Center, Azure Sentinel, Metasploit, Burp Suite, custom scripts. Good to have skills: Industry vertical knowledge – Retail, Banking and Finance, Telecom etc. Middleware technologies used for integrations Knowledge of Azure DevOps, Docker, Kubernetes etc. Microsoft is an equal opportunity employer. Consistent with applicable law, all qualified applicants will receive consideration for employment without regard to age, ancestry, citizenship, color, family or medical care leave, gender identity or expression, genetic information, immigration status, marital status, medical condition, national origin, physical or mental disability, political affiliation, protected veteran or military status, race, ethnicity, religion, sex (including pregnancy), sexual orientation, or any other characteristic protected by applicable local laws, regulations and ordinances. If you need assistance and/or a reasonable accommodation due to a disability during the application process, read more about requesting accommodations.

Posted 1 day ago

Apply

3.0 years

30 - 40 Lacs

Indore, Madhya Pradesh, India

Remote

Experience : 3.00 + years Salary : INR 3000000-4000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: DRIMCO GmbH) (*Note: This is a requirement for one of Uplers' client - AI-powered Industrial Bid Automation Company) What do you need for this opportunity? Must have skills required: Grafana, Graph, LLM, PLM systems, Prometheus, CI/CD, Dask, Kubeflow, MLFlow, or GCP, Python Programming, PyTorch, Ray, Scikit-learn, TensorFlow, Apache Spark, AWS, Azure, Docker, Kafka, Kubernetes, Machine Learning AI-powered Industrial Bid Automation Company is Looking for: We are driving the future of industrial automation and engineering by developing intelligent AI agents tailored for the manufacturing and automotive sectors. As part of our growing team, you’ll play a key role in building robust, scalable, and intelligent AI agentic products that redefine how complex engineering and requirements workflows are solved. Our highly skilled team includes researchers, technologists, entrepreneurs, and developers holding 15 patents and 20+ publications at prestigious scientific venues like ICML, ICLR, and AAAI. Founded in 2020, we are pioneering collaborative requirement assessment in industry. The combination of the founder’s deep industry expertise, an OEM partnership with Siemens, multi-patented AI technologies and VC backing positions us as the thought leader in the field of requirement intelligence. 🔍 Role Description Design, build, and optimize ML models for intelligent requirement understanding and automation. Develop scalable, production-grade AI pipelines and APIs. Own the deployment lifecycle, including model serving, monitoring, and continuous delivery. Collaborate with data engineers and product teams to ensure data integrity, performance, and scalability. Work on large-scale data processing and real-time pipelines. Contribute to DevOps practices such as containerization, CI/CD pipelines, and cloud deployments. Analyze and improve the efficiency and scalability of ML systems in production. Stay current with the latest AI/ML research and translate innovations into product enhancements. 🧠 What are we looking for 3+ years of experience in ML/AI engineering with shipped products. Proficient in Python (e.g., TensorFlow, PyTorch, scikit-learn). Strong software engineering practices: version control, testing, documentation. Experience with MLOps tools (e.g., MLflow, Kubeflow) and model deployment techniques. Familiarity with Docker, Kubernetes, CI/CD, and cloud platforms (AWS, Azure, or GCP). Experience working with large datasets, data wrangling, and scalable data pipelines (Apache Spark, Kafka, Ray, Dask, etc.). Good understanding of microservices, distributed systems and model performance optimization. Comfortable in a fast-paced startup environment; proactive and curious mindset. 🎯 Bonus Points: Experience with natural language processing, document understanding, or LLM (Large Language Model). Experience with Knowledge Graph technologies Experience with logging/monitoring tools (e.g., Prometheus, Grafana). Knowledge of requirement engineering or PLM systems. ✨ What we offer: Attractive Compensation Work on impactful AI products solving real industrial challenges. A collaborative, agile, and supportive team culture. Flexible work hours and location (hybrid/remote). How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

3.0 years

30 - 40 Lacs

Chandigarh, India

Remote

Experience : 3.00 + years Salary : INR 3000000-4000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: DRIMCO GmbH) (*Note: This is a requirement for one of Uplers' client - AI-powered Industrial Bid Automation Company) What do you need for this opportunity? Must have skills required: Grafana, Graph, LLM, PLM systems, Prometheus, CI/CD, Dask, Kubeflow, MLFlow, or GCP, Python Programming, PyTorch, Ray, Scikit-learn, TensorFlow, Apache Spark, AWS, Azure, Docker, Kafka, Kubernetes, Machine Learning AI-powered Industrial Bid Automation Company is Looking for: We are driving the future of industrial automation and engineering by developing intelligent AI agents tailored for the manufacturing and automotive sectors. As part of our growing team, you’ll play a key role in building robust, scalable, and intelligent AI agentic products that redefine how complex engineering and requirements workflows are solved. Our highly skilled team includes researchers, technologists, entrepreneurs, and developers holding 15 patents and 20+ publications at prestigious scientific venues like ICML, ICLR, and AAAI. Founded in 2020, we are pioneering collaborative requirement assessment in industry. The combination of the founder’s deep industry expertise, an OEM partnership with Siemens, multi-patented AI technologies and VC backing positions us as the thought leader in the field of requirement intelligence. 🔍 Role Description Design, build, and optimize ML models for intelligent requirement understanding and automation. Develop scalable, production-grade AI pipelines and APIs. Own the deployment lifecycle, including model serving, monitoring, and continuous delivery. Collaborate with data engineers and product teams to ensure data integrity, performance, and scalability. Work on large-scale data processing and real-time pipelines. Contribute to DevOps practices such as containerization, CI/CD pipelines, and cloud deployments. Analyze and improve the efficiency and scalability of ML systems in production. Stay current with the latest AI/ML research and translate innovations into product enhancements. 🧠 What are we looking for 3+ years of experience in ML/AI engineering with shipped products. Proficient in Python (e.g., TensorFlow, PyTorch, scikit-learn). Strong software engineering practices: version control, testing, documentation. Experience with MLOps tools (e.g., MLflow, Kubeflow) and model deployment techniques. Familiarity with Docker, Kubernetes, CI/CD, and cloud platforms (AWS, Azure, or GCP). Experience working with large datasets, data wrangling, and scalable data pipelines (Apache Spark, Kafka, Ray, Dask, etc.). Good understanding of microservices, distributed systems and model performance optimization. Comfortable in a fast-paced startup environment; proactive and curious mindset. 🎯 Bonus Points: Experience with natural language processing, document understanding, or LLM (Large Language Model). Experience with Knowledge Graph technologies Experience with logging/monitoring tools (e.g., Prometheus, Grafana). Knowledge of requirement engineering or PLM systems. ✨ What we offer: Attractive Compensation Work on impactful AI products solving real industrial challenges. A collaborative, agile, and supportive team culture. Flexible work hours and location (hybrid/remote). How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

3.0 years

30 - 40 Lacs

Surat, Gujarat, India

Remote

Experience : 3.00 + years Salary : INR 3000000-4000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: DRIMCO GmbH) (*Note: This is a requirement for one of Uplers' client - AI-powered Industrial Bid Automation Company) What do you need for this opportunity? Must have skills required: Grafana, Graph, LLM, PLM systems, Prometheus, CI/CD, Dask, Kubeflow, MLFlow, or GCP, Python Programming, PyTorch, Ray, Scikit-learn, TensorFlow, Apache Spark, AWS, Azure, Docker, Kafka, Kubernetes, Machine Learning AI-powered Industrial Bid Automation Company is Looking for: We are driving the future of industrial automation and engineering by developing intelligent AI agents tailored for the manufacturing and automotive sectors. As part of our growing team, you’ll play a key role in building robust, scalable, and intelligent AI agentic products that redefine how complex engineering and requirements workflows are solved. Our highly skilled team includes researchers, technologists, entrepreneurs, and developers holding 15 patents and 20+ publications at prestigious scientific venues like ICML, ICLR, and AAAI. Founded in 2020, we are pioneering collaborative requirement assessment in industry. The combination of the founder’s deep industry expertise, an OEM partnership with Siemens, multi-patented AI technologies and VC backing positions us as the thought leader in the field of requirement intelligence. 🔍 Role Description Design, build, and optimize ML models for intelligent requirement understanding and automation. Develop scalable, production-grade AI pipelines and APIs. Own the deployment lifecycle, including model serving, monitoring, and continuous delivery. Collaborate with data engineers and product teams to ensure data integrity, performance, and scalability. Work on large-scale data processing and real-time pipelines. Contribute to DevOps practices such as containerization, CI/CD pipelines, and cloud deployments. Analyze and improve the efficiency and scalability of ML systems in production. Stay current with the latest AI/ML research and translate innovations into product enhancements. 🧠 What are we looking for 3+ years of experience in ML/AI engineering with shipped products. Proficient in Python (e.g., TensorFlow, PyTorch, scikit-learn). Strong software engineering practices: version control, testing, documentation. Experience with MLOps tools (e.g., MLflow, Kubeflow) and model deployment techniques. Familiarity with Docker, Kubernetes, CI/CD, and cloud platforms (AWS, Azure, or GCP). Experience working with large datasets, data wrangling, and scalable data pipelines (Apache Spark, Kafka, Ray, Dask, etc.). Good understanding of microservices, distributed systems and model performance optimization. Comfortable in a fast-paced startup environment; proactive and curious mindset. 🎯 Bonus Points: Experience with natural language processing, document understanding, or LLM (Large Language Model). Experience with Knowledge Graph technologies Experience with logging/monitoring tools (e.g., Prometheus, Grafana). Knowledge of requirement engineering or PLM systems. ✨ What we offer: Attractive Compensation Work on impactful AI products solving real industrial challenges. A collaborative, agile, and supportive team culture. Flexible work hours and location (hybrid/remote). How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

3.0 years

30 - 40 Lacs

Dehradun, Uttarakhand, India

Remote

Experience : 3.00 + years Salary : INR 3000000-4000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: DRIMCO GmbH) (*Note: This is a requirement for one of Uplers' client - AI-powered Industrial Bid Automation Company) What do you need for this opportunity? Must have skills required: Grafana, Graph, LLM, PLM systems, Prometheus, CI/CD, Dask, Kubeflow, MLFlow, or GCP, Python Programming, PyTorch, Ray, Scikit-learn, TensorFlow, Apache Spark, AWS, Azure, Docker, Kafka, Kubernetes, Machine Learning AI-powered Industrial Bid Automation Company is Looking for: We are driving the future of industrial automation and engineering by developing intelligent AI agents tailored for the manufacturing and automotive sectors. As part of our growing team, you’ll play a key role in building robust, scalable, and intelligent AI agentic products that redefine how complex engineering and requirements workflows are solved. Our highly skilled team includes researchers, technologists, entrepreneurs, and developers holding 15 patents and 20+ publications at prestigious scientific venues like ICML, ICLR, and AAAI. Founded in 2020, we are pioneering collaborative requirement assessment in industry. The combination of the founder’s deep industry expertise, an OEM partnership with Siemens, multi-patented AI technologies and VC backing positions us as the thought leader in the field of requirement intelligence. 🔍 Role Description Design, build, and optimize ML models for intelligent requirement understanding and automation. Develop scalable, production-grade AI pipelines and APIs. Own the deployment lifecycle, including model serving, monitoring, and continuous delivery. Collaborate with data engineers and product teams to ensure data integrity, performance, and scalability. Work on large-scale data processing and real-time pipelines. Contribute to DevOps practices such as containerization, CI/CD pipelines, and cloud deployments. Analyze and improve the efficiency and scalability of ML systems in production. Stay current with the latest AI/ML research and translate innovations into product enhancements. 🧠 What are we looking for 3+ years of experience in ML/AI engineering with shipped products. Proficient in Python (e.g., TensorFlow, PyTorch, scikit-learn). Strong software engineering practices: version control, testing, documentation. Experience with MLOps tools (e.g., MLflow, Kubeflow) and model deployment techniques. Familiarity with Docker, Kubernetes, CI/CD, and cloud platforms (AWS, Azure, or GCP). Experience working with large datasets, data wrangling, and scalable data pipelines (Apache Spark, Kafka, Ray, Dask, etc.). Good understanding of microservices, distributed systems and model performance optimization. Comfortable in a fast-paced startup environment; proactive and curious mindset. 🎯 Bonus Points: Experience with natural language processing, document understanding, or LLM (Large Language Model). Experience with Knowledge Graph technologies Experience with logging/monitoring tools (e.g., Prometheus, Grafana). Knowledge of requirement engineering or PLM systems. ✨ What we offer: Attractive Compensation Work on impactful AI products solving real industrial challenges. A collaborative, agile, and supportive team culture. Flexible work hours and location (hybrid/remote). How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

3.0 years

30 - 40 Lacs

Mysore, Karnataka, India

Remote

Experience : 3.00 + years Salary : INR 3000000-4000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: DRIMCO GmbH) (*Note: This is a requirement for one of Uplers' client - AI-powered Industrial Bid Automation Company) What do you need for this opportunity? Must have skills required: Grafana, Graph, LLM, PLM systems, Prometheus, CI/CD, Dask, Kubeflow, MLFlow, or GCP, Python Programming, PyTorch, Ray, Scikit-learn, TensorFlow, Apache Spark, AWS, Azure, Docker, Kafka, Kubernetes, Machine Learning AI-powered Industrial Bid Automation Company is Looking for: We are driving the future of industrial automation and engineering by developing intelligent AI agents tailored for the manufacturing and automotive sectors. As part of our growing team, you’ll play a key role in building robust, scalable, and intelligent AI agentic products that redefine how complex engineering and requirements workflows are solved. Our highly skilled team includes researchers, technologists, entrepreneurs, and developers holding 15 patents and 20+ publications at prestigious scientific venues like ICML, ICLR, and AAAI. Founded in 2020, we are pioneering collaborative requirement assessment in industry. The combination of the founder’s deep industry expertise, an OEM partnership with Siemens, multi-patented AI technologies and VC backing positions us as the thought leader in the field of requirement intelligence. 🔍 Role Description Design, build, and optimize ML models for intelligent requirement understanding and automation. Develop scalable, production-grade AI pipelines and APIs. Own the deployment lifecycle, including model serving, monitoring, and continuous delivery. Collaborate with data engineers and product teams to ensure data integrity, performance, and scalability. Work on large-scale data processing and real-time pipelines. Contribute to DevOps practices such as containerization, CI/CD pipelines, and cloud deployments. Analyze and improve the efficiency and scalability of ML systems in production. Stay current with the latest AI/ML research and translate innovations into product enhancements. 🧠 What are we looking for 3+ years of experience in ML/AI engineering with shipped products. Proficient in Python (e.g., TensorFlow, PyTorch, scikit-learn). Strong software engineering practices: version control, testing, documentation. Experience with MLOps tools (e.g., MLflow, Kubeflow) and model deployment techniques. Familiarity with Docker, Kubernetes, CI/CD, and cloud platforms (AWS, Azure, or GCP). Experience working with large datasets, data wrangling, and scalable data pipelines (Apache Spark, Kafka, Ray, Dask, etc.). Good understanding of microservices, distributed systems and model performance optimization. Comfortable in a fast-paced startup environment; proactive and curious mindset. 🎯 Bonus Points: Experience with natural language processing, document understanding, or LLM (Large Language Model). Experience with Knowledge Graph technologies Experience with logging/monitoring tools (e.g., Prometheus, Grafana). Knowledge of requirement engineering or PLM systems. ✨ What we offer: Attractive Compensation Work on impactful AI products solving real industrial challenges. A collaborative, agile, and supportive team culture. Flexible work hours and location (hybrid/remote). How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

3.0 years

30 - 40 Lacs

Vijayawada, Andhra Pradesh, India

Remote

Experience : 3.00 + years Salary : INR 3000000-4000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: DRIMCO GmbH) (*Note: This is a requirement for one of Uplers' client - AI-powered Industrial Bid Automation Company) What do you need for this opportunity? Must have skills required: Grafana, Graph, LLM, PLM systems, Prometheus, CI/CD, Dask, Kubeflow, MLFlow, or GCP, Python Programming, PyTorch, Ray, Scikit-learn, TensorFlow, Apache Spark, AWS, Azure, Docker, Kafka, Kubernetes, Machine Learning AI-powered Industrial Bid Automation Company is Looking for: We are driving the future of industrial automation and engineering by developing intelligent AI agents tailored for the manufacturing and automotive sectors. As part of our growing team, you’ll play a key role in building robust, scalable, and intelligent AI agentic products that redefine how complex engineering and requirements workflows are solved. Our highly skilled team includes researchers, technologists, entrepreneurs, and developers holding 15 patents and 20+ publications at prestigious scientific venues like ICML, ICLR, and AAAI. Founded in 2020, we are pioneering collaborative requirement assessment in industry. The combination of the founder’s deep industry expertise, an OEM partnership with Siemens, multi-patented AI technologies and VC backing positions us as the thought leader in the field of requirement intelligence. 🔍 Role Description Design, build, and optimize ML models for intelligent requirement understanding and automation. Develop scalable, production-grade AI pipelines and APIs. Own the deployment lifecycle, including model serving, monitoring, and continuous delivery. Collaborate with data engineers and product teams to ensure data integrity, performance, and scalability. Work on large-scale data processing and real-time pipelines. Contribute to DevOps practices such as containerization, CI/CD pipelines, and cloud deployments. Analyze and improve the efficiency and scalability of ML systems in production. Stay current with the latest AI/ML research and translate innovations into product enhancements. 🧠 What are we looking for 3+ years of experience in ML/AI engineering with shipped products. Proficient in Python (e.g., TensorFlow, PyTorch, scikit-learn). Strong software engineering practices: version control, testing, documentation. Experience with MLOps tools (e.g., MLflow, Kubeflow) and model deployment techniques. Familiarity with Docker, Kubernetes, CI/CD, and cloud platforms (AWS, Azure, or GCP). Experience working with large datasets, data wrangling, and scalable data pipelines (Apache Spark, Kafka, Ray, Dask, etc.). Good understanding of microservices, distributed systems and model performance optimization. Comfortable in a fast-paced startup environment; proactive and curious mindset. 🎯 Bonus Points: Experience with natural language processing, document understanding, or LLM (Large Language Model). Experience with Knowledge Graph technologies Experience with logging/monitoring tools (e.g., Prometheus, Grafana). Knowledge of requirement engineering or PLM systems. ✨ What we offer: Attractive Compensation Work on impactful AI products solving real industrial challenges. A collaborative, agile, and supportive team culture. Flexible work hours and location (hybrid/remote). How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

3.0 years

30 - 40 Lacs

Thiruvananthapuram, Kerala, India

Remote

Experience : 3.00 + years Salary : INR 3000000-4000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: DRIMCO GmbH) (*Note: This is a requirement for one of Uplers' client - AI-powered Industrial Bid Automation Company) What do you need for this opportunity? Must have skills required: Grafana, Graph, LLM, PLM systems, Prometheus, CI/CD, Dask, Kubeflow, MLFlow, or GCP, Python Programming, PyTorch, Ray, Scikit-learn, TensorFlow, Apache Spark, AWS, Azure, Docker, Kafka, Kubernetes, Machine Learning AI-powered Industrial Bid Automation Company is Looking for: We are driving the future of industrial automation and engineering by developing intelligent AI agents tailored for the manufacturing and automotive sectors. As part of our growing team, you’ll play a key role in building robust, scalable, and intelligent AI agentic products that redefine how complex engineering and requirements workflows are solved. Our highly skilled team includes researchers, technologists, entrepreneurs, and developers holding 15 patents and 20+ publications at prestigious scientific venues like ICML, ICLR, and AAAI. Founded in 2020, we are pioneering collaborative requirement assessment in industry. The combination of the founder’s deep industry expertise, an OEM partnership with Siemens, multi-patented AI technologies and VC backing positions us as the thought leader in the field of requirement intelligence. 🔍 Role Description Design, build, and optimize ML models for intelligent requirement understanding and automation. Develop scalable, production-grade AI pipelines and APIs. Own the deployment lifecycle, including model serving, monitoring, and continuous delivery. Collaborate with data engineers and product teams to ensure data integrity, performance, and scalability. Work on large-scale data processing and real-time pipelines. Contribute to DevOps practices such as containerization, CI/CD pipelines, and cloud deployments. Analyze and improve the efficiency and scalability of ML systems in production. Stay current with the latest AI/ML research and translate innovations into product enhancements. 🧠 What are we looking for 3+ years of experience in ML/AI engineering with shipped products. Proficient in Python (e.g., TensorFlow, PyTorch, scikit-learn). Strong software engineering practices: version control, testing, documentation. Experience with MLOps tools (e.g., MLflow, Kubeflow) and model deployment techniques. Familiarity with Docker, Kubernetes, CI/CD, and cloud platforms (AWS, Azure, or GCP). Experience working with large datasets, data wrangling, and scalable data pipelines (Apache Spark, Kafka, Ray, Dask, etc.). Good understanding of microservices, distributed systems and model performance optimization. Comfortable in a fast-paced startup environment; proactive and curious mindset. 🎯 Bonus Points: Experience with natural language processing, document understanding, or LLM (Large Language Model). Experience with Knowledge Graph technologies Experience with logging/monitoring tools (e.g., Prometheus, Grafana). Knowledge of requirement engineering or PLM systems. ✨ What we offer: Attractive Compensation Work on impactful AI products solving real industrial challenges. A collaborative, agile, and supportive team culture. Flexible work hours and location (hybrid/remote). How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

3.0 years

30 - 40 Lacs

Patna, Bihar, India

Remote

Experience : 3.00 + years Salary : INR 3000000-4000000 / year (based on experience) Expected Notice Period : 15 Days Shift : (GMT+05:30) Asia/Kolkata (IST) Opportunity Type : Remote Placement Type : Full Time Permanent position(Payroll and Compliance to be managed by: DRIMCO GmbH) (*Note: This is a requirement for one of Uplers' client - AI-powered Industrial Bid Automation Company) What do you need for this opportunity? Must have skills required: Grafana, Graph, LLM, PLM systems, Prometheus, CI/CD, Dask, Kubeflow, MLFlow, or GCP, Python Programming, PyTorch, Ray, Scikit-learn, TensorFlow, Apache Spark, AWS, Azure, Docker, Kafka, Kubernetes, Machine Learning AI-powered Industrial Bid Automation Company is Looking for: We are driving the future of industrial automation and engineering by developing intelligent AI agents tailored for the manufacturing and automotive sectors. As part of our growing team, you’ll play a key role in building robust, scalable, and intelligent AI agentic products that redefine how complex engineering and requirements workflows are solved. Our highly skilled team includes researchers, technologists, entrepreneurs, and developers holding 15 patents and 20+ publications at prestigious scientific venues like ICML, ICLR, and AAAI. Founded in 2020, we are pioneering collaborative requirement assessment in industry. The combination of the founder’s deep industry expertise, an OEM partnership with Siemens, multi-patented AI technologies and VC backing positions us as the thought leader in the field of requirement intelligence. 🔍 Role Description Design, build, and optimize ML models for intelligent requirement understanding and automation. Develop scalable, production-grade AI pipelines and APIs. Own the deployment lifecycle, including model serving, monitoring, and continuous delivery. Collaborate with data engineers and product teams to ensure data integrity, performance, and scalability. Work on large-scale data processing and real-time pipelines. Contribute to DevOps practices such as containerization, CI/CD pipelines, and cloud deployments. Analyze and improve the efficiency and scalability of ML systems in production. Stay current with the latest AI/ML research and translate innovations into product enhancements. 🧠 What are we looking for 3+ years of experience in ML/AI engineering with shipped products. Proficient in Python (e.g., TensorFlow, PyTorch, scikit-learn). Strong software engineering practices: version control, testing, documentation. Experience with MLOps tools (e.g., MLflow, Kubeflow) and model deployment techniques. Familiarity with Docker, Kubernetes, CI/CD, and cloud platforms (AWS, Azure, or GCP). Experience working with large datasets, data wrangling, and scalable data pipelines (Apache Spark, Kafka, Ray, Dask, etc.). Good understanding of microservices, distributed systems and model performance optimization. Comfortable in a fast-paced startup environment; proactive and curious mindset. 🎯 Bonus Points: Experience with natural language processing, document understanding, or LLM (Large Language Model). Experience with Knowledge Graph technologies Experience with logging/monitoring tools (e.g., Prometheus, Grafana). Knowledge of requirement engineering or PLM systems. ✨ What we offer: Attractive Compensation Work on impactful AI products solving real industrial challenges. A collaborative, agile, and supportive team culture. Flexible work hours and location (hybrid/remote). How to apply for this opportunity? Step 1: Click On Apply! And Register or Login on our portal. Step 2: Complete the Screening Form & Upload updated Resume Step 3: Increase your chances to get shortlisted & meet the client for the Interview! About Uplers: Our goal is to make hiring reliable, simple, and fast. Our role will be to help all our talents find and apply for relevant contractual onsite opportunities and progress in their career. We will support any grievances or challenges you may face during the engagement. (Note: There are many more opportunities apart from this on the portal. Depending on the assessments you clear, you can apply for them as well). So, if you are ready for a new challenge, a great work environment, and an opportunity to take your career to the next level, don't hesitate to apply today. We are waiting for you!

Posted 1 day ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities As Developer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of Requirement understanding. Gather requirements to define data definitions, transformation logic, and data model logical and physical designs, data flow and process.Take ownership of projects and manage statuses and timelines effectively. Strong communication skills and ability to work collaboratively in a team environment Preferred Education Master's Degree Required Technical And Professional Expertise Strong proficiency in C#, ASP.NET, and .NET Core 6.0+. Hands on with front-end technologies such as HTML5, CSS3, JavaScript, and front-end frameworks (e.g., React, Angular) Solid understanding of Microservices architecture, Design patterns & SOLID principles. Good understanding of RDBMS concepts with experience with SQL Server. Excellent problem-solving skills and ability to troubleshoot complex issues Preferred Technical And Professional Experience Exposure in DevOps Practices and managing CI/CD Pipelines Exposure to Azure services for cloud-based application development

Posted 1 day ago

Apply

0 years

0 Lacs

Gurgaon, Haryana, India

On-site

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities As Developer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of Requirement understanding. Gather requirements to define data definitions, transformation logic, and data model logical and physical designs, data flow and process.Take ownership of projects and manage statuses and timelines effectively. Strong communication skills and ability to work collaboratively in a team environment Preferred Education Master's Degree Required Technical And Professional Expertise Strong proficiency in C#, ASP.NET, and .NET Core 6.0+. Hands on with front-end technologies such as HTML5, CSS3, JavaScript, and front-end frameworks (e.g., React, Angular) Solid understanding of Microservices architecture, Design patterns & SOLID principles. Good understanding of RDBMS concepts with experience with SQL Server. Excellent problem-solving skills and ability to troubleshoot complex issues Preferred Technical And Professional Experience Exposure in DevOps Practices and managing CI/CD Pipelines Exposure to Azure services for cloud-based application development

Posted 1 day ago

Apply

10.0 - 18.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Total Exp : 10 - 18 Years Locations : PAN India Desired Skills: 1. Understanding of client business objective, challenges, scope, business outcome through Cloud and AI enablement 2. Conduct advisory engagements and have technical experience in any one or two areas - Application / Data modernization, Industry Cloud Solutions, M&A Integration, Cloud Optimization or Ecosystem Integration 3. Assess existing application landscape and determine cloud led business transformation (migration/modernization) suitability 4. Provide technical guidance for cloud platform selection and migration/modernization 5. Develop hybrid cloud roadmaps and strategies for clients, conduct comprehensive cloud assessments, distributed cloud architecture design and create implementation plan and roadmap 6. Maintain client stakeholder relationship, identify downstream opportunities, ensure closure of advisory commitments 7. Conduct cost optimization analyses, identify opportunities to reduce cloud spending, maximize return on cloud investments 8. Ensure security and regulatory compliance, provide training and education to client on cloud technologies Good to Have - Cloud certification / advanced cloud certification - Previous cloud delivery, advisory, solutioning experience in cloud led business transformation engagements - Business understanding of Industry processes, Mergers and divestitures, operating model post cloud adoption, data structures, data availability assessment and AI enablement - Cloud certification / advanced cloud certification - Technical Proficiency in one / multiple of the following technology spaces - distributed cloud architecture design, migration and modernization techniques, microservices and serverless architecture, API and integration techniques, DevSecOps, cloud native services, cloud resource optimization and FinOps techniques

Posted 2 days ago

Apply

0 years

0 Lacs

Pune, Maharashtra, India

On-site

Introduction A career in IBM Consulting is rooted by long-term relationships and close collaboration with clients across the globe. You'll work with visionaries across multiple industries to improve the hybrid cloud and AI journey for the most innovative and valuable companies in the world. Your ability to accelerate impact and make meaningful change for your clients is enabled by our strategic partner ecosystem and our robust technology platforms across the IBM portfolio Your Role And Responsibilities As Developer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of Requirement understanding. Gather requirements to define data definitions, transformation logic, and data model logical and physical designs, data flow and process. Take ownership of projects and manage statuses and timelines effectively. Strong communication skills and ability to work collaboratively in a team environment. Preferred Education Master's Degree Required Technical And Professional Expertise Strong proficiency in C#, ASP.NET, and .NET Core 6.0+. Hands on with front-end technologies such as HTML5, CSS3, JavaScript, and front-end frameworks (e.g., React, Angular) Solid understanding of Microservices architecture, Design patterns & SOLID principles Good understanding of RDBMS concepts with experience with SQL Server. Excellent problem-solving skills and ability to troubleshoot complex issues Preferred Technical And Professional Experience Exposure in DevOps Practices and managing CI/CD Pipelines Exposure to Azure services for cloud-based application development

Posted 2 days ago

Apply

10.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Job Summary We are seeking a skilled Java MFA Developer with 6–10 years of experience to join our dynamic team. This opportunity is perfect for professionals who are passionate about Identity and Access Management (IAM) and looking to deepen their expertise in Java, microservices, Docker, and cloud-native development . In this role, you’ll work hands-on with tools and platforms such as Helm, Kubernetes, Docker , and other containerization technologies. You'll also apply your Java development and DevOps knowledge to design and deliver secure, scalable solutions. We understand that not every candidate will have experience with all the technologies listed. If you’re passionate, committed, and eager to learn, we’re here to support your growth and help you succeed. Key Responsibilities · 6–10 years of experience in Java, Spring, Spring Boot, and Microservices · Cloud proficiency (AWS, Azure, or GCP preferred) · Hands-on with Helm, Kubernetes, Docker, and containerization tools · Agile/Scrum working experience · Strong troubleshooting and solution-oriented mindset · Experience with OpenShift platform · Exposure to Oracle DB (SQL, Stored Procedures, Capacity Planning) · Familiarity with MFA products (any) · Regular updates to technical leads and proactive issue resolution · Ability to integrate and innovate across components Additional Skills (Nice to Have) · Full-stack development exposure · Maven / Gradle · GitLab CI/CD, Vault, and test automation integration · Shell scripting on Linux · Monitoring tools: Splunk, DX-APM or similar · Understanding of networking and system monitoring fundamentals · Git flow knowledge Soft Skills · Strong analytical and problem-solving abilities · Excellent communication and collaboration skills · Self-starter, capable of handling multiple priorities independently Educational Qualifications · Bachelor's degree in Computer Science, Information Technology, or related field Salary & Benefits · Salary: Based on experience and expertise · Competitive benefits package · Health insurance, retirement plans, and paid time off · Support for certifications and continuous professional development Why Join Us? · Work on cutting-edge IAM and MFA solutions · Collaborate in a high-performing, supportive team · Exposure to impactful projects with enterprise-level clients · Opportunity to grow and learn new technologies in a fast-paced, innovative environment Interested candidates can send their CVs to: careers@trevonix.com

Posted 2 days ago

Apply

8.0 years

0 Lacs

Pune, Maharashtra, India

On-site

Company Description NewBridge is a fintech platform dedicated to digital transformation in the Global Loans and Deposits market. By automating workflows and transitioning from paper and voice-based transactions to digital solutions, NewBridge ensures efficiency and transparency in financial transactions. We operate two significant platforms: LoanBook, a leading-edge primary issuance and secondary trading platform for loans, and DepositBook, a digital network connecting global institutional borrowers and lenders for wholesale deposits. Our clients include banks, corporations, pension funds, asset managers, family offices, insurance companies, and private banks. Role Description This is a full-time on-site role for a Senior Java Software Engineer located in Pune. The Senior Java Software Engineer will be responsible for developing software solutions, implementing microservices architecture, and programming using Java and the Spring Framework. Daily tasks will include designing, coding, testing, and maintaining software applications, collaborating with cross-functional teams, and contributing to all phases of the development lifecycle. Key Responsibilities: • Backend Microservices Development: Design, develop, and maintain robust, scalable backend microservices using Java (Spring Boot) to support our evolving product portfolio. • System Design and Implementation: Participate in the full software development lifecycle, from requirements analysis and system design to coding, testing, deployment, and ongoing support. • Technical Leadership: Provide technical guidance and mentorship to junior developers, fostering a culture of continuous learning and knowledge sharing. • Performance Optimization: Analyze and optimize system performance, ensuring our microservices architecture can handle high traffic and deliver exceptional user experiences. • Collaboration: Work effectively within a cross-functional team, including product managers, designers, and QA engineers, to deliver high-quality solutions. • Documentation: Create clear and concise technical documentation to support knowledge transfer and maintainability. Qualifications: Essential: o 8+ years of hands-on software development experience in Java, preferably with Spring Boot. o Proven expertise in object-oriented programming (OOP) and design principles (SOLID). o Strong understanding of microservices architecture and best practices. o Proficiency in AWS cloud services and infrastructure. o Experience with building and consuming RESTful APIs, utilizing OpenAPI/Swagger for documentation. o Deep knowledge of JPA, ORM, and SQL/NoSQL databases. o Experience with version control systems (Git) and CI/CD pipelines. Desired: o Knowledge of Kotlin. o Familiarity with concepts like service mesh, ingress controllers, API gateways. o Hands-on experience with Docker, Kubernetes, or other containerization technologies. o Understanding of infrastructure security best practices for web applications. o Experience with SonarQube or similar code quality tools. Experience in the fintech industry is a plus

Posted 2 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies