Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
7.0 - 12.0 years
14 - 24 Lacs
Gurugram
Hybrid
Gen AI + DS + ML Ops Job Title: Generative AI and Data Science Engineer with MLOps Expertise Location: Gurgaon, India Employment Type: Full-time About the Role: We are seeking a versatile and highly skilled Generative AI and Data Science Engineer with strong MLOps expertise. This role combines deep technical knowledge in data science and machine learning with a focus on designing and deploying scalable, production-level AI solutions. You will work with cross-functional teams to drive AI/ML projects from research and prototyping through to deployment and maintenance, ensuring model robustness, scalability, and efficiency. Responsibilities: Generative AI Development and Data Science: Design, develop, and fine-tune generative AI models for various applications such as natural language processing, image synthesis, and data augmentation. Perform exploratory data analysis (EDA) and statistical modeling to identify trends, patterns, and actionable insights. Collaborate with data engineering and product teams to create data pipelines for model training, testing, and deployment. Apply data science techniques to optimize model performance and address real-world business challenges. Machine Learning Operations (MLOps): Implement MLOps best practices for managing and automating the end-to-end machine learning lifecycle, including model versioning, monitoring, and retraining. Build, maintain, and optimize CI/CD pipelines for ML models to streamline development and deployment processes. Ensure scalability, robustness, and security of AI/ML systems in production environments. Develop tools and frameworks for monitoring model performance and detecting anomalies post-deployment. Research and Innovation: Stay current with advancements in generative AI, machine learning, and MLOps technologies and frameworks. Identify new methodologies, tools, and technologies that could enhance our AI and data science capabilities. Engage in R&D initiatives and collaborate with team members on innovative projects. Requirements: Educational Background: Bachelors or Masters degree in Computer Science, Data Science, Engineering, or a related field. PhD is a plus. Technical Skills: Proficiency in Python and familiarity with machine learning libraries (e.g., TensorFlow, PyTorch, Keras, scikit-learn). Strong understanding of generative AI models (e.g., GANs, VAEs, transformers) and deep learning techniques. Experience with MLOps frameworks and tools such as MLflow, Kubeflow, Docker, and CI/CD platforms. Knowledge of data science techniques for EDA, feature engineering, statistical modeling, and model evaluation. Familiarity with cloud platforms (e.g., AWS, Google Cloud, Azure) for deploying and scaling AI/ML models. Soft Skills: Ability to collaborate effectively across teams and communicate complex technical concepts to non-technical stakeholders. Strong problem-solving skills and the ability to innovate in a fast-paced environment. Preferred Qualifications: Prior experience in designing and deploying large-scale generative AI models. Proficiency in SQL and data visualization tools (e.g., Tableau, Power BI). Experience with model interpretability and explainability frameworks.
Posted 1 week ago
7.0 - 12.0 years
5 - 15 Lacs
Hyderabad
Work from Office
Job Title: Tech Lead Location: Hyderabad, India. Experience: 8-10 years Employment Type: Full-time About the Role Were looking for a passionate and driven Tech Lead to take charge of our technology roadmap—owning end-to-end development, testing, and DevOps. You’ll be at the heart of our engineering team, helping translate business goals into robust, scalable, and secure technical solutions. This role offers the unique opportunity to lead from the front, mentor engineers, and help build a product that’s redefining how the manufacturing ecosystem connects and grows. Key Responsibilities Lead architecture, development, and delivery of core product modules using scalable and modular designs Set up and enforce code quality, testing standards, CI/CD pipelines, and deployment best practices Collaborate cross-functionally with product, design, and business teams to align technology with user needs Drive and evolve the DevOps infrastructure (monitoring, logging, scalability, cloud ops, and automation) Manage sprint cycles, perform code reviews, and ensure engineering KPIs are met Mentor junior engineers, foster a culture of innovation, and ensure high standards of engineering excellence Troubleshoot and resolve production issues quickly with a bias for action Keep a pulse on new technologies and continuously improve the tech stack What We’re Looking For 5–8 years of hands-on experience in full-stack development (e.g., PHP, Node.js, React, CodeIgniter etc.) Solid experience with DevOps practices (CI/CD, containerisation, cloud infra – AWS preferred) Strong grasp of testing methodologies (uit, integration, automation, regression) and tools like Jest, Selenium, etc. Proficiency with databases (MySQL, PostgreSQL, MariaDB etc.) and RESTful APIs Experience working in agile, startup-style teams, comfortable with fast & frequent iterations. Excellent communication and leadership skills; ability to mentor and drive team productivity, any team lead or project management experience is a plus. Bonus Points If You Have Experience with CodeIgniter, Laravel, or React.js Exposure to AI/ML integration or data-intensive applications Worked in a B2B SaaS environment Familiarity with microservices architecture and security protocols Why Join Us? Opportunity to build a product that’s solving real problems in a high-impact industry High ownership, zero bureaucracy, fast decision-making Work closely with the founders and leadership team ESOPs.
Posted 1 week ago
5.0 - 10.0 years
10 - 13 Lacs
Pune, Chennai, Mumbai (All Areas)
Hybrid
Hello Candidates, We are Hiring !! Job Position - Sr. Alfresco Developer Experience - 5+ Years Location- Pune , Mumbai , Chennai Work mode - Hybrid (3 days WFO ) JOB DESCRIPTION We are seeking a skilled Alfresco Developer to join our team, responsible for designing, developing, and implementing enterprise content management (ECM) solutions using Alfresco. The ideal candidate will have a strong background in Alfresco architecture, content modeling, and integration with enterprise systems, alongside DevOps experience for deployment and scalability. ________________________________________ Key Responsibilities: • Design and develop custom solutions using Alfresco Content Services, Share, and ADF. • Develop and manage content models, workflows, and security models (ACLs). • Create and manage Alfresco module packages for system customization. • Implement and maintain Records Management and Governance Services features. • Utilize Alfresco APIs for application integration and extension. • Integrate Alfresco with external enterprise applications (e.g., ERP, CRM). • Configure and optimize Solr for content indexing and search performance. • Support and enhance system security, permissions, and compliance. • Work with DevOps tools including Docker and containerization for deployment and CI/CD processes. • Administer and optimize databases including PostgreSQL or MySQL. • Monitor, troubleshoot, and resolve production issues. ________________________________________ Required Skills & Qualifications: • Strong experience with Alfresco Content Services and Alfresco Share / ADF. • Expertise in content modeling, workflow design, and security (ACLs). • Familiarity with Alfresco Governance Services and Records Management. • Proficiency with Alfresco APIs (REST, CMIS). • Solid understanding of Solr indexing and search integration. • Experience with enterprise application integration. • Hands-on experience with Docker, containers, and CI/CD pipelines. • Working knowledge of PostgreSQL or MySQL. • Excellent problem-solving skills and the ability to work independently or in a team environment. ________________________________________ Preferred Qualifications: • Alfresco Certified Engineer or Administrator (ACE / ACA) is a plus. • Experience with Kubernetes or cloud platforms (AWS, Azure) is an advantage. • Familiarity with Agile development practices. NOTE - Intrested candidates can share their resume - shrutia.talentsketchers@gmail.com
Posted 2 weeks ago
4.0 - 6.0 years
13 - 14 Lacs
Bengaluru
Work from Office
Job Title: Python Automation Engineer Experience: 4+ Years Location: Bangalore Employment Type: Full-Time Work Mode: Onsite Job Summary: We are seeking a skilled and detail-oriented Python Automation Engineer with 4+ years of experience to join our dynamic team in Bangalore. The ideal candidate should have a strong background in scripting and automation frameworks, with the ability to design, develop, and maintain automation solutions that improve efficiency and reliability of software systems. Key Responsibilities: Design and develop scalable test automation frameworks using Python. Automate functional, regression, integration, and performance test cases. Integrate automated tests into CI/CD pipelines (e.g., Jenkins, GitLab CI). Troubleshoot issues and provide detailed root cause analysis. Collaborate closely with development and QA teams to ensure quality delivery. Maintain and update existing automation scripts as per evolving application features. Contribute to continuous improvements in automation and testing processes. Required Skills: Strong hands-on experience in Python programming for automation. Experience in Selenium , Pytest , or similar test automation tools. Solid understanding of software testing lifecycle , test case design, and bug tracking. Familiarity with REST API testing using tools like Postman, Requests, etc. Experience with version control systems like Git. Exposure to CI/CD tools such as Jenkins, GitHub Actions, or GitLab CI. Good understanding of Linux/Unix commands and shell scripting. Preferred Skills: Experience in performance testing or load testing tools (e.g., JMeter). Knowledge of containerization tools like Docker . Exposure to cloud platforms like AWS , Azure , or GCP . Familiarity with Agile methodologies and working in a Scrum environment. Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related field. 4+ years of relevant experience in Python automation testing. Why Join Us? Work with a passionate team on cutting-edge technologies. Opportunity to grow your automation and DevOps skills. Flexible work environment and strong career development programs.
Posted 2 weeks ago
4.0 - 8.0 years
10 - 20 Lacs
Bengaluru
Hybrid
We are recruiting a broad-range of technical disciplines and are looking to hire T’ shaped engineers across most profiles. Most importantly you must have a good coding capability to deliver our products. Expertise with programming languages (e.g. Java, JavaScript (ReactJS Preferred), TypeScript), database technologies (SQL and NoSQL, experience with MongoDB is welcome) and release & configuration management tools (e.g. Chef, Puppet) Confident with Service Oriented and Microservice based architectures (RESTful, NodeJS, Apigee) Experience with scripting languages (Python, Shell, PowerShell) in cloud environments with a focus on Iaas, Paas, in Azure. Strong understanding of modern DevOps platform technologies (GitHub Actions) incl. infrastructure-as-code (e.g. Terraform, Ansible and containers such as Docker and Kubernetes) Experience with code quality and code security tools (i.e SonarQube, DataDog) Good knowledge of Behaviour Driven Development and test automation tooling (Jest, Jasmine, Junit, Playwright) Experience with Atlassian suite Experience with agile methodologies Experience working in a cloud native environment (Azure), good knowledge of Azure Services and infrastructure configuration & deployment Desirable: Azure Certified (AZ-204)
Posted 2 weeks ago
5.0 - 10.0 years
12 - 20 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
A strong Tosca Test automation profile is urgently needed. Minimum 7+ years of experience. Skill sets required: Strong Testing experience Experience in integrating TOSCA with CI/CD pipelines Knowledge of other test automation tools Selenium/Python knowledge Strong SQL knowledge ; ETL/DWH Testing Experience in developing automation framework from scratch Ability to drive work collaboratively within cross-functional teams Agile Methodologies
Posted 2 weeks ago
8.0 - 10.0 years
2 - 3 Lacs
Noida
Remote
Urgent Requirement for Software Architect. Exp of AWS services such as Lambda, API Gateway, S3 & design using Angular, Node.js, TypeScript, JavaScript ,Node.js ,Docker, CI/CD pipelines , Ajax, JSON,Angular 2+, Bootstrap, & RESTful APIs.
Posted 2 weeks ago
5.0 - 7.0 years
6 - 8 Lacs
Kolkata
Remote
Note: Please don't apply if you do not have at least 3 years of Scrapy experience. We are seeking a highly experienced Web Scraping Expert specialising in Scrapy-based web scraping and large-scale data extraction. This role is focused on building and optimizing web crawlers, handling anti-scraping measures, and ensuring efficient data pipelines for structured data collection. The ideal candidate will have 5+ years of hands-on experience developing Scrapy-based scraping solutions, implementing advanced evasion techniques, and managing high-volume web data extraction. You will collaborate with a cross-functional team to design, implement, and optimize scalable scraping systems that deliver high-quality, structured data for critical business needs. Key Responsibilities Scrapy-based Web Scraping Development Develop and maintain scalable web crawlers using Scrapy to extract structured data from diverse sources. Optimize Scrapy spiders for efficiency, reliability, and speed while minimizing detection risks. Handle dynamic content using middlewares, browser-based scraping (Playwright/Selenium), and API integrations. Implement proxy rotation, user-agent switching, and CAPTCHA solving techniques to bypass anti-bot measures. Advanced Anti-Scraping Evasion Techniques Utilize AI-driven approaches to adapt to bot detection and prevent blocks. Implement headless browser automation and request-mimicking strategies to mimic human behavior. Data Processing & Pipeline Management Extract, clean, and structure large-scale web data into structured formats like JSON, CSV, and databases. Optimize Scrapy pipelines for high-speed data processing and storage in MongoDB, PostgreSQL, or cloud storage (AWS S3). Code Quality & Performance Optimization Write clean, well-structured, and maintainable Python code for scraping solutions. Implement automated testing for data accuracy and scraper reliability. Continuously improve crawler efficiency by minimizing IP bans, request delays, and resource consumption. Required Skills and Experience Technical Expertise 5+ years of professional experience in Python development with a focus on web scraping. Proficiency in using Scrapy based scraping Strong understanding of HTML, CSS, JavaScript, and browser behavior. Experience with Docker will be a plus Expertise in handling APIs (RESTful and GraphQL) for data extraction. Proficiency in database systems like MongoDB, PostgreSQL Strong knowledge of version control systems like Git and collaboration platforms like GitHub. Key Attributes Strong problem-solving and analytical skills, with a focus on efficient solutions for complex scraping challenges. Excellent communication skills, both written and verbal. A passion for data and a keen eye for detail Why Join Us? Work on cutting-edge scraping technologies and AI-driven solutions. Collaborate with a team of talented professionals in a growth-driven environment. Opportunity to influence the development of data-driven business strategies through advanced scraping techniques. Competitive compensation and benefits.
Posted 2 weeks ago
7.0 - 11.0 years
15 - 20 Lacs
Bengaluru
Hybrid
The Sr. Application Services Engineer will be responsible for installing and upgrading HealthEdges award-winning applications in our private cloud, as well as customer-hosted instances. The HealthEdge Application Engineering Team is the team of technical experts at HealthEdge that runs and manages application operations for the Company’s clients. Environments are technically sophisticated, and the ideal candidate is looking to learn and work within the agile HealthEdge Deployment Methodology, custom and COTS tools we use to manage the environment, automation technologies, alerting, and others. Responsibilities : Effectively understands and can execute tactical assignments with some oversight from senior team members Implements assignments with no defects or issues; all work is delivered on time and within acceptable quality levels Responds to issues with a strong sense of urgency; works collaboratively until issues are resolved Diligent, well organized, and always focused on process improvement. Recommends and implements process improvements Effective communicator/documenter – documents process and trains team members with ease Coaches and trains junior team members timely Demonstrated expertise in some, but not all technologies deployed in the HealthRules Cloud Identifies knowledge gaps and closes them; demonstrates consistent intellectual curiously and drive to learn more and self-develop himself/herself Perform all job functions consistent with HealthEdge policies and procedures, including those which govern handling PHI and PII Automates the day-to-day manual and repeatable tasks Requirements : 7+ years of experience in developing, supporting, or deploying applications in complex environment Bachelor’s degree in computer science, MIS or related disciplines or equivalent work experience Experience training or acting in a deployment or solutions engineer, and/or system/platform engineer, technical application support, or experience in a system administration role Experience with deployment and automation tools (Ansible, Chef, Puppet) An Engineering mindset – focused on designing, implementing, and improving processes and technology Strong working knowledge and hands-on experience with major J2EE middleware tier software offerings such as; WebLogic, WebSphere, Karaf, Apache, and JBoss Strong Linux skills. To be successful, you must be comfortable on a Linux server and able to easily navigate around a server Experience with PL/SQL and related relational database technologies Ability to lead technical projects and guide other resources Ability and desire to learn new technologies and apply new skills Excellent teamwork and communication skills, both verbal and written Ability to play a vital role in high visibility assignments. The ability to communicate directly with senior management Ability to manage multiple tasks, set priorities and communicate to project teams Strong troubleshooting skills, a must Shell scripting skills are a strong plus Occasional off-hours work (including weekends) required to perform upgrades in production environments
Posted 2 weeks ago
2.0 - 3.0 years
6 - 7 Lacs
Mumbai
Work from Office
Develop and optimize Python-based microservices for OCR and image processing applications. • Design, implement, and maintain scalable solutions using GCP (Google Cloud Platform) and Kubernetes
Posted 2 weeks ago
4.0 - 9.0 years
0 - 1 Lacs
Hyderabad
Work from Office
Job Title: Software Engineer -Data Engineer Position: Software Engineer Experience: 4-9 years Category: Software Development/ Engineering Shift Timings: 1:00 pm to 10:00 pm Main location: Hyderabad Work Type: Work from office Notice Period: 0-30 Days Skill: Python, Pyspark, Data Bricks Employment Type: Full Time • Bachelor's in Computer Science, Computer Engineering or related field Required qualifications to be successful in this role Must have Skills: • 3+ yrs. Development experience with Spark (PySpark), Python and SQL. • Extensive knowledge building data pipelines • Hands on experience with Databricks Devlopment • Strong experience with • Strong experience developing on Linux OS. • Experience with scheduling and orchestration (e.g. Databricks Workflows,airflow, prefect, control-m). Good to have skills: • Solid understanding of distributed systems, data structures, design principles. • Agile Development Methodologies (e.g. SAFe, Kanban, Scrum). • Comfortable communicating with teams via showcases/demos. • Play key role in establishing and implementing migration patterns for the Data Lake Modernization project. • Actively migrate use cases from our on premises Data Lake to Databricks on GCP. • Collaborate with Product Management and business partners to understand use case requirements and reporting. • Adhere to internal development best practices/lifecycle (e.g. Testing, Code Reviews, CI/CD, Documentation) . • Document and showcase feature designs/workflows. • Participate in team meetings and discussions around product development. • Stay up to date on industry latest industry trends and design patterns. • 3+ years experience with GIT. • 3+ years experience with CI/CD (e.g. Azure Pipelines). • Experience with streaming technologies, such as Kafka, Spark. • Experience building applications on Docker and Kubernetes. • Cloud experience (e.g. Azure, Google). Interested Candidates can drop your Resume on Mail id :- " kalyan.v@talent21.in "
Posted 2 weeks ago
7.0 - 10.0 years
22 - 30 Lacs
Hyderabad
Work from Office
Key Responsibilities: Develop and maintain web applications using .NET Core and Azure technologies. Implement and manage CI/CD pipelines using Azure DevOps. Create and maintain comprehensive documentation for developed applications. Collaborate with front-end developers to integrate user-facing elements with server-side logic. Design and implement microservices architecture. Develop and manage Azure Function Apps. Work with SQL databases, PostgreSQL, and non-relational databases. Utilize Elasticsearch for search and analytics. Write and maintain QA unit tests to ensure code quality. Stay updated with emerging technologies and apply them to improve existing solutions. Work on global applications to ensure scalability and performance across different regions. Requirements: Proven experience with .NET Core and Azure technologies. Proficiency in front-end development using React, Angular, or JavaScript. Strong understanding of design patterns and OOP principles. Excellent problem-solving skills and the ability to handle tasks independently. Experience with Azure DevOps for CI/CD. Strong documentation skills and knowledge of microservices architecture and Azure Function Apps. Knowledge of Elasticsearch. Experience with SQL databases, PostgreSQL, and non-relational databases. Proficiency in using GitHub for version control. Experience in writing QA unit tests.
Posted 2 weeks ago
4.0 - 8.0 years
0 - 1 Lacs
Hyderabad
Work from Office
Job Title: Software Engineer -Data Engineer Position: Software Engineer Experience: 4-6 years (Less YOE will be Rejected) Category: Software Development/ Engineering Shift Timings: 1:00 pm to 10:00 pm Main location: Hyderabad Work Type: Work from office Notice Period: 0-30 Days Skill: Python, Pyspark, Data Bricks Employment Type: Full Time • Bachelor's in Computer Science, Computer Engineering or related field Required qualifications to be successful in this role Must have Skills: • 3+ yrs. Development experience with Spark (PySpark), Python and SQL. • Extensive knowledge building data pipelines • Hands on experience with Databricks Devlopment • Strong experience with • Strong experience developing on Linux OS. • Experience with scheduling and orchestration (e.g. Databricks Workflows,airflow, prefect, control-m). Good to have skills: • Solid understanding of distributed systems, data structures, design principles. • Agile Development Methodologies (e.g. SAFe, Kanban, Scrum). • Comfortable communicating with teams via showcases/demos. • Play key role in establishing and implementing migration patterns for the Data Lake Modernization project. • Actively migrate use cases from our on premises Data Lake to Databricks on GCP. • Collaborate with Product Management and business partners to understand use case requirements and reporting. • Adhere to internal development best practices/lifecycle (e.g. Testing, Code Reviews, CI/CD, Documentation) . • Document and showcase feature designs/workflows. • Participate in team meetings and discussions around product development. • Stay up to date on industry latest industry trends and design patterns. • 3+ years experience with GIT. • 3+ years experience with CI/CD (e.g. Azure Pipelines). • Experience with streaming technologies, such as Kafka, Spark. • Experience building applications on Docker and Kubernetes. • Cloud experience (e.g. Azure, Google). Interested Candidates can drop your Resume on Mail id :- " tarun.k@talent21.in "
Posted 2 weeks ago
12.0 - 16.0 years
6 - 15 Lacs
Pune
Hybrid
Design, build, and maintain Azure infrastructure using Infrastructure as Code (Terraform and/or Bicep). - Develop and manage CI/CD pipelines using Azure DevOps or GitHub Actions to automate build, test, and deployment processes. - Collaborate with architects, developers, and security teams to implement best practices for cloud infrastructure, security, and compliance. - Manage Azure resources (VMs, Networking, Storage, AKS, App Services, etc.) with automation and IaC. - Monitor, troubleshoot, and optimize infrastructure for performance, reliability, and cost. - Implement security controls and policies (Identity, RBAC, Key Vault, firewalls, etc.) in Azure environments. - Maintain documentation for infrastructure, procedures, and standards. - Participate in on-call rotation and incident response as needed. #Required Skills & Qualifications - Hands-on experience with Azure cloud services (IaaS, PaaS, networking, security). - Strong proficiency with Terraform and/or Bicep for infrastructure automation. - Experience with Azure DevOps, GitHub Actions, or equivalent CI/CD platforms Role & responsibilities Preferred candidate profile
Posted 2 weeks ago
2.0 - 6.0 years
3 - 6 Lacs
Chennai
Remote
Experience: 1.5 to 5 years Location: Remote Employment Type: Full-Time Job Summary: We are looking for talented Full Stack Python Developers (Junior and Senior levels) who are passionate about building scalable web applications. You will work closely with cross-functional teams to design, develop, and deliver robust enterprise solutions using modern technologies such as Python, ReactJS, AWS, and more. Responsibilities : Design, develop, test, deploy, and maintain scalable enterprise web applications. Build responsive front-end applications using ReactJS . Develop robust backend services and RESTful APIs using Python (Django / Flask / FastAPI). Work on Microservices architecture and cloud-based platforms such as AWS . Utilize Docker and Terraform for DevOps activities and infrastructure management. Participate in code reviews and Agile Scrum practices. (Senior Role) Architect solutions and ensure adherence to coding standards. (Senior Role) Mentor junior developers and contribute to technical leadership. Requirements : For Junior Developer (2 to 3 years): 2 to 3 years of experience with Python (Django / Flask / FastAPI). Experience contributing to Microservices architecture. Proficient in ReactJS , JavaScript/jQuery , CSS , HTML5 . Familiarity with Postgres , DynamoDB , SQL queries. Exposure to AWS , Docker , Terraform is a plus. Strong problem-solving and collaboration skills. Eagerness to learn and work in a fast-paced environment. For Senior Developer (3 to 5 years): 3 to 5 years of hands-on experience with Python (Django / Flask / FastAPI). Proven experience building and architecting Microservices . Proficient in ReactJS , JavaScript/jQuery , CSS , HTML5 . Strong experience with Postgres , DynamoDB , SQL queries. Hands-on experience with Terraform , Docker , AWS services. Familiarity with AWS S3 , ElasticSearch is a plus. Strong problem-solving, leadership, and communication skills. Ability to mentor junior team members and drive best practices.
Posted 2 weeks ago
3.0 - 5.0 years
10 - 14 Lacs
Gurugram, Bengaluru, Mumbai (All Areas)
Hybrid
Role & responsibilities : Design,develop, and maintain ETL workflows using Ab Initio. Manage and support critical data pipelines and data sets across complex,high-volume environments. Perform data analysis and troubleshoot issues across Teradata and Oracle data sources. Collaborate with DevOps for CI/CD pipeline integration using Jenkins, and manage deployments in Unix/Linux environments. Participate in Agile ceremonies including stand-ups, sprint planning, and roadmap discussions. Support cloud migration efforts, including potential adoption of Azure,Databricks, and PySparkbased solutions. Contribute to project documentation, metadata management (LDM, PDM), onboarding guides, and SOPs Preferred candidate profile 3 years of experience in data engineering, with proven expertise in ETL development and maintenance. Proficiency with Ab Initio tools (GDE, EME, Control Center). Strong SQL skills, particularly with Oracle or Teradata. Solid experience with Unix/Linux systems and scripting. Familiarity with CI/CD pipelines using Jenkins or similar tools. Strong communication skills and ability to collaborate with cross-functional teams.
Posted 2 weeks ago
1.0 - 3.0 years
1 - 6 Lacs
Jaipur
Hybrid
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Senior Associate - Angular Support In this role, you would be primary responsible for supporting and enhancing web applications using the Angular framework. You will work closely with our team to deliver highly performant, scalable, and visually appealing web applications, user experience across devices. Responsibilities Front-end Development: Develop dynamic and responsive web applications using Angular. UI Implementation: Translate UI/UX designs into high-quality code, ensuring consistent style across the platform. Component Development: Create reusable components and modules, focusing on maintainability and performance. API Integration: Work closely with backend developers to consume RESTful APIs and integrate server-side logic into the front-end interface. Performance Optimization: Optimize applications for speed, scalability, and responsiveness. Testing and Debugging: Write unit and integration tests using tools like Jasmine or Karma; debug and troubleshoot application issues. Collaboration: Work in an Agile environment, collaborating with designers, developers, and project managers to deliver products on time. Shift requirement: Flexible to work in morning/afternoon/evening/night shifts on a rotation basis- as the candidate may have to work with global teams in different time zones. Qualifications we seek in you! Minimum Qualifications / Skills B.E./ B. Tech/ MCA or equivalent Preferred Qualifications/ Skills Experience in front-end development with Angular. Angular Material: Experience with Angular Material or other UI component libraries. Agile Development: Familiarity with Agile methodologies (Scrum/Kanban). CI/CD: Knowledge of continuous integration/continuous delivery (CI/CD) pipelines. Cross-browser Compatibility: Experience ensuring cross-browser compatibility and web standards compliance. Angular Expertise: Proficiency in Angular, including modules, services, directives, and pipes. TypeScript: Strong knowledge of TypeScript, including experience with modern JavaScript ES6+. HTML/CSS: Solid understanding of HTML5, CSS3, and pre-processors like SASS or LESS. Responsive Design: Ability to create mobile-first designs that work well on multiple devices and screen sizes. Version Control: Experience with Git and version control best practices. RESTful Services: Understanding of how to interact with APIs and handle asynchronous calls. Reactive Programming: Familiarity with RxJS and reactive programming concepts. Testing: Experience with Angular testing frameworks such as Jasmine, Karma, or Protractor. Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 2 weeks ago
5.0 - 9.0 years
17 - 25 Lacs
Pune, Gurugram, Bengaluru
Hybrid
Desired Skills and experience Candidates should have a B.E./B.Tech/MCA/MBA in Finance, Information Systems, Computer Science or a related field Strong experience in R programming and package development Proficiency with GitHub and unit testing frameworks. Strong documentation and communication skills. A background or work experience in biostatistics or a similar discipline (Preferred). Expert knowledge in Survival Analysis (Preferred) Statistical model deployment, and end-to-end MLOps is nice to have. Having worked extensively on cloud infrastructure, preferably Databricks and Azure. Shinydevelopment is nice to have. Can work with customer stakeholders to understand business processes and workflows and can design solutions to optimize processes via streamlining and automation. DevOps experience and familiarity with software release process. Familiar with agile delivery methods. Excellent communication skills, both written and verbal Extremely strong organizational and analytical skills with strong attention to detail Strong track record of excellent results delivered to internal and external clients Able to work independently without the needs for close supervision and also collaboratively as part of cross-team efforts Experience with delivering projects within an agile environment
Posted 2 weeks ago
5.0 - 8.0 years
0 - 1 Lacs
Thane
Work from Office
As a Senior Python Developer, you will be responsible for designing, developing, and maintaining efficient and reliable Python applications. You will work collaboratively with cross-functional teams to deliver high-quality software solutions, ensure code quality, and provide technical guidance to other team members. Responsibilities Design and develop robust, scalable Python applications Collaborate with cross-functional teams to define and implement software solutions Mentor and guide junior developers to ensure adherence to coding standards Participate in code reviews to maintain high-quality codebase Identify and resolve performance and scalability issues Contribute to continuous improvement of development processes and practices Qualifications Bachelor's degree in Computer Science, Engineering, or related field 5 To 8 years of professional Python development experience Proven track record of delivering high-quality software solutions Strong understanding of software architecture and design principles Experience with version control systems such as Git Excellent problem-solving and analytical skills Strong communication and teamwork abilities knowledge of AWS will be added advantage. Skills Python Django Flask RESTful APIs SQL and NoSQL databases Git Docker AWS Test-driven development (TDD) CI/CD pipelines
Posted 2 weeks ago
5.0 - 10.0 years
20 - 35 Lacs
Bengaluru
Remote
Job Title: Senior Machine Learning Engineer Work Mode: Remote Base Location: Bengaluru Experience: 5+ Years Strong problem-solving skills and ability to work in a fast-paced, collaborative environment. Strong programming skills in Python and experience with ML frameworks. Proficiency in containerization (Docker) and orchestration (Kubernetes) technologies. Solid understanding of CI/CD principles and tools (e.g., Jenkins, GitLab CI, GitHub Actions). Knowledge of data engineering concepts and experience building data pipelines. Strong understandings on Computational, Storage and Orchestration resources on cloud platforms. Deploying and managing ML models especially on GCP (cloud platform agnostic though) services such as Cloud Run, Cloud Functions, and Vertex AI. Implementing MLOps best practices, including model version tracking, governance, and monitoring for performance degradation and drift. Creating and using benchmarks, metrics, and monitoring to measure and improve services Collaborating with data scientists and engineers to integrate ML workflows from onboarding to decommissioning. Experience with MLOps tools like Kubeflow, MLflow, and Data Version Control (DVC). Manage ML models on any of the following: AWS (SageMaker), Azure (Machine Learning), and GCP (Vertex AI). Tech Stack : Aws or GCP or Azure Experience. (More GCP Specific) must have done Py spark, Databricks is good. ML Experience, Docker and Kubernetes.
Posted 2 weeks ago
4.0 - 8.0 years
15 - 18 Lacs
Lucknow
Work from Office
Urgent Hiring for Data Engineers Job Location: Lucknow (On-Site) Exp - 4+ yrs (relevant) Salary range: 15 lpa - 18 lpa No.of open positions : 10 Immediate joiners are only required Job Overview: We are seeking experienced Data Engineer to join our team. As a Data Engineer, you will be responsible for designing, building, and maintaining large-scale data systems and pipelines. You will work closely with our data science team to prepare data for prescriptive and predictive modeling, ensuring high-quality data outputs. Key Responsibilities: - Analyze and organize raw data from various sources - Build and maintain data systems and pipelines - Prepare data for prescriptive and predictive modeling - Combine raw information from different sources to generate valuable insights - Enhance data quality and reliability Requirements : - 4+ years of experience as a Data Engineer or in a similar role - Technical expertise in data models, data mining, and segmentation techniques - Experience with Cloud data technologies (Azure Data Factory, Databricks) - Knowledge of CI/CD pipelines and Jenkins - Strong programming skills in Python - Hands-on experience with SQL databases
Posted 2 weeks ago
9.0 - 11.0 years
15 - 30 Lacs
Noida, Bengaluru
Work from Office
Job description Location: Noida/Bangalore Experience:9+years Position Overview We are seeking a highly skilled Lead AWS DevOps Engineer with an emphasis on AWS cloud technologies and working knowledge of Azure. This role requires deep expertise in architecting, deploying, and managing cloud infrastructure for a variety of applications, with a strong focus on front-end workloads. Proficiency with critical AWS servicesincluding VPC, Lambda, Elastic Load Balancing, Route 53, CloudFront, EC2, RDS, and S3—is essential. You will lead the design of scalable, secure, and resilient environments, mentor junior engineers, and collaborate across teams. Key Responsibilities Architect, implement, and maintain AWS infrastructure, focusing on: Elastic Load Balancing (ELB): Configure and manage Application Load Balancers (ALB), Network Load Balancers (NLB), and Classic Load Balancers to distribute traffic efficiently, maximize application availability, and prevent server overloads. Amazon Route 53: Manage DNS routing, domain registration, and health checks; implement advanced routing policies (weighted, latency-based, failover, geolocation) to ensure high availability, low latency, and seamless integration with AWS resources. CloudFront: Optimize content delivery and caching for front-end applications. EC2: Provision, monitor, and scale compute resources. RDS: Manage and optimize relational databases i.e. RDS Aurora, MySql and Postgres DB. AWS Lambda: Absolute expertise with AWS Lambda with Layers and ECR integrations. VPC: Design secure, scalable network topologies, including subnets, security groups, and peering. Lead the support of existing applications, migration, deployment, and scaling of front-end, back-end and data applications in the cloud. Develop and maintain infrastructure-as-code (e.g., Terraform, CloudFormation). Automate CI/CD pipelines and streamline release processes. Monitor, troubleshoot, and optimize system performance and costs. Integrate and support Azure services as needed for hybrid or multi-cloud scenarios. Mentor and support DevOps team members, fostering a culture of continuous improvement. Stay current with cloud trends and emerging technologies to drive innovation. Required Skills and Experience 10+ years of hands-on DevOps experience, with at least 3 years in a lead or senior hands-on role. Deep expertise in AWS, including: Elastic Load Balancing (ALB, NLB, CLB): advanced traffic routing, health checks, and high-availability configurations. Route 53: DNS management, custom routing policies, domain registration, health checks, and integration with other AWS services. CloudFront, EC2, RDS, VPC, Baston host. Hosting Front end applications using EC2 and containers Overseeing cost optimization for current applications and infrastructure Working knowledge of Azure cloud services and integration/migration strategies. Strong proficiency with infrastructure-as-code tools (Terraform, AWS CloudFormation). Experience with CI/CD tools (e.g., Jenkins, GitHub Actions, AWS CodePipeline). Proficient in scripting languages (Python, Bash, or similar). Solid understanding of networking, security best practices, and cloud cost optimization. Experience supporting front-end application deployments and troubleshooting performance issues. Excellent communication, leadership, and team collaboration skills. Preferred Qualifications AWS and/or Azure certifications (e.g., AWS Solutions Architect Professional, DevOps Engineer, Any Azure certification good to have) [Must have]. Experience with containerization (Docker, ECS, EKS, AKS) [Must have]. GitHub and SonarQube integration experience is required. Familiarity with monitoring and logging tools (CloudWatch, Datadog, Prometheus). Prior experience in a multi-cloud or hybrid environment [Must have]. Education Bachelor's degree in computer science, Engineering, or a related field (or equivalent experience). This role is ideal for a proactive DevOps leader who thrives in dynamic environments and is passionate about leveraging AWS and Azure technologies to deliver robust, scalable and cost-effective solutions for business applications.
Posted 3 weeks ago
10.0 - 15.0 years
15 - 30 Lacs
Pune
Hybrid
We are looking for a highly skilled Senior Software Engineer with 10+ years of hands-on experience in Java/J2EE, Microservices, Cloud technologies, and DevOps practices. The ideal candidate has a proven track record in modernizing enterprise applications, migrating monolithic systems to microservices, and leading high-impact Proof of Concepts (POCs). You will work closely with cross-functional teams to design and build scalable, resilient, and high-performance applications while mentoring junior engineers and promoting software engineering best practices. Key Responsibilities: Design, develop, and maintain enterprise-level applications using Java 11/17, Spring Boot 3.x, and RESTful APIs Architect and implement microservices-based systems and lead migration from monolithic architectures Integrate with Kafka for event-driven architecture and messaging Implement security protocols using Keycloak for authentication and authorization Deploy and manage applications on Azure Cloud and Red Hat OpenShift Containerize applications with Docker and orchestrate using Kubernetes Optimize application performance and ensure scalability and high availability Collaborate with QA teams for unit, integration, and performance testing using JUnit, Mockito, Cucumber, JMeter, etc. Participate in CI/CD pipelines setup and enhancement using GitLab CI/CD, Jenkins, UrbanCode, and Bitbucket Provide technical leadership and mentor team members in best practices and coding standards Key Skills Technologies: Programming Frameworks: Java 11/17, JavaScript, TypeScript Spring Boot 3.x, Spring MVC, JPA, Hibernate Node.js, Angular 7+ Cloud DevOps: Azure Cloud, Red Hat OpenShift Docker, Kubernetes GitLab CI/CD, Jenkins, UrbanCode, Bitbucket Messaging Event Streaming: Apache Kafka Security: Keycloak, WebSocket-based secure communication Databases: Oracle 9i/11/12g, PostgreSQL, MS SQL Server MongoDB (NoSQL) Testing Tools: JUnit 5, Mockito, Cucumber, JMeter, Postman, SoapUI, Fiddler IDEs Tools: IntelliJ, Spring Tool Suite (STS), Git, Maven, Gradle Servers OS: WebLogic, Apache Tomcat Windows, Linux Preferred Qualifications: Strong understanding of Domain-Driven Design (DDD). Experience in performance tuning and enterprise-level application scaling. Proven track record of leading development teams and mentoring engineers. Ability to work in Agile environments and contribute to continuous improvement. Why Join Us Work with cutting-edge technologies and cloud-native solutions. Opportunity to make a significant impact on high-visibility projects. Collaborative team culture and professional growth opportunities. Flexible work arrangements.
Posted 3 weeks ago
5.0 - 10.0 years
6 - 16 Lacs
Hyderabad, Chennai, Bengaluru
Hybrid
Job Summary We are seeking a highly skilled and experienced DevOps Engineer with a specialization in Azure to join our team The ideal candidate will be responsible for designing implementing and managing DevOps processes that leverage Azures cloud platform This role requires a deep understanding of DevOps practices CICD pipelines and cloud infrastructure along with handson experience in Azure Key Responsibilities Design and develop scalable DevOps processes using Azure services Collaborate with crossfunctional teams to understand infrastructure requirements and translate them into effective DevOps workflows Implement CICD pipelines to ensure code is accurately and efficiently deployed into Azure environments Optimize and tune Azure environments for performance and cost efficiency Ensure infrastructure security and compliance with industry standards and regulations Provide technical guidance and mentorship to junior DevOps engineers and analysts Stay uptodate with the latest trends and best practices in DevOps and Azure Qualifications Proven experience as a DevOps Engineer with a focus on Azure Strong knowledge of DevOps practices CICD pipelines and cloud infrastructure Handson experience with Azure services such as Azure DevOps Azure Kubernetes Service AKS Azure Functions and Azure App Services Proficiency in scripting languages such as PowerShell Bash or Python Excellent problemsolving skills and attention to detail Strong communication and collaboration skills Good to have Experience with other DevOps tools and platforms such as Jenkins GitLab or Terraform Certification in Azure or related technologies Skills Mandatory Skills : Ansible, ARM, Azure AKS, Azure App Service, Azure DevOps, Docker, Git, Kubernetes, PowerShell, Python, Terraform Role & responsibilities
Posted 3 weeks ago
9.0 - 14.0 years
18 - 33 Lacs
Bengaluru
Work from Office
OpenShift Administrator 9+ Years Location : Bangalore Company : HCLTech Experience : 9 to 13 Years Employment Type : Full-Time | Permanent About the Role : HCLTech is looking for a skilled OpenShift Administrator to support enterprise container platform environments. This role requires deep expertise in OpenShift cluster management, automation, and DevOps toolchains. Key Responsibilities : Install, configure, and administer Red Hat OpenShift (v4.x) clusters. Handle day-to-day operations, patching, upgrades, and incident resolution. Automate cluster operations using Ansible, Terraform, and Helm. Collaborate with developers and DevOps teams for containerization enablement. Implement security policies, network configurations, and monitoring solutions. Required Skills : 9+ years of IT experience with minimum 3 years in OpenShift/Kubernetes administration. Strong understanding of container orchestration, CI/CD pipelines. Experience with RHEL/Linux administration, DNS, firewalls, and load balancers. Familiarity with Git, Jenkins, Prometheus, Grafana, and logging tools. Preferred Certification : Red Hat Certified Specialist in OpenShift Administration (EX280) How to Apply : Send your resume to charu.g@hcltech.com with the subject line: “OpenShift Admin – Bangalore – 9+ Years”
Posted 3 weeks ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20183 Jobs | Dublin
Wipro
10025 Jobs | Bengaluru
EY
8024 Jobs | London
Accenture in India
6531 Jobs | Dublin 2
Amazon
6260 Jobs | Seattle,WA
Uplers
6244 Jobs | Ahmedabad
Oracle
5916 Jobs | Redwood City
IBM
5765 Jobs | Armonk
Capgemini
3771 Jobs | Paris,France
Tata Consultancy Services
3728 Jobs | Thane