Home
Jobs

14 Nosql Database Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 7.0 years

7 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Shift: (GMT+05:30) Asia/Kolkata (IST) What do you need for this opportunity Must have skills required: Java, Spring, System Design, Micro services Infrrd is Looking for: Someone who has: 5-8 years of experience in developing backend for web-based applications using Java. Mentoring 4+ members team at offshore and should have worked on projects where onsite-offshore coordination is required and the delivery model is agile SCRUM for a development project. Excellent understanding of Core Java and Spring framework (Spring, Spring MVC, Spring Data). Ability to work hands-on while development / coding and strong debugging and problem solving skills. Solid command with data structures. Experience with any NoSQL database. Experience with System design and microservices architecture. Working experience on tools like: Jira, Any Java IDE, GitHub Good understanding of Kubernetes and AWS DevOps. What you will do: Should provide strong technical anchorship and be a primary Gatekeeper of the Java team. Design and architect scalable solutions with high-level and low-level design specifications. Conduct code reviews, ensure code quality, implement code automation, and mentor team members on best practices and design principles. Ensure the deliverables are of the highest quality in terms of functional and technical aspects through scrum process. Ensure the in-sprint defects are closed with the highest quality and any production defects are taken care of for a specific duration post release. Work along with the PM, Architect and the teams to manage the team and the deliverables from technical and functional aspects. Contribute individually, as well as provide team mentoring and guidance at appropriate instances.

Posted 10 hours ago

Apply

6.0 - 8.0 years

25 - 27 Lacs

Hyderabad, Chennai

Work from Office

Naukri logo

Proficient in Nest.js Strong experience with Angular. Experience with RESTful API development and integration MongoDB or any NoSQL database knowledge Familiarity with authentication and authorization (JWT, OAuth) Good understanding of server-side templating and error handling DevOps basics (CI/CD, Docker, etc.) is a plus Strong debugging and troubleshooting skills

Posted 4 days ago

Apply

6.0 - 8.0 years

25 - 27 Lacs

Gurugram, Bengaluru

Work from Office

Naukri logo

Proficient in Nest.js Strong experience with Angular. Experience with RESTful API development and integration MongoDB or any NoSQL database knowledge Familiarity with authentication and authorization (JWT, OAuth) Good understanding of server-side templating and error handling DevOps basics (CI/CD, Docker, etc.) is a plus Strong debugging and troubleshooting skills

Posted 4 days ago

Apply

6.0 - 11.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Position Overview: We are seeking an experienced and skilled Senior Database Developer to join our dynamic team. The ideal candidate will have at least 8 years of hands-on experience in database development, with a strong focus on Neo4j (graph) databases. The role involves working on cutting-edge projects, contributing to data modelling, and ensuring the scalability and efficiency of our database systems. Responsibilities : Design, develop, and maintain databases, with a primary focus on Cypher/graph databases. Modify databases according to requests and perform tests. Advanced Query, performance tuning of databases and optimization of database systems. Solve database usage issues and malfunctions. Analyze all databases and monitor them for all design specifications and prepare associated test strategies. Evaluate and engineer efficient backup-recovery processes for various databases. Promote uniformity of database-related programming effort by developing methods and procedures for database programming Remain current with the industry by researching available products and methodologies to determine the feasibility of alternative database management systems, communication protocols, middleware, and query tools. Liaise with developers to improve applications and establish best practices. Ensure the performance, security, and scalability of database systems. Develop and optimize PL/SQL queries for efficient data storage and retrieval. Implement and maintain data models, ensuring accuracy and alignment with business needs. Train, mentor and motivate the junior team members. Contribute to assessing the teams performance evaluation. Stay updated on emerging database technologies and contribute to continuous improvement initiatives. Skills Required: 6+ years work experience as a Database developer Bachelor's or master's degree in computer science, Engineering, or a related field. Proficiency in Neo4j (graph) databases is mandatory. Strong experience with PL/SQL, data modeling, and database optimization techniques. Why us? Impactful Work: Your contributions will play a pivotal role in ensuring the quality and reliability of our platform. Professional Growth: We believe in investing in our employees' growth and development. You will have access to various learning resources, books, training programs, and opportunities to enhance your technical skills & expand your knowledge Collaborative Culture: We value teamwork and collaboration. You will work alongside talented professionals from diverse backgrounds, including developers, product managers, and business analysts, to collectively solve challenges and deliver exceptional software. Benefits: Health insurance covered for you and your family. Quarterly team outing, twice a month team lunch & personal and professional learning development session. Top performers win a chance on an international trip completely sponsored by the company.

Posted 5 days ago

Apply

1.0 - 6.0 years

3 - 8 Lacs

Hyderabad

Work from Office

Naukri logo

What we expect of you Role Description: We are looking for an Associate Data Engineer with deep expertise in writing data pipelines to build scalable, high-performance data solutions. The ideal candidate will be responsible for developing, optimizing and maintaining complex data pipelines, integration frameworks, and metadata-driven architectures that enable seamless access and analytics. This role prefers deep understanding of the big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Data Engineer who owns development of complex ETL/ELT data pipelines to process large-scale datasets Contribute to the design, development, and implementation of data pipelines, ETL/ELT processes, and data integration solutions Ensuring data integrity, accuracy, and consistency through rigorous quality checks and monitoring Exploring and implementing new tools and technologies to enhance ETL platform and performance of the pipelines Proactively identify and implement opportunities to automate tasks and develop reusable frameworks Eager to understand the biotech/pharma domains & build highly efficient data pipelines to migrate and deploy complex data across systems Work in an Agile and Scaled Agile (SAFe) environment, collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental value Use JIRA, Confluence, and Agile DevOps tools to manage sprints, backlogs, and user stories. Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycle Collaborate and communicate effectively with the product teams, with cross-functional teams to understand business requirements and translate them into technical solutions What we expect of you Must-Have Skills: Experience in Data Engineering with a focus on Databricks, AWS, Python, SQL, and Scaled Agile methodologies Proficiency & Strong understanding of data processing and transformation of big data frameworks (Databricks, Apache Spark, Delta Lake, and distributed computing concepts) Strong understanding of AWS services and can demonstrate the same Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery, and DevOps practices Good-to-Have Skills: Data Engineering experience in Biotechnology or pharma industry Exposure to APIs, full stack development Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and DevOps Education and Professional Certifications Bachelors degree and 2 to 5 + years of Computer Science, IT or related field experience OR Masters degree and 1 to 4 + years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.

Posted 1 week ago

Apply

6.0 - 8.0 years

8 - 10 Lacs

Hyderabad

Work from Office

Naukri logo

Role Description: We are looking for highly motivated expert Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain complex ETL/ELT data pipelines in Databricks using PySpark, Scala, and SQL to process large-scale datasets Understand the biotech/pharma or related domains & build highly efficient data pipelines to migrate and deploy complex data across systems Design and Implement solutions to enable unified data access, governance, and interoperability across hybrid cloud environments Ingest and transform structured and unstructured data from databases (PostgreSQL, MySQL, SQL Server, MongoDB etc.), APIs, logs, event streams, images, pdf, and third-party platforms Ensuring data integrity, accuracy, and consistency through rigorous quality checks and monitoring Expert in data quality, data validation and verification frameworks Innovate, explore and implement new tools and technologies to enhance efficient data processing Proactively identify and implement opportunities to automate tasks and develop reusable frameworks Work in an Agile and Scaled Agile (SAFe) environment, collaborating with cross-functional teams, product owners, and Scrum Masters to deliver incremental value Use JIRA, Confluence, and Agile DevOps tools to manage sprints, backlogs, and user stories. Support continuous improvement, test automation, and DevOps practices in the data engineering lifecycle Collaborate and communicate effectively with the product teams, with cross-functional teams to understand business requirements and translate them into technical solutions Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Data Engineering experience in Biotechnology or pharma industry Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications Any Degree and 6-8 years of experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.

Posted 1 week ago

Apply

9.0 - 12.0 years

9 - 12 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Role Description: We are looking for highly motivated expert Senior Data Engineer who can own the design & development of complex data pipelines, solutions and frameworks. The ideal candidate will be responsible to design, develop, and optimize data pipelines, data integration frameworks, and metadata-driven architectures that enable seamless data access and analytics. This role prefers deep expertise in big data processing, distributed computing, data modeling, and governance frameworks to support self-service analytics, AI-driven insights, and enterprise-wide data management. Roles & Responsibilities: Design, develop, and maintain scalable ETL/ELT pipelines to support structured, semi-structured, and unstructured data processing across the Enterprise Data Fabric. Implement real-time and batch data processing solutions, integrating data from multiple sources into a unified, governed data fabric architecture. Optimize big data processing frameworks using Apache Spark, Hadoop, or similar distributed computing technologies to ensure high availability and cost efficiency. Work with metadata management and data lineage tracking tools to enable enterprise-wide data discovery and governance. Ensure data security, compliance, and role-based access control (RBAC) across data environments. Optimize query performance, indexing strategies, partitioning, and caching for large-scale data sets. Develop CI/CD pipelines for automated data pipeline deployments, version control, and monitoring. Implement data virtualization techniques to provide seamless access to data across multiple storage systems. Collaborate with cross-functional teams, including data architects, business analysts, and DevOps teams, to align data engineering strategies with enterprise goals. Stay up to date with emerging data technologies and best practices, ensuring continuous improvement of Enterprise Data Fabric architectures. Must-Have Skills: Hands-on experience in data engineering technologies such as Databricks, PySpark, SparkSQL Apache Spark, AWS, Python, SQL, and Scaled Agile methodologies. Proficiency in workflow orchestration, performance tuning on big data processing. Strong understanding of AWS services Experience with Data Fabric, Data Mesh, or similar enterprise-wide data architectures. Ability to quickly learn, adapt and apply new technologies Strong problem-solving and analytical skills Excellent communication and teamwork skills Experience with Scaled Agile Framework (SAFe), Agile delivery practices, and DevOps practices. Good-to-Have Skills: Good to have deep expertise in Biotech & Pharma industries Experience in writing APIs to make the data available to the consumers Experienced with SQL/NOSQL database, vector database for large language models Experienced with data modeling and performance tuning for both OLAP and OLTP databases Experienced with software engineering best-practices, including but not limited to version control (Git, Subversion, etc.), CI/CD (Jenkins, Maven etc.), automated unit testing, and Dev Ops Education and Professional Certifications 9 to 12 years of Computer Science, IT or related field experience AWS Certified Data Engineer preferred Databricks Certificate preferred Scaled Agile SAFe certification preferred Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills Ability to work effectively with global, virtual teams High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Ability to learn quickly, be organized and detail oriented. Strong presentation and public speaking skills.

Posted 1 week ago

Apply

1.0 - 3.0 years

3 - 5 Lacs

Hyderabad

Work from Office

Naukri logo

What you will do You will play a key role as part of Operations Generative AI (GenAI) Product team to deliver cutting edge innovative GEN AI solutions across various Process Development functions(Drug Substance, Drug Product, Attribute Sciences & Combination Products) in Operations functions. Role Description: As a Full Stack Sr Associate Software Engineer, you will contribute to the development and maintenance of our GEN AI web applications across various Process Development functions(Drug Substance, Drug Product, Attribute Sciences & Combination Products) in Operations, Youll be working on both front-end and back-end technologies. This role is ideal for recent graduates or early-career professionals looking to gain hands-on experience in software development. Roles & Responsibilities: Develop and maintain based front-end applications using modern web frameworks (React, Angular, Fast API). Build and maintain back-end services using languages like Python, Java, or Node.js. Collaborate with the design and product teams to understand user needs and translate them into technical requirements. Write clean, efficient, and well-tested code. Participate in code reviews and provide constructive feedback. Maintain system uptime and optimal performance Learn and adapt to new technologies and industry trends like Prompt Engineering, AI tools and Retrieval-augmented generation (RAG) frameworks What we expect of you We are all different, yet we all use our unique contributions to serve patients. The [vital attribute] professional we seek is a [type of person] with these qualifications. Basic Qualifications: Masters degree and 1 to 3 years of Experience in Full Stack Software Engineering, Web development OR Bachelors degree and 3 to 5 years of Experience in Full Stack Software Engineering, Web development OR Diploma and 7 to 9 years of Experience in Full Stack Software Engineering, Web development Cloud Computing certificate preferred Functional Skills: Experienced with API integration, serverless, microservices architecture. Experience in AWS, SQL/NOSQL database, vector database for large language models Experience with popular large language models like OPEN AI Experience with language models and frameworks like Langchain or llamaIndex Experience with prompt engineering, model fine tuning Experience with DevOps CICD build and deployment pipeline Experience with design patterns, data structures, test-driven development. Preferred Qualifications: Professional Certifications: AWS, Data Science Certifications(preferred) Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills.

Posted 1 week ago

Apply

8.0 - 12.0 years

18 - 20 Lacs

Pune, Chennai, Coimbatore

Work from Office

Naukri logo

Node.js & NoSQL systems to design and maintain scalable APIs and real-time data pipelines. API development, data integration, and cloud infrastructure, leveraging Apache Kafka and GCP to build robust, event-driven backend systems. Required Candidate profile Exp in APIs using Node.js and TypeScript. API integration solutions (REST, GraphQL, webhooks). manage applications using Docker, Kubernetes, and GCP. Strong Node.js, TypeScript, &JavaScript (ES2019+)

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Chennai

Hybrid

Naukri logo

Duration: 8Months Work Type: Onsite Position Description: Looking for qualified Data Scientists who can develop scalable solutions to complex real-world problems using Machine Learning, Big Data, Statistics, and Optimization. Potential candidates should have hands-on experience in applying first principles methods, machine learning, data mining, and text mining techniques to build analytics prototypes that work on massive datasets. Candidates should have experience in manipulating both structured and unstructured data in various formats, sizes, and storage-mechanisms. Candidates should have excellent problem-solving skills with an inquisitive mind to challenge existing practices. Candidates should have exposure to multiple programming languages and analytical tools and be flexible to using the requisite tools/languages for the problem at-hand. Skills Required: Machine Learning, GenAI, LLM Skills Preferred: Python, Google Cloud Platform, Big Query Experience Required: 3+ years of hands-on experience in using machine learning/text mining tools and techniques such as Clustering/classification/decision trees, Random forests, Support vector machines, Deep Learning, Neural networks, Reinforcement learning, and other numerical algorithms Experience Preferred: 3+ years of experience in at least one of the following languages: Python, R, MATLAB, SAS Experience with GoogleCloud Platform (GCP) including VertexAI, BigQuery, DBT, NoSQL database and Hadoop Ecosystem Education Required: Bachelor's Degree

Posted 1 month ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Tamil Nadu

Work from Office

Naukri logo

Duration: 12Months Position Description: Serve as a core member of the secure coding product team that enables the design, development, and creation of secure coding practices Develop application software and RESTful services using GCP and Spring Framework. Experience building distributed, service oriented, cloud micro service-based architectures. Use of Test-Driven Development and code pairing/mobbing practices. Develop components across all tiers of the application stack. Continuously integrates and deploys developed software. Modify CI/CD pipeline and scripts as necessary to improve continuous integration practices. Consults with product manager to identify minimal viable product and decomposes features by story slicing. Collaborate with other product teams on integrations, testing, and deployments Skills Required: React, JavaScript, Application Support, Big Query, Application Testing, Application Design, Coding, Angular, SPRING, Application Development, Developer, Java, Web Services Experience Required: Experience in cloud services engineering, including Pivotal Cloud Foundry (GCP' J-Frog, GitHub, Spring, Angular), RESTful services, CI/CD pipeline (Tekton or similar). Experience with Swagger, logging/tracing, Conformance, Dynatrace, Spring security, and SonarQube Understanding Spring Cloud Data, Spring Security, OAuth, Service monitoring on Cloud Experience in application testing, release management, and support activities. Experience with various Software Development Life Cycle methods such as Agile Experience Preferred: 4+ years of development experience (Purchasing/Automotive industry experience a plus) preferably utilizing Java, Spring, Angular, React, Web Services, etc. 3 years of experience designing and building technical solutions using Java technologies such as Spring, Spring Boot, Web Services, Microservice Architecture etc. Comprehensive understanding of relational database (Microsoft SQL Server), PostgreSQL, NoSQL database and flat file processing concepts Strong knowledge in design patterns and principles, experience in developing web services, REST APIs, and related architectures Exposure to automated testing concepts, tools, and frameworks Excellent communications skills - ability to engage in deep technical discussions with customers and peers and become a trusted technical advisor Education Required: Bachelor's Degree

Posted 1 month ago

Apply

2.0 - 7.0 years

12 - 17 Lacs

Mumbai, Mumbai Suburban, Thane

Work from Office

Naukri logo

Data Analysis, SQL, R, Python, Tableau, Power BI,

Posted 1 month ago

Apply

8 - 12 years

22 - 25 Lacs

Navi Mumbai

Work from Office

Naukri logo

Key Responsibilities 10 years of proven experience as software architect with strong development background. Experience in software development and coding in various languages such as C#, .NET, Java, Java/JEE, Spring, JSON, XML, REST, NoSQL Database (MongoDB, Cassandra), Distributed System, RDBMS, ePub3, MVC frameworks, Design Patterns, Javascript, JQuery, GWT, JSON, node.js ecosystem. Experience in RESTful APIs Architecture, architecture patterns, and integration architecture. Develop high-level product specifications with attention to system integration and feasibility Use tools and methodologies to create representations for functions and user interface of desired product Development of high-level system design diagrams. Analyze user requirements and convert requirements to design documents Communicate successfully all concepts and guidelines to development team Ensure software meets all requirements of quality, security, modifiability, extensibility etc. With initial design oversee progress of development team to ensure consistency. Has good experience in SDLC and end to end process from requirements gathering, designing, developing, testing, integration and deployment. Design and update software database. This includes, but is not limited to, Software Applications, Web Sites, Data Communication Processes, and User Interfaces Excellent knowledge of software and application design and architecture. Drive improvements to the development process Good to have - Excellent knowledge of UML and other modeling methods A technical mindset with great attention to detail Outstanding communication and presentation abilities Technical Knowledge: A software architect needs to know the most popular programming languages and frameworks (or those used in your organization). They also have to understand the pros and cons of different software patterns to make the best decisions based on requirements, capabilities, and resources. Development of technical specifications and plans Manages the system design and should be able to identify risks timely. Should also apply their knowledge, expertise, and experience to find the best solution. Strong analytical, problem-solving, and decision-making skills. Understand emerging web and mobile development models Experience debugging distributed systems with high data loads. Experience with automated testing and CICD tools. Experience developing software utilizing workflow or ESB software Good to Have - Experience in LAMP (at least any one - Linux, Apache, MySQL, PHP/Python/Perl) and/or server-side Java programming with Web 1.0 MVC frameworks and experience in JEE technologies like Servlets, JSPs, EJBs, Web Services. Knowledge of cloud technologies: Ability to choose an appropriate tool and determine when to use it Cloud native technologies and their applications. Working knowledge of APIs, Microservices, Cloud, Design, Architecture, Integration, and worked on at least 2. Documentation and Reporting: Maintain comprehensive documentation of architectural designs, decisions, and changes. Provide regular updates and reports to senior management on architectural initiatives and progress. Soft Skills: Communicate effectively and professionally in all forms of communication with internal and external customers Possess strong problem solving and decision-making skills while using good judgment Good team player, mentors junior developers, and advices and coaches them

Posted 1 month ago

Apply

5 - 10 years

0 - 0 Lacs

Hyderabad

Work from Office

Naukri logo

Job Description: DevOps Engineer Qualifications: - Bachelors or Masters degree in Computer Science or Computer Engineering. - 4 to 8 years of experience in DevOps. Key Skills and Responsibilities: - Passionate about continuous build, integration, testing, and delivery of systems. - Strong understanding of distributed systems, APIs, microservices, and cloud computing. - Experience in implementing applications on private and public cloud infrastructure. - Proficient in container technologies such as Kubernetes, including experience with public clouds like AWS, GCP, and other platforms through migrations, scaling, and day-to-day operations. - Hands-on experience with AWS services (VPC, EC2, EKS, S3, IAM, etc.) and Elastic Beanstalk. - Knowledge of source control management (Git, GitHub, GitLab). - Hands-on experience with Kafka for data streaming and handling microservices communication. - Experience in managing Jenkins for CI/CD pipelines. - Familiar with logging tools and monitoring solutions. - Experience working with network load balancers (Nginx, Netscaler). - Proficient with KONG API gateways, Kubernetes, PostgreSQL, NoSQL databases, and Kafka. - Experience with AWS S3 buckets, including policy management, storage, and backup using S3 and Glacier. - Ability to respond to production incidents and take on-call responsibilities. - Experience with multiple cloud providers and designing applications accordingly. - Skilled in owning and operating mission-critical, large-scale product operations (provisioning, deployment, upgrades, patching, and incidents) on the cloud. - Strong commitment to ensuring high availability and scalability of production systems. - Continuously raising the standard of engineering excellence by implementing best DevOps practices. - Quick learner with a balance between listening and taking charge. Responsibilities: - Develop and implement tools to automate and streamline operations. - Develop and maintain CI/CD pipeline systems for application development teams using Jenkins. - Prioritize production-related issues alongside operational team members. - Conduct root cause analysis, resolve issues, and implement long-term fixes. - Expand the capacity and improve the performance of current operational systems. Regards Mohammed Umar Farooq HR Recruitment Team Revest Solutions 9949051730

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies