Jobs
Interviews
16 Job openings at Carnation Infotech
Azure Devops Engineer

Lucknow, Bengaluru

5 - 8 years

INR 8.0 - 16.0 Lacs P.A.

Hybrid

Full Time

Job Title: DevOps Engineer (Azure) Location: Bangalore & Lucknow Hybrid Word Mode JD DevOps Engineer (Azure) Preferred Skills CI/CD pipeline Azure pipeline [if we have people with knowledge of Jenkins/Code pipeline/ GitLab those can also be presented] IaC Mainly Terraform knowledge is expected [alternatives from AWS DevOp world like Cloud formation/ARM templates or Bicep also would work. If the person has knowledge in Ansible, Puppet or Chef those can also be considered as cross-platform knowledge update will be easy for such resources] Secrets Management – Azure Key Vault IAM policies – Azure AD, RBAC Log monitoring – Azure monitor , Log Analytics Source Control – Azure Repos/GitHub Knowledge of containerization (Docker, Kubernetes on Azure AKS) Other nice to have skills Understanding of networking, firewalls, security groups, and IAM (Identity and Access Management) concepts Scripting skills (Bash, PowerShell, Python, or similar).

NodeJs Backend Developer

Udaipur

5 years

INR 10.0 - 25.0 Lacs P.A.

On-site

Full Time

Required Skills: ● Extensive experience with Node.js and related frameworks such as Express.js. ● Proficiency in source control management systems and continuous integration/deployment environments. ● Strong understanding of agile development methodologies including Kanban and Scrum. ● Experience with multi-threading, concurrency, and performance optimization in Node.js applications. ● Solid debugging and performance profiling skills. ● Comprehensive knowledge of object-oriented and service-oriented application development techniques. ● Ability to work independently and as part of a team, demonstrating strong initiative and problem-solving skills. ● Excellent communication and interpersonal skills, with a focus on collaboration and team success. Responsibilities : ● Lead and contribute to multiple development projects, ensuring successful delivery and adherence to quality standards. ● Gather solution requirements, develop detailed technical specifications, and collaborate closely with customers and users. ● Work across various technology domains, applying your expertise to solve complex business challenges. ● Mentor and guide junior developers, fostering a culture of continuous learning and professional growth. ● Participate in the interviewing process to help scale the company's engineering talent. ● Provide technical leadership to the team, offering coaching and mentorship to ensure the delivery of high-quality solutions. ● Establish and enforce best practices for software development within the team. ● Collaborate with cross-functional teams including software developers, business analysts, and architects to plan, design, develop, test, and maintain web-based applications. ● Assist in the collection and documentation of user requirements, development of user stories, estimates, and work plans. ● Design, develop, and unit test applications in accordance with established standards. ● Participate in peer reviews of solution designs and related code, ensuring adherence to best practices and coding standards. ● Develop and refine integrations between applications, optimizing performance and scalability. ● Troubleshoot and resolve technical and application issues, providing third-level support to business users as needed. ● Continuously assess opportunities for application and process improvement, documenting and sharing recommendations with the team. ● Stay up-to-date with the latest industry trends and technologies, researching and evaluating new software products as required. Job Types: Full-time, Permanent Pay: ₹1,000,000.00 - ₹2,500,000.00 per year Location Type: In-person Schedule: Monday to Friday Ability to commute/relocate: Udaipur, Rajasthan: Reliably commute or planning to relocate before starting work (Preferred) Experience: Node.js: 5 years (Preferred) Work Location: In person

Data Engineer (AWS/Azure) | Full-Time with MNC | Pan India

Noida, Pune, Bengaluru

5 - 10 years

INR 15.0 - 25.0 Lacs P.A.

Hybrid

Full Time

Job description Key Responsibilities: Data Pipeline Development & Optimization: Design, develop, and maintain scalable and high-performance data pipelines using PySpark and Databricks . Ensure data quality, consistency, and security throughout all pipeline stages. Optimize data workflows and pipeline performance, ensuring efficient data processing. Cloud-Based Data Solutions: Architect and implement cloud-native data solutions using AWS services (e.g., S3 , Glue , Lambda , Redshift ), GCP ( DataProc , DataFlow ), and Azure ( ADF , ADLF ). Work on ETL processes to transform, load, and process data across cloud platforms. SQL & Data Modeling: Utilize SQL (including windowing functions) to query and analyze large datasets efficiently. Work with different data schemas and models relevant to various business contexts (e.g., star/snowflake schemas, normalized, and denormalized models). Data Security & Compliance: Implement robust data security measures, ensuring encryption, access control, and compliance with industry standards and regulations. Monitor and troubleshoot data pipeline performance and security issues. Collaboration & Communication: Collaborate with cross-functional teams (data scientists, software engineers, and business stakeholders) to design and integrate end-to-end data pipelines. Communicate technical concepts clearly and effectively to non-technical stakeholders. Domain Expertise: Understand and work with domain-related data, tailoring solutions to address the specific business needs of the customer. Optimize data solutions for the business context, ensuring alignment with customer requirements and goals. Mentorship & Leadership: Provide guidance to junior team members, fostering a collaborative environment and ensuring best practices are followed. Drive innovation and promote a culture of continuous learning and improvement within the team. Required Qualifications: Experience : 6-8 years of total experience in data engineering, with 3+ years of hands-on experience in Databricks , PySpark , and AWS . 3+ years of experience in Python and SQL for data engineering tasks. Experience working with cloud ETL services such as AWS Glue , GCP DataProc/DataFlow , Azure ADF and ADLF . Technical Skills : Strong proficiency in PySpark for large-scale data processing and transformation. Expertise in SQL , including window functions, for data manipulation and querying. Experience with cloud-based ETL tools (AWS Glue, GCP DataFlow , Azure ADF ) and understanding of their integration with cloud data platforms. Deep understanding of data schemas and models used across various business contexts. Familiarity with data warehousing optimization techniques , including partitioning, indexing, and query optimization. Knowledge of data security best practices (e.g., encryption, access control, and compliance). Agile Methodologies : Experience working in Agile (Scrum or Kanban) teams for iterative development and delivery. Communication : Excellent verbal and written communication skills, with the ability to explain complex technical concepts to non-technical stakeholders. Skills Python,Databricks,Pyspark,Sql

Salesforce QA | Full-Time | Lucknow & Bangalore | Hybrid

Lucknow, Bengaluru

4 - 9 years

INR 15.0 - 30.0 Lacs P.A.

Hybrid

Full Time

Job Title: Salesforce QA Employment Type: Full-Time Job Summary: We are seeking two experienced Offshore QA Engineers to join our Salesforce team. These individuals will play a critical role in ensuring the quality and reliability of all changes before they are deployed to our production environment. As QA gatekeepers, they will be responsible for comprehensive testing, documentation, and coordination with development teams to support high-quality, timely releases. Key Responsibilities: Serve as QA gatekeepers for all changes being promoted to the Production environment. Perform manual and automated testing of Salesforce features, integrations, and customizations. Conduct regression, functional, and load/performance testing across environments. Collaborate closely with developers to understand the scope of changes and identify potential risk areas. Provide QA sign-off before changes are moved to User Acceptance Testing (UAT) and production. Participate in Knowledge Transfer (KT) sessions from vendors for new project initiatives. Proactively gather requirements and test data to validate business-critical functionalities. Maintain comprehensive documentation of test cases, test results, defects, and test cycles. Identify opportunities and implement test automation to optimize the QA process. Ensure timely defect tracking, reporting, and resolution in coordination with development teams. Qualifications: 4+ years of experience in Quality Assurance , preferably in Salesforce or CRM environments. Strong understanding of Salesforce architecture, customizations, and workflows . Experience with QA tools such as Selenium, JIRA, TestRail, or similar platforms. Familiarity with test automation frameworks and scripting. Proven ability to manage QA activities independently and proactively. Excellent communication skills and ability to collaborate across global teams. Strong attention to detail and analytical thinking. Nice to Have: Salesforce certifications (e.g., Salesforce Administrator, Platform App Builder, etc.) Experience in Agile/Scrum methodologies. Load testing tools like JMeter or similar.

JAMF Engineer | Permanent Remote | 6Months-1Year extendable Contract

Lucknow

3 - 8 years

INR 10.0 - 20.0 Lacs P.A.

Remote

Full Time

Job Title: JAMF Engineer Job Overview: We are seeking an experienced JAMF Engineer with 36 years of hands-on experience managing Apple devices at scale using JAMF. The role involves providing Tier 2 and Tier 3 support, scripting, and collaborating with US and Europe-based technical teams. Candidates should be well-versed in BASH scripting, familiar with iOS, and possess strong communication and collaboration skills. Responsibilities: Provide Tier 2 and 3 support for JAMF Pro environments and escalated Tier 1 support from the client. Collaborate with the technical team to automate solutions and create quick fixes in Self Service. Develop custom scripts and solutions to manage Apple devices effectively. Meet regularly with the technical team to ensure all support needs are met. Assist with installation, monitoring, and configuration of JAMF tools including JAMF Connect, JAMF Protect, and JAMF Infrastructure Manager. Qualifications: Must-Have: 3–6 years of experience managing Apple devices with JAMF. Strong scripting experience (BASH required; Python or AppleScript a plus). Proficiency in JAMF Connect / Protect, macOS lifecycle management, device enrollment (DEP/ABM), VPP, and MDM profiles. Experience in maintaining secure macOS environments in regulated industries (HIPAA, SOC 2). Preferred: JAMF 400 certification (preferred) or JAMF 300 certification (plus). Familiarity with Zero Trust frameworks and Apple security features. Contributions to tools like JAMF Toolkit or open-source Mac Admin utilities. Experience with Python and AppleScript scripting. Technical Skills & Tools: JAMF Connect, JAMF Protect, JAMF Infrastructure Manager. Bash scripting; Python and AppleScript (preferred). Self Service; JAMF Toolkit. Soft Skills: Excellent written and verbal communication skills. Ability to collaborate with stakeholders and remote teams. Self-starter with a proactive and detail-oriented approach. Strong team player with presentation skills. Familiarity with G-Suite and remote work etiquette. Additional Information: Work requires time overlap until 1 PM EST to coordinate with US and Europe-based teams. Mac administration knowledge is a plus.

Data Engineer with DBT and Airflow | Permanent Remote / WFH | Contract

Lucknow

6 - 11 years

INR 15.0 - 30.0 Lacs P.A.

Remote

Full Time

Job Title: Data Engineer (DBT & Airflow) Type: Contract (8 hrs/day) Experience: 6+ years Location: Remote/WFH Duration: 3 - 6 months ( Possibility of extension) Job Summary: We are seeking an experienced Data Engineer with strong expertise in DBT and Apache Airflow to join our team on a contract basis. The ideal candidate will have a proven track record of building scalable data pipelines, transforming raw datasets into analytics-ready models, and orchestrating workflows in a modern data stack. You will play a key role in designing, developing, and maintaining data infrastructure that supports business intelligence, analytics, and machine learning initiatives. Key Responsibilities: Design, build, and maintain robust data pipelines and workflows using Apache Airflow Develop and manage modular, testable, and well-documented SQL models using DBT Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements Implement and monitor data quality checks, alerts, and lineage tracking Work with cloud-based data warehouses such as Snowflake, BigQuery, or Redshift Optimize ETL/ELT processes for performance and scalability Participate in code reviews, documentation, and process improvement initiatives Required Qualifications: 6+ years of professional experience in data engineering or ETL development Strong hands-on experience with DBT (Data Build Tool) for data transformation Proven experience designing and managing DAGs using Apache Airflow Advanced proficiency in SQL and working with cloud data warehouses (Snowflake, BigQuery, Redshift, etc.) Solid programming skills in Python Experience with data modeling, data warehousing, and performance tuning Familiarity with version control systems (e.g., Git) and CI/CD practices Strong problem-solving skills and attention to detail

Java Developer | Full-Time | Lucknow & Bangalore | Hybrid

Lucknow, Bengaluru

5 - 10 years

INR 15.0 - 30.0 Lacs P.A.

Hybrid

Full Time

Position Name: Java Developer (Java & Microservices) Job Description Overview: We are looking for a highly motivated and experienced Backend Developer with a strong background in Java, Spring Boot, and Microservices architecture. The ideal candidate is not only technically sound but also demonstrates a strong sense of ownership, attention to detail, and a proactive approach to problem-solving. This role requires someone who understands business requirements deeply and translates them into clean, scalable, and efficient backend solutions. Key Responsibilities: Design, develop, and maintain scalable backend services using Java and Spring Boot. Build and optimize RESTful APIs and microservices for high performance and availability. Work with relational databases, such as MySQL and PostgreSQL, to model data structures and write efficient queries. Containerize applications using Docker and support deployment and troubleshooting across environments. Collaborate closely with Product, frontend, QA, and DevOps teams to ensure cohesive and timely delivery. Understand business logic, take ownership of feature development end-to-end, and ensure delivery aligns with business goals. Conduct thorough testing, debugging, and performance tuning of backend components. Contribute to technical discussions, code reviews, and the improvement of best practices. Must-Have Skills: Proficiency in Java, Spring Boot, and microservices-based development. Strong experience with MySQL and PostgreSQL databases. Hands-on knowledge of Docker and container-based development workflows. Solid understanding of RESTful APIs and service integration. Strong analytical and problem-solving skills. Clear communication skills and the ability to collaborate with cross-functional teams. Ownership mindset with the ability to drive solutions independently. Attention to detail and high standards for code quality and maintainability. Business acumen to understand the why’ behind features and implement logic accordingly. Good to Have: Experience with Microsoft Azure and Azure Kubernetes Service (AKS). Exposure to CI/CD pipelines, DevOps practices, and monitoring tools. Familiarity with agile methodologies and project tracking tools like JIRA.

GenAI Data Engineer || 6 Months Extendable Contract || Remote

Bengaluru

10 - 20 years

INR 20.0 - 35.0 Lacs P.A.

Remote

Full Time

As the data engineering consultant, you should have the common traits and capabilities that are listed Essential Requirements and meet many of the capabilities listed in Desirable Requirements Essential Requirements and Skills 10+ years working with customers in the Data Analytics, Big Data and Data Warehousing field. 10+ years working with data modeling tools. 5+ years building data pipelines for large customers. 2+ years of experience working in the field of Artificial Intelligence that leverages Big Data. This should be in a customer-facing services delivery role. 3+ years of experience in Big Data database design. A good understanding of LLMs, prompt engineering, fine tuning and training. Strong knowledge of SQL, NoSQL and Vector databases. Experience with popular enterprise databases such as SQL Server, MySQL, Postgres and Redis is a must. Additionally experience with popular Vector Databases such as PGVector, Milvus and Elasticsearch is a requirement. Experience with major data warehousing providers such as Teradata. Experience with data lake tools such as Databricks, Snowflake and Starburst. Proven experience building data pipelines and ETLs for both data transformation and multiple data source data extraction. Experience with automation of the deployment and execution of these pipelines. Experience with tools such as Apache Spark, Apache Hadoop, Informatica and similar data processing tools. Proficient knowledge of Python and SQL is a must. Proven experience with building test procedures, ensuring the quality, reliability, performance, and scalability of the data pipelines. Ability to develop applications that expose Restful APIs for data querying and ingestion. Experience preparing training data for Large Language Model ingestion and training (e.g. through vector databases). Experience with integrating with RAG solutions and leveraging related tools such as Nvidia Guardrails. Ability to define and implement metrics for RAG solutions. Understanding of typical AI tooling ecosystem including knowledge and experience of Kubernetes, MLOps, LLMOps and AIOps tools. Ability to gain customer trust, ability to plan, organize and drive customer workshops. Good communication skills in English is a must. The ability to work in a highly efficient team using an Agile methodology such as Scrum or Kanban. Ability to have extended pairing sessions with customers, enabling knowledge transfers in complex domains. Ability to influence and interact with confidence and credibility at all levels within the Dell Technologies companies and with our customers, partners, and vendors. Experience working on project teams within a defined methodology while adhering to margin, planning and SOW requirements. Ability to be onsite during customer workshops and enablement sessions. Desirable Requirements and Skills Knowledge of industry widespread AI Studios and AI Workbenches is a plus. Experience building and using Information Retrieval (IR) frameworks to support LLM inferencing. Working knowledge of Linux is a plus. Knowledge of using Minio is appreciated. Experience using Lean and Iterative Deployment Methodologies. Working knowledge of cloud technologies is a plus. University Degree aligned to Data Engineering is a plus. In possession of relevant industry certifications e.g. Databricks Certified Data Engineer, Microsoft Certifications, etc.

Sr .NET Fullstack Developer | Full-Time with MNC | PAN India | Hybrid

Noida, Pune, Bengaluru

7 - 12 years

INR 17.0 - 32.0 Lacs P.A.

Hybrid

Full Time

Job Title: Sr .NET Fullstack Developer Experience: 7+ Years Location: Bangalore, Chennai, Pune, Noida, Kochi, Trivandrum, Gurgaon (3 Days WFO & 2 days WFO) Employment Type: Full-Time Job Overview: We are looking for a .NET Fullstack Developer with expertise in Angular to join our growing technology team. The ideal candidate is a self-starter and go-getter with strong communication skills and the ability to engage with multiple stakeholders. You should have a proven track record of delivering scalable and reliable software applications, along with hands-on experience across backend and frontend technologies. Key Responsibilities: Design, develop, and maintain scalable web applications using .NET Core , SQL and Angular (v8+) . Build and consume RESTful APIs and ensure integration with front-end interfaces. Collaborate with cross-functional teams to gather and understand business requirements. Optimize application performance and maintain clean, modular, and reusable code. Implement responsive, interactive UIs using Angular , RxJS , and Angular Material . Develop and optimize complex SQL queries and ensure efficient database performance. Must-Have Skills: Backend .NET Core: Strong proficiency in .NET Core for building robust and scalable applications. Deep understanding of object-oriented programming (OOP) and asynchronous programming . Familiarity with best practices in architecture and design patterns. Frontend – Angular: Expertise in modern Angular (version 8 or higher) . Proficient in building responsive UIs using Angular Material and RxJS . Database – SQL: Solid experience with relational databases like Microsoft SQL Server , or MySQL . Ability to write and optimize complex SQL queries . Understanding of database design principles and performance tuning techniques .

Performance Test Engineer | Full-Time with MNC | Kochi, Trivandrum

Kochi, Thiruvananthapuram

5 - 10 years

INR 6.0 - 16.0 Lacs P.A.

Hybrid

Full Time

Job Title: Performance Test Engineer Experience: 5+ Years Location: Trivandrum / Cochin (3 days WFO & 2 Days WFH) Job Description: We are looking for a skilled Performance Test Engineer with over 5 years of experience in performance testing to join our team in Trivandrum or Cochin. The ideal candidate will be responsible for evaluating the responsiveness, stability, scalability, and speed of applications to ensure optimal performance under various conditions. Key Responsibilities: Develop and execute load, stress, scalability, and endurance tests to assess application performance. Analyze test results to identify performance bottlenecks such as high resource usage, slow response times, and system failures. Provide detailed recommendations for performance improvements. Create and maintain automated performance test scripts using tools like JMeter and LoadRunner . Track and monitor key performance metrics such as response time, throughput, CPU and memory usage. Collaborate with developers, system admins, and other stakeholders to troubleshoot and resolve performance issues. Document and report test results with detailed analysis, highlighting potential risks and optimization strategies. Work with DevOps teams to integrate performance testing into CI/CD pipelines for continuous validation. Present performance trends and improvement suggestions clearly to stakeholders. Required Skills: Performance Testing JMeter LoadRunner Skills Performance Testing,Jmeter,Load Runner

AVP - New Business Development (India Market)

Lucknow, Bengaluru

15 - 24 years

INR 15.0 - 30.0 Lacs P.A.

Work from Office

Full Time

Were looking for a growth-driven AVP – New Business Development to lead and do hands-on client acquisition and revenue expansion across India. The ideal candidate will have a strong network in IT/ITES, proven experience in project-based solutions and staff augmentation, and the ability to drive CXO-level conversations. Key Responsibilities Drive new client acquisition for IT projects and staffing services (contract & full-time). Build and manage a strong sales pipeline across sectors like BFSI, Retail, Healthcare, etc. Lead proposal development, RFP responses, and pricing discussions. Partner with delivery & recruitment teams to ensure client success and repeat business. Requirements 15+ years in IT sales/business development, with deep exposure to project and staffing services. Strong B2B client network and experience closing large enterprise deals in India. Excellent communication, negotiation, and relationship-building skills. Working knowledge of CRMs and staffing sales cycles. MBA or relevant degree preferred Candidates can share the resume at shahnawaz.khan@cardinaltsinc.net

Data Engineer || 6 Months Extendable Contract || Carnation Infotech

Bengaluru

6 - 10 years

INR 10.0 - 17.0 Lacs P.A.

Remote

Full Time

Job Summary: We are looking for a highly skilled Data Engineer with 6+ years of experience to join our team on a contract basis. The ideal candidate will have a strong background in data engineering with deep expertise in DBT (Data Build Tool) and Apache Airflow for building robust data pipelines. You will be responsible for designing, developing, and optimizing data workflows to support analytics and business intelligence initiatives. Key Responsibilities: Design, build, and maintain scalable and reliable data pipelines using DBT and Airflow . Collaborate with data analysts, data scientists, and business stakeholders to understand data requirements. Optimize data transformation workflows to improve efficiency, quality, and maintainability. Implement and maintain data quality checks and validation logic. Monitor pipeline performance, troubleshoot failures, and ensure timely data delivery. Develop and maintain documentation for data processes, models, and flows. Work with cloud data warehouses (e.g., Snowflake , BigQuery , Redshift , etc.) for storage and transformation. Support ETL/ELT jobs and integrate data from multiple sources (APIs, databases, flat files). Ensure best practices in version control, CI/CD, and automation of data workflows. Required Skills and Experience: 6+ years of hands-on experience in data engineering . Strong proficiency with DBT for data modeling and transformation. Experience with Apache Airflow for orchestration and workflow scheduling. Solid understanding of SQL and relational data modeling principles. Experience working with modern cloud data platforms (e.g., Snowflake , BigQuery , Databricks , or Redshift ). Proficiency in Python or similar scripting language for data manipulation and automation. Familiarity with version control systems like Git and collaborative development workflows. Experience with CI/CD tools and automated testing frameworks for data pipelines. Excellent problem-solving skills and ability to work independently. Strong communication and documentation skills. Nice to Have: Experience with streaming data platforms (e.g., Kafka, Spark Streaming). Knowledge of data governance, security, and compliance best practices. Experience with dashboarding tools (e.g., Looker, Tableau, Power BI) for data validation. Exposure to agile development methodologies. Contract Terms: Commitment: Full-time, 8 hours/day Duration: 6 months, with possible extension Location: Remote

Servicenow Developer || 6 Months Extendable Contract || Remote

Bengaluru

5 - 10 years

INR 8.5 - 18.5 Lacs P.A.

Remote

Full Time

The incumbent should have an overall experience of 5-8 years in engineering & development role with minimum 4.5+ years specific to specific to ServiceNow software suite, in the below technical areas: • Good Understanding of development and implementation of as a service” and cloud platforms and products. Experience of Development of Customized SaaS Platforms Previous experience implementing and configuring the ServiceNow software suite in one or more of the following modules: Incident Management, Problem Management, Request Management, Change Management, Performance Analytics, Asset Management, Release Management, Asset Management, IT Cost Management, Discovery, Orchestration, ITOM ServiceNow Administrator-level training and / or equivalent hands-on experience In-depth understanding and hands-on application of Service Management business processes. ITIL/ITOM knowledge is a plus. • Good understanding of the ServiceNow Data Model High-level understanding of the ServiceNow integration (with other systems / applications) architecture, in particular API’s Experience working with a diverse functional and technical team on multiple concurrent streams / tracks of work • Has experience functioning as a Developer/Senior Developer in multiple ServiceNow implementations Ability to build & present solutions to management.

Head of Engineering | Full-Time with MNC | Hybrid -Trivandrum

Thiruvananthapuram

18 - 25 years

INR 35.0 - 50.0 Lacs P.A.

Hybrid

Full Time

PFB Head of Engineering role JD and details: Skill Delivery manager who is from Java or .Net Development background, technical architecture, engineering processes, cloud infrastructure, leadership skill, very good communication , strong tech knowledge in .Net/Java, DevOps, agile process, Handing 150+ team size , Setting up ODC from Scratch . Location Trivandrum Experience – 18+ Years Notice Period – 0 to 15 days Job Description:- Engineering Leader –We are looking for a dynamic Engineering Leader to drive technical excellence, mentor engineers, and lead the development of scalable, high-performance solutions. This role requires a strong technical background, leadership skills, and the ability to collaborate across teams to deliver impactful products. Key Responsibilities: Technical Leadership: • Define and drive the technical vision, architecture, and roadmap for engineering teams. • Ensure best practices in software development, security, scalability, and maintainability. • Advocate for and implement modern engineering processes, CI/CD, and DevOps culture. Team Leadership & Mentorship: • Lead and mentor a team of engineers, fostering a culture of innovation, collaboration, and continuous improvement. • Set clear goals, provide feedback, and support career growth for engineers. • Hire, onboard, and retain top engineering talent. Project & Delivery Management: • Work closely with project/delivery managers, designers, and stakeholders to define project scope, timelines, and priorities. • Drive the execution of engineering initiatives with a focus on quality, performance, and reliability. • Balance technical debt, feature development, and business needs effectively. Stakeholder Collaboration: • Be the POC for the client’s engineering leadership team • Partner with cross-functional teams, including Product, Design, QA, and Business teams, to align engineering efforts with company goals. • Communicate technical challenges and solutions effectively to both technical and non-technical stakeholders. Required Qualifications: • 12+ years of experience in software development, with at least 4+ years in a leadership role. • Strong expertise in the below tech stack Java/.Net, JavaScript Framework, Azure DevOps, Python, Docker/Kubernetes • Experience with agile methodologies and modern DevOps practices. • Proven track record of delivering complex software projects on time and within scope. • Strong problem-solving, decision-making, and communication skills. Preferred Qualifications: • Experience scaling engineering teams in a high-growth environment. • Deep knowledge of system design, microservices architecture, and cloud infrastructure. • Prior experience in mentoring and developing engineering talent. • Familiarity with compliance, security best practices, and regulatory requirements

JAVA Developer || Carnation Infotech || Night Shift || Lucknow/Remote

Lucknow, Bengaluru

5 - 9 years

INR 7.0 - 14.0 Lacs P.A.

Hybrid

Full Time

Work Timings: 6:30 PM to 3:30 AM Java/JEE/Jakarta EE: Core Java, Multithreading, Concurrency, Collections, OOP Microservices: MicroProfile, Open Liberty, RESTful APIs (JAX-RS), JSONB/P Messaging: Apache Kafka (Producers, Consumers, Streams) Caching: Redis (Cache Management, Data Structures) Database: JDBC, SQL, Data Source Configuration, Transaction Management Web Technologies: WebSockets, Servlets, JSP Frontend Development: JavaScript, JSP Frameworks: ReactJS, React Native, Bootstrap, JQuery Libraries: jQuery Web Fundamentals: HTML5, CSS3, JSON, XML DevOps & Cloud: Containerization/Orchestration: Docker, Kubernetes, OpenShift CI/CD: Quickbuild or similar

SRE DevOps & Security | Full-Time with MNC | Pan India | Hybrid

Noida, Pune, Bengaluru

5 - 10 years

INR 20.0 - 35.0 Lacs P.A.

Hybrid

Full Time

Experience 5 to 9 years Note – Security experience is mandatory for this role, DevsecOps is a preferred. Notice Period – Only Immediate. Job Description Any Cloud experience (Azure/AWS) Terraform, Kubernetes, Any CI/CD tool Security and code quality tools: WiZ, Snyk, Qualys, Mend, Checkmarx, Dependabot, etc. (Experience with any of these tools) Secret management: Hashicorp vault / Akeyless Role & responsibilities

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Job Titles Overview