Home
Jobs

226 Lambda Jobs - Page 7

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4 - 6 years

6 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for Site Reliability Engineer! Youll make a difference by: SRE L1 Commander is responsible for ensuring the stability, availability, and performance of critical systems and services. As the first line of defense in incident management and monitoring, the role requires real-time response, proactive problem solving, and strong coordination skills to address production issues efficiently. Monitoring and Alerting: Proactively monitor system health, performance, and uptime using monitoring tools like Datadog, Prometheus. Serving as the primary responder for incidents to troubleshoot and resolve issues quickly, ensuring minimal impact on end-users. Accurately categorizing incidents, prioritize them based on severity, and escalate to L2/L3 teams when necessary. Ensuring systems meet Service Level Objectives (SLOs) and maintain uptime as per SLAs. Collaborating with DevOps and L2 teams to automate manual processes for incident response and operational tasks. Performing root cause analysis (RCA) of incidents using log aggregators and observability tools to identify patterns and recurring issues. Following predefined runbooks/playbooks to resolve known issues and document fixes for new problems. Youd describe yourself as: Experienced professional with 4 to 6 years of relevant experience in SRE, DevOps, or Production Support with monitoring tools (e.g., Prometheus, Datadog). Working knowledge of Linux/Unix operating systems and basic scripting skills (Python, Gitlab actions) cloud platforms (AWS, Azure, or GCP). Familiarity with container orchestration (Kubernetes, Docker, Helmcharts) and CI/CD pipelines. Exposure with ArgoCD for implementing GitOps workflows and automated deployments for containerized applications. Possessing experience in Monitoring: Datadog, Infrastructure: AWS EC2, Lambda, ECS/EKS, RDS, Networking: VPC, Route 53, ELB and Storage: S3, EFS, Glacier. Strong troubleshooting and analytical skills to resolve production incidents effectively. Basic understanding of networking concepts (DNS, Load Balancers, Firewalls). Good communication and interpersonal skills for incident communication and escalation. Having preferred certifications: AWS Certified SysOps Administrator Associate, AWS Certified Solutions Architect Associate or AWS Certified DevOps Engineer Professional

Posted 1 month ago

Apply

10 - 15 years

15 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Experience: 10 Years Work Type: Onsite Budget: As per market standards Primary Skills NodeJS 6+ years of hands-on backend development JavaScript / HTML / CSS Strong frontend development capabilities ReactJS / VueJS Working knowledge or project experience preferred AWS Serverless Architecture Mandatory (Lambda, API Gateway, S3) LLM Integration / AI Development Experience with OpenAI, Anthropic APIs Prompt Engineering Context management and token optimization SQL / NoSQL Databases Solid experience with relational & non-relational DBs EndToEnd Deployment Deploy, debug, and manage full-stack apps Clean Code Writes clean, maintainable, production-ready code Secondary Skills Amazon Bedrock Familiarity is a strong plus Web Servers Experience with Nginx / Apache configuration RAG Patterns / VectorDBs / AIAgents Bonus experience Software Engineering Best Practices Strong design & architecture skills CI/CD / DevOps Exposure Beneficial for full pipeline integration Expectations Own frontend and backend development Collaborate closely with engineering and client teams Build scalable, secure, and intelligent systems Influence architecture and tech stack decisions Stay up-to-date with AI trends and serverless best practices

Posted 1 month ago

Apply

5 - 9 years

15 - 18 Lacs

Mumbai, Pune, Bengaluru

Work from Office

Naukri logo

Description:Hands-on experience with AWS services including S3, Lambda,Glue, API Gateway, and SQS.Strong skills in data engineering on AWS, with proficiency in Python ,pyspark & SQL.Experience with batch job scheduling and managing data dependencies.Knowledge of data processing tools like Spark and Airflow.Automate repetitive tasks and build reusable frameworks to improve efficiency.Provide Run/DevOps support and manage the ongoing operation of data services. Location - Bangalore, Mumbai, Pune, Chennai, Kolkata, Hyderabad

Posted 1 month ago

Apply

5 - 10 years

15 - 27 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Role & responsibilities 4+ years of overall experience with at least 3 years of experience in AWS Data Engineering • Implement ETL processes to extract, transform, and load data from various sources into data lakes or data warehouses. • Hands-on experience implementing data ingestion, ETL and data processing using AWS Glue, Spark, and Python: • Proficiency in AWS services related to data engineering, such as AWS IAM, S3, SQS, SNS, Glue, Lambda, Athena, RDS, and CloudWatch • Strong knowledge of SQL (e.g., joins and aggregations) and experience with relational databases • Monitor and troubleshoot data pipeline issues to ensure smooth operation. • CI/CD Pipelines: Good experience of CI/CD tools and pipelines, particularly Bitbucket, with experience in automating deployment processes. Experience in delivering AI/ML-related projects, including model development, data preprocessing, and deployment in production environments

Posted 1 month ago

Apply

4 - 9 years

22 - 25 Lacs

Mohali, Chandigarh

Work from Office

Naukri logo

With a deep focus on TypeScript, they are developing both the frontend (React/Next.js) and backend (Node.js/NestJS) of the platform. In addition, they are also responsible for the cloud infrastructure, including the setup of CI/CD pipelines, deployment processes, and monitoring via AWS services. They bring a solution-oriented mindset, strong communication skills (English), and a high level of ownership and reliability to the teamcontributing across all areas of the product development lifecycle. Language TypeScript (Fullstack) Frontend React (Next.js) - Tailwind CSS or MUI Backend Node.js with Express or NestJS Mobile (optional) React Native (Expo) Cloud AWS (e.g. EC2, ECS, Fargate, Lambda, S3, RDS) DevOps Docker, GitHub Actions (CI/CD), Monitoring Database PostgreSQL, Redis, DynamoDB Soft-Skills - Clear and proactive communication across time zones - Critical thinking and solution-oriented mindset - High reliability and ownership delivering consistent results - Collaborative working style that fits seamlessly into our startup culture - Continuous learning mentality, keeping up with best practices They dont just write code they contribute to product thinking, help us move fast without breaking things, and are a core part of our extended team.

Posted 1 month ago

Apply

10 - 15 years

15 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Experience: 10 Years Work Type: Onsite Budget: As per market standards Primary Skills NodeJS 6+ years of hands-on backend development JavaScript / HTML / CSS Strong frontend development capabilities ReactJS / VueJS Working knowledge or project experience preferred AWS Serverless Architecture Mandatory (Lambda, API Gateway, S3) LLM Integration / AI Development Experience with OpenAI, Anthropic APIs Prompt Engineering Context management and token optimization SQL / NoSQL Databases Solid experience with relational & non-relational DBs End To End Deployment Deploy, debug, and manage full-stack apps Clean Code Writes clean, maintainable, production-ready code Secondary Skills Amazon Bedrock Familiarity is a strong plus Web Servers Experience with Nginx / Apache configuration RAG Patterns / Vector DBs / AI Agents Bonus experience Software Engineering Best Practices Strong design & architecture skills CI/CD / DevOps Exposure Beneficial for full pipeline integration Expectations Own frontend and backend development Collaborate closely with engineering and client teams Build scalable, secure, and intelligent systems Influence architecture and tech stack decisions Stay up-to-date with AI trends and serverless best practices

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Mumbai

Work from Office

Naukri logo

We seek a seasoned Lead Full-Stack Engineer with 7+ years of expertise in front-end and back-end technologies and strong leadership and soft skills. The ideal candidate should be proficient in Angular, NodeJS, and Python, with additional experience in AWS, GoLang, and AI integration. This role requires availability to work within the EST time zone and will involve leading the development of innovative enterprise platforms while mentoring a team of engineers. Key Responsibilities : - Technical Leadership : Lead web application design, development, and maintenance using Angular and NodeJS. - Architect and optimize backend services and APIs using Python and GoLang. - Guide the integration of AI-driven features and tools into existing platforms. - Oversee the use of AWS cloud services for application deployment, scaling, and maintenance. - Monitor and optimize application performance on AWS. - Stay updated with emerging technologies and implement them in the development process. Team Collaboration and Soft Skills : - Mentorship : Mentor and guide junior engineers, fostering a collaborative and learning-oriented environment. - Communication : Effectively communicate technical concepts to both technical and non-technical stakeholders. - Problem-Solving : Proactively identify and resolve bottlenecks, ensuring smooth project delivery. Skills in the spotlight : - Front-End : Mastery in Angular for building dynamic user interfaces. - Back-End : Expertise in NodeJS and Python for developing scalable microservices. - Cloud : Extensive hands-on experience with AWS services (EC2, S3, Lambda, etc.). - Programming Languages : Advanced skills in GoLang for backend services. - AI : Proven experience in integrating AI/ML models into web or mobile applications. - DevOps : Strong knowledge of CI/CD pipelines and version control using Git. - Agile Methodologies : Proficiency in leading projects within Agile/Scrum environments. Preferred Qualifications : - Knowledge of additional JavaScript frameworks and libraries. - Experience with database management systems like MongoDB, PostgreSQL, or MySQL. - Familiarity with Docker and Kubernetes for containerization. - Understanding of secure coding practices and data protection.

Posted 1 month ago

Apply

7 - 10 years

8 - 14 Lacs

Surat

Work from Office

Naukri logo

Role: Lead Full Stack Engineer We seek a seasoned Lead Full-Stack Engineer with 7+ years of expertise in front-end and back-end technologies and strong leadership and soft skills. The ideal candidate should be proficient in Angular, NodeJS, and Python, with additional experience in AWS, GoLang, and AI integration. This role requires availability to work within the EST time zone and will involve leading the development of innovative enterprise platforms while mentoring a team of engineers. Key Responsibilities : - Technical Leadership : Lead web application design, development, and maintenance using Angular and NodeJS. - Architect and optimize backend services and APIs using Python and GoLang. - Guide the integration of AI-driven features and tools into existing platforms. - Oversee the use of AWS cloud services for application deployment, scaling, and maintenance. - Monitor and optimize application performance on AWS. - Stay updated with emerging technologies and implement them in the development process. Team Collaboration and Soft Skills : - Mentorship : Mentor and guide junior engineers, fostering a collaborative and learning-oriented environment. - Communication : Effectively communicate technical concepts to both technical and non-technical stakeholders. - Problem-Solving : Proactively identify and resolve bottlenecks, ensuring smooth project delivery. Skills in the spotlight : - Front-End : Mastery in Angular for building dynamic user interfaces. - Back-End : Expertise in NodeJS and Python for developing scalable microservices. - Cloud : Extensive hands-on experience with AWS services (EC2, S3, Lambda, etc.). - Programming Languages : Advanced skills in GoLang for backend services. - AI : Proven experience in integrating AI/ML models into web or mobile applications. - DevOps : Strong knowledge of CI/CD pipelines and version control using Git. - Agile Methodologies : Proficiency in leading projects within Agile/Scrum environments. Preferred Qualifications : - Knowledge of additional JavaScript frameworks and libraries. - Experience with database management systems like MongoDB, PostgreSQL, or MySQL. - Familiarity with Docker and Kubernetes for containerization. - Understanding of secure coding practices and data protection. Location : India

Posted 1 month ago

Apply

14 - 19 years

45 - 50 Lacs

Chennai

Remote

Naukri logo

Location: India/Remote About the Job: We are seeking a seasoned Java Full Stack Enterprise Architect with 14 to 19 years of experience to lead and drive enterprise-level projects. The ideal candidate will have strong expertise in Java , AWS (Amazon Web Services) , Kafka , Docker , Kubernetes , and other cutting-edge technologies. This role requires experience in application transformation, modernization , and containerization initiatives. Prior experience in the healthcare industry is highly desirable. What you will do: Architectural Leadership Design and implement scalable, resilient, and secure full-stack solutions using Java and modern frameworks. Provide end-to-end architecture guidance for enterprise transformation and modernization projects. Define best practices for application design, development, deployment, and maintenance in a cloud-native environment. Cloud and AWS Solutioning Architect solutions leveraging AWS services (e.g., EC2, S3, Lambda, RDS, DynamoDB). Develop and maintain cloud migration strategies, ensuring high availability and cost optimization. Create detailed documentation, including solution designs and architectural diagrams. Containerization & Orchestration Lead the adoption of Docker and Kubernetes to containerize applications. Oversee the orchestration of microservices in distributed systems to ensure scalability and reliability. Define CI/CD pipelines to automate deployment processes. Data Streaming & Integration Design and implement event-driven architectures using Kafka. Ensure seamless integration across enterprise systems and data pipelines. Transformation & Modernization Drive legacy application modernization to microservices and cloud-native architecture. Assess current technology stack and recommend improvements to align with business goals. Team Collaboration Mentor engineering teams, fostering a culture of innovation and continuous improvement. Collaborate with cross-functional teams, including product managers, developers, and business stakeholders. Who you are: Education Bachelors degree in Computer Science, Engineering, or a related field (Masters degree preferred). 14+ years of experience in Java Full Stack Architecture. Technical Skills: Core Expertise : Java, Spring Boot, RESTful APIs, and front-end technologies (e.g., Angular, React, or Vue.js). Cloud Technologies : Strong experience with AWS services , cloud-native application development, and deployment strategies. Containerization & Orchestration : Proficiency in Docker , Kubernetes , and Helm. Data Streaming : Advanced knowledge of Kafka , including architecture, implementation, and troubleshooting. Modernization : Hands-on experience with application transformation and legacy system modernization projects. Leadership : Proven ability to lead large teams, drive complex projects, and align technical deliverables with business objectives. Industry Knowledge : Prior experience in the healthcare industry is a plus but not mandatory. Preferred Qualifications Certifications: AWS Certified Solutions Architect or equivalent certifications. Experience with healthcare standards (e.g., HIPAA, HL7, FHIR) is a significant advantage. Strong understanding of DevOps practices and tools (e.g., Jenkins, GitHub Actions). Soft Skills Attention to detail. Dedicated self-starter with excellent people skills. Quick learner and a go-getter. Effective time and project management. Analytical thinker and a great team player. Strong leadership, interpersonal & problem-solving skills. English Language proficiency is required to effectively communicate in a professional environment. Excellent communication skills are a must. Strong problem-solving skills and a creative mindset to bring fresh ideas to the table. Should demonstrate confidence and self-assurance in their skills and expertise enabling them to contribute to team success and engage with colleagues and clients in a positive, assured manner. Should be accountable and responsible for deliverables and outcomes. Should demonstrate ownership of tasks, meet deadlines, and ensure high-quality results. Demonstrates strong collaboration skills by working effectively with cross-functional teams, sharing insights, and contributing to shared goals and solutions. Continuously explore emerging trends, technologies, and industry best practices to drive innovation and maintain a competitive edge.

Posted 1 month ago

Apply

4 - 7 years

12 - 16 Lacs

Hyderabad

Work from Office

Naukri logo

We are looking for a highly skilled Full Stack Developer with hands-on experience in Python, GenAI, and AWS cloud services. The ideal candidate should have proficiency in backend development using NodeJS, ExpressJS, Python Flask/FastAPI, and RESTful API design. On the frontend, strong skills in Angular, ReactJS, TypeScript, etc. are required. ### Roles and Responsibility Design and develop cloud-native applications and services using AWS services such as Lambda, API Gateway, ECS, EKS, DynamoDB, Glue, Redshift, EMR. Implement CI/CD pipelines using AWS CodePipeline, CodeBuild, and CodeDeploy to automate application deployment and updates. Collaborate with architects and other engineers to design scalable and secure application architectures on AWS. Monitor application performance and implement optimizations to enhance reliability, scalability, and efficiency. Implement security best practices for AWS applications, including identity and access management (IAM), encryption, and secure coding practices. Design and deploy containerized applications using AWS services such as Amazon ECS (Elastic Container Service), Amazon EKS (Elastic Kubernetes Service), and AWS Fargate. Configure and manage container orchestration, scaling, and deployment strategies, while optimizing container performance and resource utilization by tuning settings and configurations. Implement and manage application observability tools such as AWS CloudWatch, AWS X-Ray, Prometheus, Grafana, and ELK Stack (Elasticsearch, Logstash, Kibana). Develop and configure monitoring, logging, and alerting systems to provide insights into application performance and health, creating dashboards and reports to visualize application metrics and logs for proactive monitoring and troubleshooting. Integrate AWS services with application components and external systems, ensuring smooth and efficient data flow and diagnose/troubleshoot issues related to application performance, availability, and reliability. Create and maintain comprehensive documentation for application design, deployment processes, and configuration.### Job Requirements Proficiency in AWS services such as Lambda, API Gateway, ECS, EKS, DynamoDB, S3, and RDS, Glue, Redshift, EMR. Experience in developing and deploying AI solutions with Python and JavaScript. Strong background in machine learning, deep learning, and data modeling. Good understanding of Agile methodologies and version control systems like Git. Familiarity with container orchestration concepts and tools, including Kubernetes and Docker Swarm. Understanding of AWS security best practices, including IAM, KMS, and encryption. Observability Tools: Proficiency in using observability tools like AWS CloudWatch, AWS X-Ray, Prometheus, Grafana, and ELK Stack. Monitoring: Experience with monitoring and logging tools such as AWS CloudWatch, CloudTrail, or ELK Stack. Collaboration: Strong teamwork and communication skills with the ability to work effectively with cross-functional teams.

Posted 1 month ago

Apply

1 - 3 years

12 - 16 Lacs

Kochi

Work from Office

Naukri logo

We are looking for a highly skilled and experienced professional with 1 to 3 years of experience to join our team as a Staff in the Data & Analytics domain. The ideal candidate will have a strong background in data analytics, excellent communication skills, and the ability to work collaboratively with cross-functional teams. ### Roles and Responsibility Contribute to various technical streams of EI implementation projects. Provide product and design-level technical best practices. Interface and communicate with onsite coordinators. Complete assigned tasks on time and provide regular status reports to the lead. Support business development activities through leading pursuits and developing strong relationships with existing clients. Participate in case teams to provide solutions to unstructured problems. ### Job Requirements BE/BTech/MCA/MBA with adequate industry experience. Strong SQL skills and experience in writing Stored Procedures for data transformation. Experience in ETL/Data Integration using Informatica or any standard tool. Good understanding of DWH concepts, SQL, and AWS cloud exposure (S3/EMR/Lambda/Glue). Experience using Snowflake as a Data Integration Tool. Certification is a plus. Performance analysis and performance tuning skills in Snowflake. Experience using semi-structured data with Snowflake is preferred. Experience with Clone, Time travel, and other advanced features in Snowflake. Familiarity with Agile project delivery. Exposure to cloud technologies such as Azure, AWS is preferred.

Posted 1 month ago

Apply

8 - 10 years

10 - 14 Lacs

Gurugram

Work from Office

Naukri logo

Practice Overview Practice: Data and Analytics (DNA) - Analytics Consulting The Role and Responsibilities We have open positions ranging from Data Engineer to Lead Data Engineer, providing talented and motivated professionals with excellent career and growth opportunities. We seek individuals with relevant prior experience in quantitatively intense areas to join our team. Youll be working with varied and diverse teams to deliver unique and unprecedented solutions across all industries. In the data engineering track, you will be primarily responsible for developing and monitoring high-performance applications that can rapidly deploy latest machine learning frameworks and other advanced analytical techniques at scale. This role requires you to be a proactive learner and quickly pick up new technologies, whenever required. Most of the projects require handling big data, so you will be required to work on related technologies extensively. You will work closely with other team members to support project delivery and ensure client satisfaction. Your responsibilities will include Working alongside Oliver Wyman consulting teams and partners, engaging directly with clients to understand their business challenges Exploring large-scale data and designing, developing, and maintaining data/software pipelines, and ETL processes for internal and external stakeholders Explaining, refining, and developing the necessary architecture to guide stakeholders through the journey of model building Advocating application of best practices in data engineering, code hygiene, and code reviews Leading the development of proprietary data engineering, assets, ML algorithms, and analytical tools on varied projects Creating and maintaining documentation to support stakeholders and runbooks for operational excellence Working with partners and principals to shape proposals that showcase our data engineering and analytics capabilities Travelling to clients locations across the globe, when required, understanding their problems, and delivering appropriate solutions in collaboration with them Keeping up with emerging state-of-the-art data engineering techniques in your domain Your Attributes, Experience & Qualifications Bachelor's or masters degree in a computational or quantitative discipline from a top academic program (Computer Science, Informatics, Data Science, or related) Exposure to building cloud ready applications Exposure to test-driven development and integration Pragmatic and methodical approach to solutions and delivery with a focus on impact Independent worker with ability to manage workload and meet deadlines in a fast-paced environment Collaborative team player Excellent verbal and written communication skills and command of English Willingness to travel Respect for confidentiality Technical Background Prior experience in designing and deploying large-scale technical solutions Fluency in modern programming languages (Python is mandatory; R, SAS desired) Experience with AWS/Azure/Google Cloud, including familiarity with services such as S3, EC2, Lambda, Glue Strong SQL skills and experience with relational databases such as MySQL, PostgreSQL, or Oracle Experience with big data tools like Hadoop, Spark, Kafka Demonstrated knowledge of data structures and algorithms Familiarity with version control systems like GitHub or Bitbucket Familiarity with modern storage and computational frameworks Basic understanding of agile methodologies such as CI/CD, Applicant Resiliency, and Security Valued but not required: Compelling side projects or contributions to the Open-Source community Prior experience with machine learning frameworks (e.g., Scikit-Learn, TensorFlow, Keras/Theano, Torch, Caffe, MxNet) Familiarity with containerization technologies, such as Docker and Kubernetes Experience with UI development using frameworks such as Angular, VUE, or React Experience with NoSQL databases such as MongoDB or Cassandra Experience presenting at data science conferences and connections within the data science community Interest/background in Financial Services in particular, as well as other sectors where Oliver Wyman has a strategic presence

Posted 1 month ago

Apply

6 - 8 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for a highly motivated individual that is excited about continually learning new things and being part of a fast-paced team that is delivering cutting edge solutions that drive new products and features that are critical for our customers. In this role, you will work across geographical and organizational boundaries with all levels of software engineers, technical management, and business partners. As part of the Development team, you will help maintain the quality, reliability and availability of key systems that provide search and information retrieval for our core products. The ideal candidate will have extensive skills in the areas of design, development, delivery, execution, leadership, mentoring and directing co-workers across geographic boundaries. About the Role: In this role you will play a Senior Software Engineer role. Primary responsibilities for the role include working with technology peers and business partners to solve business problems and providing support to ensure the availability of our products are met. Develop high quality code Maintain existing software solutions by fixing bugs and optimizing performance Provide technical support to operations or other development teams by troubleshooting, debugging and solving critical issues in the production environment in a timely manner to minimize user and revenue impact. Work closely with our business partners and stakeholders to identify requirements and priority of new enhancements and features and ensure that the stakeholders needs are being met. Improve system reliability by implementing automated testing and deployment strategies Design microservices architecture for complex applications Participate in design discussions with other engineers Continuously improve your knowledge of programming languages and technologies Lead small projects or tasks within a larger project team Guide junior developers on best practices and coding standards About you Bachelor's degree in computer science, engineering, information technology or equivalent experience Java, Microservices, Spring boot, JavaScript with AWS & Python 6+ years of experience in Java technologies 1+ years of experience in Python 3+ years of experience in developing Spring boot based microservices 2+ years of experience in frontend technologies such as JavaScript 3+ years of background in software development using AWS capabilities (EC2, Lambda, IAM, RDS, CloudFormation) Broad experience in enterprise-class system design and development including use of the following technologies: Java, Oracle, SQL, Messaging technologies experience of working across geographical sites experience with CI/CD Pipelines and Github actions Demonstrated understanding of Linux operating system Excellent interpersonal, verbal, and written communication skills

Posted 1 month ago

Apply

3 - 8 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

The Core AI BI & Data Platforms Team has been established to create, operate and run the Enterprise AI, BI and Data that facilitate the time to market for reporting, analytics and data science teams to run experiments, train models and generate insights as well as evolve and run the CoCounsel application and its shared capability of CoCounsel AI Assistant. The Enterprise Data Platform aims to provide self service capabilities for fast and secure ingestion and consumption of data across TR. At Thomson Reuters, we are recruiting a team of motivated Cloud professionals to transform how we build, manage and leverage our data assets. The Data Platform team in Bangalore is seeking an experienced Software Engineer with a passion for engineering cloud-based data platform systems. Join our dynamic team as a Software Engineer and take a pivotal role in shaping the future of our Enterprise Data Platform. You will develop and implement data processing applications and frameworks on cloud-based infrastructure, ensuring the efficiency, scalability, and reliability of our systems. In this opportunity as the Software Engineer, you will: Develop data processing applications and frameworks on cloud-based infrastructure in partnership with Data Analysts and Architects with guidance from Lead Software Engineer. Innovate with new approaches to meet data management requirements. Make recommendations about platform adoption, including technology integrations, application servers, libraries, and AWS frameworks, documentation, and usability by stakeholders. Contribute to improving the customer experience. Participate in code reviews to maintain a high-quality codebase Collaborate with cross-functional teams to define, design, and ship new features Work closely with product owners, designers, and other developers to understand requirements and deliver solutions. Effectively communicate and liaise across the data platform & management teams Stay updated on emerging trends and technologies in cloud computing About You You're a fit for the role of Software Engineer, if you meet all or most of these criteria: Bachelor's degree in Computer Science, Engineering, or a related field 3+ years of relevant experience in Implementation of data lake and data management of data technologies for large scale organizations. Experience in building & maintaining data pipelines with excellent run-time characteristics such as low-latency, fault-tolerance and high availability. Proficient in Python programming language. Experience in AWS services and management, including Serverless, Container, Queueing and Monitoring services like Lambda, ECS, API Gateway, RDS, Dynamo DB, Glue, S3, IAM, Step Functions, CloudWatch, SQS, SNS. Good knowledge in Consuming and building APIs Business Intelligence tools like PowerBI Fluency in querying languages such as SQL Solid understanding in Software development practices such as version control via Git, CI/CD and Release management Agile development cadence Good critical thinking, communication, documentation, troubleshooting and collaborative skills.

Posted 1 month ago

Apply

2 - 3 years

4 - 5 Lacs

Noida, Gurugram

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 10 The Role: Cloud DevOps Engineer The Impact: This role is crucial to the business as it directly contributes to the development and maintenance of cloud-based DevOps solutions on the AWS platform. Whats in it for you: Drive Innovation : Join a dynamic and forward-thinking organization at the forefront of the automotive industry. Contribute to shaping our cloud infrastructure and drive innovation in cloud-based solutions on the AWS platform. Technical Growth : Gain valuable experience and enhance your skills by working with a team of talented cloud engineers. Take on challenging projects and collaborate with cross-functional teams to define and implement cloud infrastructure strategies. Impactful Solutions : Contribute to the development of solutions that directly impact the scalability, reliability, and security of our cloud infrastructure. Play a key role in delivering high-quality products and services to our clients. We are seeking a highly skilled and driven Cloud DevOps Engineer to join our team. Candidate should have experience with developing and deploying native cloud-based solutions, possess a passion for container-based technologies, immutable infrastructure, and continuous delivery practices in deploying global software. Responsibilities: Deploy scalable, highly available, secure, and fault tolerant systems on AWS for the development and test lifecycle of AWS Cloud Native solutions Configure and manage AWS environment for usage with web applications Engage with development teams to document and implement best practice (low maintenance) cloud-native solutions for new products Focus on building Dockerized application components and integrating with AWS ECS Contribute to application design and architecture, especially as it relates to AWS services Manage AWS security groups Collaborate closely with the Technical Architects by providing input into the overall solution architecture Implement DevOps technologies and processes i.e., containerization, CI/CD, infrastructure as code, metrics, monitoring etc. Experience of networks, security, load balancers, DNS and other infrastructure components and their application to cloud (AWS) environments Passion for solving challenging issues Promote cooperation and commitment within a team to achieve common goals What you will need: Understanding of networking, infrastructure, and applications from a DevOps perspective Infrastructure as code ( IaC ) using Terraform and CloudFormation Deep knowledge of AWS especially with services like ECS/Fargate, ECR, S3/CloudFront, Load Balancing, Lambda, VPC, Route 53, RDS, CloudWatch, EC2 and AWS Security Center Experience managing AWS security groups Experience building scalable infrastructure in AWS Experience with one or more AWS SDKs and/or CLI Experience in Automation, CI/CD pipelines, DevOps principles Experience with Docker containers Experience with operational tools and ability to apply best practices for infrastructure and software deployment Software design fundamentals in data structures, algorithm design and performance analysis Experience working in an Agile Development environment Strong written and verbal communication and presentation skills Education and Experience: Bachelor's degree in Computer Science, Information Systems, Information Technology, or a similar major or CertifiedDevelopment Program 2-3 years of experience managing AWS application environment and deployments 5+ years of experience working in a development organization

Posted 1 month ago

Apply

6 - 11 years

16 - 19 Lacs

Bengaluru

Hybrid

Naukri logo

Role: Python Developer Experience: 6+ Years Location: Bangalore Work Mode: Hybrid Detail JD : 7 + Y of Experience as Python Development Experience in AWS Services (mainly Lambda, DynamoDB) Experience in API Development Experience in SQL, OOPS concepts Mandatory Skills: Python Development, AWS Services (mainly Lambda, DynamoDB), API Development, OOPS.

Posted 1 month ago

Apply

2 - 3 years

5 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

Job Overview: We are seeking an enthusiastic and highly motivated Associate Software Engineer to join our team, working on a modern data platform at massive scale. This position will focus on database management, SQL query optimization, and maintenance tasks using AWS technologies, particularly AWS Aurora RDS, and contributing to the overall health and performance of our cloud-based data infrastructure. You will be an integral part of our growing team, helping to ensure high availability, scalability, and performance of our databases while working on cutting-edge technologies in a fast-paced, dynamic environment. Key Responsibilities: Database Management & Maintenance : Manage, monitor, and optimize AWS Aurora RDS databases. Perform routine maintenance tasks such as backups, patching, and upgrades. Ensure high availability, fault tolerance, and performance of databases in production environments. SQL Development & Optimization : Write, optimize, and troubleshoot SQL queries for performance and efficiency. Work on database schema design, indexing strategies, and data migration. Perform query tuning and optimization to enhance database performance. Database Administration (DBA) Activities : Assist in database provisioning, configuration, and monitoring on AWS Aurora RDS. Handle user access management, security, and compliance tasks. Assist in database health monitoring, alerting, and disaster recovery planning. AWS Cloud Technologies : Leverage AWS services, including Aurora, RDS, S3, Lambda, and others to support a robust cloud infrastructure. Participate in cloud-based data infrastructure management and scaling. Assist in implementing cost optimization strategies for database operations on AWS. Collaboration & Continuous Improvement : Work closely with cross-functional teams, including software engineers, data engineers, and operations, to ensure efficient database usage and high-quality code. Contribute to database-related best practices and automation initiatives. Participate in on-call rotations for database support, as needed. Modern Data Platform Support : Work with large-scale, distributed data systems and support the continuous evolution of our data platform. Support data integration, ETL pipelines, and data processing workflows. Qualifications: Education : Bachelors degree in Computer Science, Engineering, Information Technology, or related field (or equivalent experience). Technical Skills : Proficiency in SQL, with a strong understanding of database design, normalization, and optimization. Experience with AWS cloud services, particularly AWS Aurora RDS . Basic knowledge of database administration tasks (e.g., backups, replication, patching, performance tuning). Familiarity with AWS cloud infrastructure and services such as EC2 , S3 , Lambda , CloudWatch , IAM , and VPC . Experience with modern data platforms, distributed systems, and high-volume databases is a plus. Experience : 2 - 3 years of experience working with relational databases (preferably AWS Aurora RDS or MySQL, PostgreSQL). Familiarity with database monitoring tools and techniques. Hands-on experience with version control tools (e.g., Git) and CI/CD pipelines is a plus. Problem Solving & Analytical Skills : Strong troubleshooting skills, especially related to databases and performance bottlenecks. Ability to analyze complex issues and come up with solutions in a timely manner. Soft Skills : Strong written and verbal communication skills. Ability to work independently and as part of a team. Strong attention to detail and a commitment to high-quality work. Nice to Have : Experience with NoSQL databases (e.g.,ElasticSearch, MongoDB) or big data technologies (e.g., Apache Kafka, Hadoop, Spark). Familiarity with Infrastructure as Code (IaC) tools like Terraform or CloudFormation . Experience with containerization and Kubernetes for deploying database services. Why Join Us? Innovative Projects : Youll work on cutting-edge technology that powers large-scale data platforms and cloud-based services. Career Growth : We provide ample opportunities for skill development, mentorship, and career advancement. Collaborative Environment : Join a supportive team that fosters knowledge sharing, learning, and growth. Impact : Your work will directly contribute to the scalability and performance of our data infrastructure.

Posted 1 month ago

Apply

7 - 12 years

15 - 30 Lacs

Chennai, Bengaluru

Hybrid

Naukri logo

Roles and Responsibilities: Study existing technology landscape and understand current application workloads Understand and document technical requirements from clients Write high code on Terrafrom to set up the environment. Define Migration strategy to move application to cloud Develop architecture blueprints and detailed documentation. Create bill of materials, including required Cloud Services (such as EC2, S3 etc.) and tools; Design the overall Virtual Private Cloud VPC environment including server instance, storage instances, subnets, availability zones, etc Design the AWS network architecture including VPN connectivity between regions and subnets Design the HA / DR strategies Set up process, services and tools around cloud Build the environment Execute migration plan Leverage appropriate AWS services Maintain and manage microservice deployment on Kubernetes using Helm Charts. Validate the environment to meets all security and compliance controls Preferred candidate profile

Posted 1 month ago

Apply

7 - 10 years

20 - 35 Lacs

Pune

Hybrid

Naukri logo

8+ yrs of exp in S/W dev with a focus on AWS solutions , architecture. Exp in architecture applications using EKS. AWS certifications - AWS Certified Solutions Architect Design, develop, and implement microservices-based AWS using Java

Posted 1 month ago

Apply

6 - 11 years

10 - 20 Lacs

Mumbai, Hyderabad, Bengaluru

Hybrid

Naukri logo

Hi, We are hiring for Java Developer with one of the Leading MNC for Hyderabad, Bangalore & Mumbai Location. Experience - 6 Years - 12 Years Location - Bangalore , Hyderabad , Chennai CTC - As per company norms Please find below the Job Description: Mandatory Skills - Java, Spring-boot, Microservices, AWS. Kubernetes - Good to have.. Description : Expertise in development using Core Java, J2EE, Spring Boot, Microservices, and Web Services SOA experience SOAP as well as Restful with JSON formats, with Messaging Kafka. Working proficiency in enterprise developmental toolsets like Jenkins, Git/Bitbucket, Sonar, Black Duck, Splunk, Apigee, etc. Experience in AWS cloud monitoring tools like Datadog, Cloudwatch, and Lambda is needed. Experience with XACML authorization policies. Experience in NoSQL and SQL databases such as Cassandra, Aurora, and Oracle. Good understanding of React JS, Photon framework, Design, Kubernetes Working with GIT/Bitbucket, Maven, Gradle, and Jenkins tools to build and deploy code deployment to production environments. Primary Location: IN-KA-Bangalore Schedule: Full Time Shift: Experienced Employee Status: Individual Contributor Job Type: Full-time Kindly fill in the below-mentioned details to proceed ahead with your profile Total Experience - Relevant in Java - Exp in Multithreading - Exp in Microservices - Exp in Spring Boot - Exp in Kafka - Exp in AWS - Exp in Kubernetes - Current Designation - Current Organization - Current Location - Current CTC + Variable - Any Offer in hand - Expected CTC + Variable - Notice Period / LWD - Reason for Relocation to Bangalore - If interested Kindly share your resume at nupur.tyagi@mounttalent.com

Posted 1 month ago

Apply

7 - 9 years

14 - 24 Lacs

Chennai

Work from Office

Naukri logo

Experience Range: 4-8 years in Data Quality Engineering Job Summary: As a Senior Data Quality Engineer, you will play a key role in ensuring the reliability and accuracy of our data platform and projects. Your primary responsibility will be developing and leading the product testing strategy while leveraging your technical expertise in AWS and big data technologies. You will also guide the team in implementing shift-left testing using Behavior-Driven Development (BDD) methodologies integrated with AWS CodeBuild CI/CD. Your contributions will ensure the successful execution of testing across multiple data platforms and projects. Key Responsibilities: Develop Product Testing Strategy: Collaborate with stakeholders to define and implement the product testing strategy. Identify key platform and project responsibilities, ensuring a comprehensive and effective testing approach. Lead Testing Strategy Implementation: Take charge of implementing the testing strategy across data platforms and projects, ensuring thorough coverage and timely completion of tasks. BDD & AWS Integration: Utilize Behavior-Driven Development (BDD) methodologies to drive shift-left testing and integrate AWS services such as AWS Glue, Lambda, Airflow jobs, Athena, Quicksight, Amazon Redshift, DynamoDB, Parquet, and Spark to improve test effectiveness. Test Execution & Reporting: Design, execute, and document test cases while providing comprehensive reporting on testing results. Collaborate with the team to identify the appropriate data for testing and manage test environments. Collaboration with Developers: Work closely with application developers and technical support to analyze and resolve identified issues in a timely manner. Automation Solutions: Create and maintain automated test cases, enhancing the test automation process to improve testing efficiency. Must-Have Skills: Big Data Platform Expertise: At least 2 years of experience as a technical test lead working on a big data platform, preferably with direct experience in AWS. Strong Programming Skills: Proficiency in object-oriented programming, particularly with Python. Ability to use programming skills to enhance test automation and tooling. BDD & AWS Integration: Experience with Behavior-Driven Development (BDD) practices and AWS technologies, including AWS Glue, Lambda, Airflow, Athena, Quicksight, Amazon Redshift, DynamoDB, Parquet, and Spark. Testing Frameworks & Tools: Familiarity with testing frameworks such as PyTest, PyTest-BDD, and CI/CD tools like AWS CodeBuild and Harness. Communication Skills: Exceptional communication skills with the ability to convey complex technical concepts to both technical and non-technical stakeholders. Good-to-Have Skills: Automation Engineering: Expertise in creating automation testing solutions to improve testing efficiency. Experience with Test Management: Knowledge of test management processes, including test case design, execution, and defect tracking. Agile Methodologies: Experience working in Agile environments, with familiarity in using Agile tools such as Jira to track stories, bugs, and progress. Experience Range: Minimum Requirements: Bachelors degree in Computer Science or related field, or HS/GED with 8 years of experience in Data Quality Engineering. At least 4 years of experience in big data platforms and test engineering, with a strong focus on AWS and Python. Skills Test Automation,Python,Data Engineering

Posted 1 month ago

Apply

7 - 9 years

14 - 24 Lacs

Chennai

Work from Office

Naukri logo

Job Summary: As a Senior Data Quality Engineer, you will play a crucial role in ensuring the reliability and accuracy of our data platform and projects. Your primary responsibilities will involve developing and leading the product testing strategy, leveraging your technical expertise in AWS and big data technologies. You will also work closely with the team to implement shift-left testing using Behavior-Driven Development (BDD) methodologies integrated with AWS CodeBuild CI/CD. Key Responsibilities: Develop Product Testing Strategy: Collaborate with stakeholders to define and design the product testing strategy, identifying key platform and project responsibilities. Lead Testing Strategy Implementation: Take charge of implementing the testing strategy, ensuring its successful execution across the data platform and projects. Oversee and coordinate testing tasks to ensure thorough coverage and timely completion. BDD and AWS Integration: Guide the team in utilizing Behavior-Driven Development (BDD) practices for shift-left testing. Leverage AWS services (e.g., AWS Glue, Lambda, Airflow, Athena, Quicksight, Redshift, DynamoDB, Parquet, Spark) to enhance testing effectiveness. Test Case Management: Work with the team to identify and prepare data for testing, create/maintain automated test cases, execute test cases, and document results. Problem Resolution: Assist developers and technical support staff in resolving identified issues in a timely manner. Automation Engineering Solutions: Create test automation solutions that improve the efficiency and coverage of testing efforts. Must-Have Skills: Big Data Platform Expertise: At least 2 years of experience as a technical test lead working on a big data platform, preferably with direct experience in AWS. AI/ML Familiarity: Experience with AI/ML concepts and practical experience working on AI/ML-driven initiatives. Synthetic Test Data Creation: Knowledge of synthetic data tooling, test data generation, and best practices. Offshore Team Leadership: Proven ability to lead and collaborate with offshore teams, managing projects with limited real data access. Programming Expertise: Strong proficiency in object-oriented programming, particularly with Python. Testing Tools/Frameworks: Familiarity with tools like PyTest, PyTest-BDD, AWS CodeBuild, and Harness. Excellent Communication: Ability to communicate effectively with both technical and non-technical stakeholders, explaining complex technical concepts in simple terms. Good-to-Have Skills: Experience with AWS Services: Familiarity with AWS DL/DW components like AWS Glue, Lambda, Airflow jobs, Athena, Quicksight, Amazon Redshift, DynamoDB, Parquet, and Spark. Test Automation Experience: Practical experience in implementing test automation frameworks for complex data platforms and systems. Shift-Left Testing Knowledge: Experience in implementing shift-left testing strategies, particularly using Behavior-Driven Development (BDD) methodologies. Project Management: Ability to manage multiple testing projects simultaneously while ensuring the accuracy and quality of deliverables. Minimum Requirements: Bachelors in Computer Science and 4 years of relevant experience, or High School/GED with 8 years of relevant experience. Relevant Experience: Big Data platform testing, test strategy leadership, automation, and working with AWS services and AI/ML concepts. Skills Test Automation,Python,Data Engineering

Posted 1 month ago

Apply

12 - 16 years

40 - 45 Lacs

Ahmedabad

Work from Office

Naukri logo

We are seeking an experienced AWS Architect to join our dynamic team at Tech Mahindra. The AWS Architect will be responsible for designing, implementing, and managing cloud solutions on the AWS platform. The ideal candidate will have a strong background in AWS services, cloud architecture, and enterprise-level implementations. Key Responsibilities: Design and implement scalable, highly available, and fault-tolerant systems on AWS. Develop and manage cloud architecture and strategy, including cost management and optimization. Collaborate with clients and internal teams to understand business requirements and translate them into technical solutions. Provide architectural guidance and best practices for cloud deployment and operations. Lead the development of cloud solutions, including design, testing, and deployment. Monitor and manage cloud infrastructure to ensure optimal performance, security, and compliance. Troubleshoot and resolve issues related to cloud architecture and deployment. Stay updated with the latest AWS services and industry trends to ensure that Tech Mahindra's solutions are cutting-edge and competitive. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Advanced degrees or certifications are a plus. AWS Certified Solutions Architect (Associate or Professional) or equivalent certification. Proven experience designing and implementing AWS cloud solutions. Strong knowledge of AWS services including EC2, S3, RDS, Lambda, VPC, CloudFormation, and IAM. Experience with cloud security best practices and compliance requirements. Proficiency in scripting languages such as Python, Bash, or PowerShell. Familiarity with DevOps practices and tools like Terraform, Jenkins, and Docker. Excellent problem-solving skills and the ability to work in a fast-paced environment. Strong communication skills with the ability to effectively interact with clients and stakeholders. Preferred Skills: Experience with multi-cloud environments (e.g., AWS, Azure, Google Cloud). Knowledge of container orchestration platforms such as Kubernetes. Familiarity with Agile methodologies and project management tools. Experience in migrating on-premises applications to the cloud.

Posted 1 month ago

Apply

12 - 16 years

35 - 40 Lacs

Bengaluru

Work from Office

Naukri logo

As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg

Posted 1 month ago

Apply

12 - 16 years

35 - 40 Lacs

Chennai

Work from Office

Naukri logo

As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies