Home
Jobs

316 Aws Lambda Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 10.0 years

22 - 37 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

We are looking for "AWS Data Engineer (With GCP, BigQuery)" with Minimum 5 years experience Contact- Yashra (95001 81847) Required Candidate profile Athena,Step Functions, Spark - Pyspark, ETL Fundamentals, SQL(Basic + Advanced), Glue, Python, Lambda, Data Warehousing, EBS /EFS, AWS EC2,Lake Formation, Aurora, S3, Modern Data Platform Fundamentals

Posted 18 hours ago

Apply

5.0 - 15.0 years

0 - 28 Lacs

Bengaluru

Work from Office

Naukri logo

Key Skills : Python, Pyspark, AWS Glue, Redshift and Spark Steaming, Job Description: 6+ years of experience in data engineering, specifically in cloud environments like AWS. Proficiency in PySpark for distributed data processing and transformation. Solid experience with AWS Glue for ETL jobs and managing data workflows. Hands-on experience with AWS Data Pipeline (DPL) for workflow orchestration. Strong experience with AWS services such as S3, Lambda, Redshift, RDS, and EC2. What The requirement is, you can check with the candidate the followings (Vast knowledge of python, pyspark, glue job, lambda, step function, sql) Please find the Data Engineering requirement JD and client expectations from candidate. 1. Process these events and save data in Trusted and refined bucket schemas 2. Bring Six Tables for Historical data to Raw bucket. Populate historical data in trusted and refined bucket schemas. 3. Publish raw, trusted and refined bucket data from #2 and #3 to corresponding buckets in CCB data lake Develop Analytics pipeline to publish data to Snowflake 4. Integrate TDQ/BDQ in the Glue pipeline 5. Develop Observability dashboards for these jobs 6. Implement reliability wherever needed to prevent data loss 7. Configure Data archival policies and periodic cleanup 8. Perform end to end testing of the implementation 9. Implement all of the above in Production 10. Implement Reconcile data across SORs, Auth Data Lake and CCB Data Lake 11. Success criteria is All the 50 Kafka events are ingested in the CCB data lake and existing 16 Tableau dashboards are populated using this data.

Posted 18 hours ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

Hyderabad

Hybrid

Naukri logo

Job Title: Lead Data Engineer Job Summary The Lead Data Engineer will provide technical expertise in analysis, design, development, rollout and maintenance of data integration initiatives. This role will contribute to implementation methodologies and best practices, as well as work on project teams to analyse, design, develop and deploy business intelligence / data integration solutions to support a variety of customer needs. This position oversees a team of Data Integration Consultants at various levels, ensuring their success on projects, goals, trainings and initiatives though mentoring and coaching. Provides technical expertise in needs identification, data modelling, data movement and transformation mapping (source to target), automation and testing strategies, translating business needs into technical solutions with adherence to established data guidelines and approaches from a business unit or project perspective whilst leveraging best fit technologies (e.g., cloud, Hadoop, NoSQL, etc.) and approaches to address business and environmental challenges Works with stakeholders to identify and define self-service analytic solutions, dashboards, actionable enterprise business intelligence reports and business intelligence best practices. Responsible for repeatable, lean and maintainable enterprise BI design across organizations. Effectively partners with client team. Leadership not only in the conventional sense, but also within a team we expect people to be leaders. Candidate should elicit leadership qualities such as Innovation, Critical thinking, optimism/positivity, Communication, Time Management, Collaboration, Problem-solving, Acting Independently, Knowledge sharing and Approachable. Responsibilities: Design, develop, test, and deploy data integration processes (batch or real-time) using tools such as Microsoft SSIS, Azure Data Factory, Databricks, Matillion, Airflow, Sqoop, etc. Create functional & technical documentation e.g. ETL architecture documentation, unit testing plans and results, data integration specifications, data testing plans, etc. Provide a consultative approach with business users, asking questions to understand the business need and deriving the data flow, conceptual, logical, and physical data models based on those needs. Perform data analysis to validate data models and to confirm ability to meet business needs. May serve as project or DI lead, overseeing multiple consultants from various competencies Stays current with emerging and changing technologies to best recommend and implement beneficial technologies and approaches for Data Integration Ensures proper execution/creation of methodology, training, templates, resource plans and engagement review processes Coach team members to ensure understanding on projects and tasks, providing effective feedback (critical and positive) and promoting growth opportunities when appropriate. Coordinate and consult with the project manager, client business staff, client technical staff and project developers in data architecture best practices and anything else that is data related at the project or business unit levels Architect, design, develop and set direction for enterprise self-service analytic solutions, business intelligence reports, visualisations and best practice standards. Toolsets include but not limited to: SQL Server Analysis and Reporting Services, Microsoft Power BI, Tableau and Qlik. Work with report team to identify, design and implement a reporting user experience that is consistent and intuitive across environments, across report methods, defines security and meets usability and scalability best practices. Required Qualifications: 10 Years industry implementation experience with data integration tools such as AWS services Redshift, Athena, Lambda, Glue, S3, ETL, etc. 5-8 years of management experience required 5-8 years consulting experience preferred Minimum of 5 years of data architecture, data modelling or similar experience Bachelor’s degree or equivalent experience, Master’s Degree Preferred Strong data warehousing, OLTP systems, data integration and SDLC Strong experience in orchestration & working experience cloud native / 3rd party ETL data load orchestration Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms Strong databricks experience required to create notebooks in pyspark Experience using major data modelling tools (examples: ERwin, ER/Studio, PowerDesigner, etc.) Experience with major database platforms (e.g. SQL Server, Oracle, Azure Data Lake, Hadoop, Azure Synapse/SQL Data Warehouse, Snowflake, Redshift etc.) Strong experience in orchestration & working experience in either Data Factory or HDInsight or Data Pipeline or Cloud composer or Similar Understanding and experience with major Data Architecture philosophies (Dimensional, ODS, Data Vault, etc.) Understanding of modern data warehouse capabilities and technologies such as real-time, cloud, Big Data. Understanding of on premises and cloud infrastructure architectures (e.g. Azure, AWS, GCP) Strong experience in Agile Process (Scrum cadences, Roles, deliverables) & working experience in either Azure DevOps, JIRA or Similar with Experience in CI/CD using one or more code management platforms 3-5 years’ development experience in decision support / business intelligence environments utilizing tools such as SQL Server Analysis and Reporting Services, Microsoft’s Power BI, Tableau, looker etc. Preferred Skills & Experience: Knowledge and working experience with Data Integration processes, such as Data Warehousing, EAI, etc. Experience in providing estimates for the Data Integration projects including testing, documentation, and implementation Ability to Analyse business requirements as they relate to the data movement and transformation processes, research, evaluation and recommendation of alternative solutions. Ability to provide technical direction to other team members including contractors and employees. Ability to contribute to conceptual data modelling sessions to accurately define business processes, independently of data structures and then combines the two together. Proven experience leading team members, directly or indirectly, in completing high-quality major deliverables with superior results Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM. Can create documentation and presentations such that the they “stand on their own” Can advise sales on evaluation of Data Integration efforts for new or existing client work. Can contribute to internal/external Data Integration proof of concepts. Demonstrates ability to create new and innovative solutions to problems that have previously not been encountered. Ability to work independently on projects as well as collaborate effectively across teams Must excel in a fast-paced, agile environment where critical thinking and strong problem solving skills are required for success Strong team building, interpersonal, analytical, problem identification and resolution skills Experience working with multi-level business communities Can effectively utilise SQL and/or available BI tool to validate/elaborate business rules. Demonstrates an understanding of EDM architectures and applies this knowledge in collaborating with the team to design effective solutions to business problems/issues. Effectively influences and, at times, oversees business and data analysis activities to ensure sufficient understanding and quality of data. Demonstrates a complete understanding of and utilises DSC methodology documents to efficiently complete assigned roles and associated tasks. Deals effectively with all team members and builds strong working relationships/rapport with them. Understands and leverages a multi-layer semantic model to ensure scalability, durability, and supportability of the analytic solution. Understands modern data warehouse concepts (real-time, cloud, Big Data) and how to enable such capabilities from a reporting and analytic stand-point. Demonstrated ability to serve as a trusted advisor that builds influence with client management beyond simply EDM.

Posted 21 hours ago

Apply

4.0 - 9.0 years

10 - 13 Lacs

Bengaluru

Remote

Naukri logo

Join our team to build scalable full stack solutions using .NET Core, React, and AWS. You'll own backend APIs, integrate frontend apps, and manage cloud infrastructure. You will be using react in the frontend development. Required Candidate profile 4+ years in full stack development with .NET, React, AWS. Must have built secure APIs, deployed services on AWS, and collaborated across teams. Strong code quality and system design skills required.

Posted 1 day ago

Apply

5.0 - 10.0 years

18 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

JD for Full Stack Developer About V Group V Group Inc., an IT-based solution entity based out of New Jersey. With multiple offshore sites (Pune and Bhopal) in India. With offerings ranging from IT infrastructure to Product development, V Group Inc provides a compliant service in numerous industry sectors while maintaining structure, stability, and core values. Ranked by INC5000 in 2020 - Fastest Growing IT Co. in the USA. Current business groups include e-commerce, Digital, Professional services, IT projects, and Products. Join our team of innovative technical and business-savvy people; with a passion for creating solutions! Visit us at: https://www.vgroupinc.com (Corporate website) || https://www.webstorevgroup.net (Ecommerce Store) || https://www.vgroupdigital.com/portfolio (Digital Products). Job Details: Position Title : Full Stack Developer Job Locatio n: Blore Experience : 7+ Years of experience. Job responsibilities: 7+ years of hands-on experience in taking requirements and designing, architecting, and implementing scalable and robust solutions. Proven success in building frameworks and products. 4+ years of experience working with modern languages like Typescript, Java, Node.js, Python 4+ years of experience with agile methodologies and best practices. Excellent communication and presentation skills. 3+ years of experience with cloud technologies (AWS is a plus). 3+ years of experience with CI/CD, DevOps practices, and tools. Familiarity with serverless technologies (such as AWS Lambda, API Gateway) is advantageous. Perks & Benefits Health & Accident Insurance Paid Leaves and Sick Leaves Education sponsorship / Certification Reimbursement Free Training Platforms Internet Reimbursement Gym membership EPF, Gratuity. US Based Clients Onsite Opportunities to US.

Posted 2 days ago

Apply

12.0 - 15.0 years

16 - 18 Lacs

Bengaluru

Hybrid

Naukri logo

iSource Services is hiring for one of their client for the position of AWS. AWS experience (not Azure or GCP), with 12-15 years of experience, and hands-on expertise in design and implementation. Design and Develop Data Solutions, Design and implement efficient data processing pipelines using AWS services like AWS Glue, AWS Lambda, Amazon S3, and Amazon Redshift. Candidates should possess exceptional communication skills to engage effectively with US clients. The ideal candidate must be hands-on with significant practical experience. Availability to work overlapping US hours is essential. The contract duration is 6 months. For this role, we're looking for candidates with 12 to 15 years of experience. AWS experience communication skills

Posted 3 days ago

Apply

6.0 - 9.0 years

25 - 30 Lacs

Gurugram

Work from Office

Naukri logo

Exp- 6 to 9 years Notice period- Immediate to 15 days Location- GGN only WFO – 4 days in a week Working Shift - 1 PM to 10 PM. Band -4A

Posted 3 days ago

Apply

5.0 - 8.0 years

12 - 15 Lacs

Bengaluru

Hybrid

Naukri logo

Backend Skills required Proficiency Level Python Javascript/Typescript Open Search ArangoDB AWS Lambda AWS App Sync GraphQL DynamoDB AWS CloudFormation AWS Command Line Interface (CLI) Rest API Docker AWS Code pipeline AWS Cloudwatch

Posted 3 days ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Role Overview. We are seeking a skilled Backend Developer with expertise in TypeScript and AWS to design and implement scalable, event-driven microservices. The ideal candidate will have a strong background in serverless architectures and backend development.. Key Responsibilities. Backend Development: Develop and maintain server-side applications using TypeScript and Node.js.expertia.ai. API Design: Create and manage RESTful APIs adhering to OpenAPI specifications.. Serverless Architecture: Implement serverless solutions using AWS Lambda, API Gateway, and DynamoDB.. Event-Driven Systems: Design and build event-driven architectures utilizing AWS SQS and SNS.vitiya99.medium.com. Microservices: Develop microservices that are scalable and maintainable.. Collaboration: Work closely with frontend developers and other stakeholders to integrate APIs and ensure seamless functionality.. Code Quality: Write clean, maintainable code and conduct code reviews.iihglobal.com. Continuous Improvement: Stay updated with the latest industry trends and technologies to continuously improve backend systems.. Required Skills & Qualifications. Experience: 7–10 years in backend development with a focus on TypeScript and Node.js.. AWS Expertise: Proficiency in AWS services such as Lambda, API Gateway, DynamoDB, SQS, and SNS.. API Development: Experience in designing and implementing RESTful APIs.. Event-Driven Architecture: Familiarity with building event-driven systems using AWS services.. Microservices: Experience in developing microservices architectures.. Version Control: Proficiency in using Git for version control.. CI/CD: Experience with continuous integration and continuous deployment pipelines.. Collaboration: Strong communication skills and ability to work in a team environment.. Preferred Skills. Infrastructure as Code: Experience with tools like Terraform or AWS CloudFormation.. Containerization: Familiarity with Docker and container orchestration tools.. Monitoring & Logging: Experience with monitoring and logging tools to ensure system reliability.. Agile Methodologies: Experience working in Agile development environments.. Show more Show less

Posted 3 days ago

Apply

6.0 - 10.0 years

5 - 5 Lacs

Thiruvananthapuram

Work from Office

Naukri logo

Job Title: Senior Java Backend Developer Location: Trivandrum Experience: 6+ Years Job Summary We are seeking a Senior Java Backend Developer to join our team in Trivandrum. The ideal candidate will bring strong backend development expertise, particularly in Java and Kotlin, to build scalable, high-performance backend systems for our client's e-commerce platform. You will work in a dynamic Agile environment , collaborating with cross-functional teams to deliver secure, efficient, and reliable solutions. Key Responsibilities Design, develop, and maintain backend services using Java , Kotlin , and Spring Boot Build and optimize event-driven systems using Kafka , MQ , and AWS Lambda Manage and maintain databases including PostgreSQL , Oracle , and DynamoDB Integrate systems and services using ATG 11.3 , IBM App Connect , and IBM Integration Bus (IIB) Develop and maintain CI/CD pipelines using Jenkins and GitLab CI/CD Ensure backend services meet standards for performance , scalability , and reliability Participate in code reviews , automated testing , and create relevant technical documentation Mandatory Skills Java (8+) - Advanced development experience Kotlin - Minimum 4 years of hands-on experience Spring Boot - Proficiency with microservices architecture Kafka and MQ - Strong knowledge of messaging systems PostgreSQL and DynamoDB - In-depth experience with relational and NoSQL databases AWS Lambda - Understanding of serverless architecture and implementation CI/CD Tools - Experience with Jenkins and GitLab CI/CD ATG 11.3 - Hands-on experience is essential E-commerce platforms - Proven background working with e-commerce systems Excellent communication skills in English (written and verbal) Good to Have Skills Experience with Oracle database Familiarity with IBM App Connect and IBM Integration Bus (IIB) Understanding of Agile/Scrum methodology Experience working in distributed or multinational teams Strong analytical and problem-solving abilities Required Skills Java,Atg,Microservices

Posted 3 days ago

Apply

20.0 - 24.0 years

30 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities: Lead and manage the cloud engineering team to design, develop, and deploy scalable cloud solutions. Collaborate with various teams to understand business needs and translate them into technical requirements. Develop and implement cloud strategies that align with business goals and drive revenue growth. Provide technical leadership and mentorship to engineering teams, ensuring best practices in cloud architecture and deployment. Engage with clients to present cloud solutions, address technical concerns, and demonstrate the value of our offerings. Stay updated with the latest cloud technologies and trends to ensure our solutions remain competitive. Work closely with product management to define and prioritize features and enhancements. Oversee the integration of cloud solutions with existing systems and infrastructure. Ensure compliance with security standards and best practices in cloud deployments. Proven experience in pre-sales, solution, Estimation, and stakeholder engagement. Excellent communication and presentation skills, with the ability to convey complex technical concepts to non-technical stakeholders. Experience with DevOps practices and tools, including CI/CD pipelines, automation, and infrastructure as code. Strong problem-solving skills and the ability to think strategically about technology and business needs. Ability to work in a fast-paced, dynamic environment and manage multiple priorities. Demonstrated success in a leadership role, managing and coaching engineering teams. Roles and Responsibilities Qualifications: Bachelor's or masters degree in computer science, Engineering, or a related field. Proven experience in cloud engineering and architecture, with a strong understanding of AWS cloud platforms. AWS Certified Solutions Architect Technical Skills Required: Proficiency in AWS cloud platforms and their services. Experience with infrastructure as code tools (e.g., Terraform, CloudFormation). Knowledge of containerization technologies (e.g., Docker, Kubernetes). Familiarity with serverless architectures and services (e.g., AWS Lambda, Azure Functions). Understanding of microservices architecture and design patterns. Experience with monitoring and logging tools (e.g., Prometheus, Grafana, ELK stack). Knowledge of networking concepts and technologies (e.g., VPC, VPN, DNS, Load Balancers). Proficiency in scripting and programming languages (e.g., Python, Java, Go). Experience with database technologies (e.g., SQL, NoSQL, PostgreSQL, MongoDB). Understanding of security best practices and compliance standards (e.g., IAM, encryption, GDPR).

Posted 3 days ago

Apply

5.0 - 10.0 years

16 - 25 Lacs

Chennai

Work from Office

Naukri logo

Key Responsibilities: Proficiency in Core Java and AWS services such as EC2, S3, RDS, Lambda, CloudFormation , etc. Experience with Spring Boot and other Java frameworks. Strong knowledge of SQL and NoSQL databases. Familiarity with containerization technologies like Docker and Kubernetes. Experience with CI/CD tools and processes. Excellent problem-solving skills and the ability to work independently and as part of a team. Strong communication skills and the ability to articulate technical concepts to non-technical stakeholders. Qualifications: Bachelors or Master’s degree in Computer Science, Engineering, or a related field. 5-7 years of experience in Java development, with a strong understanding of object-oriented programming and design patterns.

Posted 3 days ago

Apply

8.0 - 12.0 years

35 - 45 Lacs

Bengaluru

Hybrid

Naukri logo

Job Description Job Overview: We are seeking a seasoned Lead Platform Engineer with a strong background in platform development and a proven track record of leading technology design and teams. The ideal candidate will have at least 8 years of overall experience, with a minimum of 6 years in relevant roles. This position entails owning module design and spearheading the implementation process alongside a team of talented platform engineers. At MontyCloud, you'll have the opportunity to work on cutting-edge technology, shaping the future of our cloud management platform with your expertise. If you're passionate about building scalable, efficient, and innovative cloud solutions, we'd love to have you on our team. Key Responsibilities: • Lead the design and architecture of robust, scalable platform modules, ensuring alignment with business objectives and technical standards. • Drive the implementation of platform solutions, collaborating closely with platform engineers and cross-functional teams to achieve project milestones. • Mentor and guide a team of platform engineers, fostering an environment of growth and continuous improvements. • Stay abreast of emerging technologies and industry trends, incorporating them into the platform to enhance functionality and user experience. • Ensure the reliability and security of the platform through comprehensive testing and adherence to best practices. • Collaborate with senior leadership to set technical strategy and goals for the platform engineering team. Key Requirements: • Minimum of 9 years of experience in software or platform engineering, with at least 6 years in roles directly relevant to platform development and team leadership. • Expertise in Python programming, with a solid foundation in writing clean, efficient, and scalable code. • Proven experience in serverless application development, designing and implementing microservices, and working within event-driven architectures. • Demonstrated experience in building and shipping high-quality SaaS platforms/applications on AWS, showcasing a portfolio of successful deployments. • Comprehensive understanding of cloud computing concepts, AWS architectural best practices, and familiarity with a range of AWS services, including but not limited to Lambda, RDS, DynamoDB, and API Gateway. • Exceptional problem-solving skills, with a proven ability to optimize complex systems for efficiency and scalability. • Excellent communication skills, with a track record of effective collaboration with team members and successful engagement with stakeholders across various levels. • Previous experience leading technology design and engineering teams, with a focus on mentoring, guiding, and driving the team towards achieving project milestones and technical excellence. Good to Have: • AWS Certified Solutions Architect, AWS Certified Developer, or other relevant cloud development certifications. • Experience with the AWS Boto3 SDK for Python. • Exposure to other cloud platforms such as Azure or GCP. • Knowledge of containerization and orchestration technologies, such as Docker and Kubernetes. • Knowledge of containerization and orchestration technologies, such as Docker and Kubernetes.

Posted 4 days ago

Apply

8.0 - 13.0 years

12 - 22 Lacs

Pune

Hybrid

Naukri logo

- Exp in developing applications using Python, Glue(ETL), Lambda, step functions services in AWS EKS, S3, Glue, EMR, RDS Data Stores, CloudFront, API Gateway - Exp in AWS services such as Amazon Elastic Compute (EC2), Glue, Amazon S3, EKS, Lambda Required Candidate profile - 10+ years of exp in software development and technical leadership, preferably having a strong financial knowledge in building complex trading applications. - 5+ years of people management exp.

Posted 4 days ago

Apply

4.0 - 9.0 years

12 - 16 Lacs

Kochi

Work from Office

Naukri logo

As Data Engineer, you wi deveop, maintain, evauate and test big data soutions. You wi be invoved in the deveopment of data soutions using Spark Framework with Python or Scaa on Hadoop and AWS Coud Data Patform Responsibiities: Experienced in buiding data pipeines to Ingest, process, and transform data from fies, streams and databases. Process the data with Spark, Python, PySpark, Scaa, and Hive, Hbase or other NoSQL databases on Coud Data Patforms (AWS) or HDFS Experienced in deveop efficient software code for mutipe use cases everaging Spark Framework / using Python or Scaa and Big Data technoogies for various use cases buit on the patform Experience in deveoping streaming pipeines Experience to work with Hadoop / AWS eco system components to impement scaabe soutions to meet the ever-increasing data voumes, using big data/coud technoogies Apache Spark, Kafka, any Coud computing etc Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Minimum 4+ years of experience in Big Data technoogies with extensive data engineering experience in Spark / Python or Scaa ; Minimum 3 years of experience on Coud Data Patforms on AWS; Experience in AWS EMR / AWS Gue / DataBricks, AWS RedShift, DynamoDB Good to exceent SQL skis Exposure to streaming soutions and message brokers ike Kafka technoogies Preferred technica and professiona experience Certification in AWS and Data Bricks or Coudera Spark Certified deveopers

Posted 4 days ago

Apply

4.0 - 7.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

As a Senior Cloud Platform Back-End Engineer with a strong background in AWS tools and services, you will join the Data & AI Solutions - Engineering team in our Healthcare R&D business. Your expertise will enhance the development and continuous improvement of a critical AWS-Cloud-based analytics platform, supporting our R&D efforts in drug discovery. This role involves implementing the technical roadmap and maintaining existing functionalities. You will adapt to evolving technologies, manage infrastructure and security, design and implement new features, and oversee seamless deployment of updates. Additionally, you will implement strategies for data archival and optimize the data lifecycle processes for efficient storage management in compliance with regulations. Join a multicultural team working in agile methodologies with high autonomy. The role requires office presence at our Bangalore location. Who You Are: University degree in Computer Science, Engineering, or a related field Proficiency using Python, especially with the boto3 library to interact with AWS services programmatically, for infrastructure as a code with AWS CDK and AWS Lambdas Experience with API Development & Management by designing, developing, and managing APIs using AWS API Gateway and other relevant API frameworks. Strong understanding of AWS security best practices, IAM policies, encryption, auditing and regulatory compliance (e.g. GDPR). Experience with Application Performance Monitoring and tracing solutions like AWS CloudWatch, X-Ray, and OpenTelemetry. Proficiency in navigating and utilizing various AWS tools and services System design skills in cloud environment Experience with SQL and data integration into Snowflake Familiarity with Microsoft Entra ID for identity and access management Willingness to work in a multinational environment and cross-functional teams distributed between US, Europe (mostly, Germany) and India Sense of accountability and ownership, fast learner Fluency in English & excellent communication skills

Posted 4 days ago

Apply

6.0 - 11.0 years

8 - 18 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Naukri logo

Hi We have Excellent opportunity with TOP MNC Company for Permanent position Skill: Java, , Microservices , Javascript , Springboot, • Experience NoSQL databases - Mongo DB Exp: 5 plus Years Location:Bangalore,Hyderabad,Pune,Kerala 5+ years of experience developing Backend, API applications/software • Expert working experience in Java, , Microservices , Javascript , Springboot, • Experience NoSQL databases - Mongo DB • Require experience and Strong understanding of entire Software Development Life Cycle (SDLC), Agile (Scrum), • Experience with web services (consuming or creating) with REST, MQTT, Web Sockets • Good experience with Micro-services architecture, Pub-sub model ,working on cloud architecture, QA automation, CI / CD pipelines ,application security and load testing ,3rd party integration(Dockers, Kubernetes) and management • Experience managing Cloud infrastructure (resources and service) in AWS, Azure and/or GCP • Strong knowledge of SOA, object-oriented programming, design patterns, multi-threaded application development • Experience in reporting and analytic, queuing and real-time streaming systems • Experience developing, maintaining and innovating large scale web or mobile applications • BE/ B.Tech / MCA / M. Tech in computer programming, computer science, or If you are interested kindly revert back with updated resume Thanks for applying Regards, Jamuna 9916995347 s.jamuna@randstaddigital.com

Posted 4 days ago

Apply

7.0 - 12.0 years

25 - 35 Lacs

Hyderabad

Hybrid

Naukri logo

We are currently hiring for a Senior Software Engineer role with strong experience in Node.js, AWS Services, and JavaScript/TypeScript. This is a high-priority position, and we are looking for candidates who can join immediately or with a very short notice period. Senior Software Engineer Required Experience 7+ Years Technical skills in most of the following areas: Node.js , ReactJS and Redux, Saga, GraphQL, REST APIs , HTML, CSS, CSS3, JavaScript , TypeScript & Serverless.com framework or (Knowledge or Experience with AWS SAM, Lambda , S3, CloudWatch & DynamoDB/NeptuneDB.) , Knowledge or Experience with Cassandra , Mysql Databases. 6 - 10 years of experience in Software Web Development using NodeJS, TypeScript/ JavaScript. Good understanding of Software Development life cycle, Requirements Gathering, Requirements Analysis, Execution and Defect tracking. Experience in building REST APIs and GraphQL API’s, web services using Node JS, Express JS/Apollo, AWS Lambda and API Gateway. Extensive knowledge of SOA principle, Design Patterns, Application, and integration architectures. Experience in Agile methodology with tools like JIRA, GIT, GITLAB, SVN, Bit Bucket as an active scrum member. Strong with Object Oriented Analysis & Design (OOAD). Strong experience on PaaS, IaaS cloud computing. Developing secure, high-performance Web APIs that others rely on. Good hands-on experience in Serverless frameworks, AWS JS SDK, AWS services like Lambda, SNS, SES, SQS, SSM, S3, EC2, IAM, CloudWatch, Kinesis and Cloud Formation. Solid understanding of any SQL/NoSQL Databases like SQL, DynamoDB, Neptune DB and AWS Time stream DB. Experience on TDD i.e.Unit test cases writing, coding standards.

Posted 4 days ago

Apply

3.0 - 8.0 years

25 - 40 Lacs

Hyderabad

Work from Office

Naukri logo

1. Automation of Processes: Automate trading system deployments, configuration, and monitoring to minimize manual errors and ensure rapid, consistent updates across environments. Develop scripts and tools to automate repetitive tasks, such as environment provisioning, software deployments, and database updates, using tools like Ansible, Jenkins, or Terraform. 2. High-Frequency Trading (HFT) System Optimization: Optimize CI/CD pipelines for ultra-low latency and high-throughput trading systems to support continuous delivery of trading algorithms and infrastructure updates. Ensure that deployment and testing processes do not impact the performance of trading operations. 3. Infrastructure Management and Scalability: Managecloud and on-premises infrastructures tailored for trading environments, focusing on low latency, high availability, and failover strategies. UseInfrastructure as Code (IaC) to provision scalable and secure environments that can handle fluctuating loads typical in trading operations. 4. Monitoring and Real-Time Logging: Implement monitoring tools to track system performance, trade execution times, and infrastructure health in real-time. Setupsophisticated logging mechanisms for trade data, errors, and performance metrics, ensuring traceability and quick troubleshooting during incidents. 5. Security and Compliance: Integrate security best practices into the DevOps pipeline, including real-time security scans, vulnerability assessments, and access control tailored for financial data protection. Ensure that all systems comply with financial regulations such as GDPR, MiFID II, and SEC rules, including managing audit logs and data retention policies. 6. Disaster Recovery and High Availability: Design and maintain disaster recovery solutions to ensure continuity in trading operations during outages or data breaches. Implement redundancy and failover strategies to maintain trading platform uptime, minimizing the risk of costly downtimes. 7. Performance Optimization for Trading Systems: Fine-tune infrastructure and CI/CD pipelines to reduce deployment times and latency, crucial for real-time trading environments. Workonsystem performance to support the rapid execution of trades, data feeds, and order matching systems. 8. Incident Management and Troubleshooting: Rapidly respond to incidents affecting trading operations, performing root cause analysis and implementing corrective measures to prevent reoccurrence. Ensure detailed incident reporting and documentation to support regulatory requirements. 9. Configuration Management: Maintain configuration consistency across multiple environments (dev, test, prod) using tools like Puppet, Chef, or SaltStack. Ensure configurations meet the stringent security and performance standards required for trading platforms. 10. Collaboration with Development and Trading Teams: Workclosely with developers, quants, and traders to ensure smooth deployment of new trading algorithms and updates to trading platforms. Facilitate communication between development, trading desks, and compliance teams to ensure that changes are in line with business requirements and regulations. 11. Risk Management: Implement risk management controls within the DevOps pipeline to minimize the impact of potential system failures on trading operations. Workwith risk and compliance teams to ensure that deployment and infrastructure changes do not expose trading systems to unnecessary risks. 12. Cloud Services and Cost Optimization: Deploy, manage, and scale trading applications on cloud platforms like AWS, Azure, or Google Cloud, with a focus on minimizing costs without compromising performance. Utilize cloud-native services such as AWS Lambda or Azure Functions for event-driven processes in trading workflows. 13. Version Control and Code Management: Managethe versioning of trading algorithms and platform updates using Git or similar tools, ensuring traceability and quick rollback capabilities if issues arise. Establish rigorous code review processes to ensure that changes align with performance and security standards specific to trading systems.

Posted 4 days ago

Apply

2.0 - 5.0 years

1 - 7 Lacs

Kolkata

Work from Office

Naukri logo

Responsibilities: * Should be able to learn new technologies quickly and implement as needed. * Should be able to develop applications in Golang and NestJs / Nodejs (Typescript/Javascript). * Having ability to develop react application is preferred.

Posted 4 days ago

Apply

6.0 - 11.0 years

15 - 25 Lacs

Hyderabad

Work from Office

Naukri logo

6–9 years of hands-on exp in MEAN/MERN stacks Team handling exp is a must DevOps: CI/CD pipelines (Jenkins/GitLab), Docker, Kubernetes. AWS: Lambda, SQS, S3, EC2, CloudFormation. Exp upgrading Angular, Node.js, and MongoDB in production environments

Posted 4 days ago

Apply

4.0 - 6.0 years

72 - 96 Lacs

Ahmedabad

Work from Office

Naukri logo

Responsibilities: * Lead technology strategy & roadmap * Ensure scalability, security & reliability * Collaborate with cross-functional teams on system design * Oversee tech team's delivery & optimization

Posted 5 days ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Pune

Hybrid

Naukri logo

Job Duties and Responsibilities: We are looking for a self-starter to join our Data Engineering team. You will work in a fast-paced environment where you will get an opportunity to build and contribute to the full lifecycle development and maintenance of the data engineering platform. With the Data Engineering team you will get an opportunity to - Design and implement data engineering solutions that is scalable, reliable and secure on the Cloud environment Understand and translate business needs into data engineering solutions Build large scale data pipelines that can handle big data sets using distributed data processing techniques that supports the efforts of the data science and data application teams Partner with cross-functional stakeholder including Product managers, Architects, Data Quality engineers, Application and Quantitative Science end users to deliver engineering solutions Contribute to defining data governance across the data platform Basic Requirements: A minimum of a BS degree in computer science, software engineering, or related scientific discipline is desired 3+ years of work experience in building scalable and robust data engineering solutions Strong understanding of Object Oriented programming and proficiency with programming in Python (TDD) and Pyspark to build scalable algorithms 3+ years of experience in distributed computing and big data processing using the Apache Spark framework including Spark optimization techniques 2+ years of experience with Databricks, Delta tables, unity catalog, Delta Sharing, Delta live tables(DLT) and incremental data processing Experience with Delta lake, Unity Catalog Advanced SQL coding and query optimization experience including the ability to write analytical and nested queries 3+ years of experience in building scalable ETL/ ELT Data Pipelines on Databricks and AWS (EMR) 2+ Experience of orchestrating data pipelines using Apache Airflow/ MWAA Understanding and experience of AWS Services that include ADX, EC2, S3 3+ years of experience with data modeling techniques for structured/ unstructured datasets Experience with relational/columnar databases - Redshift, RDS and interactive querying services - Athena/ Redshift Spectrum Passion towards healthcare and improving patient outcomes Demonstrate analytical thinking with strong problem solving skills Stay on top of emerging technologies and posses willingness to learn. Bonus Experience (optional) Experience with Agile environment Experience operating in a CI/CD environment Experience building HTTP/REST APIs using popular frameworks Healthcare experience

Posted 5 days ago

Apply

10.0 - 15.0 years

15 - 30 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Role & responsibilities Key Responsibilities: Design, develop, and maintain backend services using Python and AWS serverless technologies. Implement event-driven architectures to ensure efficient and scalable solutions. Utilize Terraform for infrastructure as code to manage and provision AWS resources. Configure and manage AWS networking components to ensure secure and reliable communication between services. Develop and maintain serverless applications using AWS Lambda functions, DynamoDB, and other AWS services. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, maintainable, and efficient code while following best practices. Troubleshoot and resolve issues in a timely manner. Stay up to date with the latest industry trends and technologies to ensure our solutions remain cutting-edge. Required Qualifications: 9 to 15 years of experience in backend development with a strong focus on Python. Proven experience with AWS serverless technologies, including Lambda, DynamoDB, and other related services. Strong understanding of event-driven architecture and its implementation. Hands-on experience with Terraform for infrastructure as code. In-depth knowledge of AWS networking components and best practices. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Qualifications: AWS certifications (e.g., AWS Certified Developer, AWS Certified Solutions Architect). Experience with other programming languages and frameworks. Familiarity with CI/CD pipelines and DevOps practices.

Posted 5 days ago

Apply

5.0 - 10.0 years

12 - 15 Lacs

Mohali

Work from Office

Naukri logo

Role Overview: The Enhancements & Automation Lead is responsible for driving AI-driven automation, optimizing workflows, and integrating intelligent solutions to improve the efficiency of Amazon Connect Managed Services. Key Responsibilities: Identify automation opportunities to streamline Amazon Connect operations. Design and implement AI-driven enhancements for call routing, sentiment analysis, and chatbots. Integrate third-party automation tools and AI models for intelligent ticketing and incident classification. Collaborate with NOC teams and developers to optimize IVR workflows and customer experience. Evaluate and recommend automation platforms, RPA (Robotic Process Automation), and AI frameworks . Develop custom scripts, serverless automation workflows, and self-healing mechanisms . Provide expert guidance on DevOps, CI/CD, and cloud automation best practices . Required Skills & Qualifications: 5+ years of experience in automation, AI/ML, or cloud-based service optimization . Hands-on experience with AWS Lambda, AI/ML services, Python, and API integrations . Strong understanding of RPA, low-code/no-code automation tools, and workflow orchestration . Proficiency in Amazon Connect architecture, IVR scripting, and customer interaction analytics . AWS certifications in Machine Learning or DevOps are highly preferred

Posted 5 days ago

Apply

Exploring AWS Lambda Jobs in India

The job market for AWS Lambda roles in India is currently booming with the increasing adoption of cloud technology by businesses. AWS Lambda is a serverless computing service provided by Amazon Web Services, allowing developers to run code without provisioning or managing servers. This has led to a high demand for professionals with AWS Lambda skills across a variety of industries.

Top Hiring Locations in India

Here are 5 major cities in India actively hiring for AWS Lambda roles: - Bangalore - Hyderabad - Pune - Chennai - Mumbai

Average Salary Range

The estimated salary range for AWS Lambda professionals in India varies based on experience levels. Entry-level positions may start at around INR 6-8 lakhs per annum, while experienced professionals can earn upwards of INR 15-20 lakhs per annum.

Career Path

In the field of AWS Lambda, a typical career path may progress as follows: - Junior Developer - Developer - Senior Developer - Tech Lead - Architect

Related Skills

In addition to AWS Lambda expertise, professionals in this field are often expected to have knowledge of the following related skills: - AWS CloudFormation - AWS API Gateway - AWS S3 - AWS DynamoDB - Python programming

Interview Questions

Here are 25 interview questions for AWS Lambda roles: - How does AWS Lambda differ from traditional server-based computing? (basic) - What are the benefits of using AWS Lambda? (basic) - Explain the concept of event sources in AWS Lambda. (medium) - How can you troubleshoot performance issues in AWS Lambda functions? (medium) - What is the maximum execution time allowed for a single invocation of an AWS Lambda function? (basic) - How can you monitor AWS Lambda functions? (medium) - What is the difference between provisioned concurrency and on-demand concurrency in AWS Lambda? (advanced) - How does AWS Lambda handle scaling automatically? (medium) - Explain the concept of cold start in AWS Lambda. (medium) - How can you optimize the performance of AWS Lambda functions? (advanced) - What is the maximum memory allocation for an AWS Lambda function? (basic) - How does AWS Lambda pricing work? (medium) - What is the difference between synchronous and asynchronous invocation of AWS Lambda functions? (medium) - How can you secure AWS Lambda functions? (medium) - What is the AWS Serverless Application Model (SAM)? (medium) - How can you handle errors in AWS Lambda functions? (medium) - Explain the concept of environment variables in AWS Lambda. (basic) - How can you integrate AWS Lambda with other AWS services? (medium) - What are the different programming languages supported by AWS Lambda? (basic) - What are the limitations of AWS Lambda? (medium) - How can you automate deployment of AWS Lambda functions? (medium) - What is the AWS Lambda Execution Role? (basic) - How can you monitor AWS Lambda costs? (medium) - Explain the concept of dead-letter queues in AWS Lambda. (medium) - How can you test AWS Lambda functions locally? (medium)

Closing Remark

As the demand for AWS Lambda professionals continues to grow in India, it is essential for job seekers to stay updated with the latest trends and skills in the field. By preparing thoroughly and applying confidently, you can secure exciting opportunities in the thriving AWS Lambda job market. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies