Home
Jobs

281 Lambda Expressions Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 3.0 years

3 - 7 Lacs

Ahmedabad

Work from Office

Naukri logo

Enterprise-Level Development: Design, develop, and deploy full-stack web applications for enterprise systems, gaming platforms, and social media applications. Backend Development: Build and maintain scalable backend services using Node.js, Next.js, Nest.js, and related frameworks. Frontend Development: Create dynamic and responsive user interfaces using React.js . Familiarity with Next.js and Vue.js is a plus. Third-Party Integrations: Integrate APIs, payment gateways, cloud services, and social media platforms effectively. Database Management: Design, optimize, and manage both relational (PostgreSQL, MySQL) and NoSQL (MongoDB) databases for large-scale data handling. Cloud Infrastructure: Deploy and manage applications on AWS (e.g., EC2, S3, Lambda, RDS) for scalable and reliable performance. Containerization: Use Docker for application containerization and ensure smooth deployment across environments. Team Leadership & Communication: Lead development teams, mentor junior developers, and maintain clear communication with clients to gather requirements and ensure delivery satisfaction. Performance Optimization: Improve application speed, scalability, and security to ensure high availability and excellent user experience. Agile Collaboration: Work in Agile teams, participating in sprint planning, reviews, and delivering consistent, high-quality results. AI Integration: Collaborate on AI features such as chatbots, recommendation systems, sentiment analysis, or content generation using tools like OpenAI, AWS AI/ML, or Google Cloud AI. Required Skills & Experience Node.js: 2+ years of hands-on experience with scalable backend development. React.js & Next.js: Strong expertise in building modern frontends, including SSR and SSG. Database Expertise: Solid experience with PostgreSQL, MySQL, and MongoDB, including schema design and performance tuning. Cloud Platforms: Proficient in AWS services (EC2, S3, Lambda, RDS) for hosting and scaling applications. Docker: Deep understanding of containerization with Docker; familiarity with orchestration tools like Kubernetes. AI Tooling: Exposure to AI platforms such as OpenAI, Google Cloud AI, or similar tools. Why Join Webs Optimization Software Solution Working Days: 5 days a week Company History: Incorporated since 2013 Team: An ever-growing team of 80+ highly talented professionals Work Schedule: Flexible working hours Health Benefits: Medical insurance Work Culture: Positive atmosphere and culture promoting personal growth Job Satisfaction: Job satisfaction and stability with a suitable leave policy Company Activities: Fun company activities Benefit of WFH Policy

Posted 17 hours ago

Apply

4.0 - 8.0 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for an experienced AWS Developer responsible for making our app more scalable and reliable. You will containerize our application and migrate it to EKS or other AWS service such as ECS, Lambda, etc. ? at present we are running our services on EC2 machines using Auto Scaling Groups. You will be responsible for setting up a monitoring stack. These metrics will be used for service capacity planning. Additionally, you will update our deployment model to cover automatic rollbacks, short downtime when a new version is deployed to production servers and similar challenges. Migration to the AWS CI /CD stack will also form a part of your responsibilities. What You?ll Do Assist in the rollout and training of resources on utilizing AWS data science support tools and the AWS environment for development squads. Work within the client?s AWS environment to help implement AI / ML model development and data platform architecture Help evaluate, recommend, and assist with installing of cloud-based tools To wrangle data and host and deploy AI models Expertise You?ll Bring A Bachelor?s or master?s degree in science, engineering, mathematics or equivalent experience 5+ years working as a DevOps engineer Strong hands-on working with AWS ? Lambda, S3, or similar tools Working in an AGILE environment Object Oriented Programming (OOP) Relational database ? MySQL preferred Proficiency in containerization tools Linux Shell Scripting ? Bash, Python Git CI / CD using Jenkins Containerization ? Kubernetes, Pivotal Cloud Foundry, or other similar tools Software development process including architectural styles and design patterns Create CI / CD pipelines using Jenkins, Code Build, AWS ECR, and Helm Jenkins ? job as code, infrastructure as code All aspects of provisioning compute resources within the AWS environment Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above

Posted 19 hours ago

Apply

4.0 - 8.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what?s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 14 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,186M annual revenue (13.2% Y-o-Y). Along with our growth, we?ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,850+ people located in 21 countries across the globe. Throughout this market-leading growth, we?ve maintained strong employee satisfaction - over 94% of our employees approve of the CEO and 89% recommend working at Persistent to a friend. At Persistent, we embrace diversity to unlock everyone's potential. Our programs empower our workforce by harnessing varied backgrounds for creative, innovative problem-solving. Our inclusive environment fosters belonging, encouraging employees to unleash their full potential. For more details please login to www.persistent.com About The Position We are looking for an experienced AWS Developer responsible for making our app more scalable and reliable. You will containerize our application and migrate it to EKS or other AWS service such as ECS, Lambda, etc. ? at present we are running our services on EC2 machines using Auto Scaling Groups. You will be responsible for setting up a monitoring stack. These metrics will be used for service capacity planning. Additionally, you will update our deployment model to cover automatic rollbacks, short downtime when a new version is deployed to production servers and similar challenges. Migration to the AWS CI /CD stack will also form a part of your responsibilities. What You?ll Do Assist in the rollout and training of resources on utilizing AWS data science support tools and the AWS environment for development squads. Work within the client?s AWS environment to help implement AI / ML model development and data platform architecture Help evaluate, recommend, and assist with installing of cloud-based tools To wrangle data and host and deploy AI models Expertise You?ll Bring A Bachelor?s or master?s degree in science, engineering, mathematics or equivalent experience 5+ years working as a DevOps engineer Strong hands-on working with AWS ? Lambda, S3, or similar tools Working in an AGILE environment Object Oriented Programming (OOP) Relational database ? MySQL preferred Proficiency in containerization tools Linux Shell Scripting ? Bash, Python Git CI / CD using Jenkins Containerization ? Kubernetes, Pivotal Cloud Foundry, or other similar tools Software development process including architectural styles and design patterns Create CI / CD pipelines using Jenkins, Code Build, AWS ECR, and Helm Jenkins ? job as code, infrastructure as code All aspects of provisioning compute resources within the AWS environment Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above

Posted 19 hours ago

Apply

4.0 - 8.0 years

5 - 9 Lacs

Pune

Work from Office

Naukri logo

About Persistent We are a trusted Digital Engineering and Enterprise Modernization partner, combining deep technical expertise and industry experience to help our clients anticipate what’s next. Our offerings and proven solutions create a unique competitive advantage for our clients by giving them the power to see beyond and rise above. We work with many industry-leading organizations across the world including 12 of the 30 most innovative US companies, 80% of the largest banks in the US and India, and numerous innovators across the healthcare ecosystem. Our growth trajectory continues, as we reported $1,231M annual revenue (16% Y-o-Y). Along with our growth, we’ve onboarded over 4900 new employees in the past year, bringing our total employee count to over 23,500+ people located in 19 countries across the globe. Persistent Ltd. is dedicated to fostering diversity and inclusion in the workplace. We invite applications from all qualified individuals, including those with disabilities, and regardless of gender or gender preference. We welcome diverse candidates from all backgrounds. For more details please login to www.persistent.com About The Position We are looking for an experienced AWS Developer responsible for making our app more scalable and reliable. You will containerize our application and migrate it to EKS or other AWS service such as ECS, Lambda, etc. ? at present we are running our services on EC2 machines using Auto Scaling Groups. You will be responsible for setting up a monitoring stack. These metrics will be used for service capacity planning. Additionally, you will update our deployment model to cover automatic rollbacks, short downtime when a new version is deployed to production servers and similar challenges. Migration to the AWS CI /CD stack will also form a part of your responsibilities. What You?ll Do Assist in the rollout and training of resources on utilizing AWS data science support tools and the AWS environment for development squads. Work within the client?s AWS environment to help implement AI / ML model development and data platform architecture Help evaluate, recommend, and assist with installing of cloud-based tools To wrangle data and host and deploy AI models Expertise You?ll Bring A Bachelor?s or master?s degree in science, engineering, mathematics or equivalent experience 5+ years working as a DevOps engineer Strong hands-on working with AWS ? Lambda, S3, or similar tools Working in an AGILE environment Object Oriented Programming (OOP) Relational database ? MySQL preferred Proficiency in containerization tools Linux Shell Scripting ? Bash, Python Git CI / CD using Jenkins Containerization ? Kubernetes, Pivotal Cloud Foundry, or other similar tools Software development process including architectural styles and design patterns Create CI / CD pipelines using Jenkins, Code Build, AWS ECR, and Helm Jenkins ? job as code, infrastructure as code All aspects of provisioning compute resources within the AWS environment Benefits Competitive salary and benefits package Culture focused on talent development with quarterly promotion cycles and company-sponsored higher education and certifications Opportunity to work with cutting-edge technologies Employee engagement initiatives such as project parties, flexible work hours, and Long Service awards Annual health check-ups Insurance coverage : group term life , personal accident, and Mediclaim hospitalization for self, spouse, two children, and parents Inclusive Environment •We offer hybrid work options and flexible working hours to accommodate various needs and preferences. •Our office is equipped with accessible facilities, including adjustable workstations, ergonomic chairs, and assistive technologies to support employees with physical disabilities. Let's unleash your full potential. See Beyond, Rise Above

Posted 19 hours ago

Apply

7.0 - 13.0 years

9 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Location: Bangalore. About LeadSquared. One of the fastest-growing SaaS Unicorn companies in the CRM space, LeadSquared empowers organizations with the power of automation. More than 2000 customers with 2 lakhs+ users across the globe utilize the LeadSquared platform to automate their sales and marketing processes and run high-velocity sales at scale, We are backed by prominent investors such as Westbridge Capital, Stakeboat Capital, and Gaja Capital to name a few. We are expanding rapidly and our 1300+ strong and still growing workforce is spread across India, the U.S, the Middle East, ASEAN, ANZ, and South Africa, Among Top 50 fastest growing tech companies in India as per Deloitte Fast 50 programs. Frost and Sullivan's 2019 Marketing Automation Company of the Year award. Among Top 100 fastest growing companies in FT 1000: High-Growth Companies Asia-Pacific. Listed as Top Rates Product on G2Crowd, GetApp, and TrustRadius. Engineering @ LeadSquared. At LeadSquared, we like being up to date with the latest technology and utilizing the trending tech stacks to build our product. By joining the engineering team, you get to work first-hand with the latest web and mobile technologies and solve the challenges of scale, performance, security, and cost optimization. Our goal is to build the best SaaS platform for sales execution in the industry and what better place than LeadSquared for an exciting career?. The Role. LeadSquared platform and product suite are 100% on the cloud and currently all on AWS. The product suite comprises of a large number of applications, services, and APIs built on various open-source and AWS native tech stacks and deployed across multiple AWS accounts, The role involves leading the mission-critical responsibility of ensuring that all our online services are available, reliable, secure, performant, and running at optimal costs. We firmly believe in a code and automation-driven approach to Site Reliability, Responsibilities. Taking ownership of release management with effective build and deployment processes by collaborating with development teams, Infrastructure and configuration management of production systems, Be a stakeholder in product scoping, performance enhancement, cost optimization, and architecture discussions with the Engineering leaders, Automate DevOps functions and full control of source code repository management with continuous integration, Strong understanding of Product functionality, customers’ use cases, and architecture, Prioritize and meet the SLA for incidents and service management; also, to ensure that projects are managed and delivered on time and quality, Recommend new technologies and tools that will automate manual tasks, better observability, and faster troubleshooting, Need to make sure the team adheres to compliance and company policies with regular audits, Motivating, empowering, and improving the team’s technical skills, Requirements. 13+ years’ experience in building, deploying and scaling software applications on AWS cloud. (Preferably in SaaS). Deep understanding of observability and cost optimization of all major AWS services – EC2, RDS, Elasticsearch, Redis, SQS, API Gateway, Lambda, etc, AWS certification is a plus, Experience in building tools for deployment automation and observability response management for AWS resources. Dot Net, Python, and CFTs or Terraform are preferred, Operational experience in deploying, operating, scaling, and troubleshooting large-scale production systems on the cloud, Strong interpersonal communication skills (including listening, speaking, and writing) and ability to work well in a diverse, team-focused environment with other DevOps and engineering teams, Function well in a fast-paced, rapidly changing environment, 5+ years’ experience in people management, Why Should You Apply?. Fast-paced environment. Accelerated Growth & Rewards. Easily approachable management. Work with the best minds and industry leaders. Flexible work timings. Interested?. If this role sounds like you, then apply with us! You have plenty of room for growth at LeadSquared, Show more Show less

Posted 3 days ago

Apply

2.0 - 6.0 years

3 - 6 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

hackajob is collaborating with LexisNexis Risk Solutions to connect them with exceptional tech professionals for this role.. Rust Developer @ IDVerse. Full time — Remote — AU / US. Join a strong team of passionate engineers and build a world-class platform to fight identity fraud at a global scale. In Rust.. The Position. You will work in close collaboration with our SVP of Architecture, our engineering team, and our product team to:. Write and re-write hardened, documented, and tested weband API-based applications. Build our growing collection of libraries. Define and enforce best practices of an expanding Rust team A fair chunk is green-field work (and no, it's not crypto-currency/blockchain related ) (even our front-end applications are written in Rust, using Leptos for the WASM (and Tailwind for CSS)). We prefer event-based architectures, cloud (AWS) and serverless. Only the good stuff.. Needed Qualifications. Whilst technical competence is critical, we place great emphasis on passion, communication and collaboration across the business.. You have solid experience creating and maintaining web-based and API-based applications (in Rust or not). You can demonstrate having built non-trivial Rust projects, ideally web-related. You are comfortable with JavaScript/TypeScript. You are able to communicate clearly, both in writing and orally, and collaborate effectively with a remote team. You understand that documentation is half the battle, and that untested code is broken code. You know it takes time to build anything correctly, and you also know how to "get things done" when the situation calls for it. You are autonomous, but also know it's better to ask than to guess. You are dependable, responsible, and committed.. Nice-to-Haves. It would be even more awesome if you have experience:. Building front-end WebAssembly applications (with Leptos, maybe?). Solving problems with machine learning. Developing for/with AWS Serverless technologies (API Gateway, Lambda, DynamoDB...). Location and Time Zone. Our team is globally distributed and fully remote. The higher concentration is based around the Australian / East Asia time zones. For this role, we'll be looking at any location, but will favour either the American or European time zones.. About Us. IDVerse is a Sydney-based start-up that is a global pioneer in the development of digital identity verification technology. We've built everything from the ground up and have a broad range of blue-chip customers across banking, telecommunications, government, and more. We've perfected the technology locally in Australia and New Zealand, and are quickly expanding into the northern hemisphere. We're still a small team, and take pride in making it smart and inclusive. The position is remote and the work week can be flexible. Remuneration will be competitive and based on experience. We encourage people from all backgrounds and genders to apply to this position. As an early member of the team, you will have a great impact on its future shape.. Instructions On How To Apply. Send an email to devjobs@idverse.com with Rust Up! in the title (be exact, we have automated filters that will discard anything else. This is your first test!). Write a few lines about you and attach your rsum. Add any link you think will help us assess both your soft and hard skills. If you pique our interest, we'll set up a video call and go from there.. Show more Show less

Posted 3 days ago

Apply

1.0 - 4.0 years

2 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

Role Overview. We are seeking a skilled Backend Developer with expertise in TypeScript and AWS to design and implement scalable, event-driven microservices. The ideal candidate will have a strong background in serverless architectures and backend development.. Key Responsibilities. Backend Development: Develop and maintain server-side applications using TypeScript and Node.js.expertia.ai. API Design: Create and manage RESTful APIs adhering to OpenAPI specifications.. Serverless Architecture: Implement serverless solutions using AWS Lambda, API Gateway, and DynamoDB.. Event-Driven Systems: Design and build event-driven architectures utilizing AWS SQS and SNS.vitiya99.medium.com. Microservices: Develop microservices that are scalable and maintainable.. Collaboration: Work closely with frontend developers and other stakeholders to integrate APIs and ensure seamless functionality.. Code Quality: Write clean, maintainable code and conduct code reviews.iihglobal.com. Continuous Improvement: Stay updated with the latest industry trends and technologies to continuously improve backend systems.. Required Skills & Qualifications. Experience: 7–10 years in backend development with a focus on TypeScript and Node.js.. AWS Expertise: Proficiency in AWS services such as Lambda, API Gateway, DynamoDB, SQS, and SNS.. API Development: Experience in designing and implementing RESTful APIs.. Event-Driven Architecture: Familiarity with building event-driven systems using AWS services.. Microservices: Experience in developing microservices architectures.. Version Control: Proficiency in using Git for version control.. CI/CD: Experience with continuous integration and continuous deployment pipelines.. Collaboration: Strong communication skills and ability to work in a team environment.. Preferred Skills. Infrastructure as Code: Experience with tools like Terraform or AWS CloudFormation.. Containerization: Familiarity with Docker and container orchestration tools.. Monitoring & Logging: Experience with monitoring and logging tools to ensure system reliability.. Agile Methodologies: Experience working in Agile development environments.. Show more Show less

Posted 3 days ago

Apply

4.0 - 7.0 years

9 - 13 Lacs

Kolkata

Work from Office

Naukri logo

Join our Team. About this opportunity:. We are looking for skilled Java Developer at all levels (2-8 years) to join our team. The ideal candidate will have strong expertise in Spring Boot, Kafka, AWS, Docker, and Kubernetes, with a passion for building scalable and efficient backend systems. Knowledge of Generative AI (GenAI) would be a big plus!. We are open for Noida, Gurgaon , Kolkata , Pune , Bangalore and Chennai locations.. Key Responsibilities:. Design, develop, and maintain backend services using Java and Spring Boot.. Implement event-driven architectures using Kafka.. Deploy and manage applications on AWS, leveraging cloud-native services.. Containerize applications using Docker and orchestrate deployments with Kubernetes.. Write efficient, scalable, and secure code following best practices.. Collaborate with cross-functional teams, including frontend developers, DevOps, and product teams.. Optimize application performance, troubleshoot issues, and ensure high availability.. Stay updated with emerging technologies, particularly Generative AI trends.. Requirements:. 2-8 years of experience in Java and Spring Boot.. Hands-on experience with Kafka for real-time data streaming.. Knowledge of AWS services (EC2, S3, Lambda, etc.).. Experience with Docker and Kubernetes for containerized deployments.. Understanding of microservices architecture and distributed systems.. Familiarity with RESTful APIs, database management (SQL/NoSQL), and caching strategies.. Strong problem-solving skills and a passion for writing clean, maintainable code.. Preferred Qualifications:. Knowledge of Generative AI (GenAI) and AI/ML models.. Experience with CI/CD pipelines and DevOps practices.. Familiarity with monitoring and logging tools. Exposure to Agile methodologies and team collaboration tools.. Show more Show less

Posted 3 days ago

Apply

2.0 - 6.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Location: OnebyZero Bangalore, India/ Ho Chi Minh, Vietnam/Bangkok, Thailand/Makati, Philippines. Work Set-up: Hybrid. The Role: DevSecOps Engineer. We are looking for a skilled DevSecOps Engineer with over 3 years of experience and expertise in AWS security. This role focuses on ensuring the security of our cloud infrastructure and applications while fostering collaboration between development, operations, and security teams. In addition to security, the role involves managing cloud infrastructure using Terraform and contributing to overall DevOps practices.. What You’ll do. Cloud Security Design & Implementation: Design, implement, and manage secure AWS cloud infrastructure, ensuring adherence to best practices in security, scalability, and availability.. Infrastructure as Code (IaC): Develop and maintain cloud infrastructure using Terraform, ensuring version control, scalability, and ease of deployment.. Security Automation: Develop and maintain CI/CD pipelines with integrated security checks to enable secure and rapid software delivery.. Risk Assessment: Identify vulnerabilities, assess risks, and implement security measures to protect cloud environments.. Compliance Management: Ensure compliance with regulatory standards and internal policies (e.g., GDPR, HIPAA, ISO 27001) across the cloud infrastructure.. Monitoring & Incident Response: Monitor and respond to security incidents using AWS services like CloudTrail, GuardDuty, and Security Hub.. Collaboration & Training: Work with development and operations teams to implement secure coding practices and conduct security training.. DevOps Practices: Collaborate with teams to ensure smooth integration of security into the DevOps pipeline, enabling automated deployments and scaling.. Requirements. Basic Qualifications. 3+ years of hands-on experience in a DevOps Engineer role, SecOps or cloud security roles.. Extensive experience with AWS services (EC2, S3, VPC, Lambda, RDS, etc.). Strong proficiency in Infrastructure as Code (IaC) using Terraform, AWS CDK, or CloudFormation.. Demonstrated expertise in building, managing, and automating CI/CD pipelines (e.g., GitHub Actions, Jenkins).. Advanced scripting skills in Python and Bash for automation and tool development.. Expertise in Linux system administration (Ubuntu, CentOS, etc.).. Deep understanding of networking, security practices, and monitoring in cloud environments.. Experience with containerization and orchestration tools such as Docker and Kubernetes.. Knowledge of security testing tools (e.g., OWASP ZAP, Snyk, or Burp Suite).. Skills. Cloud Platforms: Advanced AWS Cloud expertise (EC2, VPC, S3, Lambda, RDS, CloudFront, etc.). IaC Tools: Terraform, AWS CDK, CloudFormation. CI/CD Tools: GitHub Actions, Jenkins. Scripting Languages: Python, Bash. Containerization: Docker, Kubernetes. Operating Systems: Linux (Ubuntu, CentOS, etc.). Version Control: Git, GitHub, GitLab. Show more Show less

Posted 3 days ago

Apply

3.0 - 5.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description. We are seeking a highly skilled and motivated Cloud Data Engineer with a strong background in computer science or statistics, coupled with at least 5 years of professional experience. The ideal candidate will possess a deep understanding of cloud computing, particularly in AWS, and should have a proven track record in data engineering, Big Data applications, and AI/ML applications.. Responsibilities. Cloud Expertise:. Proficient in AWS services such as EC2, VPC, Lambda, DynamoDB, API Gateway, EBS, S3, IAM, and more.. Design, implement, and maintain scalable cloud-based solutions.. Execute efficient and secure cloud infrastructure configurations.. Data Engineering:. Develop, construct, test, and maintain architectures, such as databases and processing systems.. Utilize coding skills in Spark and Python for data processing and manipulation.. Administer multiple ETL applications to ensure seamless data flow.. Big Data Applications:. Work on end-to-end Big Data application projects, from conception to deployment.. Optimize and troubleshoot Big Data solutions to ensure high performance.. AI/ML Applications:. Experience in developing and deploying AI/ML applications based on NLP, CV, and GenAI.. Collaborate with data scientists to implement machine learning models into production environments.. DevOps and Infrastructure as a Service (IaaS):. Possess knowledge and experience with DevOps applications for continuous integration and deployment.. Set up and maintain infrastructure as a service, ensuring scalability and reliability.. Qualifications. Bachelor’s degree in computer science, Statistics, or a related field.. 5+ years of professional experience in cloud computing, data engineering, and related fields.. Proven expertise in AWS services, with a focus on EC2, VPC, Lambda, DynamoDB, API Gateway, EBS, S3, IAM, etc.. Proficient coding skills in Spark and Python for data processing.. Hands-on experience with Big Data application projects.. Experience in AI/ML applications, particularly in NLP, CV, and GenAI.. Administration experience with multiple ETL applications.. Knowledge and experience with DevOps tools and processes.. Ability to set up and maintain infrastructure as a service.. Soft Skills. Strong analytical and problem-solving skills.. Excellent communication and collaboration abilities.. Ability to work effectively in a fast-paced and dynamic team environment.. Proactive mindset with a commitment to continuous learning and improvement.. Show more Show less

Posted 3 days ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Mumbai

Work from Office

Naukri logo

Our mission is to make meaningful learning a part of your everyday ????. The shelf life of our skills is now less than 5 years. So, if you stopped learning today, your skills would soon be irrelevant. Think that’s a big problem? You’d be right.. Enter HowNow. Founded in 2019, our Learning and Skills Platform is disrupting the way people learn and upskill through technology. Whether it's finding a quick answer, learning skills or tapping into shared knowledge, we make it easy for people to learn what they need, when they need it.. Already used by fast-growing scale-ups and global enterprises, such as Trainline, Depop and Sanofi, we’re pushing the boundaries of how people learn.. Hi I'm Naaz the People Advisor at HowNow ???????? I’m looking for a Senior DevOps Engineer to join us.. Joining us as the first DevOps engineer offers a unique opportunity to shape our culture and practices from the ground up. You'll have the autonomy to drive innovation and make a visible impact on the company’s growth and success, setting the foundation for our future.. As the company grows, so will your opportunities. You'll be in a prime position to evolve into a leadership role, guiding the DevOps function as it becomes central to our operations. This role is perfect for someone eager to make a lasting impact and grow alongside a dynamic company.. Alongside the opportunities to develop and grow your career, we're a fun and friendly bunch. Have a watch of the video below to get an understanding of what it's like to work here.. Day-to-day tasks will include ????. You’ll design and manage scalable and highly available cloud infrastructure on AWS, leveraging services such as EC2, RDS, ElastiCache, OpenSearch, Lambda, and ECS. Implement and maintain EC2 Auto Scaling and load balancing to ensure seamless performance.. You’ll develop, refine, and oversee continuous integration and deployment pipelines, ensuring rapid and reliable code deployment using Jenkins, Terraform, and CloudFormation.. You’ll drive automation initiatives using advanced scripting in Bash, Python, and PowerShell to improve system deployments, upgrades, and day-to-day operations.. You’ll utilize AWS CloudWatch and custom monitoring tools to track system performance and health. Proactively handle incidents, troubleshoot system issues, and perform root cause analysis to prevent future disruptions.. You’ll work closely with engineering and product teams to define and implement infrastructure solutions that support the development of new features and products.. The key things that we will be looking for in applicants ????. You have minimum of 5 years in DevOps or similar roles with a strong background in software development and system administration.. You have bachelor’s degree in Computer Science, Engineering, or related field; or equivalent practical experience.. You have proficiency with AWS, including managing databases such as MongoDB and MySQL on RDS. Strong experience in building and managing containerized applications using Docker and Kubernetes.. You have excellent analytical and troubleshooting skills, with the capability to work under pressure and manage multiple priorities.. You have strong interpersonal and communication skills, capable of working in a team environment and interacting with all levels of management.. Nice To Have. Familiarity with the Ruby on Rails ecosystem.. AWS Certification is highly regarded.. Active contributions to open-source projects.. What You’ll Get. ???? Our salaries are calculated using a SaaS benchmarking tool called (Figures). Happy to disclose upon application. You’ll also receive a performance based bonus on top.. ???? Hybrid working (in our offices 3x a week Mon-Wed). ???? Wind-down Fridays. No meetings from 3pm onwards on Fridays, for you to wind down for the weekend. Our HowNow’ers use this time to exercise, study, or spend some time with their family and friends, which you can read about here. ???????? Enhanced maternity and paternity policies which you can read about here. ???? 25 days holiday, plus bank holidays and your birthday day off. ???? An annual £200 learning and development budget. ???? Dog friendly offices we love our pets! ????. ???? Monthly socials and team dinners which have included Bounce, Mystery Rooms, ITC Maratha, JW Marriot and many more. ???? Access to the very best learning platform out there (HowNow+) to keep you at the top of your game. What's next? ????. Once you've applied, we'll get back in touch with you. This is usually within the next 5 working days. Sometimes it can take slightly longer, but we will get back to you irregardless of what the outcome of your application is.. You'll be invited to a 30-minute video call with Naaz, our People Operations Coordinator to discuss your experiences and the role.. You'll be invited to a 60-minute technical Interview. You'll be invited to a 60-minute video call with Hardik, our Backend Team Lead and Ashish, our CTO and Co-Founder.. Show more Show less

Posted 3 days ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Mangaluru

Work from Office

Naukri logo

Job Summary. As a DevOps Engineer specializing in data, you will be responsible for implementing and managing our cloud-based data infrastructure using AWS and Snowflake. You will collaborate with data engineers, data scientists, and other stakeholders to design, deploy, and maintain a robust data ecosystem that supports our analytics and business intelligence initiatives. Your expertise in modern data tech stacks, MLOps methodologies, automation, and information security will be crucial in enhancing our data pipelines and ensuring data integrity and availability.. Technical Skills. 3+ years of experience in the field of Data Warehousing and BI. Experience working with Snowflake Database. In depth knowledge of Data Warehouse concepts.. Experience in designing, developing, testing, and implementing ETL solutions using enterprise ETL tools .. Experience with large or partitioned relational databases (Aurora / MySQL / DB2). Very strong SQL and data analysis capabilities. Familiarly with Billing and Payment data is a plus. Agile development (Scrum) experience. Other preferred experience includes working with DevOps practices, SaaS, IaaS, code management (CodeCommit, git), deployment tools (CodeBuild, CodeDeploy, Jenkins, Shell scripting), and Continuous Delivery. Primary AWS development skills include S3, IAM, Lambda, RDS, Kinesis, APIGateway, Redshift, EMR, Glue, and CloudFormation. Responsibilities. Be a key contributor to the design and development of a scalable and cost-effective cloud-based data platform based on a data lake design. Develop data platform components in a cloud environment to ingest data and events from cloud and on-premises environments as well as third parties. Build automated pipelines and data services to validate, catalog, aggregate and transform ingested data. Build automated data delivery pipelines and services to integrate data from the data lake to internal and external consuming applications and services. Build and deliver cloud-based deployment and monitoring capabilities consistent with DevOps models. Deep knowledge and skills current with the latest cloud services, features and best practices. Who We Are:. unifyCX is an emerging Global Business Process Outsourcing company with a strong presence in the U.S., Colombia, Dominican Republic, India, Jamaica, Honduras, and the Philippines. We provide personalized contact centers, business processing, and technology outsourcing solutions to clients worldwide. In nearly two decades, unifyCX has grown from a small team to a global organization with staff members all over the world dedicated to supporting our international clientele.. At unifyCX, we leverage advanced AI technologies to elevate the customer experience (CX) and drive operational efficiency for our clients. Our commitment to innovation positions us as a trusted partner, enabling businesses across industries to meet the evolving demands of a global market with agility and precision.. unifyCX is a certified minority-owned business and an EOE employer who welcomes diversity.. Show more Show less

Posted 3 days ago

Apply

3.0 - 7.0 years

8 - 12 Lacs

Mangaluru

Work from Office

Naukri logo

Job Summary. As a DevOps Engineer specializing in data, you will be responsible for implementing and managing our cloud-based data infrastructure using AWS and Snowflake. You will collaborate with data engineers, data scientists, and other stakeholders to design, deploy, and maintain a robust data ecosystem that supports our analytics and business intelligence initiatives. Your expertise in modern data tech stacks, MLOps methodologies, automation, and information security will be crucial in enhancing our data pipelines and ensuring data integrity and availability.. Technical Skills. 3+ years of experience in the field of Data Warehousing and BI. Experience working with Snowflake Database. In depth knowledge of Data Warehouse concepts.. Experience in designing, developing, testing, and implementing ETL solutions using enterprise ETL tools .. Experience with large or partitioned relational databases (Aurora / MySQL / DB2). Very strong SQL and data analysis capabilities. Familiarly with Billing and Payment data is a plus. Agile development (Scrum) experience. Other preferred experience includes working with DevOps practices, SaaS, IaaS, code management (CodeCommit, git), deployment tools (CodeBuild, CodeDeploy, Jenkins, Shell scripting), and Continuous Delivery. Primary AWS development skills include S3, IAM, Lambda, RDS, Kinesis, APIGateway, Redshift, EMR, Glue, and CloudFormation. Responsibilities. Be a key contributor to the design and development of a scalable and cost-effective cloud-based data platform based on a data lake design. Develop data platform components in a cloud environment to ingest data and events from cloud and on-premises environments as well as third parties. Build automated pipelines and data services to validate, catalog, aggregate and transform ingested data. Build automated data delivery pipelines and services to integrate data from the data lake to internal and external consuming applications and services. Build and deliver cloud-based deployment and monitoring capabilities consistent with DevOps models. Deep knowledge and skills current with the latest cloud services, features and best practices. Who We Are:. unifyCX is an emerging Global Business Process Outsourcing company with a strong presence in the U.S., Colombia, Dominican Republic, India, Jamaica, Honduras, and the Philippines. We provide personalized contact centers, business processing, and technology outsourcing solutions to clients worldwide. In nearly two decades, unifyCX has grown from a small team to a global organization with staff members all over the world dedicated to supporting our international clientele.. At unifyCX, we leverage advanced AI technologies to elevate the customer experience (CX) and drive operational efficiency for our clients. Our commitment to innovation positions us as a trusted partner, enabling businesses across industries to meet the evolving demands of a global market with agility and precision.. unifyCX is a certified minority-owned business and an EOE employer who welcomes diversity.. Show more Show less

Posted 3 days ago

Apply

2.0 - 5.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Job Description. Responsibilities:. Design, implement, and manage cloud infrastructure using AWS services, including EC2, Lambda, API Gateway, Step Functions, EKS clusters, and Glue. Develop and maintain Infrastructure as Code (IaC) using Terraform to ensure consistent and reproducible deployments. Set up and optimize CI/CD pipelines using tools such as Azure Pipelines and AWS Pipelines to automate software delivery processes. Containerize applications using Docker and orchestrate them with Kubernetes for efficient deployment and scaling. Write and maintain Python scripts to automate tasks, improve system efficiency, and integrate various tools and services. Develop shell scripts for system administration, automation, and troubleshooting. Implement and manage monitoring and logging solutions to ensure system health and performance. Collaborate with development teams to improve application deployment processes and reduce time-to-market. Ensure high availability, scalability, and security of cloud-based systems. Troubleshoot and resolve infrastructure and application issues in production environments. Implement and maintain backup and disaster recovery solutions. Stay up-to-date with emerging technologies and industry best practices in DevOps and cloud computing. Document processes, configurations, and system architectures for knowledge sharing and compliance purposes. Mentor junior team members and contribute to the overall growth of the DevOps practice within the organization. Additional Information. At Tietoevry, we believe in the power of diversity, equity, and inclusion. We encourage applicants of all backgrounds, genders (m/f/d), and walks of life to join our team, as we believe that this fosters an inspiring workplace and fuels innovation.?Our commitment to openness, trust, and diversity is at the heart of our mission to create digital futures that benefit businesses, societies, and humanity.. Diversity,?equity and?inclusion (tietoevry.com). Show more Show less

Posted 3 days ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Bengaluru

Work from Office

Naukri logo

Join our Team. About this opportunity:. We are looking for and experienced Java Developer or Architect with strong technical expertise to design and lead development of scalable, high performance Java applications. The ideal candidate should have in depth understanding of Java/J2ee technologies, Design Pattern, Microservice Architecture, Docker & Kubernetes, Integration Framework. This role requires design skills, excellent problem-solving skills, and the ability to collaborate with cross-functional teams, including DevOps, and Front-End developers.. What you will do:. Architect, design, and implement back-end solutions using Java/J2ee, Spring MVC, Spring Boot and related frameworks.. Design, develop and maintain scalable Java components using REST or SOAP based Web Services. ? Design & develop enterprise solution with Messaging or Streaming Framework like ActiveMQ, HornetQ & Kafka. ? Work with Integration Framework like Apache Camel/Jboss Fuse/Mule ESB/EAI/Spring Integration.. ? Make effective use of Caching Technologies (like Hazlecast /Redis /Infinispan /EHCache /MemCache) in application to handle large volume of data set.. ? Deploy the application in Middleware or App Server (like Jboss/Weblogic/tomcat). ? Collaborate with the DevOps team to manage builds and CI/CD pipelines using Jira, GitLab, Sonar and other tools.. The skills you bring:. Strong expertise in Java/J2ee, Springboot & Micriservices.. ? Good understanding of Core Java concepts (like Collections Framework, Object Oriented Design). ? Experience in working with Multithreading Concepts (like Thread Pool, Executor Service, Future Task, Concurrent API, Countdown Latch). ? Detailed working exposure of Java8 with Stream API, Lambda, Interface, Functional Interfaces. ? Proficiency in Java Web Application Development using Spring MVC & Spring Boot. ? Good Knowledge about using Data Access Frameworks using ORM (Hibernate & JPA). ? Familiar with Database concepts with knowledge in RDBMS/SQL. ? Good understanding of Monolithic & Microservice Architecture. What happens once you apply?. Click Here to find all you need to know about what our typical hiring process looks like.. We encourage you to consider applying to jobs where you might not meet all the criteria. We recognize that we all have transferrable skills, and we can support you with the skills that you need to develop.. Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do. We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth. We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team. Ericsson is proud to be an Equal Opportunity Employer. learn more.. Primary country and city: India (IN) || Kolkata. Job details: Software Developer. Job Stage: Job Stage 4. Primary Recruiter: Avishek Lama. Hiring Manager: Suranjit Dutta. Show more Show less

Posted 3 days ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Navi Mumbai

Work from Office

Naukri logo

As a Big Data Engineer, you wi deveop, maintain, evauate, and test big data soutions. You wi be invoved in data engineering activities ike creating pipeines/workfows for Source to Target and impementing soutions that tacke the cients needs. Your primary responsibiities incude: Design, buid, optimize and support new and existing data modes and ETL processes based on our cients business requirements. Buid, depoy and manage data infrastructure that can adequatey hande the needs of a rapidy growing data driven organization. Coordinate data access and security to enabe data scientists and anaysts to easiy access to data whenever they need too Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scaa ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Deveoped Python and pyspark programs for data anaysis. Good working experience with python to deveop Custom Framework for generating of rues (just ike rues engine). Deveoped Python code to gather the data from HBase and designs the soution to impement using Pyspark. Apache Spark DataFrames/RDD's were used to appy business transformations and utiized Hive Context objects to perform read/write operations Preferred technica and professiona experience Understanding of Devops. Experience in buiding scaabe end-to-end data ingestion and processing soutions Experience with object-oriented and/or functiona programming anguages, such as Python, Java and Scaa

Posted 4 days ago

Apply

4.0 - 9.0 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Designs, deveops and supports appications soutions with focus on HANA version of Advanced Business Appication Programming (ABAP). This speciaty may design, deveop and/or re-engineer highy compex appication components, and integrate software packages, programs and reusabe objects residing on mutipe patforms. This speciaty may additionay have working knowedge of SAP HANA Technica Concept and Architecture, Data Modeing using HANA Studio, ABAP Deveopment Toos (ADT), Code Performance Rues and Guideines for SAP HANA, ADBC, Native SQL, ABAP Core data Services, Data Base Procedures, Text Search, ALV on HANA, and HANA Live modes consumption Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise 4 -12 years of experience required. The ABAP on HANA Appication Deveopers woud possess the knowedge of the foowing topics and appy them to bring in vaue and innovation to cient engagementsSAP HANA Technica Concept and Architecture, Data Modeing using HANA Studio, ABAP Deveopment Toos (ADT), Code Performance Rues and Guideines for SAP HANA, ADBC, Native SQL, ABAP Core data Services, Data Base Procedures, Text Search, ALV on HANA, and HANA Live modes consumption. Designing and deveoping, data dictionary objects, data eements, domains, structures, views, ock objects, search heps and in formatting the output of SAP documents with mutipe options. Modifying standard ayout sets in SAP Scripts, Smart forms & Adobe Forms Deveopment experience in RICEF (Reports, Interfaces, Conversions, Enhancements, Forms and Reports) Preferred technica and professiona experience Experience in working in Impementation, Upgrade, Maintenance and Post Production support projects woud be an advantage Understanding of SAP functiona requirement, conversion into Technica design and deveopment using ABAP Language for Report, Interface, Conversion, Enhancement and Forms in impementation or support projects

Posted 4 days ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Mumbai

Work from Office

Naukri logo

As a Big Data Engineer, you wi deveop, maintain, evauate, and test big data soutions. You wi be invoved in data engineering activities ike creating pipeines/workfows for Source to Target and impementing soutions that tacke the cients needs. Your primary responsibiities incude: Design, buid, optimize and support new and existing data modes and ETL processes based on our cients business requirements. Buid, depoy and manage data infrastructure that can adequatey hande the needs of a rapidy growing data driven organization. Coordinate data access and security to enabe data scientists and anaysts to easiy access to data whenever they need too. Required education Bacheor's Degree Preferred education Master's Degree Required technica and professiona expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scaa ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Deveoped Python and pyspark programs for data anaysis. Good working experience with python to deveop Custom Framework for generating of rues (just ike rues engine). Deveoped Python code to gather the data from HBase and designs the soution to impement using Pyspark. Apache Spark DataFrames/RDD's were used to appy business transformations and utiized Hive Context objects to perform read/write operations. Preferred technica and professiona experience Understanding of Devops. Experience in buiding scaabe end-to-end data ingestion and processing soutions Experience with object-oriented and/or functiona programming anguages, such as Python, Java and Scaa

Posted 4 days ago

Apply

8.0 - 13.0 years

25 - 30 Lacs

Chennai

Work from Office

Naukri logo

Join our Team Join our Team Job Description We are looking for and experienced Java Developer with strong technical expertise to design and lead development of scalable, high-performance Java / python applications The ideal candidate should have Indepth understanding of Java / python technologies, Design Pattern, Microservice Architecture, Docker &Kubernetes, Integration Framework This role requires design skills, excellent problem-solving skills, and the ability to collaborate with cross-functional teams, including DevOps, and Front-End developers, What You Will Do Design and implement back-end solutions using Java, Spring Boot and related frameworks, Design, develop and maintain scalable Java components using REST Microservices Design & develop enterprise solution with Messaging or Streaming Framework like ActiveMQ, Hornet Q &Kafka Work experience in AWS/GCP/Azure or Ericsson internal ADP Services Work with Integration Framework like Apache Camel/Jboss Fuse/Mule ESB/EAI/Spring Integration, Make effective use of Caching Technologies (like Hazle cast /Redis /Infini span /EH Cache /Memcached) in application to handle large volume of data set, Deploy the application in Middleware or App Server (like Jboss/WebLogic/tomcat) Collaborate with the DevOps team to manage builds and CI/CD pipelines using Jira, GitLab, Sonar and other tools, Communicate effectively with a diverse set of technical audiences to convey complex concepts, You will bring Strong expertise in Java/J2ee technologies Good understanding of Core Java concepts (like Collections Framework, Object Oriented Design) Experience in working with Multithreading Concepts (like Thread Pool, Executor Service, Future Task, Concurrent API, Countdown Latch) Detailed working exposure of Java8 with Stream API, Lambda, Interface, Functional Interfaces Proficiency in Java Web Application Development using Spring MVC & Spring Boot Good Knowledge about using Data Access Frameworks using ORM (Hibernate & JPA) Familiar with Database concepts with knowledge in NoSQL Good understanding of Microservice Architecture Familiarity in working with Design Pattern like Creational, Behavioral, Structural & Dependency Injection(Spring IoC) Proficiency in exposing & consuming using REST based Web Services Expertise in Messaging or Streaming Framework (any one like Active MQ/Hornet Q/Kafka) Knowledge about Caching Technologies (any one like Hazle cast/Redis/Infini span/EH Cache/Mem Cache) Working experience of Integration Framework (any one like Apache Camel/Jboss Fuse/Mule ESB/EAI/Spring Integration) Hands on experience in any Middleware or App Server (any one like Jboss/tomcat) Knowledge of Java Enterprise Technologies (like Filters, Interceptor) Familiar with JEE Security (like Encryption/Decryption, Spring Security, SSL/TLS) Understanding of high-performance real time and distributed transactional processing Experience in Unit Testing (like Junit/nUnit/Mockito) and code coverage (like JaCoCo/Any Other) Knowledge of Cloud Technologies (like Docker & Kubernetes) Familiar with DevOps Tools like Git, GitLab, bitbucket, SVN etc, Static Code review with SonarQube & Agile process/tools like Jira, Good to have Telecom Domain Knowledge Good to have knowledge in Distributed log management (ELK, Splunk) Good to have scripting proficiency in Unix/Linux platform Good to have understanding on Security (Vulnerability/Privacy/Hardening etc) and their tools, Why join Ericsson At Ericsson, you?ll have an outstanding opportunity The chance to use your skills and imagination to push the boundaries of what?s possible To build solutions never seen before to some of the worlds toughest problems You?ll be challenged, but you wont be alone You?ll be joining a team of diverse innovators, all driven to go beyond the status quo to craft what comes next, What happens once you apply Click Here to find all you need to know about what our typical hiring process looks like, Encouraging a diverse and inclusive organization is core to our values at Ericsson, that's why we champion it in everything we do We truly believe that by collaborating with people with different experiences we drive innovation, which is essential for our future growth We encourage people from all backgrounds to apply and realize their full potential as part of our Ericsson team Ericsson is proud to be an Equal Opportunity Employer learn more, Primary country and city: India (IN) || Chennai Req ID: 765318

Posted 4 days ago

Apply

8.0 - 13.0 years

13 - 18 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a Senior Snowflake Developer/Architect will be responsible for designing, developing, and maintaining scalable data solutions that effectively meet the needs of our organization. The role will serve as a primary point of accountability for the technical implementation of the data flows, repositories and data-centric solutions in your area, translating requirements into efficient implementations. The data repositories, data flows and data-centric solutions you create will support a wide range of reporting, analytics, decision support and (Generative) AI solutions. Your Role: Implement and manage data modelling techniques, including OLTP, OLAP, and Data Vault 2.0 methodologies. Write optimized SQL queries for data extraction, transformation, and loading. Utilize Python for advanced data processing, automation tasks, and system integration. Be an advisor with your In-depth knowledge of Snowflake architecture, features, and best practices. Develop and maintain complex data pipelines and ETL processes in Snowflake. Collaborate with data architects, analysts, and stakeholders to design optimal and scalable data solutions. Automate DBT Jobs & build CI/CD pipelines using Azure DevOps for seamless deployment of data solutions. Ensure data quality, integrity, and compliance throughout the data lifecycle. Troubleshoot, optimize, and enhance existing data processes and queries for performance improvements. Document data models, processes, and workflows clearly for future reference and knowledge sharing. Build Data tests, Unit tests and mock data frameworks. Who You Are: B achelors or masters degree in computer science, mathematics, or related fields. At least 8 years of experience as a data warehouse expert, data engineer or data integration specialist. In depth knowledge of Snowflake components including Security and Governance Proven experience in implementing complex data models (eg. OLTP , OLAP , Data vault) A strong understanding of ETL including end-to-end data flows, from ingestion to data modeling and solution delivery. Proven industry experience with DBT and JINJA scripts Strong proficiency in SQL, with additional knowledge of Python (i.e. pandas and PySpark) being advantageous. Familiarity with data & analytics solutions such as AWS (especially Glue, Lambda, DMS) is nice to have. Experience working with Azure Dev Ops and warehouse automation tools (eg. Coalesce) is a plus. Experience with Healthcare R&D is a plus. Excellent English communication skills, with the ability to effectively engage both with R&D scientists and software engineers. Experience working in virtual and agile teams.

Posted 4 days ago

Apply

8.0 - 13.0 years

15 - 30 Lacs

Pune

Work from Office

Naukri logo

Job Description SecurityHQ is a global cybersecurity company. Our specialist teams design, engineer and manage systems that promote clarity and an inclusive culture of trust, build momentum around improving security posture, and increase the value of cybersecurity investment. Around the clock, 365 days per year, our customers are never alone. Were SecurityHQ. Were focused on engineering cybersecurity, by design Responsibilities Lead response to complex, high-impact security incidents in AWS, including unauthorized access, data breaches, malware infections, DDoS attacks, phishing, APTs, zero-day exploits, and cloud misconfigurations. Perform in-depth analysis of security incidents, including advanced log analysis, digital forensic investigation, and root cause analysis. Develop and implement containment, eradication, and recovery plans for complex security incidents, minimizing disruption and improving security posture. Coordinate with internal and external stakeholders during incident response activities. Document incident details, analysis findings, and remediation actions, including detailed forensic reports and security posture assessments. Identify and recommend security improvements to prevent future incidents and enhance cloud security posture, including: AWS security best practices Security tool implementation and configuration (with a focus on CSPM tools) Vulnerability management Security awareness training Threat hunting strategies Security architecture enhancements CSPM implementation and optimization Develop and maintain AWS-specific incident response plans, playbooks, and procedures, emphasizing automation, orchestration, and continuous security posture improvement. Stay current on cloud security, digital forensics, and cloud security posture management. Mentor junior security analysts in incident response and security posture management. Participate in on-call rotation, providing expert-level support and guidance on security posture. Develop and deliver training on incident response, forensic best practices, and cloud security posture management. Conduct proactive threat hunting and security posture assessments. Contribute to the development of security tools and automation to improve incident response efficiency, effectiveness, and security posture. Essential Skills Expert-level understanding of AWS services, including: EC2, S3, RDS, VPC, Lambda CloudTrail, CloudWatch, Config, Security Hub, GuardDuty IAM, KMS AWS Organizations, AWS Control Tower Extensive experience with SIEM systems (e.g., Datadog, Qradar, Azure Sentinel) in a cloud environment, with a focus on security posture monitoring. Mastery of log analysis, network analysis, and digital forensic investigation techniques, including experience with specialized forensic tools (e.g., EnCase, FTK, Autopsy, Velociraptor) and CSPM tools. Strong experience with scripting (e.g., Python, PowerShell) for automation, analysis, tool development, and security posture management. Deep familiarity with security tools and technologies, including: IDS/IPS EDR Vulnerability scanners Firewalls Network forensics tools CSPM tools Excellent communication and interpersonal skills, with the ability to convey highly technical information to technical and non-technical audiences, including executive leadership and legal counsel, regarding incident response and security posture. Exceptional problem-solving and analytical skills; ability to remain calm, focused, and decisive under high-pressure situations, including those involving significant security posture deficiencies. Ability to work independently, lead a team, and collaborate effectively to improve the organization's security posture. Expert-level understanding of AWS services, including: EC2, S3, RDS, VPC, Lambda CloudTrail, CloudWatch, Config, Security Hub, GuardDuty IAM, KMS AWS Organizations, AWS Control Tower Extensive experience with SIEM systems (e.g., Datadog, Qradar, Azure Sentinel) in a cloud environment, with a focus on security posture monitoring. Mastery of log analysis, network analysis, and digital forensic investigation techniques, including experience with specialized forensic tools (e.g., EnCase, FTK, Autopsy, Velociraptor) and CSPM tools. Strong experience with scripting (e.g., Python, PowerShell) for automation, analysis, tool development, and security posture management. Deep familiarity with security tools and technologies, including: IDS/IPS EDR Vulnerability scanners Firewalls Network forensics tools CSPM tools Excellent communication and interpersonal skills, with the ability to convey highly technical information to technical and non-technical audiences, including executive leadership and legal counsel, regarding incident response and security posture. Exceptional problem-solving and analytical skills; ability to remain calm, focused, and decisive under high-pressure situations, including those involving significant security posture deficiencies. Ability to work independently, lead a team, and collaborate effectively to improve the organization's security posture. Education Requirements & Experience Master's degree in Computer Science, Cybersecurity, or a related field. AWS Security certifications (e.g., AWS Certified Security - Specialty). Relevant security certifications (e.g., CISSP, GCIH, GCIA, GREM, GNFA, OSCP). Experience leading incident response teams and security posture improvement initiatives. Experience with cloud automation and orchestration (e.g., AWS Systems Manager, Lambda) for incident response and security posture management. Knowledge of DevSecOps principles and practices, including security integration into CI/CD pipelines and infrastructure as code (IaC) security. Experience with container security (e.g., Docker, Kubernetes) in AWS, including forensic analysis and security posture assessment. Experience with reverse engineering and malware analysis, focused on identifying threats that impact cloud security posture. Strong understanding of legal and regulatory issues related to digital forensics, incident response, and cloud security posture (e.g., data privacy, chain of custody, compliance requirements).

Posted 4 days ago

Apply

8.0 - 13.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 5 days ago

Apply

9.0 - 14.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 5 days ago

Apply

10.0 - 15.0 years

15 - 30 Lacs

Pune, Chennai, Bengaluru

Hybrid

Naukri logo

Role & responsibilities Key Responsibilities: Design, develop, and maintain backend services using Python and AWS serverless technologies. Implement event-driven architectures to ensure efficient and scalable solutions. Utilize Terraform for infrastructure as code to manage and provision AWS resources. Configure and manage AWS networking components to ensure secure and reliable communication between services. Develop and maintain serverless applications using AWS Lambda functions, DynamoDB, and other AWS services. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, maintainable, and efficient code while following best practices. Troubleshoot and resolve issues in a timely manner. Stay up to date with the latest industry trends and technologies to ensure our solutions remain cutting-edge. Required Qualifications: 9 to 15 years of experience in backend development with a strong focus on Python. Proven experience with AWS serverless technologies, including Lambda, DynamoDB, and other related services. Strong understanding of event-driven architecture and its implementation. Hands-on experience with Terraform for infrastructure as code. In-depth knowledge of AWS networking components and best practices. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Qualifications: AWS certifications (e.g., AWS Certified Developer, AWS Certified Solutions Architect). Experience with other programming languages and frameworks. Familiarity with CI/CD pipelines and DevOps practices.

Posted 5 days ago

Apply

5.0 - 10.0 years

30 - 40 Lacs

Pune, Ahmedabad

Work from Office

Naukri logo

We are seeking an experienced Sr. Java Developer with expertise in Java Spring and Spring Boot frameworks, Rest API, and Cloud. The ideal candidate will have 6+ years of hands-on experience in developing scalable and robust applications. Experience with any cloud services (AWS/Azure/GCP). Job Title: Sr. Java Developer Location: Ahmedabad/Pune Experience: 5 + Years Educational Qualification: UG: BS/MS in Computer Science, or other engineering/technical degree Key Responsibilities: Responsible for the complete software development life cycle, including requirement analysis, design, development, deployment, and support. Responsible for developing software products for Agentic AI Security. Write clean, testable, readable, scalable and maintainable Java code. Design, develop and implement highly scalable software features and infrastructure on our security platform ready for cloud native deployment from inception to completion. Participate actively and contribute to design and development discussions. Develop solid understanding and be able to explain advanced cloud computing and cloud security concepts to others. Work cross-functionally with Product Management, SRE, Software, and Quality Engineering teams to deliver new security-as-a-service offerings to the market in a timely fashion with excellent quality. Be able to clearly communicate goals and desired outcomes to internal project teams. Work closely with customer support teams to improve end-customer outcomes. Required Skill: Strong programming skills in Java, with experience in building distributed systems. 6+ years of experience in software engineering, with a focus on cloud-native application development, at large organizations or innovative startups. 3+ Experience and deep understanding for building connectors for Low Code/noCode and Agentic AI platforms like Microsoft Copilot Studio, Microsoft Power Platform, Salesforce Agentforce, Zappier, Crew AI, Marketo etc. 5+ Experience building connectors for SaaS Applications like Microsoft O365, Power Apps, Salesforce, ServiceNow etc. Preferred experience with security products-data and DLP, CASB security, SASE, and integration with third party APIs and services. 5+ years of experience with running workloads on cloud-based architectures. (AWS/GCP experience preferred) 5+ years of experience in cloud technologies like ElasticSearch, Redis, Kafka, Mongo DB, Spring Boot . Experience with Docker and Kubernetes or other container orchestration platforms. Excellent troubleshooting abilities. Isolate issues found during testing and verify bug fixes once they are resolved. Experience with backend development (REST APIs, databases, and serverless computing) of distributed cloud applications. Experience with building and delivering services and workflows at scale, leveraging microservices architectures. Experience with the agile process and working with software development teams involved with building out full-stack products deployed on the cloud at scale. Good understanding of public cloud design considerations and limitations in areas of microservice architectures, security, global network infrastructure, distributed systems, and load balancing. Strong understanding of principles of DevOps and continuous delivery. Can-do attitude and ability to make trade-off judgements with data-driven decision-making. High energy and the ability to work in a fast-paced environment. Enjoys working with many different teams with strong collaboration and communication skills.

Posted 5 days ago

Apply

Exploring Lambda Expressions Jobs in India

Lambda expressions have become increasingly popular in the tech industry, with many companies in India actively seeking professionals with this skill set. Job seekers with expertise in lambda expressions can find promising opportunities in various sectors across the country.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The salary range for lambda expressions professionals in India varies based on experience level. Entry-level positions can expect to earn between INR 4-6 lakhs per annum, while experienced professionals can command salaries ranging from INR 10-15 lakhs per annum.

Career Path

In the lambda expressions field, a typical career path may involve starting as a Junior Developer, progressing to a Senior Developer, and eventually moving up to a Tech Lead position. With continuous learning and skill development, professionals can advance their careers in this domain.

Related Skills

Apart from lambda expressions, professionals in this field are often expected to have knowledge of the following related skills: - Java programming - Functional programming concepts - Spring framework - Distributed systems - Problem-solving abilities

Interview Questions

  • What are lambda expressions in Java? (basic)
  • How are lambda expressions different from anonymous classes? (medium)
  • Can lambda expressions be used to replace existing functional interfaces in Java? (medium)
  • What is the syntax for lambda expressions in Java? (basic)
  • Explain the concept of functional interfaces in Java. (basic)
  • How do you capture variables in lambda expressions? (medium)
  • What are the benefits of using lambda expressions in Java? (basic)
  • How do you use lambda expressions to iterate over a list in Java? (medium)
  • Can lambda expressions have return statements? (medium)
  • How do you use method references in Java? (advanced)
  • Explain the target typing of lambda expressions. (advanced)
  • How do you use lambda expressions in streams? (medium)
  • What is the purpose of the @FunctionalInterface annotation in Java? (basic)
  • Compare and contrast lambda expressions and streams in Java. (medium)
  • What is the role of the Predicate interface in lambda expressions? (medium)
  • Explain the concept of capturing "effectively final" variables in lambda expressions. (advanced)
  • How do you handle exceptions in lambda expressions? (advanced)
  • What are the limitations of using lambda expressions in Java? (medium)
  • Can lambda expressions access non-final local variables? (medium)
  • How do you compose functions using lambda expressions in Java? (advanced)
  • What is a method reference and how is it related to lambda expressions? (medium)
  • How do you convert lambda expressions to method references? (advanced)
  • What is type inference in lambda expressions? (medium)
  • How do you use lambda expressions to implement the Comparator interface in Java? (medium)
  • Explain the concept of higher-order functions in the context of lambda expressions. (advanced)

Closing Remark

As you prepare for lambda expressions job opportunities in India, remember to showcase your expertise and skills confidently during interviews. Keep learning and honing your abilities to stay competitive in this dynamic field. Best of luck in your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies