Jobs
Interviews

922 Aws Lambda Jobs - Page 2

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

1.0 - 6.0 years

9 - 13 Lacs

bengaluru

Work from Office

We are seeking a highly skilled AI Engineer with a strong foundation in AWS application development to join our growing team. This role is ideal for someone who thrives in a fast-paced environment and is passionate about building scalable, high-performance applications using modern cloud-native technologies. Key Responsibilities: Design, develop, and deploy serverless applications using AWS services such as Lambda , API Gateway , Route 53 , and optionally Fargate . Build and maintain APIs using FastAPI for high-performance, scalable applications. Collaborate with cross-functional teams to integrate AI/ML models and LLMs into production systems. Analyze system performance, identify latency bottlenecks, and implement improvements to enhance scalability and reliability. Monitor and troubleshoot network infrastructure issues including timeouts and connectivity problems. Youd describe yourself as: Hands-on experience with AWS , particularly: AWS Lambda API Gateway Route 53 (Fargate is a plus) Proficiency in FastAPI for building RESTful APIs. Strong understanding of cloud-native architecture and microservices. Preferred/Bonus Skills: Experience with network performance analysis , latency optimization, and scalability improvements. Familiarity with AI/ML models , especially Large Language Models (LLMs) and their deployment in production environments. Knowledge of containerization and orchestration tools (e.g., Docker, ECS, Kubernetes). This role is based in Bangalore. But youll also get to visit other locations in India and globe, so youll need to go where this journey takes you. In return, youll get the chance to work with teams impacting entire cities, countries and the shape of things to come.

Posted 1 day ago

Apply

2.0 - 6.0 years

9 - 14 Lacs

bengaluru

Work from Office

We are seeking a highly skilled AI Engineer with a strong foundation in AWS application development to join our growing team. This role is ideal for someone who thrives in a fast-paced environment and is passionate about building scalable, high-performance applications using modern cloud-native technologies. Key Responsibilities: Design, develop, and deploy serverless applications using AWS services such as Lambda , API Gateway , Route 53 , and optionally Fargate . Build and maintain APIs using FastAPI for high-performance, scalable applications. Collaborate with cross-functional teams to integrate AI/ML models and LLMs into production systems. Analyze system performance, identify latency bottlenecks, and implement improvements to enhance scalability and reliability. Monitor and troubleshoot network infrastructure issues including timeouts and connectivity problems. Youd describe yourself as: Hands-on experience with AWS , particularly: AWS Lambda API Gateway Route 53 (Fargate is a plus) Proficiency in FastAPI for building RESTful APIs. Strong understanding of cloud-native architecture and microservices. Preferred/Bonus Skills: Experience with network performance analysis , latency optimization, and scalability improvements. Familiarity with AI/ML models , especially Large Language Models (LLMs) and their deployment in production environments. Knowledge of containerization and orchestration tools (e.g., Docker, ECS, Kubernetes). This role is based in Bangalore. But youll also get to visit other locations in India and globe, so youll need to go where this journey takes you. In return, youll get the chance to work with teams impacting entire cities, countries and the shape of things to come.

Posted 1 day ago

Apply

8.0 - 12.0 years

15 - 20 Lacs

pune

Work from Office

We are seeking a dynamic and experienced Tech Lead with a strong foundation in Java and Apache Spark to join our team. In this role, you will lead the development and deployment of scalable cloud-based data solutions, leveraging your expertise in AWS and big data technologies. Key Responsibilities: Lead the design, development, and deployment of scalable and reliable data processing solutions on AWS using Java and Spark. Architect and implement big data processing pipelines using Apache Spark on AWS EMR. Develop and deploy Serverless applications using AWS Lambda, integrating with other AWS services. Utilize Amazon EKS for container orchestration and microservices management. Design and implement workflow orchestration using Apache Airflow for complex data pipelines. Collaborate with cross-functional teams to define project requirements and ensure seamless integration of services. Mentor and guide team members in Java development best practices, cloud architecture, and data engineering. Monitor and optimize performance and cost of deployed solutions across AWS infrastructure. Stay current with emerging technologies and industry trends to drive innovation and maintain a competitive edge. Required Skills: Strong hands-on experience in Java development. Proficiency in Apache Spark for distributed data processing. Experience with AWS services including EMR, Lambda, EKS, and Airflow. Solid understanding of Serverless architecture and microservices. Proven leadership and mentoring capabilities. Excellent problem-solving and communication skills.

Posted 1 day ago

Apply

2.0 - 4.0 years

25 - 35 Lacs

bengaluru

Work from Office

Technologies: Amazon Bedrock, RAG Models, Java, Python, C or C++, AWS Lambda, Responsibilities: Responsible for developing, deploying, and maintaining a Retrieval Augmented Generation (RAG) model in Amazon Bedrock, our cloud-based platform for building and scaling generative AI applications. Design and implement a RAG model that can generate natural language responses, commands, and actions based on user queries and context, using the Anthropic Claude model as the backbone. Integrate the RAG model with Amazon Bedrock, our platform that offers a choice of high-performing foundation models from leading AI companies and Amazon via a single API, along with a broad set of capabilities to build generative AI applications with security, privacy, and responsible AI. Optimize the RAG model for performance, scalability, and reliability, using best practices and robust engineering methodologies. Design, test, and optimize prompts to improve performance, accuracy, and alignment of large language models across diverse use cases. Develop and maintain reusable prompt templates, chains, and libraries to support scalable and consistent GenAI applications. Skills/Qualifications: Experience in programming with at least one software language, such as Java, Python, or C/C++. Experience in working with generative AI tools, models, and frameworks, such as Anthropic, OpenAI, Hugging Face, TensorFlow, PyTorch, or Jupyter. Experience in working with RAG models or similar architectures, such as RAG, Ragna, or Pinecone. Experience in working with Amazon Bedrock or similar platforms, such as AWS Lambda, Amazon SageMaker, or Amazon Comprehend. Ability to design, iterate, and optimize prompts for various LLM use cases (e.g., summarization, classification, translation, Q&A, and agent workflows). Deep understanding of prompt engineering techniques (zero-shot, few-shot, chain-of-thought, etc.) and their effect on model behavior. Familiarity with prompt evaluation strategies, including manual review, automatic metrics, and A/B testing frameworks. Experience building prompt libraries, reusable templates, and structured prompt workflows for scalable GenAI applications. Ability to debug and refine prompts to improve accuracy, safety, and alignment with business objectives. Awareness of prompt injection risks and experience implementing mitigation strategies. Familiarity with prompt tuning, parameter-efficient fine-tuning (PEFT), and prompt chaining methods. Familiarity with continuous deployment and DevOps tools preferred. Experience with Git preferred Experience working in agile/scrum environments Successful track record interfacing and communicating effectively across cross-functional teams. Good communication, analytical and presentation skills, problem-solving skills and learning attitude

Posted 1 day ago

Apply

7.0 - 12.0 years

15 - 30 Lacs

hyderabad

Work from Office

Experience: 7+ Years Location: Hyderabad Notice Period: Immediate to 30 Days Required Skills & Qualifications: 3+ years of backend development experience. 2+ years of working with AWS services, particularly in a serverless environment. Proficiency in Node.js, Python, or Java for backend development. Solid understanding of microservices design principles and API development. Hands-on experience working with SQL and NOSQL databases. Experience with AWS Lambda, API Gateway, DynamoDB, S3, SQS, SNS, and CloudWatch. Familiarity with CI/CD pipelines, Infrastructure as Code (IaC) using CDK or SAM or Terraform. Knowledge of IOT data and related technologies will be a plus. Experience with Docker and container orchestration (e.g., ECS, EKS) is a plus. Strong understanding of REST, JSON, and asynchronous messaging patterns. Knowledge of security best practices in cloud-based environments.

Posted 1 day ago

Apply

6.0 - 10.0 years

14 - 22 Lacs

hyderabad, pune, bengaluru

Hybrid

We seek a Senior-Level AWS Data Engineer who shares our passion for innovation and change. This role is critical to helping our business partners evolve and adapt to consumers' personalized expectations in this new technological era. Looking at highly skilled and motivated Data Engineer to join our dynamic team. The ideal candidate will have a strong background in designing, developing, and managing data pipelines, working with cloud technologies, and optimizing data workflows. You will play a key role in supporting our data-driven initiatives and ensuring the seamless integration and analysis of large datasets. What will help you succeed: Fluent English Python, PySpark, SparkSQL, and SQL. AWS data services, including S3, S3 tables, Glue, EMR, EC2, Athena, Redshift, step functions and Lambda Functions. Design Scalable Data Models: Develop and maintain conceptual, logical, and physical data models for structured and semi-structured data in AWS environments. Optimize Data Pipelines: Work closely with data engineers to align data models with AWS-native data pipeline design and ETL best practices. AWS Cloud Data Services: Design and implement data solutions leveraging AWS Redshift, Athena, Glue, S3, Lake Formation, and AWS-native ETL workflows. Design, develop, and maintain scalable data pipelines and ETL processes using AWS services (Glue, Lambda, RedShift). Write efficient, reusable, and maintainable Python and PySpark scripts for data processing and transformation. Optimize SQL queries for performance and scalability. Expertise in writing complex SQL queries and optimizing them for performance. Monitor, troubleshoot, and improve data pipelines for reliability and performance. Focusing on ETL automation using Python and PySpark, responsible for design, build, and maintain efficient data pipelines, ensuring data quality and integrity for various applications. This job can be filled in Pune, Bangalore, Hyderabad locations

Posted 2 days ago

Apply

4.0 - 6.0 years

6 - 14 Lacs

coimbatore

Work from Office

Hiring Full Stack Engineer (React, React Native, Python, AWS) with 46 yrs exp for on-site role at Sense7AI. Work with offshore clients, flexible IST/EST hours. Strong API, AWS Serverless skills needed. Immediate joiners preferred. hr@sense7ai.com Health insurance Flexi working Cafeteria Work from home

Posted 2 days ago

Apply

2.0 - 5.0 years

5 - 12 Lacs

chennai

Work from Office

We're looking for a hands-on AWS Engineer to join our growing engineering team. In this role, youll be working on event-driven architectures using AWS' serverless and container services to build scalable, real-time customer interaction platforms using Amazon Chime and Amazon Connect. Youll collaborate closely with product, design, and infrastructure teams to deliver end-to-end solutions with both backend and frontend responsibilities. Responsibilities Design and implement event-driven services using AWS Lambda, Step Functions, and ECS Fargate. Build features on top of Amazon Connect and Amazon Chime. Develop backend APIs and services using Node.js or Python. Build and maintain frontend applications in React or Angular, including chat or video integration components. Manage relational and NoSQL data models using Amazon RDS and DynamoDB. Collaborate with other engineers and stakeholders to design and ship features that solve real-world problems. Ensure code quality through testing, reviews, and observability best practices. Required Skills: Strong frontend development skills in React or Angular Solid experience in event-driven architecture, using AWS Lambda, Step Functions, ECS Fargate (or similar container platforms) Hands-on with both Amazon RDS (PostgreSQL or MySQL preferred), Amazon DynamoDB Proficient in at least one programming language preferably Node.js or Python Experience with Amazon Connect, Amazon Chime or similar services Comfortable working with AWS IAM roles, policies, and security best practices Nice to Have: Infrastructure as code experience with CDK, CloudFormation, or Terraform Exposure to real-time communication protocols (WebRTC) Familiarity with monitoring and tracing tools (CloudWatch, X-Ray) • Experience with CI/CD pipelines and automated testing Candidates with relevant AWS Developer or AWS Professional will be given preference.

Posted 2 days ago

Apply

5.0 - 8.0 years

25 - 35 Lacs

noida, hyderabad, bengaluru

Hybrid

Greetings from Encora Innovation Labs!(Encora)! Encora is looking for DataOps Engineer with 5-8 years experience in Data Engineer, Spark, AWS services, Python, AWS Glue, Redshift, Lambda and Airflow. Pease find the below detailed job description and the company profile for your better understanding. Position: DataOps Engineer Experience: 5-8 years Job Location: Chennai/Bangalore/Pune/Hyderabad/Noida Position Type: Full time Qualification: Any graduate Technical Skills: AWS services AWS Glue, Redshift, Lambda, IAM, IaC Terraform, CloudFormation, GitHub. Programming - Python, SQL and Spark Data Engineering - Data pipelines, Airflow orchestration Job Summary Responsibilities and Duties • Monitors and reacts to alerts in real time and triages issues Executes runbook instructions to resolve routine problems and user requests. Escalates complex or unresolved issues to L2. Documents new findings to improve runbooks and knowledge base. Participates in shift handovers to ensure seamless coverage Participates in ceremonies to share operational status Build and manage ETL/ELT pipelines using AWS Glue, Lambda, or Apache Spark. Design and maintain data lakes, data warehouses , or data marts using Redshift, S3, or Snowflake. Work with structured and unstructured data using AWS native services. Collaborate with data scientists, analysts, and product teams to deliver data solutions. Strong experience with AWS services (EC2, S3, Lambda, RDS, Glue, Redshift, IAM, etc.) Proficiency in scripting languages like Python, Bash , or Shell scripting . Experience with infrastructure as code (IaC) tools Terraform, CloudFormation. Proficient in SQL and data manipulation/transformation. Education and Experience: B.E in Computer Science Engineering, or equivalent technical degree with strong computer science fundamentals Experience in an Agile software development environment Excellent communication and collaboration skills with the ability to work in a team-oriented environment. Communication: Facilitates team and stakeholder meetings effectively Holds regular status meetings / scrum meeting Resolves and/or escalates issues in a timely fashion Understands how to communicate difficult/sensitive information tactfully Astute cross-cultural awareness and experience in working with international teams (especially US) You should be speaking to us if; You are looking for a career that challenges you to bring your knowledge and expertise to bear for designing implementing and running a world class IT organization You like a job that brings a great deal of autonomy and decision-making latitude You like working in an environment that is young, innovative and well established You like to work in an organization that takes decisions quickly, is non-hierarchical and where you can make an impact Why Encora Innovation Labs? Are you are looking for a career that challenges you to bring your knowledge and expertise to bear for designing implementing and running a world class IT organization? Encora Innovation Labs is a world class SaaS technology Product Engineering company and focused on transformational outcomes for leading-edge tech companies. Encora Partners with fast growing tech companies who are driving innovation and growth within their industries. Who We Are: Encora is devoted to making the world a better place for clients, for our communities and for our people. What We Do: We drive transformational outcomes for clients through our agile methods, micro-industry vertical expertise, and extraordinary people. We provide hi-tech, differentiated services in next-gen software engineering solutions including Big Data, Analytics, Machine Learning, IoT, Embedded, Mobile, AWS/Azure Cloud, UI/UX, and Test Automation to some of the leading technology companies in the world. Encora specializes in Data Governance, Digital Transformation, and Disruptive Technologies, helping clients to capitalize on their potential efficiencies. Encora has been an instrumental partner in the digital transformation journey of clients across a broad spectrum of industries: Health Tech, Fin Tech, Hi-Tech, Security, Digital Payments, Education Publication, Travel, Real Estate, Supply Chain and Logistics and Emerging Technologies. Encora has successfully developed and delivered more than 2,000 products over the last few years and has led the transformation of a number of Digital Enterprises. Encora has over 25 offices and innovation centers in 20+ countries worldwide. Our international network ensures that clients receive seamless access to the complete range of our services and expert knowledge and skills of professionals globally. Encora global delivery centers and offices in the United States, Costa Rica, Mexico, United Kingdom, India, Malaysia, Singapore, Indonesia, Hong Kong, Philippines, Mauritius, and the Cayman Islands. Encora is Certified Great Place to Work in India. Please visit us at Website: encora.com LinkedIn: EncoraInc Facebook: @EncoraInc Instagram: @EncoraInc

Posted 2 days ago

Apply

5.0 - 9.0 years

25 - 27 Lacs

pune

Hybrid

Key Skills: Pyspark, AWS, Python, DynamoDb, AWS Lambda, Redshift, NoSql Roles and Responsibilities: Design, develop, and maintain scalable data pipelines using Pyspark and AWS services. Implement and manage data storage solutions using AWS Redshift and DynamoDB. Utilize AWS Lambda for serverless data processing and automation tasks. Ensure data quality and integrity through rigorous testing and validation processes. Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. Monitor and optimize data workflows for performance and cost efficiency. Stay updated with the latest industry trends and technologies related to data engineering and AWS services. Skills Required: Proficiency in Pyspark for building data pipelines (Must-Have) Strong experience with AWS services including Lambda, Redshift, and DynamoDB (Must-Have) Solid coding skills in Python (Must-Have) Understanding of data modeling, data validation, and pipeline optimization Knowledge of NoSQL technologies (Nice-to-Have) Familiarity with cloud cost optimization and performance tuning Strong analytical thinking, problem-solving, and team collaboration skills Education: B.Sc., B.Com., B.E., B.Tech, B.Tech-M.Tech (Dual), or equivalent Bachelor's degree

Posted 2 days ago

Apply

3.0 - 8.0 years

25 - 40 Lacs

hyderabad

Work from Office

1. Automation of Processes: Automate trading system deployments, configuration, and monitoring to minimize manual errors and ensure rapid, consistent updates across environments. Develop scripts and tools to automate repetitive tasks, such as environment provisioning, software deployments, and database updates, using tools like Ansible, Jenkins, or Terraform. 2. High-Frequency Trading (HFT) System Optimization: Optimize CI/CD pipelines for ultra-low latency and high-throughput trading systems to support continuous delivery of trading algorithms and infrastructure updates. Ensure that deployment and testing processes do not impact the performance of trading operations. 3. Infrastructure Management and Scalability: Managecloud and on-premises infrastructures tailored for trading environments, focusing on low latency, high availability, and failover strategies. UseInfrastructure as Code (IaC) to provision scalable and secure environments that can handle fluctuating loads typical in trading operations. 4. Monitoring and Real-Time Logging: Implement monitoring tools to track system performance, trade execution times, and infrastructure health in real-time. Setupsophisticated logging mechanisms for trade data, errors, and performance metrics, ensuring traceability and quick troubleshooting during incidents. 5. Security and Compliance: Integrate security best practices into the DevOps pipeline, including real-time security scans, vulnerability assessments, and access control tailored for financial data protection. Ensure that all systems comply with financial regulations such as GDPR, MiFID II, and SEC rules, including managing audit logs and data retention policies. 6. Disaster Recovery and High Availability: Design and maintain disaster recovery solutions to ensure continuity in trading operations during outages or data breaches. Implement redundancy and failover strategies to maintain trading platform uptime, minimizing the risk of costly downtimes. 7. Performance Optimization for Trading Systems: Fine-tune infrastructure and CI/CD pipelines to reduce deployment times and latency, crucial for real-time trading environments. Workonsystem performance to support the rapid execution of trades, data feeds, and order matching systems. 8. Incident Management and Troubleshooting: Rapidly respond to incidents affecting trading operations, performing root cause analysis and implementing corrective measures to prevent reoccurrence. Ensure detailed incident reporting and documentation to support regulatory requirements. 9. Configuration Management: Maintain configuration consistency across multiple environments (dev, test, prod) using tools like Puppet, Chef, or SaltStack. Ensure configurations meet the stringent security and performance standards required for trading platforms. 10. Collaboration with Development and Trading Teams: Workclosely with developers, quants, and traders to ensure smooth deployment of new trading algorithms and updates to trading platforms. Facilitate communication between development, trading desks, and compliance teams to ensure that changes are in line with business requirements and regulations. 11. Risk Management: Implement risk management controls within the DevOps pipeline to minimize the impact of potential system failures on trading operations. Workwith risk and compliance teams to ensure that deployment and infrastructure changes do not expose trading systems to unnecessary risks. 12. Cloud Services and Cost Optimization: Deploy, manage, and scale trading applications on cloud platforms like AWS, Azure, or Google Cloud, with a focus on minimizing costs without compromising performance. Utilize cloud-native services such as AWS Lambda or Azure Functions for event-driven processes in trading workflows. 13. Version Control and Code Management: Managethe versioning of trading algorithms and platform updates using Git or similar tools, ensuring traceability and quick rollback capabilities if issues arise. Establish rigorous code review processes to ensure that changes align with performance and security standards specific to trading systems.

Posted 2 days ago

Apply

5.0 - 8.0 years

10 - 20 Lacs

bengaluru

Work from Office

Backend Developer Responsibilities & Skills Position Title Backend Developer Position Type Full time permanent Location Bengaluru, India Company Description Privaini is the pioneer of privacy risk management for companies and their entire business networks. Privaini offers a unique "outside-in approach", empowering companies to gain a comprehensive understanding of both internal and external privacy risks. It provides actionable insights using a data-driven, systematic, and automated approach to proactively address reputation and legal risks related to data privacy. Privaini generates AI-powered privacy profile and privacy score for a company from externally observable privacy, corporate, regulatory, historical events, and security data. Without the need for time-consuming questionnaires or installing any software, Privaini creates standardized privacy views of companies from externally observable information. Then Privaini builds a privacy risk posture for every business partner in the company's business network and continuously monitors each one. Our platform provides actionable insights that privacy & risk teams can readily implement. Be part of an exciting team of researchers, developers, and data scientists focused on the mission of building transparency in data privacy risks for companies and their business networks. Key Responsibilities Strong Python, Flask, REST API, and NoSQL skills. Familiarity with Docker is a plus. AWS Developer Associate certification is required. AWS Professional Certification is preferred. Architect, build, and maintain secure, scalable backend services on AWS platforms. Utilize core AWS services like Lambda, DynamoDB, API Gateways, and serverless technologies. Design and deliver RESTful APIs using Python Flask framework. Leverage NoSQL databases and design efficient data models for large user bases. Integrate with web services APIs and external systems. Apply AWS Sagemaker for machine learning and analytics (optional but preferred). Collaborate effectively with diverse teams (business analysts, data scientists, etc.). Troubleshoot and resolve technical issues within distributed systems. Employ Agile methodologies (JIRA, Git) and adhere to best practices. Continuously learn and adapt to new technologies and industry standards. Qualifications A bachelors degree in computer science, information technology or any relevant disciplines is required. A masters degree is preferred. At least 6 years of development experience, with 5+ years of experience in AWS. Must have demonstrated skills in planning, designing, developing, architecting, and implementing applications. Additional Information At Privaini Software India Private Limited, we value diversity and always treat all employees and job applicants based on merit, qualifications, competence, and talent. We do not discriminate on the basis of race, religion, color, national origin, gender, sexual orientation, age, marital status, veteran status, or disability status.

Posted 2 days ago

Apply

5.0 - 8.0 years

7 - 11 Lacs

noida

Work from Office

Full-stack developer with 5-8 years of experience in designing and developing robust, scalable, and maintainable applications applying Object Oriented Design principles . Strong experience in Spring frameworks like Spring Boot, Spring Batch, Spring Data etc. and Hibernate, JPA. Strong experience in microservices architecture and implementation Strong knowledge of HTML, CSS and JavaScript, Angular Experience with SOAP Web-Services, REST Web-Services and Java Messaging Service (JMS) API. Familiarity designing, developing, and deploying web applications using Amazon Web Services (AWS). Good experience on AWS Services - S3, Lambda, SQS, SNS, DynamoDB, IAM, API Gateways Hands on experience in SQL, PL/SQL and should be able to write complex queries. Hands-on experience in REST-APIs Experience with version control systems (e.g., Git) Knowledge of web standards and accessibility guidelines Knowledge of CI/CD pipelines and experience in tools such as JIRA, Splunk, SONAR etc . Must have strong analytical and problem-solving abilities Good experience in JUnit testing and mocking techniques Experience in SDLC processes (Waterfall/Agile), Docker, Git, SonarQube Excellent communication and interpersonal skills, Ability to work independently and as part of a team. Mandatory Competencies Programming Language - Java - Core Java (java 8+) Programming Language - Java Full Stack - HTML/CSS Fundamental Technical Skills - Spring Framework/Hibernate/Junit etc. Programming Language - Java - Spring Framework Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Git Programming Language - Java Full Stack - JavaScript DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Docker Beh - Communication and collaboration Cloud - AWS - AWS S3, S3 glacier, AWS EBS Database - Oracle - PL/SQL Packages Development Tools and Management - Development Tools and Management - CI/CD User Interface - React - React Programming Language - Java Full Stack - Spring Framework Middleware - Java Middleware - Springboot Middleware - API Middleware - Microservices Middleware - API Middleware - WebServies (REST, SOAP) Middleware - API Middleware - API (SOAP, REST) Agile - Agile - SCRUM Database - Sql Server - SQL Packages.

Posted 2 days ago

Apply

3.0 - 8.0 years

5 - 12 Lacs

bengaluru

Hybrid

Job Title: AWS Developer Experience: 3+ Years Location: Bangalore (Hybrid) Job Description We are seeking an AWS Developer with a minimum of 3 years of experience in designing, developing, and maintaining cloud-native applications on AWS. The ideal candidate will have strong skills in serverless architectures, microservices, and containerized applications , along with hands-on experience in AWS core services and DevOps practices. Key Responsibilities Design, develop, and maintain cloud-native applications using AWS services such as Lambda, API Gateway, DynamoDB, S3, SQS, SNS, ECS, and EKS . Implement serverless architectures , microservices , and containerized solutions on AWS. Work with cloud-native principles and ensure adherence to best practices for scalability and security. Build and manage CI/CD pipelines for automated deployment and testing. Collaborate with product owners, designers, and cross-functional teams to gather requirements and deliver solutions. Create and maintain technical documentation for application design, deployment, and configurations. Ensure performance, security, and compliance of deployed applications. Required Skills Minimum 3 years of hands-on experience with AWS services related to computing, databases, storage, networking, and security. Strong proficiency in one or more programming languages: .Net, Node.js, Java, Go . Experience with DevOps tools and CI/CD pipeline management. Knowledge of serverless computing , microservices architecture , and cloud-native design principles . AWS Certification (AWS Certified Developer) preferred. Interested candidates or references please drop cv: abhiram.n@techno-facts.com/6303953729

Posted 2 days ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

kolkata, hyderabad, bengaluru

Hybrid

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 2 days ago

Apply

5.0 - 8.0 years

12 - 24 Lacs

hyderabad

Work from Office

5+ years of experience with focus on API development Expert-level proficiency in Node.js, TypeScript, GraphQL, AWS Lambda, AWS API Gateway, and MongoDB 3+ years of hands-on experience with AWS services in production environments

Posted 3 days ago

Apply

6.0 - 10.0 years

15 - 25 Lacs

hyderabad

Hybrid

Role: Lead Full Stack Developer (MERN) Location: Hyderabad, India (Initially Remote) Experience: 5 to 8 years Job Type: Full-Time Shift: Night (USA Timezone) **Must have React.js + Node.js/Express.js** Job Overview:- We are seeking a highly skilled and experienced Lead Full Stack Developer specializing in MERN stack (MongoDB, Express.js, React, Node.js) to join our engineering team. The ideal candidate will play a crucial role in maintaining and improving our existing systems while contributing to the development of new, innovative solutions using modern technologies. Roles and Responsibilities Design, develop, and maintain scalable web applications using the MERN stack Work with microservices architecture in an AWS environment Maintain and troubleshoot existing C# codebase while participating in the gradual transition to MERN stack Collaborate with cross-functional teams to define, design, and ship new features Ensure the performance, quality, and responsiveness of applications Identify and correct bottlenecks and fix bugs Help maintain code quality, organization, and automatization Participate in code reviews and contribute to team knowledge sharing Stay up-to-date with emerging trends and technologies in web development Required Skills and Experience 6+ years of experience in full stack development, with a strong focus on MERN stack Expert-level proficiency in React.js, Node.js, Express.js, and MongoDB Experience designing and implementing RESTful APIs Strong understanding of JavaScript ES6+ and TypeScript Solid understanding of C# for maintaining and fixing issues in existing codebase Experience working with AWS environment Experience with CI/CD pipelines, particularly Jenkins and GitHub Actions Familiarity with version control systems (e.g., Git) Strong problem-solving skills and attention to detail Excellent written and verbal communication skills Strong team player who contributes positively to company culture Ability to work independently with minimum guidance Self-starter mentality with a proactive approach to learning and problem-solving Detailed Skillset Frontend Expert-level proficiency in React.js and its ecosystem (e.g., Redux, React Router, Next.js) Strong understanding of JavaScript ES6+, HTML5, and CSS3 Experience with responsive design and cross-browser compatibility Familiarity with modern frontend build tools (e.g., Webpack, Babel) Backend Strong proficiency in Node.js and Express.js Experience designing and implementing RESTful APIs Solid understanding of C# and .NET framework Familiarity with microservices architecture Database Experience with both SQL and NoSQL databases Proficiency in writing efficient database queries and optimizing performance Cloud & DevOps Strong experience with AWS services (e.g., EC2, S3, Lambda, ECS) Proficiency in setting up and managing CI/CD pipelines using Jenkins and GitHub Actions Experience with automating build, test, and deployment processes Experience with containerization technologies (e.g., Docker) Version Control Proficiency in Git and GitHub/GitLab workflows Testing Experience with unit testing, integration testing, and end-to-end testing Familiarity with testing frameworks for both frontend and backend (e.g., Jest, Mocha, Chai) Additional Skills Understanding of Agile methodologies Familiarity with performance optimization techniques Knowledge of security best practices in web development Nice-to-Have Skills Experience with GraphQL Knowledge of NoSQL databases, particularly MongoDB Familiarity with serverless architecture Note: Must have a LinkedIn account Strictly avoid Moonlighting Must be willing to work in US time zones (EST) The candidate will undergo an in-depth background investigation

Posted 3 days ago

Apply

5.0 - 10.0 years

12 - 17 Lacs

bengaluru

Work from Office

Bengaluru, India Java Full Stack Cross Industry Solutions 12/05/2025 Project description As a Technical Lead for this project, you will have the opportunity to contribute to the data management architecture of industry leading software. You will work closely with cross-functional teams and regional experts to design, implement, and support solutions with a focus on data security and global availability to facilitate data-driven decisions for our customers. This is your chance to work on a stable, long-term project with a global client, focusing on digital transformation and change management. Why Join Us? - Exciting OpportunitiesWork in squads under our customer's direction, utilizing Agile methodologies and Scrum. - Innovative ApplicationContribute to an application that guides and documents the sales order process, aids in market analysis, and ensures competitive pricing. - Workflow GovernanceBe part of a team that integrates digital and human approvals, ensuring seamless integration with a broader ecosystem of applications. - Global ExposureCollaborate with reputed global clients, delivering top-notch solutions. - Career GrowthJoin high-caliber project teams with front-end, back-end, and database developers, offering ample opportunities to learn, grow, and advance your career. If you have strong technical skills, effective communication abilities, and a commitment to security, we want you on our team! Ready to make an impact? Apply now and be part of our journey to success! Responsibilities 1.Design and DevelopmentPlan and implement new functionality and features in globally deployed E&P services. 2.Shape enterprise-level user experiencesModel, create, and improve the next generation of user experiences for E&P customers. 3.Data ModellingWork with data architects to create and maintain data models optimizing storage, retrieval, and analysis. 4.Database AdministrationPlan, implement and iterate on databases using the latest technologies to ensure performant, scalable, and secure solutions. 5.AutomationDevelop and maintain automation CI/CD pipelines to handle all stages of the software lifecycle. 6.Monitoring and TroubleshootingMonitor customer environments to proactively identify and resolve issues while providing support for incidents. 7.Data SecurityImplement and consistently improve measures to protect sensitive information and ensure compliance with regulations. 8.DocumentationWrite and maintain documentation for processes, configurations, and procedures. Meet SRE & MTTR GoalsLead the team in troubleshooting environment failures within SRE MTTR goals. 10.Collaborate and DefineWork closely with stakeholders to define project requirements and deliverables and understand their needs and challenges. 11.Implement Best PracticesEnsure the highest standards in coding and security, with a strong emphasis on protecting systems and data. 12.Strategize and PlanTake an active role in defect triage, strategy, and architecture planning. 1 Maintain PerformanceEnsure database performance and resolve development problems. 14.Deliver QualityTranslate requirements into high-quality solutions, adhering to Agile methodologies. 15.CollaborateWork with application support teams throughout development, deployment, and support phases. Skills Must have 5+ years of experienceTechnical Skills: Strong full-stack developers with expertise in Angular, Node.js, Java Experience in development with Kafka, Redis, Web Socket Database technologiesRDBMS (Postgres preferred) Software languagesNodeJS, Angular, Java Cloud PlatformsAWS, preferably Azure too Cloud Managed ServicesRDS, Messaging, Server-less Computing (AWS Lambda) Containerization (Docker) DevOpsGitLab pipeline(s)Qualification and Soft Skills: Bachelors degree in Computer Science, Software Engineering, or a related field Strong desire to stay up to date with the latest trends and technologies in the field. Customer-driven and result-oriented focus. Excellent problem-solving and troubleshooting skills. Ability to work independently and as part of a team. Strong communication and collaboration skills. Can independently manage and drive end-to-end solutions Require minimal handholding or supervision Have solid experience in scalable application design and development Demonstrate clear growth and continuous upskilling Nice to have Software languagesPython, C++ Knowledge in the E&P Domain (Well, Seismic, Production data types) GIS Experience is desirable OtherLanguagesEnglishC2 Proficient SenioritySenior

Posted 3 days ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

bengaluru

Work from Office

Why this job matters Cloud Native Java Developer - To Individually contribute and Drive transformation of our existing Java microservices deployed on Amazon Elastic Kubernetes Service (EKS) to serverless AWS Lambda functions , below are the Roles and Responsibilities What youll be doing Key Responsibilities Develop and deploy serverless applications using Quarkus/Spring Boot and AWS Lambda Build RESTful APIs and event-driven microservices using cloud-native patterns Optimize cold-start performance using GraalVM native images Integrate with AWS services such as AWS API Gateway, S3, DynamoDB, CloudWatch and Postgres Implement and manage Lambda authorizers (custom and token-based) for securing APIs Design and configure AWS API Gateway for routing, throttling, and securing endpoints Integrate OAuth 2.0 authentication flows using Azure Active Directory as the identity provider Descent Understanding of resilience patterns Write unit and integration tests using JUnit, Mockito, and Quarkus testing tools Collaborate with DevOps teams to automate deployments using AWS SAM, CDK, or Terraform Monitor and troubleshoot production issues using AWS observability tools Migration Responsibilities Analyse existing Spring Boot microservices deployed on Kubernetes to identify candidates for serverless migration Refactor services to be stateless, event-driven, and optimized for short-lived execution Replace Kubernetes ingress and service discovery with API Gateway and Lambda triggers Migrate persistent state and configuration to AWS-native services (e.g., DynamoDB, S3, Secrets Manager) Redesign CI/CD pipelines to support serverless deployment workflows Ensure performance, cost-efficiency, and scalability in the new architecture Document migration strategies, patterns, and best practices for future reference Technical Proficiency Strong industry expereince of 4+ years with command of Java 8+, with deep understanding of: Functional interfaces (Function, Predicate, Supplier, Consumer) Streams API, lambda expressions, and Optional Proficiency in Java concurrency, including: Thread management, Executor Service, Completable Future, and parallel streams Designing thread-safe components and understanding concurrency pitfalls Understanding of AWS EKS (Elastic Kubernetes Service) , Docker Containers and Kubernetes fundamentals: Experience with resource requests and limits, pod autoscaling, and K8s networking Familiarity with transitioning workloads from EKS to serverless environments.

Posted 3 days ago

Apply

5.0 - 10.0 years

15 - 18 Lacs

bengaluru

Hybrid

Role & responsibilities: Strong experience in designing and implementing AWS cloud solutions. In-depth knowledge of AWS services and architectures, including EC2, S3, RDS, Lambda, VPC, IAM, and others. Proficiency in infrastructure as code tools like AWS CloudFormation, Terraform, or AWS CDK. Familiarity with DevOps practices and tools such as CI/CD pipelines, Git, and Jenkins. Understanding of security best practices and experience implementing security controls in AWS. Strong problem-solving and troubleshooting skills. Excellent communication and collaboration skills to work effectively with clients and cross-functional teams. AWS certifications such as AWS Certified Solutions Architect - Associate or Professional are a plus. Please share the below details: Full name Total Experience Relevant Experience CTC ECTC Notice Period Reason for Change Current & Preferred location Contact Number/Alternate Contact Number Current Organization / Payroll company Offer in hand- Skills DOB- Higher education University Name

Posted 3 days ago

Apply

2.0 - 7.0 years

13 - 18 Lacs

pune

Work from Office

The Customer, Sales & Service Practice | Cloud Job Title - Amazon Connect + Level 11 (Analyst) + Entity (S&C GN) Management Level: Level 11 - Analyst Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai Must have skills: AWS contact center, Amazon Connect flows, AWS Lambda and Lex bots, Amazon Connect Contact Center Good to have skills: AWS Lambda and Lex bots, Amazon Connect Experience: Minimum 2 year(s) of experience is required Educational Qualification: Engineering Degree or MBA from a Tier 1 or Tier 2 institute Join our team of Customer Sales & Service consultants who solve customer facing challenges at clients spanning sales, service and marketing to accelerate business change. Practice: Customer Sales & Service Sales I Areas of Work: Cloud AWS Cloud Contact Center Transformation, Analysis and Implementation | Level: Analyst | Location: Delhi, Mumbai, Bangalore, Gurgaon, Pune, Hyderabad and Chennai | Years of Exp: 2-5 years Explore an Exciting Career at Accenture Are you passionate about scaling businesses using in-depth frameworks and techniques to solve customer facing challengesDo you want to design, build and implement strategies to enhance business performanceDoes working in an inclusive and collaborative environment spark your interest Then, this is the right place for you! Welcome to a host of exciting global opportunities within Accenture Strategy & Consultings Customer, Sales & Service practice. The Practice A Brief Sketch The Customer Sales & Service Consulting practice is aligned to the Capability Network Practice of Accenture and works with clients across their marketing, sales and services functions. As part of the team, you will work on transformation services driven by key offerings like Living Marketing, Connected Commerce and Next-Generation Customer Care. These services help our clients become living businesses by optimizing their marketing, sales and customer service strategy, thereby driving cost reduction, revenue enhancement, customer satisfaction and impacting front end business metrics in a positive manner. You will work closely with our clients as consulting professionals who design, build and implement initiatives that can help enhance business performance. As part of these, you will drive the following: Work on creating business cases for journey to cloud, cloud strategy, cloud contact center vendor assessment activities Work on creating Cloud transformation approach for contact center transformations Work along with Solution Architects for architecting cloud contact center technology with AWS platform Work on enabling cloud contact center technology platforms for global clients specifically on Amazon connect Work on innovative assets, proof of concept, sales demos for AWS cloud contact center Support AWS offering leads in responding to RFIs and RFPs Bring your best skills forward to excel at the role: Good understanding of contact center technology landscape. An understanding of AWS Cloud platform and services with Solution architect skills. Deep expertise on AWS contact center relevant services. Sound experience in developing Amazon Connect flows , AWS Lambda and Lex bots Deep functional and technical understanding of APIs and related integration experience Functional and technical understanding of building API-based integrations with Salesforce, Service Now and Bot platforms Ability to understand customer challenges and requirements, ability to address these challenges/requirements in a differentiated manner. Ability to help the team to implement the solution, sell, deliver cloud contact center solutions to clients. Excellent communications skills Ability to develop requirements based on leadership input Ability to work effectively in a remote, virtual, global environment Ability to take new challenges and to be a passionate learner Read about us. Blogs Whats in it for you An opportunity to work on transformative projects with key G2000 clients Potential to Co-create with leaders in strategy, industry experts, enterprise function practitioners and, business intelligence professionals to shape and recommend innovative solutions that leverage emerging technologies. Ability to embed responsible business into everythingfrom how you service your clients to how you operate as a responsible professional. Personalized training modules to develop your strategy & consulting acumen to grow your skills, industry knowledge and capabilities Opportunity to thrive in a culture that is committed to accelerate equality for all. Engage in boundaryless collaboration across the entire organization. About Accenture: Accenture is a leading global professional services company, providing a broad range of services and solutions in strategy, consulting, digital, technology and operations. Combining unmatched experience and specialized skills across more than 40 industries and all business functions underpinned by the worlds largest delivery network Accenture works at the intersection of business and technology to help clients improve their performance and create sustainable value for their stakeholders. With 569,000 people serving clients in more than 120 countries, Accenture drives innovation to improve the way the world works and lives. Visit us at www.accenture.com About Accenture Strategy & Consulting: Accenture Strategy shapes our clients future, combining deep business insight with the understanding of how technology will impact industry and business models. Our focus on issues such as digital disruption, redefining competitiveness, operating and business models as well as the workforce of the future helps our clients find future value and growth in a digital world. Today, digital is changing the way organizations engage with their employees, business partners, customers and communities. This is our unique differentiator. To bring this global perspective to our clients, Accenture Strategy's services include those provided by our Capability Network a distributed management consulting organization that provides management consulting and strategy expertise across the client lifecycle. Our Capability Network teams complement our in-country teams to deliver cutting-edge expertise and measurable value to clients all around the world. For more information visit https://www.accenture.com/us-en/Careers/capability-network Accenture Capability Network | Accenture in One Word come and be a part of our team.QualificationYour experience counts! Bachelors degree in related field or equivalent experience and Post-Graduation in Business management would be added value. Minimum 2 years of experience in delivering software as a service or platform as a service projects related to cloud CC service providers such as Amazon Connect Contact Center cloud solution Hands-on experience working on the design, development and deployment of contact center solutions at scale. Hands-on development experience with cognitive service such as Amazon connect, Amazon Lex, Lambda, Kinesis, Athena, Pinpoint, Comprehend, Transcribe Working knowledge of one of the programming/scripting languages such as Node.js, Python, Java

Posted 3 days ago

Apply

15.0 - 20.0 years

5 - 9 Lacs

bengaluru

Work from Office

Project Role : Application Developer Project Role Description : Design, build and configure applications to meet business process and application requirements. Must have skills : Java Full Stack Development Good to have skills : Amazon Web Services (AWS), AWS Architecture, AWS Lambda AdministrationMinimum 5 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Developer, you will design, build, and configure applications to meet business process and application requirements. A typical day involves collaborating with cross-functional teams to gather requirements, developing innovative solutions, and ensuring that applications are optimized for performance and user experience. You will also engage in troubleshooting and debugging to enhance application functionality and reliability, while continuously seeking opportunities for improvement and efficiency in the development process. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Provide solutions to problems for their immediate team and across multiple teams.- Mentor junior team members to enhance their skills and knowledge.- Conduct regular team meetings to discuss progress and address challenges. Professional & Technical Skills: - Must To Have Skills: Proficiency in Java Full Stack Development.- Good To Have Skills: Experience with Amazon Web Services (AWS), AWS Architecture, AWS Lambda Administration.- Strong understanding of front-end technologies such as HTML, CSS, and JavaScript.- Experience with back-end frameworks and technologies, including Spring and Hibernate.- Familiarity with database management systems, particularly SQL and NoSQL databases. Additional Information:- The candidate should have minimum 5 years of experience in Java Full Stack Development.- This position is based at our Bengaluru office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 3 days ago

Apply

4.0 - 6.0 years

15 - 25 Lacs

chennai

Work from Office

As a Senior Cloud Data Platform (AWS) Specialist at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving data platform issues Technical Skills Skills Requirements: In-depth knowledge of AWS services and tools such as AWS Glue, AWS Redshift, and AWS Lambda Experience in building scalable and reliable data pipelines using AWS services, Apache Spark, and related big data technologies Familiarity with cloud-based infrastructure and deployment, specifically on AWS Strong knowledge of programming languages such as Python, Java, and SQL Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 3 days ago

Apply

4.0 - 6.0 years

15 - 25 Lacs

gurugram

Work from Office

Role Description As a Senior Cloud Data Platform (AWS) Specialist at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving data platform issues Technical Skills Skills Requirements: In-depth knowledge of AWS services and tools such as AWS Glue, AWS Redshift, and AWS Lambda Experience in building scalable and reliable data pipelines using AWS services, Apache Spark, and related big data technologies Familiarity with cloud-based infrastructure and deployment, specifically on AWS Strong knowledge of programming languages such as Python, Java, and SQL Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 3 days ago

Apply

4.0 - 6.0 years

12 - 18 Lacs

gurugram

Work from Office

Role Description As a Senior Cloud Data Platform (AWS) Specialist at Incedo, you will be responsible for designing, deploying and maintaining cloud-based data platforms on the AWS platform. You will work with data engineers, data scientists and business analysts to understand business requirements and design scalable, reliable and cost-effective solutions that meet those requirements. Roles & Responsibilities: Designing, developing and deploying cloud-based data platforms using Amazon Web Services (AWS) Integrating and processing large amounts of structured and unstructured data from various sources Implementing and optimizing ETL processes and data pipelines Developing and maintaining security and access controls Collaborating with other teams to ensure the consistency and integrity of data Troubleshooting and resolving data platform issues Technical Skills Skills Requirements: In-depth knowledge of AWS services and tools such as AWS Glue, AWS Redshift, and AWS Lambda Experience in building scalable and reliable data pipelines using AWS services, Apache Spark, and related big data technologies Familiarity with cloud-based infrastructure and deployment, specifically on AWS Strong knowledge of programming languages such as Python, Java, and SQL Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred

Posted 3 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies