Home
Jobs

286 Dynamo Db Jobs - Page 5

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

12.0 - 15.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Naukri logo

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Drupal Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. Your typical day will involve collaborating with various stakeholders to gather insights, analyzing user needs, and translating them into functional specifications. You will also engage in discussions with team members to ensure that the design aligns with business objectives and technical feasibility, while continuously iterating on your designs based on feedback and testing outcomes. Your role will be pivotal in ensuring that the applications developed are user-friendly, efficient, and meet the highest standards of quality. You must have knowledge on Adobe Analytics; PHP, Laravel, Drupal; HTML, CSS; Javascript, Vue.js; React; Python; GenAI Basics, AWS SAM (Lambda), AWS EC2, AWS S3, AWS RDS, AWS DynamoDB, AWS SNS, AWS SQS, AWS SES; Cloudflare; REST API; GitHub; Web Server; SQL. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and brainstorming sessions to foster innovative solutions.- Mentor junior team members to enhance their skills and knowledge. Professional & Technical Skills: - Must To Have Skills: Proficiency in Drupal.- Strong understanding of application design principles and methodologies.- Experience with front-end technologies such as HTML, CSS, and JavaScript.- Familiarity with database management systems and data modeling.- Ability to create and maintain technical documentation. Additional Information:- The candidate should have minimum 12 years of experience in Drupal.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 weeks ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Roles and Responsibilities: Experience in GLUE AWS Experience with one or more of the followingSpark, Scala, Python, and/or R . Experience in API development with NodeJS Experience with AWS (S3, EC2) or other cloud provider Experience in Data Virtualization tools like Dremio and Athena is a plus Should be technically proficient in Big Data concepts Should be technically proficient in Hadoop and noSQL (MongoDB) Good communication and documentation skills

Posted 2 weeks ago

Apply

4.0 - 9.0 years

12 - 16 Lacs

Kochi

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies. Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Naukri logo

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Creative problem-solving skills and superb communication Skill. Container based solutions. Strong experience with Node.js and AWS stack - AWS Lambda, AWS APIGateway, AWS CDK, AWS DynamoDB, AWS SQS. Experience with infrastructure as a code using AWS CDK.Expertise in encryption and decryption techniques for securing APIs, API Authentication and Authorization Primarily more experience is required on Lambda and APIGateway. Candidates having the AWS Certified Cloud Practitioner / AWS Certified Developer Associate certifications will be preferred Preferred technical and professional experience Experience in distributed/scalable systems Knowledge of standard tools for optimizing and testing code Knowledge/Experience of Development/Build/Deploy/Test life cycle

Posted 2 weeks ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

Developer leads the cloud application development/deployment for client based on AWS development methodology, tools and best practices. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience of technologies like Spring boot, JAVA Demonstrated technical leadership experience on impact customer facing projects. Experience in building web Applications in Java/J2EE stack. Experience in UI framework such as REACT JS Working knowledge of any messaging system (KAFKA preferred) Experience designing and integrating REST APIs using Spring Boot Preferred technical and professional experience Strong experience in Concurrent design and multi-threading - General Experience, Object Oriented Programming System (OOPS), SQL Server/ Oracle/ MySQL Working knowledge of Azure or AWS cloud. Preferred Experience in building applications in a container based environment (Docker/Kubernetes) on AWS Cloud. Basic knowledge of SQL or NoSQL databases (Postgres, MongoDB, DynamoDB preferred) design and queries

Posted 2 weeks ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

Pune

Work from Office

Naukri logo

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience of technologies like Spring boot, JAVA Demonstrated technical leadership experience on impact customer facing projects. Experience in building web Applications in Java/J2EE stack. Experience in UI framework such as REACT JS Working knowledge of any messaging system (KAFKA preferred) Experience designing and integrating REST APIs using Spring Boot Preferred technical and professional experience Strong experience in Concurrent design and multi-threading - General Experience, Object Oriented Programming System (OOPS), SQL Server/ Oracle/ MySQL Working knowledge of Azure or AWS cloud. Preferred Experience in building applications in a container based environment (Docker/Kubernetes) on AWS Cloud. Basic knowledge of SQL or NoSQL databases (Postgres, MongoDB, DynamoDB preferred) design and queries

Posted 2 weeks ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Pune

Work from Office

Naukri logo

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 2 weeks ago

Apply

5.0 - 7.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers AWS S3 , Redshift , and EMR for data storage and distributed processing. AWS Lambda , AWS Step Functions , and AWS Glue to build serverless, event-driven data workflows and orchestrate ETL processes

Posted 2 weeks ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Mumbai

Work from Office

Naukri logo

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 2 weeks ago

Apply

6.0 - 11.0 years

5 - 9 Lacs

Chennai

Work from Office

Naukri logo

Full Stack Engineer ReactJs,NodeJs.,Next,Nest,Express. At least 6 years in relevant experience and overall 8-10 years in total for Full Stack Engineer (React/NodeJS). Technical Skills Required Cloud Platform Amazon Web Services(AWS) S3, IAM roles and policies, Lambda, API Gateway, Cognito user pool, Cloudwatch Programming Languages React.Js, Node.Js Databases Postgres SQL, Mongo DB, AWS Dynamo DB Scripting Languages JavaScript, TypeScript, HTML, XML Application Servers Tomcat 6.0 / 7.0, Nginx1.23.2 Framework Next Js, Nest Js, Express Js Version Control Systems Git Lab

Posted 2 weeks ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. Design, deploy, implementation of FSxN Storage (Multi AZ) in both SAN and NAS Automate the end to end process of Migration Data Migration tools like cloud sync or Data sync Well versed with any of the scripting or tools like python or terraform(preferred) Drive the Storage strategy in optimization & modernize in terms of cost, efficiency Having good understanding on AWS storage services like EFS, S3, EBS, FSx Should be able to modernize these services, applications, Can suggest how to optimize the cost as these storage services consumes so much, whether we can archive the solution, Can help in integration of storage services in AWS Technical and Professional Requirements: Amazon EBSAmazon EFS and FSxAWS Application Migration Service (MGN)AWS CloudWatchAWS Cloud Migration FactoryAWS Step FunctionsAmazon EBS Multi-Attach Preferred Skills: Storage Technology->Backup Administration->Backup Technologies Technology->Cloud Platform->AWS App Development->Cloudwatch Technology->Infrastructure-Storage-Administration->Cisco-Storage Admin->storage Generic Skills: Technology->Cloud Platform->AWS Core services->Amazon Elastic Compute Cloud(EC2) Additional Responsibilities: Storage Architect - having good understanding on AWS storage services like EFS, S3, EBS, FSx, Educational Requirements Master Of Engineering,Master Of Technology,Bachelor Of Comp. Applications,Bachelor Of Science,Bachelor of Engineering,Bachelor Of Technology Service Line Cloud & Infrastructure Services * Location of posting is subject to business requirements

Posted 2 weeks ago

Apply

5.0 - 10.0 years

15 - 22 Lacs

Pune

Work from Office

Naukri logo

About the company: Our esteemed client is a leading global systems integrator and business transformation consulting organisation. Our client helps companies innovate and transform by leveraging its unique insights, differentiated services, and flexible partnering models. The company is among the Top mobile application development companies in India, and also a pioneer in web application development and Automation Testing. They have gained the trust of more than 300 offshore clients from 30+ countries worldwide and has become a trustworthy software partner. Job Overview: We are seeking a skilled Senior Python Developer with extensive experience in Python development and hands-on expertise in AWS cloud services. The ideal candidate will play a crucial role in developing, maintaining, and deploying backend services and cloud infrastructure. This position is primarily focused on Python development, complemented by AWS tasks. What you will do: Python Development : Design, develop, test, and deploy scalable and efficient backend solutions using Python. Write clean, maintainable, and well-documented code following best practices. Implement APIs, data processing workflows, and integrations with various services. Troubleshoot, debug, and optimize existing Python applications for performance and scalability. Collaborate with cross-functional teams, including frontend developers, QA engineers, and product managers, to deliver high-quality software. Conduct code reviews and mentor junior developers on best practices. AWS Cloud Management : Design, configure, and maintain AWS infrastructure components such as EC2, S3, Lambda, RDS, and CloudFormation. Implement and manage CI/CD pipelines using AWS services like CodePipeline, CodeBuild, and CodeDeploy. Monitor and optimize cloud resource usage, ensuring cost-effective and reliable cloud operations. Set up security best practices on AWS, including IAM roles, VPC configurations, and data encryption. Troubleshoot cloud-related issues, perform regular updates, and apply patches as needed. What you will bring to the table: Bachelor's degree in Computer Science, Engineering, or a related field. 5+ years of hands-on experience in Python development. 3+ years of experience working with AWS services including EC2, S3, Lambda, RDS, CloudFormation, and Secrets Manager. Experience with modern AWS tools and services including API Gateway, DynamoDB, ECS, Amplify, CloudFront, Shield, OpenSearch (ElasticSearch). Strong knowledge of serverless architecture and deployment using the Serverless Framework. Proficiency in web frameworks such as Django, Flask, or FastAPI. Strong understanding of RESTful API design and integration via API Gateways. Solid experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., DynamoDB, MongoDB). Familiarity with DevOps practices and CI/CD tools. Experience with version control systems, particularly Git. Strong problem-solving skills and attention to detail. Excellent communication skills and ability to work in a team environment. Perks and benefits As per industry standard

Posted 2 weeks ago

Apply

6.0 - 10.0 years

11 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

8-10 Years of Java/J2EE development Experience Spring Boot, MongoDB exposure preferred Experience in SQL/PLSQL in any database Technologies like Sybase, DB2, DynamoDB Implemented Docker-Microservices, RESTful API Architecture

Posted 2 weeks ago

Apply

16.0 - 20.0 years

25 - 30 Lacs

Noida, Pune, Bengaluru

Work from Office

Naukri logo

Java Architect with 15+ years of experience with ability to architect design distributed enterprise systems, create the required architecturediagrams, detailed design, deployment topoligies and communication mechanisms Hands on experience on microservices, CQS/CQRS, EDA DDD Hands onexperience in design, development of Microservices using Spring framework Good communication skills, help/lead a development scrum team of 5-6 members interms of providing the design,code reviews Good to have experience on Azureand related native services Architect with ability to design distributedenterprise systems and come up with detailed design for the Platform, work withthe team on the implementation 16+ years of professional experience in software development, with a strong focus on Java, NodeJs, Typescript and IOT. Experience in implementing solutions in AWS and IoT. Must have executed at least 2 large engagements as a Lead. Proven experience in architecting and building complex, scalable Node.js applications. Experience implementing AWS cloud solutions. Good to have Experience with DynamoDB and Elastic Search. Expertise in Node.js, JavaScript, and related frameworks (e.g., Express.js, Nest.js). Deep understanding of microservices architecture, API design, and integration patterns. Experience with databases (e.g., MongoDB, PostgreSQL) and caching systems. Experience in implementing micro-frontend architecture. Strong knowledge of front-end technologies (e.g., React, Angular, Vue.js) is a plus. Strong communication and interpersonal skills. Experience working in an Agile/Scrum development environment. Key Responsibilities: Provide technical leadership and expertise in designing, implementing

Posted 2 weeks ago

Apply

12.0 - 16.0 years

20 - 25 Lacs

Pune

Work from Office

Naukri logo

Overall 13 - 14 Years of experience Fluency in software architecture, software development, and systems testing Technical guidance and decision making skills Ability to shape the solution and enforce development practices Quality gates code reviews, pair programming, team reviews meeting Complementary tech skills / Relevant development experience is must Understanding of code management and release approaches / must have Understanding of CI/CD pipelines, GitFlow and Github, GitOps (Flux, ArgoCD) / must have / flux is good to have Good understanding of functional programming (Python Primary / Golang Secondary used in IAC platform) Understanding ABAC / RBAC / JWT / SAML / AAD / OIDC authorization and authentication (handson and direction No SQL databases, ie, DynamoDB ( SCC heavy) Event driven architecture: queues, streams, batches, pub / subs Understanding functional programming: list / map / reduce / compose, if familiar with monads / needed

Posted 2 weeks ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Bengaluru

Work from Office

Naukri logo

8-10 Years of Java/J2EE development ExperienceSpring Boot, MongoDB exposure preferred Experience in SQL/PLSQL in any database Technologies like Sybase, DB2, DynamoDB. Implemented Docker-Microservices, RESTful API Architecture.

Posted 2 weeks ago

Apply

10.0 - 15.0 years

13 - 17 Lacs

Pune

Work from Office

Naukri logo

Communication and leadership Supervise team members, delegate tasks, issue feedback, evaluate risks, and resolve conflicts. Project and crisis management Problem solving and innovation. Ownership and vision Techskills Fluency in software architecture, software development, and systems testing. Technical guidance and decision-making skills Ability to shape the solution and enforce development practices . Quality gates code reviews, pair programming, team reviews meeting Complementarytech skills / Relevant development experience is must Understanding of code management and release approaches / must have Understanding of CI/CD pipelines, GitFlow and Github, GitOps (Flux, ArgoCD) / must have / flux is good to have . Good understanding of functional programming ( Python Primary / Golang Secondary used in IAC platform ) Understanding ABAC / RBAC / JWT / SAML / AAD / OIDC authorization and authentication ( handson and direction No SQL databases, i.e., DynamoDB ( SCC heavy) Event driven architecture queues, streams, batches, pub / subs Understanding functional programming list / map / reduce / compose, if familiar with monads / needed . Fluent in operating kubernetes clusters, as from dev perspective Creating custom CRD, operators, controllers Experience in creating Serverless AWS Azure ( both needed ) Monorepo / multirepo / Understanding of code management approaches Understanding scalability and concurrency Understanding of network, direct connect connectivity, proxies Deep knowledge in AWS cloud ( org / networks / security / IAM ) (Basic understanding of Azure cloud) Understanding of SDLC, DRY, KISS, SOLID / development principles

Posted 2 weeks ago

Apply

10.0 - 15.0 years

9 - 14 Lacs

Noida, Bhubaneswar, Bengaluru

Work from Office

Naukri logo

Flexible to adopt different technologies and solutions. Proficiency in application design anddevelopment. Tech Stack : Angular, Java, Spring boot, MySQL Micro-services and event-driven architecture Front-end integration experience AWS Cloud proficiency Good to have NoSQL knowledge : MongoDB, DynamoDB NodeJS : for serverless programming Flexible to adopt different technologies and solutions. Proficiency in application design and development. Tech Stack : Angular, Java, Spring boot, MySQL Micro-services and event-driven architecture Front-end integration experience AWS Cloud proficiency Good to haveNoSQL knowledge : MongoDB, DynamoDBNodeJS : for serverless programming

Posted 2 weeks ago

Apply

4.0 - 9.0 years

9 - 14 Lacs

Noida, Bhubaneswar, Pune

Work from Office

Naukri logo

4+ years experience as an IoT developer Must have experience on AWS Cloud - IoTCore, Kinesis, DynamoDB, API Gateway Expertise in creating applications by integrating with various AWS services Must have worked one IoT implementationon AWS Ability to work in Agile delivery Skills: Java, AWS Certified(Developer), MQTT, AWS IoT Core, nodejs

Posted 2 weeks ago

Apply

7.0 - 10.0 years

10 - 14 Lacs

Gurugram, Bengaluru

Work from Office

Naukri logo

We are looking for an experienced Senior Big Data Developer to join our team and help build and optimize high-performance, scalable, and resilient data processing systems. You will work in a fast-paced startup environment, handling highly loaded systems and developing data pipelines that process billions of records in real time. As a key member of the Big Data team, you will be responsible for architecting and optimizing distributed systems, leveraging modern cloud-native technologies, and ensuring high availability and fault tolerance in our data infrastructure. Primary Responsibilities: Design, develop, and maintain real-time and batch processing pipelines using Apache Spark, Kafka, and Kubernetes. Architect high-throughput distributed systems that handle large-scale data ingestion and processing. Work extensively with AWS services, including Kinesis, DynamoDB, ECS, S3, and Lambda. Manage and optimize containerized workloads using Kubernetes (EKS) and ECS. Implement Kafka-based event-driven architectures to support scalable, low-latency applications. Ensure high availability, fault tolerance, and resilience of data pipelines. Work with MySQL, Elasticsearch, Aerospike, Redis, and DynamoDB to store and retrieve massive datasets efficiently. Automate infrastructure provisioning and deployment using Terraform, Helm, or CloudFormation. Optimize system performance, monitor production issues, and ensure efficient resource utilization. Collaborate with data scientists, backend engineers, and DevOps teams to support advanced analytics and machine learning initiatives. Continuously improve and modernize the data architecture to support growing business needs. Required Skills: 7-10+ years of experience in big data engineering or distributed systems development. Expert-level proficiency in Scala, Java, or Python. Deep understanding of Kafka, Spark, and Kubernetes in large-scale environments. Strong hands-on experience with AWS (Kinesis, DynamoDB, ECS, S3, etc.). Proven experience working with highly loaded, low-latency distributed systems. Experience with Kafka, Kinesis, Flink, or other streaming technologies for event-driven architectures. Expertise in SQL and database optimizations for MySQL, Elasticsearch, and NoSQL stores. Strong experience in automating infrastructure using Terraform, Helm, or CloudFormation. Experience managing production-grade Kubernetes clusters (EKS). Deep knowledge of performance tuning, caching strategies, and data consistency models. Experience working in a startup environment, adapting to rapid changes and building scalable solutions from scratch. Nice to Have Experience with machine learning pipelines and AI-driven analytics. Knowledge of workflow orchestration tools such as Apache Airflow.

Posted 2 weeks ago

Apply

1.0 - 6.0 years

5 - 9 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Naukri logo

Join our dynamic team, where we innovate IT automation and visibility solutions to empower businesses. We're looking for a Full-Stack Developer to work across the stack, building scalable, high-performance applications and crafting delightful user experiences. Responsibilities - Design, develop, and maintain scalable web applications across the front-end and back-end. - Collaborate with product managers and designers to deliver user-friendly interfaces and seamless user experiences. - Build, test, and optimize RESTful APIs and GraphQL endpoints. - Maintain and optimize database systems (SQL/NoSQL). - Write clean, maintainable, and testable code to ensure long-term reliability. - Monitor performance and debug issues across the stack. Qualifications - Proficiency in JavaScript/TypeScript and frameworks like React, Vue, or Angular. - Expertise in backend development using Node.js, Python, or Ruby. - Solid understanding of database systems like PostgreSQL, MongoDB, or DynamoDB. - Experience with cloud platforms (AWS, Azure, or GCP) and version control systems (Git). - Familiarity with CI/CD pipelines and containerization tools like Docker. - Strong problem-solving skills and a team-oriented mindset.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

25 - 40 Lacs

Pune

Remote

Naukri logo

10+ yrs of exp in S/W dev with a focus on AWS solutions architecture. Exp in architecting microservices-based applications using EKS.Design, develop, and implement microservices apps on AWS using Java. AWS Certified Solutions Architect -must

Posted 2 weeks ago

Apply

4.0 - 9.0 years

12 - 22 Lacs

Gurugram

Work from Office

Naukri logo

To Apply - Submit Details via Google Form - https://forms.gle/8SUxUV2cikzjvKzD9 As a Senior Consultant in our Consulting team, youll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations Seeking experienced AWS Data Engineers to design, implement, and maintain robust data pipelines and analytics solutions using AWS services. The ideal candidate will have a strong background in AWS data services, big data technologies, and programming languages. Role & responsibilities 1. Design and implement scalable, high-performance data pipelines using AWS services 2. Develop and optimize ETL processes using AWS Glue, EMR, and Lambda 3. Build and maintain data lakes using S3 and Delta Lake 4. Create and manage analytics solutions using Amazon Athena and Redshift 5. Design and implement database solutions using Aurora, RDS, and DynamoDB 6. Develop serverless workflows using AWS Step Functions 7. Write efficient and maintainable code using Python/PySpark, and SQL/PostgrSQL 8. Ensure data quality, security, and compliance with industry standards 9. Collaborate with data scientists and analysts to support their data needs 10. Optimize data architecture for performance and cost-efficiency 11. Troubleshoot and resolve data pipeline and infrastructure issues Preferred candidate profile 1. Bachelors degree in computer science, Information Technology, or related field 2. Relevant years of experience as a Data Engineer, with at least 60% of experience focusing on AWS 3. Strong proficiency in AWS data services: Glue, EMR, Lambda, Athena, Redshift, S3 4. Experience with data lake technologies, particularly Delta Lake 5. Expertise in database systems: Aurora, RDS, DynamoDB, PostgreSQL 6. Proficiency in Python and PySpark programming 7. Strong SQL skills and experience with PostgreSQL 8. Experience with AWS Step Functions for workflow orchestration Technical Skills: - AWS Services: Glue, EMR, Lambda, Athena, Redshift, S3, Aurora, RDS, DynamoDB , Step Functions - Big Data: Hadoop, Spark, Delta Lake - Programming: Python, PySpark - Databases: SQL, PostgreSQL, NoSQL - Data Warehousing and Analytics - ETL/ELT processes - Data Lake architectures - Version control: Git - Agile methodologies

Posted 2 weeks ago

Apply

4.0 - 9.0 years

12 - 22 Lacs

Gurugram, Bengaluru

Work from Office

Naukri logo

To Apply - Submit Details via Google Form - https://forms.gle/8SUxUV2cikzjvKzD9 As a Senior Consultant in our Consulting team, youll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations Seeking experienced AWS Data Engineers to design, implement, and maintain robust data pipelines and analytics solutions using AWS services. The ideal candidate will have a strong background in AWS data services, big data technologies, and programming languages. Role & responsibilities 1. Design and implement scalable, high-performance data pipelines using AWS services 2. Develop and optimize ETL processes using AWS Glue, EMR, and Lambda 3. Build and maintain data lakes using S3 and Delta Lake 4. Create and manage analytics solutions using Amazon Athena and Redshift 5. Design and implement database solutions using Aurora, RDS, and DynamoDB 6. Develop serverless workflows using AWS Step Functions 7. Write efficient and maintainable code using Python/PySpark, and SQL/PostgrSQL 8. Ensure data quality, security, and compliance with industry standards 9. Collaborate with data scientists and analysts to support their data needs 10. Optimize data architecture for performance and cost-efficiency 11. Troubleshoot and resolve data pipeline and infrastructure issues Preferred candidate profile 1. Bachelors degree in computer science, Information Technology, or related field 2. Relevant years of experience as a Data Engineer, with at least 60% of experience focusing on AWS 3. Strong proficiency in AWS data services: Glue, EMR, Lambda, Athena, Redshift, S3 4. Experience with data lake technologies, particularly Delta Lake 5. Expertise in database systems: Aurora, RDS, DynamoDB, PostgreSQL 6. Proficiency in Python and PySpark programming 7. Strong SQL skills and experience with PostgreSQL 8. Experience with AWS Step Functions for workflow orchestration Technical Skills: - AWS Services: Glue, EMR, Lambda, Athena, Redshift, S3, Aurora, RDS, DynamoDB , Step Functions - Big Data: Hadoop, Spark, Delta Lake - Programming: Python, PySpark - Databases: SQL, PostgreSQL, NoSQL - Data Warehousing and Analytics - ETL/ELT processes - Data Lake architectures - Version control: Git - Agile methodologies

Posted 2 weeks ago

Apply

6.0 - 10.0 years

11 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

We are seeking a highly skilled Java Full Stack Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Java and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in Java programming, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 6 to 10+ years of experience in full-stack development, with a strong focus on Java. Java Full Stack Developer Roles & Responsibilities: Develop scalable web applications using Java (Spring Boot) for backend and React/Angular for frontend. Implement RESTful APIs to facilitate communication between frontend and backend. Design and manage databases using MySQL, PostgreSQL, Oracle , or MongoDB . Write complex SQL queries, procedures, and perform database optimization. Build responsive, user-friendly interfaces using HTML, CSS, JavaScript , and frameworks like Bootstrap, React, Angular , NodeJS, Phyton integration Integrate APIs with frontend components. Participate in designing microservices and modular architecture. Apply design patterns and object-oriented programming (OOP) concepts. Write unit and integration tests using JUnit , Mockito , Selenium , or Cypress . Debug and fix bugs across full stack components. Use Git , Jenkins , Docker , Kubernetes for version control, continuous integration , and deployment. Participate in code reviews, automation, and monitoring. Deploy applications on AWS , Azure , or Google Cloud platforms. Use Elastic Beanstalk , EC2 , S3 , or Cloud Run for backend hosting. Work in Agile/Scrum teams, attend daily stand-ups, sprints, retrospectives, and deliver iterative enhancements. Document code, APIs, and configurations. Collaborate with QA, DevOps, Product Owners, and other stakeholders. Must-Have Skills : Java Programming: Deep knowledge of Java language, its ecosystem, and best practices. Frontend Technologies: Proficiency in HTML , CSS , JavaScript , and modern frontend frameworks like React or Angular etc... Backend Development: Expertise in developing and maintaining backend services using Java, Spring, and related technologies. Full Stack Development: Experience in both frontend and backend development, with the ability to work across the entire application stack. Soft Skills: Problem-Solving: Ability to analyze complex problems and develop effective solutions. Communication Skills: Strong verbal and written communication skills to effectively collaborate with cross-functional teams. Analytical Thinking: Ability to think critically and analytically to solve technical challenges. Time Management: Capable of managing multiple tasks and deadlines in a fast-paced environment. Adaptability: Ability to quickly learn and adapt to new technologies and methodologies. Hard Skills: Java Programming: Expert-level knowledge in Java and its application in full-stack development. Frontend Technologies: Proficient in frontend development using HTML, CSS, JavaScript, and frameworks like React or Angular. Backend Development: Skilled in backend development using Java, Spring, Hibernate, and RESTful services. Full Stack Development: Experience in developing end-to-end solutions, from the user interface to backend services and databases. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm

Posted 2 weeks ago

Apply

Exploring Dynamo DB Jobs in India

With the increasing demand for cloud-based solutions, Dynamo DB jobs in India are on the rise. Companies are looking for skilled professionals who can manage and optimize their NoSQL databases effectively. If you are a job seeker interested in pursuing a career in Dynamo DB, here is a guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

These cities are known for their thriving tech industries and have a high demand for Dynamo DB professionals.

Average Salary Range

The average salary range for Dynamo DB professionals in India varies based on experience levels. Entry-level positions can expect a salary of INR 4-6 lakhs per annum, while experienced professionals can earn upwards of INR 15 lakhs per annum.

Career Path

A typical career progression in Dynamo DB may look like this: - Junior Developer - Developer - Senior Developer - Tech Lead

As you gain experience and expertise in Dynamo DB, you can move up the ladder to more senior roles with increased responsibilities.

Related Skills

In addition to Dynamo DB proficiency, employers often look for candidates with the following skills: - AWS (Amazon Web Services) - NoSQL databases - Data modeling - Database optimization - Query optimization

Having a combination of these skills can make you a more competitive candidate in the job market.

Interview Questions

  • What is Dynamo DB?
  • Explain the difference between SQL and NoSQL databases. (basic)
  • How does Dynamo DB ensure high availability and durability? (medium)
  • What is the partition key in Dynamo DB? (basic)
  • How does Dynamo DB handle data consistency? (medium)
  • Describe the read and write capacity units in Dynamo DB. (medium)
  • What is the difference between Provisioned and On-Demand capacity modes in Dynamo DB? (medium)
  • How can you optimize Dynamo DB performance? (medium)
  • Explain the concepts of item, attribute, and table in Dynamo DB. (basic)
  • How does secondary indexing work in Dynamo DB? (medium)
  • What is the importance of scaling in Dynamo DB? (basic)
  • How does Dynamo DB handle data replication and backups? (medium)
  • Explain the concept of Dynamo DB Streams. (medium)
  • How can you secure your data in Dynamo DB? (medium)
  • What are the different data types supported by Dynamo DB? (basic)
  • Describe the pricing model of Dynamo DB. (medium)
  • How can you monitor and troubleshoot performance issues in Dynamo DB? (medium)
  • What are the best practices for designing Dynamo DB tables? (medium)
  • How does Dynamo DB handle schema changes? (medium)
  • Explain the concept of eventual consistency in Dynamo DB. (medium)
  • How can you automate tasks in Dynamo DB using AWS SDK? (advanced)
  • Describe the differences between Dynamo DB and other NoSQL databases like MongoDB. (medium)
  • What are the limitations of Dynamo DB? (basic)
  • How does Dynamo DB handle partition keys with high request rates? (medium)

Closing Remark

As you prepare for Dynamo DB job interviews, make sure to brush up on your technical skills and be ready to showcase your understanding of NoSQL databases and cloud computing. With the right preparation and confidence, you can land a rewarding job in the thriving tech industry in India. Good luck!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies