Jobs
Interviews

762 Dynamo Db Jobs - Page 19

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

35 - 50 Lacs

Bengaluru

Work from Office

Skill : AWS/ Amazon Connect Developer/ Lead Experience : 3 years - 10 Years Location : PAN India Job Description : 2.Strong experience in contact center development 3.Experience in creating AC flows (Amazon Connect) , Lex chatbots and Lambda functions 4.Java / node.js Architect with knowledge on AWS environment, Design and develop APIs (Rest and SOAP services) 5.Knowledge on AWS Lambda services and familiarity in AWS environment and eco system. 6.Knowledge on Spring, Maven, Hibernate 7.Knowledge on Data base technologies like MySQL or SQL Server or DB2/ RDS 8. Application Development experience in any of Java, C#, Node.js, Python, PHP

Posted 2 months ago

Apply

12.0 - 15.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Drupal Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. A typical day involves collaborating with various stakeholders to gather insights, analyzing user needs, and translating them into functional specifications. You will engage in discussions with team members to brainstorm innovative solutions and ensure that the applications align with business objectives. Your role will also include reviewing design documents and providing feedback to enhance application performance and user experience, all while maintaining a focus on quality and efficiency. You must have knowledge on Adobe Analytics; PHP, Laravel, Drupal; HTML, CSS; Javascript, Stencil.js, Vue.js; React; Python; Auth0, Terraform; Azure, Azure-ChatGPT; GenAI Basics; AWS SAM (Lambda), AWS EC2, AWS S3, AWS RDS, AWS DynamoDB, AWS SNS, AWS SQS, AWS SES; Cloudflare, Cloudflare Workers; REST API; GitHub; Web Server; SQL Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate knowledge sharing sessions to enhance team capabilities.- Monitor project progress and ensure alignment with business goals. Professional & Technical Skills: - Must To Have Skills: Proficiency in Drupal.- Strong understanding of web development principles and best practices.- Experience with content management systems and their implementation.- Familiarity with front-end technologies such as HTML, CSS, and JavaScript.- Ability to troubleshoot and resolve application issues efficiently. Additional Information:- The candidate should have minimum 12 years of experience in Drupal.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

12.0 - 15.0 years

5 - 9 Lacs

Navi Mumbai

Work from Office

Project Role : Application Designer Project Role Description : Assist in defining requirements and designing applications to meet business process and application requirements. Must have skills : Drupal Good to have skills : NAMinimum 12 year(s) of experience is required Educational Qualification : 15 years full time education Summary :As an Application Designer, you will assist in defining requirements and designing applications to meet business process and application requirements. Your typical day will involve collaborating with various stakeholders to gather insights, analyzing user needs, and translating them into functional specifications. You will also engage in discussions with team members to ensure that the design aligns with business objectives and technical feasibility, while continuously iterating on your designs based on feedback and testing outcomes. Your role will be pivotal in ensuring that the applications developed are user-friendly, efficient, and meet the highest standards of quality. You must have knowledge on Adobe Analytics; PHP, Laravel, Drupal; HTML, CSS; Javascript, Vue.js; React; Python; GenAI Basics, AWS SAM (Lambda), AWS EC2, AWS S3, AWS RDS, AWS DynamoDB, AWS SNS, AWS SQS, AWS SES; Cloudflare; REST API; GitHub; Web Server; SQL. Roles & Responsibilities:- Expected to be an SME.- Collaborate and manage the team to perform.- Responsible for team decisions.- Engage with multiple teams and contribute on key decisions.- Expected to provide solutions to problems that apply across multiple teams.- Facilitate workshops and brainstorming sessions to foster innovative solutions.- Mentor junior team members to enhance their skills and knowledge. Professional & Technical Skills: - Must To Have Skills: Proficiency in Drupal.- Strong understanding of application design principles and methodologies.- Experience with front-end technologies such as HTML, CSS, and JavaScript.- Familiarity with database management systems and data modeling.- Ability to create and maintain technical documentation. Additional Information:- The candidate should have minimum 12 years of experience in Drupal.- This position is based at our Mumbai office.- A 15 years full time education is required. Qualification 15 years full time education

Posted 2 months ago

Apply

5.0 - 8.0 years

3 - 7 Lacs

Bengaluru

Work from Office

Roles and Responsibilities: Experience in GLUE AWS Experience with one or more of the followingSpark, Scala, Python, and/or R . Experience in API development with NodeJS Experience with AWS (S3, EC2) or other cloud provider Experience in Data Virtualization tools like Dremio and Athena is a plus Should be technically proficient in Big Data concepts Should be technically proficient in Hadoop and noSQL (MongoDB) Good communication and documentation skills

Posted 2 months ago

Apply

4.0 - 9.0 years

12 - 16 Lacs

Kochi

Work from Office

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies. Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers.

Posted 2 months ago

Apply

4.0 - 8.0 years

9 - 13 Lacs

Bengaluru

Work from Office

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Creative problem-solving skills and superb communication Skill. Container based solutions. Strong experience with Node.js and AWS stack - AWS Lambda, AWS APIGateway, AWS CDK, AWS DynamoDB, AWS SQS. Experience with infrastructure as a code using AWS CDK.Expertise in encryption and decryption techniques for securing APIs, API Authentication and Authorization Primarily more experience is required on Lambda and APIGateway. Candidates having the AWS Certified Cloud Practitioner / AWS Certified Developer Associate certifications will be preferred Preferred technical and professional experience Experience in distributed/scalable systems Knowledge of standard tools for optimizing and testing code Knowledge/Experience of Development/Build/Deploy/Test life cycle

Posted 2 months ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

Pune

Work from Office

Developer leads the cloud application development/deployment for client based on AWS development methodology, tools and best practices. A developer responsibility is to lead the execution of a project by working with a senior level resource on assigned development/deployment activities and design, build, and maintain cloud environments focusing on uptime, access, control, and network security using automation and configuration management tools Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience of technologies like Spring boot, JAVA Demonstrated technical leadership experience on impact customer facing projects. Experience in building web Applications in Java/J2EE stack. Experience in UI framework such as REACT JS Working knowledge of any messaging system (KAFKA preferred) Experience designing and integrating REST APIs using Spring Boot Preferred technical and professional experience Strong experience in Concurrent design and multi-threading - General Experience, Object Oriented Programming System (OOPS), SQL Server/ Oracle/ MySQL Working knowledge of Azure or AWS cloud. Preferred Experience in building applications in a container based environment (Docker/Kubernetes) on AWS Cloud. Basic knowledge of SQL or NoSQL databases (Postgres, MongoDB, DynamoDB preferred) design and queries

Posted 2 months ago

Apply

3.0 - 7.0 years

10 - 14 Lacs

Pune

Work from Office

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include: Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Experience of technologies like Spring boot, JAVA Demonstrated technical leadership experience on impact customer facing projects. Experience in building web Applications in Java/J2EE stack. Experience in UI framework such as REACT JS Working knowledge of any messaging system (KAFKA preferred) Experience designing and integrating REST APIs using Spring Boot Preferred technical and professional experience Strong experience in Concurrent design and multi-threading - General Experience, Object Oriented Programming System (OOPS), SQL Server/ Oracle/ MySQL Working knowledge of Azure or AWS cloud. Preferred Experience in building applications in a container based environment (Docker/Kubernetes) on AWS Cloud. Basic knowledge of SQL or NoSQL databases (Postgres, MongoDB, DynamoDB preferred) design and queries

Posted 2 months ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Pune

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 2 months ago

Apply

5.0 - 7.0 years

12 - 16 Lacs

Bengaluru

Work from Office

As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers AWS S3 , Redshift , and EMR for data storage and distributed processing. AWS Lambda , AWS Step Functions , and AWS Glue to build serverless, event-driven data workflows and orchestrate ETL processes

Posted 2 months ago

Apply

5.0 - 10.0 years

14 - 17 Lacs

Mumbai

Work from Office

As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala

Posted 2 months ago

Apply

6.0 - 11.0 years

5 - 9 Lacs

Chennai

Work from Office

Full Stack Engineer ReactJs,NodeJs.,Next,Nest,Express. At least 6 years in relevant experience and overall 8-10 years in total for Full Stack Engineer (React/NodeJS). Technical Skills Required Cloud Platform Amazon Web Services(AWS) S3, IAM roles and policies, Lambda, API Gateway, Cognito user pool, Cloudwatch Programming Languages React.Js, Node.Js Databases Postgres SQL, Mongo DB, AWS Dynamo DB Scripting Languages JavaScript, TypeScript, HTML, XML Application Servers Tomcat 6.0 / 7.0, Nginx1.23.2 Framework Next Js, Nest Js, Express Js Version Control Systems Git Lab

Posted 2 months ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Bengaluru

Work from Office

Responsibilities A day in the life of an Infoscion As part of the Infosys delivery team, your primary role would be to interface with the client for quality assurance, issue resolution and ensuring high customer satisfaction. You will understand requirements, create and review designs, validate the architecture and ensure high levels of service offerings to clients in the technology domain. You will participate in project estimation, provide inputs for solution delivery, conduct technical risk planning, perform code reviews and unit test plan reviews. You will lead and guide your teams towards developing optimized high quality code deliverables, continual knowledge management and adherence to the organizational guidelines and processes. Design, deploy, implementation of FSxN Storage (Multi AZ) in both SAN and NAS Automate the end to end process of Migration Data Migration tools like cloud sync or Data sync Well versed with any of the scripting or tools like python or terraform(preferred) Drive the Storage strategy in optimization & modernize in terms of cost, efficiency Having good understanding on AWS storage services like EFS, S3, EBS, FSx Should be able to modernize these services, applications, Can suggest how to optimize the cost as these storage services consumes so much, whether we can archive the solution, Can help in integration of storage services in AWS Technical and Professional Requirements: Amazon EBSAmazon EFS and FSxAWS Application Migration Service (MGN)AWS CloudWatchAWS Cloud Migration FactoryAWS Step FunctionsAmazon EBS Multi-Attach Preferred Skills: Storage Technology->Backup Administration->Backup Technologies Technology->Cloud Platform->AWS App Development->Cloudwatch Technology->Infrastructure-Storage-Administration->Cisco-Storage Admin->storage Generic Skills: Technology->Cloud Platform->AWS Core services->Amazon Elastic Compute Cloud(EC2) Additional Responsibilities: Storage Architect - having good understanding on AWS storage services like EFS, S3, EBS, FSx, Educational Requirements Master Of Engineering,Master Of Technology,Bachelor Of Comp. Applications,Bachelor Of Science,Bachelor of Engineering,Bachelor Of Technology Service Line Cloud & Infrastructure Services * Location of posting is subject to business requirements

Posted 2 months ago

Apply

5.0 - 10.0 years

15 - 22 Lacs

Pune

Work from Office

About the company: Our esteemed client is a leading global systems integrator and business transformation consulting organisation. Our client helps companies innovate and transform by leveraging its unique insights, differentiated services, and flexible partnering models. The company is among the Top mobile application development companies in India, and also a pioneer in web application development and Automation Testing. They have gained the trust of more than 300 offshore clients from 30+ countries worldwide and has become a trustworthy software partner. Job Overview: We are seeking a skilled Senior Python Developer with extensive experience in Python development and hands-on expertise in AWS cloud services. The ideal candidate will play a crucial role in developing, maintaining, and deploying backend services and cloud infrastructure. This position is primarily focused on Python development, complemented by AWS tasks. What you will do: Python Development : Design, develop, test, and deploy scalable and efficient backend solutions using Python. Write clean, maintainable, and well-documented code following best practices. Implement APIs, data processing workflows, and integrations with various services. Troubleshoot, debug, and optimize existing Python applications for performance and scalability. Collaborate with cross-functional teams, including frontend developers, QA engineers, and product managers, to deliver high-quality software. Conduct code reviews and mentor junior developers on best practices. AWS Cloud Management : Design, configure, and maintain AWS infrastructure components such as EC2, S3, Lambda, RDS, and CloudFormation. Implement and manage CI/CD pipelines using AWS services like CodePipeline, CodeBuild, and CodeDeploy. Monitor and optimize cloud resource usage, ensuring cost-effective and reliable cloud operations. Set up security best practices on AWS, including IAM roles, VPC configurations, and data encryption. Troubleshoot cloud-related issues, perform regular updates, and apply patches as needed. What you will bring to the table: Bachelor's degree in Computer Science, Engineering, or a related field. 5+ years of hands-on experience in Python development. 3+ years of experience working with AWS services including EC2, S3, Lambda, RDS, CloudFormation, and Secrets Manager. Experience with modern AWS tools and services including API Gateway, DynamoDB, ECS, Amplify, CloudFront, Shield, OpenSearch (ElasticSearch). Strong knowledge of serverless architecture and deployment using the Serverless Framework. Proficiency in web frameworks such as Django, Flask, or FastAPI. Strong understanding of RESTful API design and integration via API Gateways. Solid experience with relational databases (e.g., PostgreSQL, MySQL) and NoSQL databases (e.g., DynamoDB, MongoDB). Familiarity with DevOps practices and CI/CD tools. Experience with version control systems, particularly Git. Strong problem-solving skills and attention to detail. Excellent communication skills and ability to work in a team environment. Perks and benefits As per industry standard

Posted 2 months ago

Apply

6.0 - 10.0 years

11 - 16 Lacs

Bengaluru

Work from Office

8-10 Years of Java/J2EE development Experience Spring Boot, MongoDB exposure preferred Experience in SQL/PLSQL in any database Technologies like Sybase, DB2, DynamoDB Implemented Docker-Microservices, RESTful API Architecture

Posted 2 months ago

Apply

16.0 - 20.0 years

25 - 30 Lacs

Noida, Pune, Bengaluru

Work from Office

Java Architect with 15+ years of experience with ability to architect design distributed enterprise systems, create the required architecturediagrams, detailed design, deployment topoligies and communication mechanisms Hands on experience on microservices, CQS/CQRS, EDA DDD Hands onexperience in design, development of Microservices using Spring framework Good communication skills, help/lead a development scrum team of 5-6 members interms of providing the design,code reviews Good to have experience on Azureand related native services Architect with ability to design distributedenterprise systems and come up with detailed design for the Platform, work withthe team on the implementation 16+ years of professional experience in software development, with a strong focus on Java, NodeJs, Typescript and IOT. Experience in implementing solutions in AWS and IoT. Must have executed at least 2 large engagements as a Lead. Proven experience in architecting and building complex, scalable Node.js applications. Experience implementing AWS cloud solutions. Good to have Experience with DynamoDB and Elastic Search. Expertise in Node.js, JavaScript, and related frameworks (e.g., Express.js, Nest.js). Deep understanding of microservices architecture, API design, and integration patterns. Experience with databases (e.g., MongoDB, PostgreSQL) and caching systems. Experience in implementing micro-frontend architecture. Strong knowledge of front-end technologies (e.g., React, Angular, Vue.js) is a plus. Strong communication and interpersonal skills. Experience working in an Agile/Scrum development environment. Key Responsibilities: Provide technical leadership and expertise in designing, implementing

Posted 2 months ago

Apply

12.0 - 16.0 years

20 - 25 Lacs

Pune

Work from Office

Overall 13 - 14 Years of experience Fluency in software architecture, software development, and systems testing Technical guidance and decision making skills Ability to shape the solution and enforce development practices Quality gates code reviews, pair programming, team reviews meeting Complementary tech skills / Relevant development experience is must Understanding of code management and release approaches / must have Understanding of CI/CD pipelines, GitFlow and Github, GitOps (Flux, ArgoCD) / must have / flux is good to have Good understanding of functional programming (Python Primary / Golang Secondary used in IAC platform) Understanding ABAC / RBAC / JWT / SAML / AAD / OIDC authorization and authentication (handson and direction No SQL databases, ie, DynamoDB ( SCC heavy) Event driven architecture: queues, streams, batches, pub / subs Understanding functional programming: list / map / reduce / compose, if familiar with monads / needed

Posted 2 months ago

Apply

5.0 - 8.0 years

9 - 14 Lacs

Bengaluru

Work from Office

8-10 Years of Java/J2EE development ExperienceSpring Boot, MongoDB exposure preferred Experience in SQL/PLSQL in any database Technologies like Sybase, DB2, DynamoDB. Implemented Docker-Microservices, RESTful API Architecture.

Posted 2 months ago

Apply

10.0 - 15.0 years

13 - 17 Lacs

Pune

Work from Office

Communication and leadership Supervise team members, delegate tasks, issue feedback, evaluate risks, and resolve conflicts. Project and crisis management Problem solving and innovation. Ownership and vision Techskills Fluency in software architecture, software development, and systems testing. Technical guidance and decision-making skills Ability to shape the solution and enforce development practices . Quality gates code reviews, pair programming, team reviews meeting Complementarytech skills / Relevant development experience is must Understanding of code management and release approaches / must have Understanding of CI/CD pipelines, GitFlow and Github, GitOps (Flux, ArgoCD) / must have / flux is good to have . Good understanding of functional programming ( Python Primary / Golang Secondary used in IAC platform ) Understanding ABAC / RBAC / JWT / SAML / AAD / OIDC authorization and authentication ( handson and direction No SQL databases, i.e., DynamoDB ( SCC heavy) Event driven architecture queues, streams, batches, pub / subs Understanding functional programming list / map / reduce / compose, if familiar with monads / needed . Fluent in operating kubernetes clusters, as from dev perspective Creating custom CRD, operators, controllers Experience in creating Serverless AWS Azure ( both needed ) Monorepo / multirepo / Understanding of code management approaches Understanding scalability and concurrency Understanding of network, direct connect connectivity, proxies Deep knowledge in AWS cloud ( org / networks / security / IAM ) (Basic understanding of Azure cloud) Understanding of SDLC, DRY, KISS, SOLID / development principles

Posted 2 months ago

Apply

10.0 - 15.0 years

9 - 14 Lacs

Noida, Bhubaneswar, Bengaluru

Work from Office

Flexible to adopt different technologies and solutions. Proficiency in application design anddevelopment. Tech Stack : Angular, Java, Spring boot, MySQL Micro-services and event-driven architecture Front-end integration experience AWS Cloud proficiency Good to have NoSQL knowledge : MongoDB, DynamoDB NodeJS : for serverless programming Flexible to adopt different technologies and solutions. Proficiency in application design and development. Tech Stack : Angular, Java, Spring boot, MySQL Micro-services and event-driven architecture Front-end integration experience AWS Cloud proficiency Good to haveNoSQL knowledge : MongoDB, DynamoDBNodeJS : for serverless programming

Posted 2 months ago

Apply

4.0 - 9.0 years

9 - 14 Lacs

Noida, Bhubaneswar, Pune

Work from Office

4+ years experience as an IoT developer Must have experience on AWS Cloud - IoTCore, Kinesis, DynamoDB, API Gateway Expertise in creating applications by integrating with various AWS services Must have worked one IoT implementationon AWS Ability to work in Agile delivery Skills: Java, AWS Certified(Developer), MQTT, AWS IoT Core, nodejs

Posted 2 months ago

Apply

7.0 - 10.0 years

10 - 14 Lacs

Gurugram, Bengaluru

Work from Office

We are looking for an experienced Senior Big Data Developer to join our team and help build and optimize high-performance, scalable, and resilient data processing systems. You will work in a fast-paced startup environment, handling highly loaded systems and developing data pipelines that process billions of records in real time. As a key member of the Big Data team, you will be responsible for architecting and optimizing distributed systems, leveraging modern cloud-native technologies, and ensuring high availability and fault tolerance in our data infrastructure. Primary Responsibilities: Design, develop, and maintain real-time and batch processing pipelines using Apache Spark, Kafka, and Kubernetes. Architect high-throughput distributed systems that handle large-scale data ingestion and processing. Work extensively with AWS services, including Kinesis, DynamoDB, ECS, S3, and Lambda. Manage and optimize containerized workloads using Kubernetes (EKS) and ECS. Implement Kafka-based event-driven architectures to support scalable, low-latency applications. Ensure high availability, fault tolerance, and resilience of data pipelines. Work with MySQL, Elasticsearch, Aerospike, Redis, and DynamoDB to store and retrieve massive datasets efficiently. Automate infrastructure provisioning and deployment using Terraform, Helm, or CloudFormation. Optimize system performance, monitor production issues, and ensure efficient resource utilization. Collaborate with data scientists, backend engineers, and DevOps teams to support advanced analytics and machine learning initiatives. Continuously improve and modernize the data architecture to support growing business needs. Required Skills: 7-10+ years of experience in big data engineering or distributed systems development. Expert-level proficiency in Scala, Java, or Python. Deep understanding of Kafka, Spark, and Kubernetes in large-scale environments. Strong hands-on experience with AWS (Kinesis, DynamoDB, ECS, S3, etc.). Proven experience working with highly loaded, low-latency distributed systems. Experience with Kafka, Kinesis, Flink, or other streaming technologies for event-driven architectures. Expertise in SQL and database optimizations for MySQL, Elasticsearch, and NoSQL stores. Strong experience in automating infrastructure using Terraform, Helm, or CloudFormation. Experience managing production-grade Kubernetes clusters (EKS). Deep knowledge of performance tuning, caching strategies, and data consistency models. Experience working in a startup environment, adapting to rapid changes and building scalable solutions from scratch. Nice to Have Experience with machine learning pipelines and AI-driven analytics. Knowledge of workflow orchestration tools such as Apache Airflow.

Posted 2 months ago

Apply

1.0 - 6.0 years

5 - 9 Lacs

Kolkata, Mumbai, New Delhi

Work from Office

Join our dynamic team, where we innovate IT automation and visibility solutions to empower businesses. We're looking for a Full-Stack Developer to work across the stack, building scalable, high-performance applications and crafting delightful user experiences. Responsibilities - Design, develop, and maintain scalable web applications across the front-end and back-end. - Collaborate with product managers and designers to deliver user-friendly interfaces and seamless user experiences. - Build, test, and optimize RESTful APIs and GraphQL endpoints. - Maintain and optimize database systems (SQL/NoSQL). - Write clean, maintainable, and testable code to ensure long-term reliability. - Monitor performance and debug issues across the stack. Qualifications - Proficiency in JavaScript/TypeScript and frameworks like React, Vue, or Angular. - Expertise in backend development using Node.js, Python, or Ruby. - Solid understanding of database systems like PostgreSQL, MongoDB, or DynamoDB. - Experience with cloud platforms (AWS, Azure, or GCP) and version control systems (Git). - Familiarity with CI/CD pipelines and containerization tools like Docker. - Strong problem-solving skills and a team-oriented mindset.

Posted 2 months ago

Apply

8.0 - 13.0 years

25 - 40 Lacs

Pune

Remote

10+ yrs of exp in S/W dev with a focus on AWS solutions architecture. Exp in architecting microservices-based applications using EKS.Design, develop, and implement microservices apps on AWS using Java. AWS Certified Solutions Architect -must

Posted 2 months ago

Apply

4.0 - 9.0 years

12 - 22 Lacs

Gurugram

Work from Office

To Apply - Submit Details via Google Form - https://forms.gle/8SUxUV2cikzjvKzD9 As a Senior Consultant in our Consulting team, youll build and nurture positive working relationships with teams and clients with the intention to exceed client expectations Seeking experienced AWS Data Engineers to design, implement, and maintain robust data pipelines and analytics solutions using AWS services. The ideal candidate will have a strong background in AWS data services, big data technologies, and programming languages. Role & responsibilities 1. Design and implement scalable, high-performance data pipelines using AWS services 2. Develop and optimize ETL processes using AWS Glue, EMR, and Lambda 3. Build and maintain data lakes using S3 and Delta Lake 4. Create and manage analytics solutions using Amazon Athena and Redshift 5. Design and implement database solutions using Aurora, RDS, and DynamoDB 6. Develop serverless workflows using AWS Step Functions 7. Write efficient and maintainable code using Python/PySpark, and SQL/PostgrSQL 8. Ensure data quality, security, and compliance with industry standards 9. Collaborate with data scientists and analysts to support their data needs 10. Optimize data architecture for performance and cost-efficiency 11. Troubleshoot and resolve data pipeline and infrastructure issues Preferred candidate profile 1. Bachelors degree in computer science, Information Technology, or related field 2. Relevant years of experience as a Data Engineer, with at least 60% of experience focusing on AWS 3. Strong proficiency in AWS data services: Glue, EMR, Lambda, Athena, Redshift, S3 4. Experience with data lake technologies, particularly Delta Lake 5. Expertise in database systems: Aurora, RDS, DynamoDB, PostgreSQL 6. Proficiency in Python and PySpark programming 7. Strong SQL skills and experience with PostgreSQL 8. Experience with AWS Step Functions for workflow orchestration Technical Skills: - AWS Services: Glue, EMR, Lambda, Athena, Redshift, S3, Aurora, RDS, DynamoDB , Step Functions - Big Data: Hadoop, Spark, Delta Lake - Programming: Python, PySpark - Databases: SQL, PostgreSQL, NoSQL - Data Warehousing and Analytics - ETL/ELT processes - Data Lake architectures - Version control: Git - Agile methodologies

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies