Home
Jobs

271 S3 Jobs - Page 2

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

7.0 - 9.0 years

15 - 22 Lacs

Chennai

Work from Office

Naukri logo

Key Responsibilities • Application Development: Design, develop, and maintain dealership applications using Java, Spring Boot, and Microservices architecture. • Cloud Deployment: Build and manage cloud-native applications on AWS, leveraging services such as Lambda, ECS, RDS, S3, DynamoDB, and API Gateway. • System Architecture: Develop and maintain scalable, high-availability architectures to support large-scale dealership operations. • Database Management: Work with relational and NoSQL databases like PostgreSQL, and DynamoDB for efficient data storage and retrieval. • Integration & APIs: Develop and manage RESTful APIs and integrate with third-party services, including OEMs and dealership management systems (DMS). • DevOps & CI/CD: Implement CI/CD pipelines using tools like Jenkins, GitHub Actions, or AWS CodePipeline for automated testing and deployments. • Security & Compliance: Ensure application security, data privacy, and compliance with industry regulations. • Collaboration & Mentorship: Work closely with product managers, designers, and other engineers. Mentor junior developers to maintain best coding practices. Technical Skills Required • Programming Languages: Java (8+), Spring Boot, Hibernate • Cloud Platforms: AWS (Lambda, S3, EC2, ECS, RDS, DynamoDB, CloudFormation) • Microservices & API Development: RESTful APIs, API Gateway • Database Management: PostgreSQL, DynamoDB • DevOps & Automation: Docker, Kubernetes, Terraform, CI/CD pipelines • Testing & Monitoring: JUnit, Mockito, Prometheus, CloudWatch • Version Control & Collaboration: Git, GitHub, Jira, Confluence Nice-to-Have Skills • Experience with serverless architectures (AWS Lambda, Step Functions) • Exposure to event-driven architectures (Kafka, SNS, SQS) Qualifications • Bachelor's or masters degree in computer science, Software Engineering, or a related field • 5+ years of hands-on experience in Java-based backend development • Strong problem-solving skills with a focus on scalability and performance.

Posted 5 days ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Chennai

Remote

Naukri logo

Who We Are For 20 years, we have been working with organizations large and small to help solve business challenges through technology. We bring a unique combination of engineering and strategy to Make Data Work for organizations. Our clients range from the travel and leisure industry to publishing, retail and banking. The common thread between our clients is their commitment to making data work as seen through their investment in those efforts. In our quest to solve data challenges for our clients, we work with large enterprise, cloud-based and marketing technology suites. We have a deep understanding of these solutions so we can help our clients make the most of their investment in an efficient way to have a data-driven business. Softcrylic now joins forces with Hexaware to Make Data Work in bigger ways! Why Work at Softcrylic? Softcrylic provides an engaging, team-focused, and rewarding work environment where people are excited about the work they do and passionate about delivering creative solutions to our clients. Work Timing: 12:30 pm to 9:30 pm (Flexible in work timing) Why Work at Softcrylic? Softcrylic provides an engaging, team-focused, and rewarding work environment where people are excited about the work they do and passionate about delivering creative solutions to our clients. Work Timing: 12:30 pm to 9:30 pm (Flexible in work timing) Here's how to approach the interview: All technical interview rounds will be conducted virtually. The final round will be a face-to-face interview with HR in Chennai. However, there will be a 15-minute technical assessment/in-person technical discussion as part of the final round. Make sure to prepare accordingly for both virtual and in-person components. Job Description: Key Responsibilities: 1. Data Pipeline Development: Design, develop, and maintain large-scale data pipelines using Databricks, Apache Spark, and AWS. 2. Data Integration: Integrate data from various sources into a unified data platform using dbt and Apache Spark. 3. Graph Database Management: Design, implement, and manage graph databases to support complex data relationships and queries. 4. Data Processing: Develop and optimize data processing workflows using Python, Apache Spark, and Databricks. 5. Data Quality: Ensure data quality, integrity, and security across all data pipelines and systems. 6. Team Management: Lead and manage a team of data engineers, providing guidance, mentorship, and support. 7. Agile Scrum: Work with product owners, product managers, and stakeholders to create product roadmaps, schedule and estimate tasks in sprints, and ensure successful project delivery. Mandatory Skills: 1. Databricks: Experience with Databricks platform, including data processing, analytics, and machine learning. 2. AWS: Experience with AWS services, including S3, Glue, and other relevant services. 3. Python: Proficiency with Python programming language for data processing, analysis, and automation. 4. Graph Database: Experience with graph databases, such as Neo4j, Amazon Neptune, or similar. 5. Apache Spark: Experience with Apache Spark for large-scale data processing and analytics. 6. dbt (Data Build Tool): Experience with dbt for data transformation, modeling, and analytics. 7. Agile Scrum: Experience with Agile Scrum methodologies, including sprint planning, task estimation, and backlog management. Optional Skills: 1. dlt(Data Load Tool): Experience with data load tools for efficiently loading data into target systems. 2. Kubernetes: Experience with Kubernetes for container orchestration and management. 3. Bash Scripting: Proficiency with Bash scripting for automation and task management. 4. Linux: Experience with Linux operating system, including command-line interface and system administration.

Posted 5 days ago

Apply

5.0 - 10.0 years

20 - 30 Lacs

Pune, Chennai, Bengaluru

Work from Office

Naukri logo

Mandatory keyskills : Athena, Step Functions, Spark - Pyspark, ETL Fundamentals, SQL (Basic + Advanced), Glue, Python, Lambda, Data Warehousing, EBS /EFS, AWS EC2, Lake Formation, Aurora, S3, Modern Data Platform Fundamentals, PLSQL, Cloud front We are looking for an experienced AWS Data Engineer to design, build, and manage robust, scalable, and high-performance data pipelines and data platforms on AWS. The ideal candidate will have a strong foundation in ETL fundamentals, data modeling, and modern data architecture, with hands-on expertise across a broad spectrum of AWS services including Athena, Glue, Step Functions, Lambda, S3, and Lake Formation. Key Responsibilities : Design and implement scalable ETL/ELT pipelines using AWS Glue, Spark (PySpark), and Step Functions. Work with structured and semi-structured data using Athena, S3, and Lake Formation to enable efficient querying and access control. Develop and deploy serverless data processing solutions using AWS Lambda and integrate them into pipeline orchestration. Perform advanced SQL and PL/SQL development for data transformation, analysis, and performance tuning. Build data lakes and data warehouses using S3, Aurora, and Athena. Implement data governance, security, and access control strategies using AWS tools including Lake Formation, CloudFront, EBS/EFS, and IAM. Develop and maintain metadata, lineage, and data cataloging capabilities. Participate in data modeling exercises for both OLTP and OLAP environments. Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable insights. Monitor, debug, and optimize data pipelines for reliability and performance. Required Skills & Experience: Strong experience with AWS data services: Glue, Athena, Step Functions, Lambda, Lake Formation, S3, EC2, Aurora, EBS/EFS, CloudFront. Proficient in PySpark, Python, SQL (basic and advanced), and PL/SQL. Solid understanding of ETL/ELT processes and data warehousing concepts. Familiarity with modern data platform fundamentals and distributed data processing. Experience in data modeling (conceptual, logical, physical) for analytical and operational use cases. Experience with orchestration and workflow management tools within AWS. Strong debugging and performance tuning skills across the data stack.

Posted 5 days ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Role: AWS Data Engineer Location: Hyderabad, Bangalore, Chennai, Mumbai, Pune, Kolkata, Gurgaon Experience: 4-8 years Work Mode: Hybrid Job Description: We are seeking a skilled AWS Data Engineer to design, develop, and support large-scale data solutions on AWS. The ideal candidate will have hands-on expertise in data engineering, automation, and cloud technologies, enabling data-driven decision-making and operational excellence. Contract to Hire Key Responsibilities: Design, develop, and deploy data pipelines and solutions using AWS services such as S3, Glue, Lambda, API Gateway, and SQS . Write clean, efficient code using Python , PySpark , and SQL to process and transform data. Implement batch job scheduling , manage data dependencies, and ensure reliable data processing workflows. Develop and maintain Spark and Airflow jobs for large-scale data processing and orchestration. Automate repetitive tasks and build reusable frameworks to enhance efficiency and reduce manual intervention. Provide Run/DevOps support , monitor pipelines, and manage the ongoing operation of data services on AWS. Ensure high standards for data quality, reliability, and performance. Collaborate with data scientists, analysts, and other engineers to support business initiatives. Must-Have Skills: Strong hands-on experience with AWS services: S3, Lambda, Glue, API Gateway, SQS Proficiency in Python, PySpark, and SQL Experience with batch job scheduling and managing data dependencies Strong knowledge of Spark and Airflow for data processing and orchestration Solidunderstanding of DevOps practices and operational support for cloud data services Good to Have: Experience with containerization (Docker, Kubernetes) Exposure to monitoring/logging tools (CloudWatch, Datadog, etc.) AWS certifications (e.g., Solutions Architect, Data Analytics Specialty)

Posted 5 days ago

Apply

10.0 - 12.0 years

7 - 12 Lacs

Mumbai

Work from Office

Naukri logo

About the Role : We are seeking an experienced and highly skilled Senior AWS Engineer with over 10 years of professional experience to join our dynamic and growing team. This is a fully remote position, requiring strong expertise in serverless architectures, AWS services, and infrastructure as code. You will play a pivotal role in designing, implementing, and maintaining robust, scalable, and secure cloud solutions. Key Responsibilities : - Design & Implementation : Lead the design and implementation of highly scalable, resilient, and cost-effective cloud-native applications leveraging a wide array of AWS services, with a strong focus on serverless architecture and event-driven design. - AWS Services Expertise : Architect and develop solutions using core AWS services including AWS Lambda, API Gateway, S3, DynamoDB, Step Functions, SQS, AppSync, Amazon Pinpoint, and Cognito. - Infrastructure as Code (IaC) : Develop, maintain, and optimize infrastructure using AWS CDK (Cloud Development Kit) to ensure consistent, repeatable, and version-controlled deployments. Drive the adoption and implementation of CodePipeline for automated CI/CD. - Serverless & Event-Driven Design : Champion serverless patterns and event-driven architectures to build highly efficient and decoupled systems. - Cloud Monitoring & Observability : Implement comprehensive monitoring and observability solutions using CloudWatch Logs, X-Ray, and custom metrics to proactively identify and resolve issues, ensuring optimal application performance and health. - Security & Compliance : Enforce stringent security best practices, including the establishment of robust IAM roles and boundaries, PHI/PII tagging, secure configurations with Cognito and KMS, and adherence to HIPAA standards. Implement isolation patterns and fine-grained access control mechanisms. - Cost Optimization : Proactively identify and implement strategies for AWS cost optimization, including S3 lifecycle policies, leveraging serverless tiers, and strategic service selection (e.g., evaluating Amazon Pinpoint vs. SES based on cost-effectiveness). - Scalability & Resilience : Design and implement highly scalable and resilient systems incorporating features like auto-scaling, Dead-Letter Queues (DLQs), retry/backoff mechanisms, and circuit breakers to ensure high availability and fault tolerance. - CI/CD Pipeline : Contribute to the design and evolution of CI/CD pipelines, ensuring automated, efficient, and reliable software delivery. - Documentation & Workflow Design : Create clear, concise, and comprehensive technical documentation for architectures, workflows, and operational procedures. - Cross-Functional Collaboration : Collaborate effectively with cross-functional teams, including developers, QA, and product managers, to deliver high-quality solutions. - AWS Best Practices : Advocate for and ensure adherence to AWS best practices across all development and operational activities. Required Skills & Experience : of hands-on experience as an AWS Engineer or similar role. - Deep expertise in AWS Services : Lambda, API Gateway, S3, DynamoDB, Step Functions, SQS, AppSync, CloudWatch Logs, X-Ray, EventBridge, Amazon Pinpoint, Cognito, KMS. - Proficiency in Infrastructure as Code (IaC) with AWS CDK; experience with CodePipeline is a significant plus. - Extensive experience with Serverless Architecture & Event-Driven Design. - Strong understanding of Cloud Monitoring & Observability tools : CloudWatch Logs, X-Ray, Custom Metrics. - Proven ability to implement and enforce Security & Compliance measures, including IAM roles boundaries, PHI/PII tagging, Cognito, KMS, HIPAA standards, Isolation Pattern, and Access Control. - Demonstrated experience with Cost Optimization techniques (S3 lifecycle policies, serverless tiers, service selection). - Expertise in designing and implementing Scalability & Resilience patterns (auto-scaling, DLQs, retry/backoff, circuit breakers). - Familiarity with CI/CD Pipeline Concepts. - Excellent Documentation & Workflow Design skills. - Exceptional Cross-Functional Collaboration abilities. - Commitment to implementing AWS Best Practices.

Posted 5 days ago

Apply

10.0 - 15.0 years

15 - 30 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

Senior Full-Stack Developer with Node ,react .aws and Gen Ai llm Location: Chennai Hyderabad Banglore Experience: 10 Years Work Type: Onsite Budget: As per market standards Primary Skills NodeJS 6+ years of hands-on backend development JavaScript HTML CSS Strong frontend development capabilities ReactJS VueJS Working knowledge or project experience preferred AWS Serverless Architecture Mandatory (Lambda, API Gateway, S3) LLM Integration AI Development Experience with OpenAI, Anthropic APIs Prompt Engineering Context management and token optimization SQL NoSQL Databases Solid experience with relational & non-relational DBs End To End Deployment Deploy, debug, and manage full-stack apps Clean Code Writes clean, maintainable, production-ready code Secondary Skills Amazon Bedrock Familiarity is a strong plus Web Servers Experience with Nginx Apache configuration RAG Patterns Vector DBs AIAgents Bonus experience Software Engineering Best Practices Strong design & architecture skills CI/CD DevOps Exposure Beneficial for full pipeline integration Expectations Own frontend and backend development Collaborate closely with engineering and client teams Build scalable, secure, and intelligent systems Influence architecture and tech stack decisions Stay up-to-date with AI trends and serverless best practices

Posted 5 days ago

Apply

5.0 - 10.0 years

22 - 37 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

Experience: 5-8 Years (Lead-23 LPA), 8-10 Years (Senior Lead 35 LPA), 10+ Years (Architect- 42 LPA)- Max Location : Bangalore as 1 st preference , We can also go for Hyderabad, Chennai, Pune, Gurgaon Notice: Immediate to max 15 Days Joiner Mode of Work: Hybrid Job Description: Athena, Step Functions, Spark - Pyspark, ETL Fundamentals, SQL (Basic + Advanced), Glue, Python, Lambda, Data Warehousing, EBS /EFS, AWS EC2, Lake Formation, Aurora, S3, Modern Data Platform Fundamentals, PLSQL, Cloud front We are looking for an experienced AWS Data Engineer to design, build, and manage robust, scalable, and high-performance data pipelines and data platforms on AWS. The ideal candidate will have a strong foundation in ETL fundamentals, data modeling, and modern data architecture, with hands-on expertise across a broad spectrum of AWS services including Athena, Glue, Step Functions, Lambda, S3, and Lake Formation. Key Responsibilities: Design and implement scalable ETL/ELT pipelines using AWS Glue, Spark (PySpark), and Step Functions. Work with structured and semi-structured data using Athena, S3, and Lake Formation to enable efficient querying and access control. Develop and deploy serverless data processing solutions using AWS Lambda and integrate them into pipeline orchestration. Perform advanced SQL and PL/SQL development for data transformation, analysis, and performance tuning. Build data lakes and data warehouses using S3, Aurora, and Athena. Implement data governance, security, and access control strategies using AWS tools including Lake Formation, CloudFront, EBS/EFS, and IAM. Develop and maintain metadata, lineage, and data cataloging capabilities. Participate in data modeling exercises for both OLTP and OLAP environments. Work closely with data scientists, analysts, and business stakeholders to understand data requirements and deliver actionable insights. Monitor, debug, and optimize data pipelines for reliability and performance. Required Skills & Experience: Strong experience with AWS data services: Glue, Athena, Step Functions, Lambda, Lake Formation, S3, EC2, Aurora, EBS/EFS, CloudFront. Proficient in PySpark, Python, SQL (basic and advanced), and PL/SQL. Solid understanding of ETL/ELT processes and data warehousing concepts. Familiarity with modern data platform fundamentals and distributed data processing. Experience in data modeling (conceptual, logical, physical) for analytical and operational use cases. Experience with orchestration and workflow management tools within AWS. Strong debugging and performance tuning skills across the data stack.

Posted 6 days ago

Apply

5.0 - 10.0 years

15 - 30 Lacs

Hyderabad

Hybrid

Naukri logo

Dear Applicants, Astrosoft Technologies is back with Exciting Job opportunity Join our Growing Team & be a part to explore it. We are Hiring 'Sr. AWS Data Engineer' ! Hyderabad (Hybrid) ! 3 Tech Rounds ! Early Joiners ! Apply Here to Email - Karthik.Jangam@astrosofttech.com with your Updated Resume & Requested details to reach you out: Total Experience- AWS DE Services Current Location- Current Company- C-CTC- Ex-CTC – Offer (Y/N)– Notice Period (Max-20 Days)– Ready to Relocate Hyderabad (Y/N) – Company: AstroSoft Technologies (https://www.astrosofttech.com/) Astrosoft is an award-winning company that specializes in the areas of Data, Analytics, Cloud, AI/ML, Innovation, Digital. We have a customer first mindset and take extreme ownership in delivering solutions and projects for our customers and have consistently been recognized by our clients as the premium partner to work with. We bring to bear top tier talent, a robust and structured project execution framework, our significant experience over the years and have an impeccable record in delivering solutions and projects for our clients. Founded in 2004, Headquarters in FL,USA, Corporate Office - India, Hyderabad Benefits Program: H1B Sponsorship (Depends on Project & Performance) Lunch & Dinner (Every day) Health Insurance Coverage- Group Industry Standards Leave Policy Skill Enhancement Certification Hybrid Mode Job Summary: Strong experience and understanding of streaming architecture and development practices using Kafka, spark, flink etc., Strong AWS development experience using S3, SNS, SQS, MWAA (Airflow) Glue, DMS and EMR. Strong knowledge of one or more programing languages Python/Java/Scala (ideally Python) Experience using Terraform to build IAC components in AWS. Strong experience with ETL Tools in AWS; ODI experience is as plus. Strong experience with Database Platforms: Oracle, AWS Redshift Strong experience in SQL tuning, tuning ETL solutions, physical optimization of databases. Very familiar with SRE concepts which includes evaluating and implementing monitoring and observability tools like Splunk, Data Dog, CloudWatch and other job, log or dashboard concepts for customer support and application health checks. Ability to collaborate with our business partners to understand and implement their requirements. Excellent interpersonal skills and be able to build consensus across teams. Strong critical thinking and ability to think out-of-the box. Self-motivated and able to perform under pressure. AWS certified (preferred) Thanks & Regards Karthik Kumar (HR TAG Lead -India) Astrosoft Technologies India Private Limited Contact: +91-8712229084 Email: karthik.jangam@astrosofttech.com

Posted 6 days ago

Apply

4.0 - 7.0 years

15 - 25 Lacs

Gandhinagar, Pune

Hybrid

Naukri logo

Job Description: Ruby on Rails Developer Experience: 4 to 7 Years Location: Pune / Gandhinagar Work Mode: Hybrid Must-Have Skills Ruby on Rails (ROR) AWS: S3, Lambda, EC2, RDS, Elasticache, Facial Recognition, SNS, SES Postgres Node.js HTML, CSS, jQuery React (Beginner's level) Capistrano and Sidekiq experience Roles & Responsibilities Development, Unit Testing, and Code Review Client collaboration and communication Problem solving and technical troubleshooting Technical documentation Work on Agile Sprint methodology Shift Timings Shift Timings: from 12:00 PM to 9:00 PM IST The person must be available until 9:30 PM IST.

Posted 6 days ago

Apply

4.0 - 9.0 years

0 - 3 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Dear Candidate, Interested candidates : Please share your updated resume along with Photo, Pan card and PF member service history. Seasons Greetings! Role: Python Data Engineer Work Nature - Contract To hire (3rd Party Payroll) Work Location Pan India Total Experience 4+ Yrs Immediate Joiners only Email: tuppari.pradeep@firstmeridianglobal.com Job Description: 4+ years of experience in backend development with Python. Strong experience with AWS services and cloud architecture. Proficiency in developing RESTful APIs and microservices. Experience with database technologies such as SQL, PostgreSQL, and NoSQL databases. Familiarity with containerization and orchestration tools like Docker and Kubernetes. Knowledge of CI/CD pipelines and tools such as Jenkins, GitLab CI, or AWS CodePipeline. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Interested candidates : Please share your updated resume along with Photo, Pan card and PF member service history to the mentioned email: tuppari.pradeep@firstmeridianglobal.com Element Details Full Name as Per Govt Proofs: Contact Number Alternate Contact Number Email ID Date Of Birth Fathers Name Total Experience Relevant Experience PAN Card Number Current CTC Expected CTC Current Work Location Preferred Location Open for Relocation (Yes/No)- Current Company Name Notice Period Mode of Employment (Contract/Permanent) If Contract, please provide payroll company: Do you know anyone who could be interested in this profile? Feel free to share this email with the person. You might make their day just by forwarding this email. Regards Pradeep tuppari.pradeep@firstmeridianglobal.com

Posted 6 days ago

Apply

4.0 - 9.0 years

0 - 3 Lacs

Hyderabad

Hybrid

Naukri logo

Urgent Hiring for AWS Data Engineer for Client DELOITTE for CTH position Hi, Seasons Greetings! Role: AWS Data Engineer Work Nature - Contract To hire (3rd Party Payroll) Work Location Hyderabad Total Experience 4+ Yrs Immediate Joiners only •Job Description: • AWS Data Engineer Hands-on experience with AWS services including S3 , Lambda,Glue, API Gateway , and SQL. Strong skills in data engineering on AWS, with proficiency in Python ,pyspark & SQL . Experience with batch job scheduling and managing data dependencies. Knowledge of data processing tools like Spark and Airflow . Automate repetitive tasks and build reusable frameworks to improve efficiency. Provide Run/DevOps support and manage the ongoing operation of data services Please share your updated resume along with the below details: Element Details Full Name as Per Govt Proofs: Contact Number Alternate Contact Number Email ID Date Of Birth Fathers Name Total Experience Relevant Experience PAN Card Number Current CTC Expected CTC Current Work Location Preferred Location Open for Relocation (Yes/No)- Current Company Name Notice Period Mode of Employment (Contract/Permanent) If Contract, please provide payroll company: Do you know anyone who could be interested in this profile? Feel free to share this email with the person. You might make their day just by forwarding this email. Mounika.t@affluentgs.com 7661922227

Posted 1 week ago

Apply

5.0 - 10.0 years

13 - 16 Lacs

Mumbai, Delhi / NCR, Bengaluru

Work from Office

Naukri logo

Job Title: Sr. Full Stack Engineer (React / Node.js / Nest.js / AWS / Fabric.js) Location: Remote (CET time zone overlap required) Notice Period: Immediate Type: Full-Time, Long-Term iSource Services is hiring for one of their USA based client for the position of Sr. Full Stack Engineer (React / Node.js / Nest.js / AWS / Fabric.js). About the Role - We are seeking an experienced Senior Full Stack Engineer for a long-term engagement on a functioning SaaS product in the manufacturing domain. The ideal candidate will have a strong foundation in both frontend and backend development and be comfortable working independently within a remote setup. This position provides an exciting opportunity to contribute to real-world AI and computer vision applications while working with a funded product that has already boarded a solid customer base. Project Overview A functional SaaS product operating in the manufacturing industry The platform has onboarded customers and secured initial funding Involvement in computer vision tasks integrated into real-world use cases Opportunity to work on a full-stack architecture and gain exposure to AI challenges Required Skills & Qualifications: Strong expertise in React.js (minimum 3 years in production-level projects) Advanced knowledge of Node.js and Nest.js (back-end frameworks) Practical experience with AWS (such as EC2, S3, Lambda, etc.) Relevant hands-on experience using Fabric.js for canvas-based UI Exposure to Python for computer vision (e.g., OpenCV, image processing) Strong version control practices using GitHub (GitHub profile required) Communication: Good command of English, both written and verbal Ability to communicate clearly in a remote environment Skills : - Full stack engineer,React.js,Node.js, nest.js,AWS, saas product ,AI application, computer vision, Full Stack Engineer, React.js, Node.js, Nest.js, AWS (EC2, S3, Lambda, etc.), Fabric.js (canvas-based UI), Python, GitHub, Frontend and backend development, SaaS product experience, AI application, computer vision.

Posted 1 week ago

Apply

5.0 - 8.0 years

20 - 25 Lacs

Pune

Hybrid

Naukri logo

Software Engineer Baner, Pune, Maharashtra Department Software & Automation Employee Type Permanent Experience Range 5 - 8 Years Qualification: Bachelor's or master's degree in computer science, IT, or related field. Roles & Responsibilities: Technical Role: Architect and build scalable data pipelines using AWS and Databricks. Integrate data from sensors (Cameras, Lidars, Radars). Deliver proof-of-concepts and support system improvements. Ensure data quality and scalable design in solutions. Strong Python, Databricks (SQL, PySpark, Workflows), and AWS skills. Solid leadership and mentoring ability. Agile development experience. Additional Skill: Good to Have: AWS/Databricks certifications. Experience with Infrastructure as Code (Terraform/CDK). Exposure to machine learning data workflows. Software Skills: Python Databricks (SQL, PySpark, Workflows) AWS (S3, EC2, Glue) Terraform/CDK (good to have)

Posted 1 week ago

Apply

6.0 - 8.0 years

18 - 20 Lacs

Pune

Work from Office

Naukri logo

Roles and Responsibilities Experience of 6+ yrs as Mongo DB DBA Experience of migration to AWS is very much preferred, understanding of other DBs like SQL Server with AOAG, My SQL etc will be an added advantage. Requirements Expertise in Mongo DB with in-depth understanding, Should have worked on setting up the Mongo DB clusters for high availability and replication, Mongo DB distributed Ops Manager, Backups and Restore (Linux environment), Set up Listeners, Resolve connectivity issues and understanding of Firewall setup needs, Setting Encryption configuring certificates, understanding on AD based authenticated access and other DBA tasks. Experience of migration to AWS is very much preferred, understanding of other DBs like SQL Server with AOAG, My SQL etc will be an added advantage.

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Please share profiles Key Responsibilities • Application Development: Design, develop, and maintain dealership applications using Java, Spring Boot, and Microservices architecture. • Cloud Deployment: Build and manage cloud-native applications on AWS, leveraging services such as Lambda, ECS, RDS, S3, DynamoDB, and API Gateway. • System Architecture: Develop and maintain scalable, high-availability architectures to support large-scale dealership operations. • Database Management: Work with relational and NoSQL databases like PostgreSQL, and DynamoDB for efficient data storage and retrieval. • Integration & APIs: Develop and manage RESTful APIs and integrate with third-party services, including OEMs and dealership management systems (DMS). • DevOps & CI/CD: Implement CI/CD pipelines using tools like Jenkins, GitHub Actions, or AWS CodePipeline for automated testing and deployments. • Security & Compliance: Ensure application security, data privacy, and compliance with industry regulations. • Collaboration & Mentorship: Work closely with product managers, designers, and other engineers. Mentor junior developers to maintain best coding practices. Technical Skills Required • Programming Languages: Java (8+), Spring Boot, Hibernate • Cloud Platforms: AWS (Lambda, S3, EC2, ECS, RDS, DynamoDB, CloudFormation) • Microservices & API Development: RESTful APIs, API Gateway • Database Management: PostgreSQL, DynamoDB • DevOps & Automation: Docker, Kubernetes, Terraform, CI/CD pipelines • Testing & Monitoring: JUnit, Mockito, Prometheus, CloudWatch • Version Control & Collaboration: Git, GitHub, Jira, Confluence Nice-to-Have Skills • Experience with serverless architectures (AWS Lambda, Step Functions) • Exposure to event-driven architectures (Kafka, SNS, SQS) Qualifications • Bachelor's or masters degree in computer science, Software Engineering, or a related field • 5+ years of hands-on experience in Java-based backend development • Strong problem-solving skills with a focus on scalability and performance.

Posted 1 week ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Chennai

Work from Office

Naukri logo

Notice period: Immediate 15days Profile source: Anywhere in India Timings: 1:00pm-10:00pm Work Mode: WFO (Mon-Fri) Job Summary: We are looking for an experienced and highly skilled Senior Data Engineer to lead the design and development of our data infrastructure and pipelines. As a key member of the Data & Analytics team, you will play a pivotal role in scaling our data ecosystem, driving data engineering best practices, and mentoring junior engineers. This role is ideal for someone who thrives on solving complex data challenges and building systems that power business intelligence, analytics, and advanced data products. Key Responsibilities: Design and build robust, scalable, and secure data pipelines and Lead the complete lifecycle of ETL/ELT processes, encompassing data intake, transformation, and storage including the concept of SCD type2. Collaborate with data scientists, analysts, backend and product teams to define data requirements and deliver impactful data solutions. Maintain and oversee the data infrastructure, including cloud storage, processing frameworks, and orchestration tools. Build logical and physical data model using any data modeling tool Champion data governance practices, focusing on data quality, lineage tracking, and catalog Guarantee adherence of data systems to privacy regulations and organizational Guide junior engineers, conduct code reviews, and foster knowledge sharing and technical best practices within the team. Required Skills & Qualifications: Bachelor's or Master's degree in Computer Science, Engineering, or a related Minimum of 5 years of practical experience in a data engineering or comparable Demonstrated expertise in SQL and Python (or similar languages such as Scala/Java). Extensive experience with data pipeline orchestration tools (e.g., Airflow, dbt, ). Proficiency in cloud data platforms, including AWS (Redshift, S3, Glue), or GCP (BigQuery, Dataflow), or Azure (Data Factory, Synapse). Familiarity with big data technologies (e.g., Spark, Kafka, Hive) and other data Solid grasp of data warehousing principles, data modeling techniques, and performance (e.g. Erwin Data Modeler, MySQL Workbench) Exceptional problem-solving abilities coupled with a proactive and team-oriented approach.

Posted 1 week ago

Apply

5.0 - 8.0 years

5 - 15 Lacs

Chennai

Work from Office

Naukri logo

Notice period: Immediate 15days Profile source: Tamil Nadu Timings: 1:00pm 10:00pm (IST) Work Mode: WFO (Mon-Fri) About the Role We are looking for an experienced and highly skilled Senior Data Engineer to lead the design and development of our data infrastructure and pipelines. As a key member of the Data & Analytics team, you will play a pivotal role in scaling our data ecosystem, driving data engineering best practices, and mentoring junior engineers. This role is ideal for someone who thrives on solving complex data challenges and building systems that power business intelligence, analytics, and advanced data products. Key Responsibilities Design and build robust, scalable, and secure data pipelines and Lead the complete lifecycle of ETL/ELT processes, encompassing data intake, transformation, and Collaborate with data scientists, analysts, and product teams to define data requirements and deliver impactful data solutions. Maintain and oversee the data infrastructure, including cloud storage, processing frameworks, and orchestration tools. Champion data governance practices, focusing on data quality, lineage tracking, and catalog Guarantee adherence of data systems to privacy regulations and organizational Guide junior engineers, conduct code reviews, and foster knowledge sharing and technical best practices within the team. Required Qualifications & Skills: Bachelors or masters degree in computer science, Engineering, or a related Minimum of 5 years of practical experience in a data engineering or comparable Demonstrated expertise in SQL and Python (or similar languages such as Scala/Java). Extensive experience with data pipeline orchestration tools (e.g., Airflow, dbt, Prefect). Proficiency in cloud data platforms, including AWS (Redshift, S3, Glue), GCP (BigQuery, Dataflow), or Azure (Data Factory, Synapse). Familiarity with big data technologies (e.g., Spark, Kafka, Hive) and contemporary data stack Solid grasp of data warehousing principles, data modeling techniques, and performance Exceptional problem-solving abilities coupled with a proactive and team-oriented approach.

Posted 1 week ago

Apply

4.0 - 7.0 years

0 - 1 Lacs

Bengaluru

Hybrid

Naukri logo

Job Requirements Job Description: AWS Developer Quest Global, a leading global technology and engineering services company, is seeking an experienced AWS Developer to join our team. As an AWS Developer, you will play a key role in designing, developing, and maintaining cloud-based applications using Amazon Web Services (AWS) and Java development skills. Responsibilities: - Designing, developing, and deploying scalable and reliable cloud-based applications on AWS platform. - Collaborating with cross-functional teams to gather requirements and translate them into technical solutions. - Writing clean, efficient, and maintainable code using Java programming language. - Implementing best practices for security, scalability, and performance optimization. - Troubleshooting and resolving issues related to AWS infrastructure and applications. - Conducting code reviews and providing constructive feedback to ensure code quality. - Keeping up-to-date with the latest AWS services, tools, and best practices. Join our dynamic team at Quest Global and contribute to the development of cutting-edge cloud-based applications using AWS and Java. Apply now and take your career to new heights! Note: This job description is intended to provide a general overview of the position and does not encompass all the tasks and responsibilities that may be assigned to the role. Work Experience Requirements: - Bachelor's degree in Computer Science, Engineering, or a related field. - Minimum 5 years of experience as an AWS Developer or similar role. - Strong proficiency in Java programming language. - In-depth knowledge of AWS services such as EC2, S3, Lambda, RDS, DynamoDB, etc. - Experience with cloud-based application development and deployment. - Familiarity with DevOps practices and tools. - Excellent problem-solving and analytical skills. - Strong communication and collaboration abilities.

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 13 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Job Description Role Snowflake DevOps Engineer Visit our website bmwtechworks.in to know more. Follow us on LinkedIn I Instagram I Facebook I X for the exciting updates. Location Bangalore/ Chennai Experience: 2 to 8 years Number of openings 4 What awaits you/ Job Profile Supporting Snowflakes BMW side Use Case Customers Consulting for Use Case Onboarding Monitoring of Data Synchronization jobs (Reconciliation between Cloud Data Hub and Snowflake) Cost Monitoring Reports to Use Cases Further technical implementations like M2M Authentication for applications, data traffic over VPC Integrate Snowflake to Use Case application process within Data Portal (automated Use Case Setup triggered by Data Portal) Technical Documentation Executing standard service requests (service user lifecycle etc.) Compiling of user and operational manuals Organizing and documenting knowledge regarding incidents/customer cases in a knowledge base Enhancing and editing process documentation Ability and willingness to coach and give training to fellow colleagues and users when Ability to resolve 2nd level incidents within the Data Analytics Platform (could entail basic code changes) Close collaboration with 3rd Level Support/Development and SaaS vendor teams Implementation of new development changes and assist and contribute to the development needs. What should you bring along Strong understanding and experience with Python AWS IAM, S3, KMS, Glue, Cloudwatch. Github Understanding of APIs Understanding of Software Development and background in Business Intelligence SQL (Queries, DDL, Materialized Views, Tasks, Procedures, Optimization) Any Data Portal or Cloud Data Hub Experience A technical background in operating and supporting IT Platforms IT Service Management (according to ITIL) 2nd Level Support Strong understanding of Problem, Incident and Change processes High Customer Orientation Working in a highly complex environment (many stakeholders, multi-platform/product environment, mission-critical use cases, high business exposure, complex ticket routing) Flexible communication on multiple support channels (ITSM, Teams, email) Precise and diligent execution of ops processes Working OnCall (Standby) Mindset of Continuous Learning (highly complex software stack with changing features) Proactive in Communication Must have technical skill Snowflake, Python, Lambda, IAM, S3, KMS, Glue, CloudWatch, Terraform, Scrum Good to have Technical skills AWS VPC, Route53, Bridgeevent, SNS, Cloudtrail, Confluence, Jira

Posted 1 week ago

Apply

7.0 - 10.0 years

45 - 50 Lacs

Pune

Work from Office

Naukri logo

Requirements: Our client is seeking a highly skilled Technical Project Manager (TPM) with strong hands-on experience in full-stack development and cloud infrastructure to lead the successful planning, execution, and delivery of technical projects. The ideal candidate will have a strong background in React, Java, Spring Boot, Python, and AWS, and will work closely with cross-functional teams including developers, QA, DevOps, and product stakeholders. As a TPM, you will play a critical role in bridging technical and business objectives, ensuring timelines, quality, and scalability across complex software projects. Responsibilities : - Own and drive the end-to-end lifecycle of technical projects-from initiation to deployment and post-launch support. - Collaborate with development teams and stakeholders to define project scope, goals, deliverables, and timelines. - Act as a hands-on contributor when needed, with the ability to guide and review code and architecture decisions. - Coordinate cross-functional teams across front-end (React), back-end (Java/Spring Boot, Python), and AWS cloud infrastructure. - Manage risk, change, and issue resolution in a fast-paced agile environment. - Ensure projects follow best practices around version control, CI/CD, testing, deployment, and monitoring. - Deliver detailed status updates, sprint reports, and retrospectives to leadership and stakeholders. Required Qualifications : - IIT /NIT graduate with 5+ years of experience in software engineering, with at least 2 years in a technical project management role. - Hands-on expertise in : React Java & Spring Boot Python AWS (EC2, S3, Lambda, CloudWatch, etc.) - Experience leading agile/Scrum teams with strong understanding of software development lifecycles. - Excellent communication, organizational, and interpersonal skills. Desired Profile : - Experience designing and managing Microservices architectures. - Familiarity with Kafka or other messaging systems. - Knowledge of CI/CD pipelines, deployment strategies, and application monitoring tools (e.g., Prometheus, Grafana, CloudWatch). - Experience with containerization tools like Docker and orchestration platforms like Kubernetes.

Posted 1 week ago

Apply

2.0 - 5.0 years

2 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

o Deploy applications on AWS using services such as EC2, ECS, S3, RDS, or Lambda o Implement CI/CD pipelines using GitHub Actions, Jenkins, or CodePipeline o Apply DevSecOps best practices including container security (Docker, ECR), infrastructure as code (Terraform), and runtime monitoring Team Collaboration & Agility o Participate in Agile ceremonies (stand-ups, sprint planning, retros) o Work closely with product, design, and AI engineers to build secure and intelligent systems

Posted 1 week ago

Apply

8.0 - 12.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Duration: Full Time Job Description: We are seeking a highly experienced and hands-on PHP Developer with leadership or managerial experience to join our growing team. The ideal candidate will be proficient in Laravel, CodeIgniter, React.js, Ajax, AWS, and SQL, with a proven track record of leading development teams and delivering robust, scalable web applications. Key Responsibilities: Lead and manage a team of developers, ensuring timely and quality delivery of projects. Architect, design, and develop high-performance web applications using PHP frameworks (Laravel & CodeIgniter). Integrate and manage front-end components using React.js. Work with Ajax for seamless asynchronous user experiences. Design and maintain SQL databases for high availability and performance. Deploy, manage, and troubleshoot applications hosted on AWS. Ensure coding standards, best practices, and secure programming techniques. Collaborate with cross-functional teams, including product managers and designers. Perform code reviews, mentorship, and performance evaluations. Required Skills & Experience: 8+ years of experience in PHP development. Strong hands-on experience with Laravel and CodeIgniter frameworks. Proficiency with React.js for front-end integration. Experience with Ajax for dynamic web functionality. Solid understanding of AWS services like EC2, S3, RDS, etc. Proficient in MySQL / SQL database design and optimization. Previous experience leading a team or managing developers (must-have). Strong problem-solving, debugging, and analytical skills. Excellent communication and leadership skills. Preferred Qualifications: Familiarity with CI/CD pipelines and DevOps practices. Experience with RESTful APIs and third-party integrations. Knowledge of version control tools like Git. Bachelors/Masters degree in Computer Science or related field.

Posted 1 week ago

Apply

7.0 - 12.0 years

20 - 35 Lacs

Pune

Work from Office

Naukri logo

7+ years in software engineering, with 4+ years using AWS. Programming languages: C# and Python, along with SQL and Spark. The engineering position requires a minimum three-hour overlap with team members in the US-Pacific time zone. Strong experience with some (or all) of the following: Lambda and Step functions, API Gateway, Fargate, ECS, S3, SQS, Kinesis, Firehose, DynamoDB, RDS, Athena, and Glue. Solid foundation in data structures and algorithms and in-depth knowledge and passion for coding standards and following proven design patterns. RESTful and GraphQL APIs are examples. You might also have... DevOps experience is a plus, GitHub, GitHub Actions, Docker. Experience building CI/CD and server/deployment automation solutions, and container orchestration technologies.

Posted 1 week ago

Apply

4.0 - 9.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Qualifications: 4+ years of software development experience, including work with cloud technologies. Bachelors or Master’s degree in Computer Science, Engineering, or equivalent experience. Proficiency in one or more modern programming languages (e.g., Python, Java, Go, NodeJS). Experience with cloud platforms such as AWS, Azure, or Google Cloud. Experience with microservices architecture, distributed systems, and event-driven design. Expertise in designing and consuming RESTful APIs and familiarity with GraphQL Hands one experience with CI/CD pipelines, infrastructure as a code (e.g. Terraform, CloudFormation) and automated deployments. Strong understanding of relational and NoSQL databases. Knowledge of SaaS specific security practices (e.g. OWASP, data encryption, identity management). Strong understanding of software development methodologies and tools. Familiarity with containerization (Docker) and orchestration (Kubernetes). Knowledge of monitoring and logging tools Experience with distributed systems and data-intensive applications.

Posted 1 week ago

Apply

6.0 - 11.0 years

25 - 37 Lacs

Hyderabad, Bengaluru, Delhi / NCR

Work from Office

Naukri logo

Azure Expertise, Proven experience with Azure Cloud services especially Azure Data Factory, Azure SQL Database & Azure Databricks Expert in PySpark data processing & analytics Strong background in building and optimizing data pipelines and workflows. Required Candidate profile Solid exp with data modeling,ETL processes & data warehousing Performance Tuning Ability to optimize data pipelines & jobs to ensure scalability & performance troubleshooting & resolving performance

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies