Home
Jobs

31 Aws Technologies Jobs - Page 2

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

14 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Job Area: Information Technology Group, Information Technology Group > IT Software Developer General Summary: Qualcomm OneIT team is looking for a talented senior Full-Stack Developer to join our dynamic team and contribute to our exciting projects. The ideal candidate will have strong understanding of Java, Spring Boot, Angular/React and AWS technologies as well as experience in designing, managing and deploying applications to the cloud.Key Responsibilities: Design, develop and maintain web applications using Java, Spring Boot, and Angular/React. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, maintainable, and efficient code. Ensure the performance, quality, and responsiveness of applications. Identify and correct bottlenecks and fix bugs. Help maintain code quality, organization, and automation. Stay up-to-date with the latest industry trends and technologies. Minimum Qualifications: 3+ years of IT-relevant work experience with a Bachelor's degree in a technical field (e.g., Computer Engineering, Computer Science, Information Systems). OR 5+ years of IT-relevant work experience without a Bachelors degree. 3+ years of any combination of academic or work experience with Full-stack Application Development (e.g., Java, Python, JavaScript, etc.) 1+ year of any combination of academic or work experience with Data Structures, algorithms, and data stores. Candidate should have: Bachelor's degree in Computer Science, Engineering, or a related field. 5-7 years of experience with minimum 3 years as a Full-Stack developer using Java, Spring Boot and Angular/React. Strong proficiency in Java and Spring Boot. Experience with front-end frameworks such as Angular or React. Familiarity with RESTful APIs and web services. Knowledge of database systems like Oracle, MySQL, PostgreSQL, or MongoDB. Experience with AWS services such as EC2, S3, RDS, Lambda, and API Gateway. Understanding of version control systems, preferably Git. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Experience with any other programming language like C#, Python Knowledge of containerization technologies like Docker and Kubernetes. Familiarity with CI/CD pipelines and DevOps practices. Experience with Agile/Scrum/SAFe methodologies. Bachelors or Masters degree in information technology, computer science or equivalent. Applicants Qualcomm is an equal opportunity employer. If you are an individual with a disability and need an accommodation during the application/hiring process, rest assured that Qualcomm is committed to providing an accessible process. You may e-mail disability-accomodations@qualcomm.com or call Qualcomm's toll-free number found here. Upon request, Qualcomm will provide reasonable accommodations to support individuals with disabilities to be able participate in the hiring process. Qualcomm is also committed to making our workplace accessible for individuals with disabilities. (Keep in mind that this email address is used to provide reasonable accommodations for individuals with disabilities. We will not respond here to requests for updates on applications or resume inquiries). Qualcomm expects its employees to abide by all applicable policies and procedures, including but not limited to security and other requirements regarding protection of Company confidential information and other confidential and/or proprietary information, to the extent those requirements are permissible under applicable law. To all Staffing and Recruiting Agencies Please do not forward resumes to our jobs alias, Qualcomm employees or any other company location. Qualcomm is not responsible for any fees related to unsolicited resumes/applications. If you would like more information about this role, please contact Qualcomm Careers.

Posted 1 month ago

Apply

3 - 7 years

4 - 7 Lacs

Gurugram

Work from Office

Naukri logo

We are seeking an experienced AWS Migration Engineer to lead the migration of on-premises infrastructure and applications to AWS Cloud. The ideal candidate will have hands-on expertise in AWS services, tools, and frameworks, as well as the ability to modernize applications and design target state architectures. This role requires deep technical knowledge, problem-solving skills, and proficiency in automation and scripting. Roles and Responsibilities Discovery and Assessment: Perform detailed analysis and assessment of existing infrastructure and applications using AWS tools such as AWS Migration Evaluator, AWS Application Discovery Service, AWS Migration Hub, AWS Application Migration Service, and AWS Database Migration Service (DMS). Provide insights and migration strategies based on the findings. AWS Services Implementation: Design and implement cloud solutions using AWS services, including EC2, VPC, RDS, EBS/EFS, S3, Lambda, CloudWatch, and Transit Gateway. Ensure optimal configuration for performance, reliability, and cost-effectiveness. Networking Setup: Configure secure and reliable network connectivity between AWS and on-premises environments using AWS Direct Connect and Site-to-Site VPN. Troubleshoot networking issues and ensure seamless integration. Automation and Infrastructure-as-Code: Leverage tools like Terraform and Ansible to automate deployments and configurations. Utilize scripting languages such as Bash, PowerShell, and Python to streamline migration workflows and optimize processes. Application Modernisation: Transform legacy applications from monolithic architecture to microservices-based architecture using cloud-native technologies and approaches. Enhance scalability and maintainability. Cloud Architecture Design: Develop and implement the target state architecture for applications and infrastructure on AWS. Ensure that solutions are scalable, secure, and aligned with best practices for cloud environments. Qualifications Proven experience in migrating on-premises infrastructure and applications to AWS Cloud. Strong expertise with AWS tools and services for migration and cloud architecture. Hands-on experience with Terraform, Ansible, Bash/PowerShell scripting, and Python. Knowledge of networking concepts, including AWS Direct Connect and Site-to-Site VPN. Ability to modernize applications and transitioning from monolithic to microservices architecture Proficiency in designing scalable and secure target state architectures on AWS. Strong analytical and problem-solving skills. Ability to drive customer engagement BE/BTech in Computer Science or any equivalent degree. AWS Certifications (e.g., AWS Solutions Architect Associate, AWS SysOps Administrator, AWS DevOps Engineer Professional). Experience working with 3Tier Applications, Serverless Applications and Containers. Experience with DevOps practices and CI/CD pipelines. Experience working in Agile environments.

Posted 1 month ago

Apply

2 - 6 years

12 - 16 Lacs

Pune

Work from Office

Naukri logo

Design, develop, and manage our data infrastructure on AWS, with a focus on data warehousing solutions. Write efficient, complex SQL queries for data extraction, transformation, and loading. Utilize DBT for data modelling and transformation. Use Python for data engineering tasks, demonstrating strong work experience in this area. Implement scheduling tools like Airflow, Control M, or shell scripting to automate data processes and workflows. Participate in an Agile environment, adapting quickly to changing priorities and requirements Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proven expertise in AWS technologies, with a strong understanding of AWS services. Experience in Redshift is optional Experience in data warehousing with a solid grasp of SQL, including ability to write complex queries. Proficiency in Python, demonstrating good work experience in data engineering tasks. Familiarity with scheduling tools like Airflow, Control M, or shell scripting. Excellent communication skills, willing attitude towards learning Preferred technical and professional experience Knowledge of DBT for data modelling and transformation is a plus. Experience with PySpark or Spark is highly desirable. Familiarity with DevOps, CI/CD, and Airflow is beneficial.Experience in Agile environments is a nice-to-have

Posted 1 month ago

Apply

5 - 10 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 month ago

Apply

8 - 13 years

30 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 month ago

Apply

2 - 6 years

12 - 16 Lacs

Bengaluru

Work from Office

Naukri logo

Design, develop, and manage our data infrastructure on AWS, with a focus on data warehousing solutions. Write efficient, complex SQL queries for data extraction, transformation, and loading. Utilize DBT for data modelling and transformation. Use Python for data engineering tasks, demonstrating strong work experience in this area. Implement scheduling tools like Airflow, Control M, or shell scripting to automate data processes and workflows.Participate in an Agile environment, adapting quickly to changing priorities and requirements. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Proven expertise in AWS technologies, with a strong understanding of AWS services. Experience in Redshift is optional Experience in data warehousing with a solid grasp of SQL, including ability to write complex queries. Proficiency in Python, demonstrating good work experience in data engineering tasks. Familiarity with scheduling tools like Airflow, Control M, or shell scripting. Excellent communication skills, willing attitude towards learning Preferred technical and professional experience Knowledge of DBT for data modelling and transformation is a plus. Experience with PySpark or Spark is highly desirable. Familiarity with DevOps, CI/CD, and Airflow is beneficial.Experience in Agile environments is a nice-to-have

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies