Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
12 - 16 years
40 - 45 Lacs
Pune
Work from Office
We are seeking an experienced AWS Architect to join our dynamic team at Tech Mahindra. The AWS Architect will be responsible for designing, implementing, and managing cloud solutions on the AWS platform. The ideal candidate will have a strong background in AWS services, cloud architecture, and enterprise-level implementations. Key Responsibilities: Design and implement scalable, highly available, and fault-tolerant systems on AWS. Develop and manage cloud architecture and strategy, including cost management and optimization. Collaborate with clients and internal teams to understand business requirements and translate them into technical solutions. Provide architectural guidance and best practices for cloud deployment and operations. Lead the development of cloud solutions, including design, testing, and deployment. Monitor and manage cloud infrastructure to ensure optimal performance, security, and compliance. Troubleshoot and resolve issues related to cloud architecture and deployment. Stay updated with the latest AWS services and industry trends to ensure that Tech Mahindra's solutions are cutting-edge and competitive. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Advanced degrees or certifications are a plus. AWS Certified Solutions Architect (Associate or Professional) or equivalent certification. Proven experience designing and implementing AWS cloud solutions. Strong knowledge of AWS services including EC2, S3, RDS, Lambda, VPC, CloudFormation, and IAM. Experience with cloud security best practices and compliance requirements. Proficiency in scripting languages such as Python, Bash, or PowerShell. Familiarity with DevOps practices and tools like Terraform, Jenkins, and Docker. Excellent problem-solving skills and the ability to work in a fast-paced environment. Strong communication skills with the ability to effectively interact with clients and stakeholders. Preferred Skills: Experience with multi-cloud environments (e.g., AWS, Azure, Google Cloud). Knowledge of container orchestration platforms such as Kubernetes. Familiarity with Agile methodologies and project management tools. Experience in migrating on-premises applications to the cloud.
Posted 3 months ago
12 - 16 years
40 - 45 Lacs
Chennai
Work from Office
We are seeking an experienced AWS Architect to join our dynamic team at Tech Mahindra. The AWS Architect will be responsible for designing, implementing, and managing cloud solutions on the AWS platform. The ideal candidate will have a strong background in AWS services, cloud architecture, and enterprise-level implementations. Key Responsibilities: Design and implement scalable, highly available, and fault-tolerant systems on AWS. Develop and manage cloud architecture and strategy, including cost management and optimization. Collaborate with clients and internal teams to understand business requirements and translate them into technical solutions. Provide architectural guidance and best practices for cloud deployment and operations. Lead the development of cloud solutions, including design, testing, and deployment. Monitor and manage cloud infrastructure to ensure optimal performance, security, and compliance. Troubleshoot and resolve issues related to cloud architecture and deployment. Stay updated with the latest AWS services and industry trends to ensure that Tech Mahindra's solutions are cutting-edge and competitive. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Advanced degrees or certifications are a plus. AWS Certified Solutions Architect (Associate or Professional) or equivalent certification. Proven experience designing and implementing AWS cloud solutions. Strong knowledge of AWS services including EC2, S3, RDS, Lambda, VPC, CloudFormation, and IAM. Experience with cloud security best practices and compliance requirements. Proficiency in scripting languages such as Python, Bash, or PowerShell. Familiarity with DevOps practices and tools like Terraform, Jenkins, and Docker. Excellent problem-solving skills and the ability to work in a fast-paced environment. Strong communication skills with the ability to effectively interact with clients and stakeholders. Preferred Skills: Experience with multi-cloud environments (e.g., AWS, Azure, Google Cloud). Knowledge of container orchestration platforms such as Kubernetes. Familiarity with Agile methodologies and project management tools. Experience in migrating on-premises applications to the cloud.
Posted 3 months ago
12 - 16 years
40 - 45 Lacs
Mumbai
Work from Office
We are seeking an experienced AWS Architect to join our dynamic team at Tech Mahindra. The AWS Architect will be responsible for designing, implementing, and managing cloud solutions on the AWS platform. The ideal candidate will have a strong background in AWS services, cloud architecture, and enterprise-level implementations. Key Responsibilities: Design and implement scalable, highly available, and fault-tolerant systems on AWS. Develop and manage cloud architecture and strategy, including cost management and optimization. Collaborate with clients and internal teams to understand business requirements and translate them into technical solutions. Provide architectural guidance and best practices for cloud deployment and operations. Lead the development of cloud solutions, including design, testing, and deployment. Monitor and manage cloud infrastructure to ensure optimal performance, security, and compliance. Troubleshoot and resolve issues related to cloud architecture and deployment. Stay updated with the latest AWS services and industry trends to ensure that Tech Mahindra's solutions are cutting-edge and competitive. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related field. Advanced degrees or certifications are a plus. AWS Certified Solutions Architect (Associate or Professional) or equivalent certification. Proven experience designing and implementing AWS cloud solutions. Strong knowledge of AWS services including EC2, S3, RDS, Lambda, VPC, CloudFormation, and IAM. Experience with cloud security best practices and compliance requirements. Proficiency in scripting languages such as Python, Bash, or PowerShell. Familiarity with DevOps practices and tools like Terraform, Jenkins, and Docker. Excellent problem-solving skills and the ability to work in a fast-paced environment. Strong communication skills with the ability to effectively interact with clients and stakeholders. Preferred Skills: Experience with multi-cloud environments (e.g., AWS, Azure, Google Cloud). Knowledge of container orchestration platforms such as Kubernetes. Familiarity with Agile methodologies and project management tools. Experience in migrating on-premises applications to the cloud.
Posted 3 months ago
12 - 16 years
35 - 37 Lacs
Chandigarh
Work from Office
As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 3 months ago
12 - 16 years
35 - 37 Lacs
Vadodara
Work from Office
As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 3 months ago
12 - 16 years
35 - 37 Lacs
Visakhapatnam
Work from Office
As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 3 months ago
12 - 16 years
35 - 37 Lacs
Thiruvananthapuram
Work from Office
As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 3 months ago
12 - 16 years
35 - 37 Lacs
Coimbatore
Work from Office
As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 3 months ago
12 - 16 years
35 - 37 Lacs
Hyderabad
Work from Office
As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 3 months ago
12 - 16 years
35 - 37 Lacs
Nagpur
Work from Office
As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 3 months ago
12 - 16 years
35 - 37 Lacs
Jaipur
Work from Office
As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 3 months ago
12 - 16 years
35 - 37 Lacs
Lucknow
Work from Office
As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 3 months ago
12 - 16 years
35 - 37 Lacs
Kanpur
Work from Office
As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 3 months ago
12 - 16 years
35 - 37 Lacs
Pune
Work from Office
As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 3 months ago
12 - 16 years
35 - 37 Lacs
Ahmedabad
Work from Office
As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 3 months ago
12 - 16 years
35 - 37 Lacs
Surat
Work from Office
As AWS Data Engineer at organization, you will play a crucial role in the design, development, and maintenance of our data infrastructure. Your work will empower data-driven decision-making and contribute to the success of our data-driven initiatives You will design and maintain scalable data pipelines using AWS data analytical resources, enabling efficient data processing and analytics. Key Responsibilities: - Highly experiences in developing ETL pipelines using AWS Glue and EMR with PySpark/Scala. - Utilize AWS services (S3, Glue, Lambda, EMR, Step Functions) for data solutions. - Design scalable data models for analytics and reporting. - Implement data validation, quality, and governance practices. - Optimize Spark jobs for cost and performance efficiency. - Automate ETL workflows with AWS Step Functions and Lambda. - Collaborate with data scientists and analysts on data needs. - Maintain documentation for data architecture and pipelines. - Experience with Open source bigdata file formats such as Iceberg or delta or Hundi - Desirable to have experience in terraforming AWS data analytical resources. Must-Have Skills: - AWS (S3, Glue, EMR Lambda, EMR), PySpark or Scala, SQL, ETL development. Good-to-Have Skills: - Snowflake, Cloudera Hadoop (HDFS, Hive, Impala), Iceberg
Posted 3 months ago
4 - 9 years
10 - 20 Lacs
Bangalore Rural, Bengaluru
Hybrid
We are looking for a skilled and detail-oriented Data Engineer to join our growing data team. You will be responsible for building and maintaining scalable data pipelines, optimizing data systems, and ensuring data is clean, reliable, and ready for analysis. Mandatory Skills- Python, AWS (Glue & Lambda), SQL, Pyspark, Any other cloud Key Responsibilities: Design, develop, and maintain robust ETL/ELT pipelines. Work with structured and unstructured data from multiple sources. Build and maintain data warehouse/data lake infrastructure. Ensure data quality, integrity, and governance practices. Collaborate with data scientists, analysts, and other engineers to deliver data solutions. Optimize data workflows for performance and scalability. Monitor and troubleshoot data pipeline issues in real-time. Required Qualifications: Bachelors or master’s )degree in computer science, Engineering, or related field. Strong experience with SQL and relational databases (e.g., PostgreSQL, MySQL). Proficient in Python, Pyspark, Any cloud (Azure or AWS ), Experience with cloud platforms (e.g., AWS, GCP, Azure). Familiarity with data warehousing solutions (e.g., Snowflake, Redshift, Big Query).
Posted 3 months ago
8 - 13 years
15 - 25 Lacs
Pune
Work from Office
Experience-8+ Years Job Locations-Pune Notice Period-30 Days Job Description-Cloud Application Developer 8+ years of experience in software development with a focus on AWS solutions architecture. Proven experience in architecting microservices-based applications using EKS. Relevant AWS certifications - AWS Certified Solutions Architect Roles & Responsibilities- Design, develop, and implement robust microservices-based applications on AWS using Java. • Lead the architecture and design of EKS-based solutions, ensuring seamless deployment and scalability. Collaborate with cross-functional teams to gather and analyze functional requirements, translating them into technical specifications. Define and enforce best practices for software development, including coding standards, code reviews, and documentation. Identify non-functional requirements such as performance, scalability, security, and reliability; ensure these are met throughout the development lifecycle. Conduct architectural assessments and provide recommendations for improvements to existing systems. Mentor and guide junior developers in best practices and architectural principles. Proficiency in Java programming language with experience in frameworks such as Spring Boot. • Strong understanding of RESTful APIs and microservices architecture. Experience with AWS services, especially EKS, Lambda, S3, RDS, DynamoDB, and CloudFormation. Familiarity with CI/CD pipelines and tools like Jenkins or GitLab CI. Ability to design data models for relational and NoSQL databases. Experience in designing applications for high availability, fault tolerance, and disaster recovery. Knowledge of security best practices in cloud environments. Strong analytical skills to troubleshoot performance issues and optimize system efficiency. Excellent communication skills to articulate complex concepts to technical and non-technical stakeholders.
Posted 3 months ago
9 - 14 years
27 - 42 Lacs
Gurugram
Hybrid
About this opportunity: We are looking for a Java Software Developer to strengthen the core development capacity of the Ericsson Mobile Wallet platform. You have the chance to be part of a growing and dynamic development organization that will develop new features using the latest technology and work on improving the current architecture. You will work on developing a product that improve the lives of millions of daily users; by building our financial platform you will help to bring financial freedom to many people around the World! Ericsson Digital Services provide solutions consisting of software and services in the areas of Digital Business Support Systems (BSS), Operational Support Systems (OSS), Cloud Communication, Cloud Core, and Cloud Infrastructure. The portfolio is focused on 5G-ready, cloud-native, automated and industrialized solutions to secure a smooth digitalization journey towards 5G. You will Work with software development by converting incoming business requirements from product management. You will as a software developer work in all phases of the product's life cycle, including design, implementation, verification, maintenance, and operations of our products. Our ways of working are based on agile Dev-Ops principles, where Continuous Integration is a cornerstone in our development methodology! Working with Continues Improvement and product maintenance is also part of the role. To be successful in the role you must have • Max. 9-14 years of documented professional experience in SW development. • BE/B.TECH/MCA or higher equivalent education in Computer Science • Solid object-oriented Java (Enterprise) programming skills, with exposure to multi-threading, collections, and design patterns. • Experience in Software development environments and tools like Git/Gerrit, Jenkins etc. • Experience from working with Lean & Agile principles and like to work in a flexible team environment • Great interpersonal skills, flexibility, and willingness to adapt & respond to change • Quality mindset, good experience in unit and function testing. Preferably you have experience with Junit. • Curiosity and eagerness to learn new things • A positive and inspiring approach in your everyday work • English proficiency both written and spoken. Good to have: • Working experience in Mobile money/wallet or similar product area will be an added advantage • Keen interest and familiarity in nurturing a product from feature development stage to successful customer deployment. • Exposure of working in a multicultural setup a definite advantage
Posted 3 months ago
3 - 8 years
8 - 12 Lacs
Greater Noida
Work from Office
Sound experience in developing Python applications using Fast API or Flask. (Fast API is preferrable). Proficient in OOPs, Design patterns and functional programming. Hand on experinece with MySql or MongoDB and can manage the complex queries. Good experince of GIT versioning tool. Should have worked with server less architecture and RESTful systems. Experience of API development in Python Hands on experience in AWS services: Lambda, SQS, S3, ECS etc. Experience in using Python Classes using inheritance, overloading and polymorphism Experience in building Serverless applications in AWS using API Gateway and Lambda Experience in Insurance projects is preferable. Note: We are not looking candidate from ML(Machine learning) & Data Science domain. This opening is only for Web/API development in Python and its frameworks.
Posted 3 months ago
5 - 10 years
10 - 15 Lacs
Bengaluru
Work from Office
Wissen Infotech Pvt Ltd is hiring for AWS Cloud Developer with NodeJS Experience:5+Years Location:Bangalore Notice:Immediate Job Description As part of the End-to-End Digital Customer Relationships, installed base tracking has become one of the key pillars to grow drastically Services revenues: both field services and digital services. Schneider Electric is seeking a highly skilled and experienced AWS Developer with strong AWS Cloud skills, strong development skills in NodeJs, proficiency in GitHub to join our Installed Base Team. The ideal candidate will have a robust understanding and hands-on experience with building, deploying, and maintaining microservices in AWS such as Lambda, DynamoDB, API Gateway, etc. You will play a crucial role in developing services, ensuring our infrastructure is maintained and secure. Experience with IaC/DevOps tools such as Terraform and CloudFormation is a plus. Schneider Installed base (IB) is the central data platform where all the product asset data that we track is collected, qualified, consolidated and exposed. Under IB AWS lead, The IB AWS Cloud Developer is a key team member of installed base custom development team, its role is to: - Work with IB AWS lead on the design of the architectures for the new capabilities hosted in our AWS platform. - Design and develop microservices using NodeJs. - Previous experience with Python following OOP is a plus. - Evaluate product requirements for operational feasibility and create detailed specifications based on user stories. - Write clean, efficient, high quality, secure, testable, maintainable code based on specifications. - Coordinate with stakeholders (Product Owner, Scrum Master, Architect, Quality and DevOps teams) to ensure successful execution of the project. - Troubleshoot and resolve issues related to the infrastructure. - Ensure best practices are followed in cloud services, focusing on scalability, maintainability, and security. - Keep abreast of the latest advancements in AWS cloud technologies and trends to recommend process improvements and technology upgrades. - Mentor and provide guidance to junior team members, fostering a culture of continuous learning and innovation. - Participate in architecture review board meetings and make strategic recommendations for choice of services. Qualifications - 3+ years experience working with NodeJs along with AWS Cloud Services or a similar role. - Masters degree in computer science with a focus on Cloud/Data or equivalent (Or Bachelor with more years of XP) - Comprehensive knowledge and hands-on experience with NodeJs, AWS Cloud services, especially the modules (Lambda, serverless, API Gateway, SQS, SNS, SES, DynamoDB, CloudWatch,). - Knowledge of best practices in Python is a plus. - Knowledge of branching and version control systems like GIT (mandatory) - Experience with IaC tools such as Terraform and/or CloudFormation is a plus. - Proficient in Data Structure and algorithm. - Excellent collaboration skills. - A desire for continuous learning and staying updated with emerging technologies. Skills - Due to the nature of this position sitting on a global team, fluent English communication skills (written & spoken) is required. - Strong interpersonal skills, with ability to communicate and convince at various levels of the organization, and in a multicultural environment. - Ability to effectively multi-task and manage priorities. - Strong analytical and synthesis skills - Initiative to uncover and solve problems proactively. - Ability to understand complex software development environments.
Posted 3 months ago
1 - 4 years
12 - 18 Lacs
Bengaluru
Work from Office
Building and deploying AI native backend applications, you'll love working with us if you 1. Love solving hard engineering problems 2. Have been coding since you're like 12 years old 3. Love building systems or breaking servers Required Candidate profile We value: a. Raw intelligence b. Ability to work in and lead teams c. High ownership Must have: 1. Two year work experience 2. Worked at a startup before 3. Plus if you've worked in e-commerce
Posted 3 months ago
12 - 15 years
35 - 45 Lacs
Pune, Bengaluru, Mumbai (All Areas)
Hybrid
Strong frontend development experience with ReactJS, JavaScript or TypeScript. Proficiency in HTML5, CSS3 & responsive design best practices. Hands-on exp with AWS Cloud Services, specifically designing systems with SNS, SQS, EC2, Lambda & S3. Required Candidate profile Expert-level exp in backend development using .NetCore, C# & EF Core. Strong expertise in PostgreSQL & efficient database design. Proficient in building & maintaining RESTful APIs at scale.
Posted 3 months ago
5 - 8 years
4 - 8 Lacs
Bengaluru
Remote
We are seeking a skilled and motivated AWS Cloud Engineer to manage and optimize our cloud infrastructure. You will be responsible for designing, implementing, and maintaining scalable, secure, and cost-effective AWS environments that support our fintech products and services. Key Responsibilities: Design, deploy, and maintain cloud infrastructure on AWS. Automate provisioning, configuration, and scaling using Infrastructure as Code (IaC) tools such as Terraform or CloudFormation. Monitor system performance, troubleshoot issues, and optimize cloud resources for performance and cost. Implement security best practices including IAM roles, security groups, and encryption. Collaborate with development, QA, and DevOps teams to support CI/CD pipelines. Ensure high availability, backup, and disaster recovery plans are in place and tested. Maintain compliance with security, governance, and regulatory standards. Key Skills: Deep knowledge of Amazon Web Services (AWS) : EC2, S3, RDS, Lambda, VPC, CloudWatch, etc. Experience with Infrastructure as Code : Terraform or AWS CloudFormation. Strong scripting skills (Bash, Python, etc.). Knowledge of CI/CD tools : Jenkins, GitHub Actions, GitLab CI. Experience with monitoring/logging tools: CloudWatch, ELK Stack, Prometheus, Grafana. Understanding of cloud security best practices and networking concepts . Familiarity with containerization (Docker) and orchestration (Kubernetes optional). Experience with Linux-based server environments.
Posted 3 months ago
3 - 5 years
10 - 14 Lacs
Bengaluru
Work from Office
An experienced consulting professional who has an understanding of solutions, industry best practices, multiple business processes or technology designs within a product/technology family. Operates independently to provide quality work products to an engagement. Performs varied and complex duties and tasks that need independent judgment, in order to implement Oracle products and technology to meet customer needs. Applies Oracle methodology, company procedures, and leading practices. Over 3 to 5+ years of relevant IT experience with 5 + years in Oracle VBCS, OIC, PL/SQL, PCS based implementations as a Technical lead and senior developer Role is an Individual Contributor role. Being Hands-On is a critical requirement. Must Have: Experience Solution design for Customer engagements in the UI and Integration (OIC) space At least 5 project experience in developing SaaS Extensions using VBCS, OIC ORDS. Understanding of Inherent tools and technologies of SaaS Applications (FBDI, BIP, ADFDI, Applications Composer, Page Integration, etc.) Expertise in Oracle Visual Builder Studio, Good experience with Build and Release, Systems Integration, Agile, Estimations/Planning. Experience in configuring SSO PaaS extensions with Fusion SaaS Drive detailed design using customer requirements Good understanding and usage of OCI architecture, serverless functions, APU Gateway, object storage Conduct Design review to provide guidance and Quality assurance around standard methodologies and frameworks Experience in PCS is an added advantage. Good to have SOA/OSB/ODI/BPM skills. Have experience of building at least one project from scratch Experience with rolling out three big project (multiple phased release or country rollouts) to production. #LI-DNI Career Level - IC2 Responsibilities Standard assignments are accomplished without assistance by exercising independent judgment, within defined policies and processes, to deliver functional and technical solutions on moderately complex customer engagements. #LI-DNI
Posted 3 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
40175 Jobs | Dublin
Wipro
19626 Jobs | Bengaluru
Accenture in India
17497 Jobs | Dublin 2
EY
16057 Jobs | London
Uplers
11768 Jobs | Ahmedabad
Amazon
10704 Jobs | Seattle,WA
Oracle
9513 Jobs | Redwood City
IBM
9439 Jobs | Armonk
Bajaj Finserv
9311 Jobs |
Accenture services Pvt Ltd
8745 Jobs |