Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
8 - 18 Lacs
Noida, Gurugram
Work from Office
5+ years of overall technical experience Minimum 2+ years experience in Node.js Minimum 1-year relevant experience in team management. Experience with Angular/ Reactjs lambda Services PHP is an added advantage. Experience in working with MySQL, Mongo DB, Dynamodb, AWS services. Hands-on experience in Object-Oriented JavaScript, ES6, TypeScript.
Posted -1 days ago
5.0 - 8.0 years
7 - 10 Lacs
Chennai
Work from Office
We are looking for a skilled AWS Developer with 5 to 10 years of experience. Chennai and requires an immediate or 15-day notice period. Roles and Responsibility Design, develop, and deploy scalable and efficient software applications on AWS. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain high-quality code that meets industry standards and best practices. Troubleshoot and resolve technical issues efficiently. Participate in code reviews and contribute to improving overall code quality. Stay updated with the latest trends and technologies in AWS development. Job Requirements Strong proficiency in AWS services such as EC2, S3, Lambda, etc. Experience with cloud-based technologies and platforms. Excellent problem-solving skills and attention to detail. Strong communication and teamwork skills. Ability to work in a fast-paced environment and meet deadlines. Familiarity with agile development methodologies and version control systems. Skills: AWS DEVELOPER
Posted -1 days ago
11.0 - 13.0 years
35 - 50 Lacs
Bengaluru
Work from Office
Principal AWS Data Engineer Location : Bangalore Experience : 9 - 12 years Job Summary: In this key leadership role, you will lead the development of foundational components for a Lakehouse architecture on AWS and drive the migration of existing data processing workflows to the new Lakehouse solution. You will work across the Data Engineering organisation to design and implement scalable data infrastructure and processes using technologies such as Python, PySpark, EMR Serverless, Iceberg, Glue and Glue Data Catalog. The main goal of this position is to ensure successful migration and establish robust data quality governance across the new platform, enabling reliable and efficient data processing. Success in this role requires deep technical expertise, exceptional problem-solving skills, and the ability to lead and mentor within an agile team. Must Have Tech Skills: Prior Principal Engineer experience, leading team best practices in design, development, and implementation, mentoring team members, and fostering a culture of continuous learning and innovation Extensive experience in software architecture and solution design, including microservices, distributed systems, and cloud-native architectures. Expert in Python and Spark, with a deep focus on ETL data processing and data engineering practices. Deep technical knowledge of AWS data services and engineering practices, with demonstrable experience of implementing data pipelines using tools like EMR, AWS Glue, AWS Lambda, AWS Step Functions, API Gateway, Athena Experience of delivering Lakehouse solutions/architectures Nice To Have Tech Skills: Knowledge of additional programming languages and development tools to provide flexibility and adaptability across varied data engineering projects A master’s degree or relevant certifications (e.g., AWS Certified Solutions Architect, Certified Data Analytics) is advantageous Key Accountabilities: Lead complex projects autonomously, fostering an inclusive and open culture within development teams. Mentor team members and lead technical discussions. Provides strategic guidance on best practices in design, development, and implementation. Leads the development of high-quality, efficient code and develops necessary tools and applications to address complex business needs Collaborates closely with architects, Product Owners, and Dev team members to decompose solutions into Epics, leading the design and planning of these components. Drive the migration of existing data processing workflows to a Lakehouse architecture, leveraging Iceberg capabilities. Serves as an internal subject matter expert in software development, advising stakeholders on best practices in design, development, and implementation Key Skills: Deep technical knowledge of data engineering solutions and practices. Expert in AWS services and cloud solutions, particularly as they pertain to data engineering practices Extensive experience in software architecture and solution design Specialized expertise in Python and Spark Ability to provide technical direction, set high standards for code quality and optimize performance in data-intensive environments. Skilled in leveraging automation tools and Continuous Integration/Continuous Deployment (CI/CD) pipelines to streamline development, testing, and deployment. Exceptional communicator who can translate complex technical concepts for diverse stakeholders, including engineers, product managers, and senior executives. Provides thought leadership within the engineering team, setting high standards for quality, efficiency, and collaboration. Experienced in mentoring engineers, guiding them in advanced coding practices, architecture, and strategic problem-solving to enhance team capabilities. Educational Background: Bachelor’s degree in computer science, Software Engineering, or a related field is essential. Bonus Skills: Financial Services expertise preferred, working with Equity and Fixed Income asset classes and a working knowledge of Indices.
Posted -1 days ago
8.0 - 10.0 years
27 - 42 Lacs
Bengaluru
Work from Office
Job Summary: Experience : 4 - 8 years Location : Bangalore The Data Engineer will contribute to building state-of-the-art data Lakehouse platforms in AWS, leveraging Python and Spark. You will be part of a dynamic team, building innovative and scalable data solutions in a supportive and hybrid work environment. You will design, implement, and optimize workflows using Python and Spark, contributing to our robust data Lakehouse architecture on AWS. Success in this role requires previous experience of building data products using AWS services, familiarity with Python and Spark, problem-solving skills, and the ability to collaborate effectively within an agile team. Must Have Tech Skills: Demonstrable previous experience as a data engineer. Technical knowledge of data engineering solutions and practices. Implementation of data pipelines using tools like EMR, AWS Glue, AWS Lambda, AWS Step Functions, API Gateway, Athena Proficient in Python and Spark, with a focus on ETL data processing and data engineering practices. Nice To Have Tech Skills: Familiar with data services in a Lakehouse architecture. Familiar with technical design practices, allowing for the creation of scalable, reliable data products that meet both technical and business requirements A master’s degree or relevant certifications (e.g., AWS Certified Solutions Architect, Certified Data Analytics) is advantageous Key Accountabilities: Writes high quality code, ensuring solutions meet business requirements and technical standards. Works with architects, Product Owners, and Development leads to decompose solutions into Epics, assisting the design and planning of these components. Creates clear, comprehensive technical documentation that supports knowledge sharing and compliance. Experience in decomposing solutions into components (Epics, stories) to streamline development. Actively contributes to technical discussions, supporting a culture of continuous learning and innovation. Key Skills: Proficient in Python and familiar with a variety of development technologies. Previous experience of implementing data pipelines, including use of ETL tools to streamline data ingestion, transformation, and loading. Solid understanding of AWS services and cloud solutions, particularly as they pertain to data engineering practices. Familiar with AWS solutions including IAM, Step Functions, Glue, Lambda, RDS, SQS, API Gateway, Athena. Proficient in quality assurance practices, including code reviews, automated testing, and best practices for data validation. Experienced in Agile development, including sprint planning, reviews, and retrospectives Educational Background: Bachelor’s degree in computer science, Software Engineering, or related essential. Bonus Skills: Financial Services expertise preferred, working with Equity and Fixed Income asset classes and a working knowledge of Indices. Familiar with implementing and optimizing CI/CD pipelines. Understands the processes that enable rapid, reliable releases, minimizing manual effort and supporting agile development cycles.
Posted -1 days ago
3.0 - 7.0 years
5 - 9 Lacs
Pune
Work from Office
Snowflake Data Engineer1 were looking for a candidate who has a strong background in data technologies such as SQL Server, Snowflake, and similar platforms. In addition to that, they should bring experience in at least one other programming language, with a proficiency in Python being a key requirement.The ideal candidate should also have Exposure to DevOps pipelines within a data engineering context At least a high-level understanding of AWS services and how they fit into modern data architectures A proactive mindsetsomeone who is motivated to take initiative and contribute beyond assigned tasks
Posted Just now
3.0 - 7.0 years
5 - 9 Lacs
Hyderabad, Pune
Work from Office
Snowflake data engineer1 As mentioned earlier, for this role, were looking for a candidate who has a strong background in data technologies such as SQL Server, Snowflake, and similar platforms. In addition to that, they should bring experience in at least one other programming language, with a proficiency in Python being a key requirement.The ideal candidate should also have Exposure to DevOps pipelines within a data engineering context At least a high-level understanding of AWS services and how they fit into modern data architectures A proactive mindsetsomeone who is motivated to take initiative and contribute beyond assigned tasks
Posted Just now
5.0 - 8.0 years
7 - 12 Lacs
Pune
Work from Office
Hello Visionary! We empower our people to stay resilient and relevant in a constantly changing world. Were looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like youd make a great addition to our vibrant team. We are looking for an Angular Developer. This position is for Pune Location. Youll make a difference by: Having Expert-level knowledge of Angular 12+, component-driven development, and application architecture. Having Strong proficiency in RxJS, Observables, Subjects, Operators, and reactive programming patterns. Having Hands-on experience with HTML5, JavaScript, CSS3, SCSS/SASS, responsive design, Flexbox, and CSS Grid. Having Deep understanding of theming, design systems, and Angular Material customization. Having Proven experience in developing accessible (a11y) web applications, aligned with WCAG 2.1 and ARIA standards. Good understanding of integrating and working with RESTful APIs. Familiarity with AWS services for cloud deployments and infrastructure integration. Proficient with Unit Testing (Jasmine, Karma) and E2E Testing (Cypress) frameworks. Experience with CI/CD pipelines and DevOps integration. Youll win us over by: Having An engineering degree B.E/B.Tech/MCA/M.Tech/M.Sc with good academic record. 5-8 years of demonstrable experience in software development. Especially the Latest versions. Having knowledge in web application development. Having Flexibility to work on backend development with Node.js or Golang. Ablity to work on multiple technologies/tools and handle complex topics. Being a good team player. Well support you with: Hybrid working Opportunities. Diverse and inclusive culture. Great variety of learning & development opportunities. Join us and be yourself! We value your unique identity and perspective, recognizing that our strength comes from the diverse backgrounds, experiences, and thoughts of our team members. We are fully committed to providing equitable opportunities and building a workplace that reflects the diversity of society. We also support you in your personal and professional journey by providing resources to help you thrive. Come bring your authentic self and create a better tomorrow with us. Make your mark in our exciting world at Siemens. This role is based in Pune and is an Individual contributor role. You might be required to visit other locations within India and outside. In return, you'll get the chance to work with teams impacting - and the shape of things to come. Find out more about Siemens careers at
Posted Just now
3.0 - 5.0 years
13 - 17 Lacs
Mumbai
Work from Office
At Siemens Energy, we can. Our technology is key, but our people make the difference. Brilliant minds innovate. They connect, create, and keep us on track towards changing the worlds energy systems. Their spirit fuels our mission. Software Developer - Data Integration Platform- Mumbai or Pune , Siemens Energy, Full Time Looking for challenging roleIf you really want to make a difference - make it with us We make real what matters. About the role Technical Skills (Mandatory) Python (Data Ingestion Pipelines) Proficiency in building and maintaining data ingestion pipelines using Python. Blazegraph Experience with Blazegraph technology. Neptune Familiarity with Amazon Neptune, a fully managed graph database service. Knowledge Graph (RDF, Triple) Understanding of RDF (Resource Description Framework) and Triple stores for knowledge graph management. AWS Environment (S3) Experience working with AWS services, particularly S3 for storage solutions. GIT Proficiency in using Git for version control. Optional and good to have skills Azure DevOps (Optional)Experience with Azure DevOps for CI/CD pipelines and project management (optional but preferred). Metaphactory by Metaphacts (Very Optional)Familiarity with Metaphactory, a platform for knowledge graph management (very optional). LLM / Machine Learning ExperienceExperience with Large Language Models (LLM) and machine learning techniques. Big Data Solutions (Optional)Experience with big data solutions is a plus. SnapLogic / Alteryx / ETL Know-How (Optional)Familiarity with ETL tools like SnapLogic or Alteryx is optional but beneficial. We dont need superheroes, just super minds. A degree in Computer Science, Engineering, or a related field is preferred. Professional Software DevelopmentDemonstrated experience in professional software development practices. Years of Experience3-5 years of relevant experience in software development and related technologies. Soft Skills Strong problem-solving skills. Excellent communication and teamwork abilities. Ability to work in a fast-paced and dynamic environment. Strong attention to detail and commitment to quality. Fluent in English (spoken and written) Weve got quite a lot to offer. How about you This role is based in Pune or Mumbai , where youll get the chance to work with teams impacting entire cities, countries- and the shape of things to come. Were Siemens. A collection of over 379,000 minds building the future, one day at a time in over 200 countries. We're dedicated to equality, and we welcome applications that reflect the diversity of the communities we work in. All employment decisions at Siemens are based on qualifications, merit and business need. Bring your curiosity and imagination and help us shape tomorrow. Find out more about Siemens careers at
Posted Just now
3.0 - 8.0 years
1 - 5 Lacs
Pune
Work from Office
Hello Visionary! We know that the only way a business thrive is if our people are growing. Thats why we always put our people first. Our global, diverse team would be happy to support you and challenge you to grow in new ways. Who knows where our shared journey will take you We are looking for AWS L2 Support Engineer (Remote, On-Call Duties) . This will be Remote work. Timings- 6.30 PM to 3.30 AM(IST) We are seeking a highly motivated and technically skilled Support Engineer (L2/L3) to join our dynamic team providing world-class support for our financial application hosted in AWS and data analytics. The ideal candidate will have strong troubleshooting skills, experience with various AWS services and the ability to assist stakeholders in resolving technical issues effectively. Youll make a difference by having key responsibilities as mentioned below. Advanced TroubleshootingDiagnose and resolve complex issues related to AWS services (e.g., EC2, S3, RDS, Secrets manager, System manager , Log watch IAM misconfigurations). Incident ManagementProvide L1/L2 support by monitoring, triaging, and resolving customer issues related to the platform Customer InteractionCommunicate with stockholders through email, incident management tool and phone to understand and address their concerns, ensuring excellent customer satisfaction. EscalationCollaborate with development team and engineering for unresolved issues, ensuring a smooth handoff with detailed issue documentation DocumentationCreate and update knowledge base articles, FAQs, and customer-facing documentation based on resolved issues. - Maintain detailed records of customer interactions and issue resolutions in the ticketing system System Monitoring and Maintenance: - Proactively monitor environments for alerts and potential issues. - Perform routine health checks and provide recommendations for optimizationProviding timely status to the customers & management. Required Skills and Qualifications: Technical Skills: Strong understanding of various AWS services like EC2, RDS , Open Search Service , S3 , RDS failures, Secrets manager , System manager , log watch IAM . etc Experience with Linux and basic commanding and shell scripting. Basic knowledge of networking fundamentals (e.g., DNS, IP routing, firewalls). Exposure to monitoring tools (e.g., Prometheus, Grafana) and logging systems (ELK). Knowledge of database systems with basic sql scripting. Excellent verbal and written communication skills for clear and professional interaction with stakeholders. Strong analytical and problem-solving abilities. Ability to prioritize and manage multiple support tickets efficiently. Customer-focused mindset with a dedication to providing timely and effective support. Youll win us over by: Holding a graduate B.Tech/B.E. in Information Technology, Computers. 3+ years in a technical support or operations role on AWS hosted applications. Experience in finance domain is highly preferred. AWS Certified Cloud Practitioner and AWS Certified Solutions Architect- Associate is plus. Remote Work : Must have a reliable internet connection and a dedicated workspace to support uninterrupted work. Be available during core business hours and flexible for after-hours meetings if required. Must perform remote duties while residing in the same city as the company's base location. On-Call Expectations: Participate in a weekly rotational on-call schedule (e.g., 1 week on-call every 2 weeks). Respond to critical incidents within 15 minutes of notification. Work closely with the team to ensure smooth handovers after the on-call period Create a better #TomorrowWithUs! This role, based in Pune, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. We are dedicated to equality and welcome applications that reflect the diversity of the communities we serve. All employment decisions at Siemens are based on qualifications, merit, and business need. Bring your curiosity and imagination, and help us shape tomorrow Find out more about Siemens careers at
Posted Just now
3.0 - 5.0 years
7 - 11 Lacs
Pune
Work from Office
Hello Visionary! We empower our people to stay resilient and relevant in a constantly changing world. Were looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like youd make a great addition to our vibrant team. Siemens founded the new business unit Siemens Foundational Technologies (formerly known as Siemens IoT Services) on April 1, 2019 with its headquarter in Munich, Germany. It has been crafted to unlock the digital future of its clients by offering end-to-end support on their outstanding digitalization journey. Siemens Foundational Technologies is a strategic advisor and a trusted implementation partner in digital transformation and industrial IoT with a global network of more than 8000 employees in 10 countries and 21 offices. Highly skilled and experienced specialists offer services which range from consulting to craft & prototyping to solution & implementation and operation- everything out of one hand. We are looking for a Senior DevOps Engineer Youll make a difference by: Key Responsibilities: Design, implement, and maintain CI/CD pipelines using GitLab, including configuring GitLab Runners. Build, manage, and scale containerized applications using Docker, Kubernetes, and HELM. Automate infrastructure provisioning and management with Terraform. Manage and optimize cloud-based environments, especially AWS. Administer and optimize Kafka clusters for data streaming and processing. Oversee the performance and reliability of databases and Linux environments. Monitor and enhance system health using tools like Prometheus and Grafana. Collaborate with cross-functional teams to implement DevOps best practices. Ensure system security, scalability, and disaster recovery readiness. Troubleshoot and resolve technical issues across the infrastructure. Required Skills & Qualifications: 3 - 5 years of experience in DevOps, system administration, or a related role. Expertise in CI/CD tools and workflows, especially GitLab Pipelines and GitLab Runners. Proficient in containerization and orchestration tools like Docker, Kubernetes, and HELM. Strong hands-on experience with Docker Swarm, including creating and managing Docker clusters. Proficiency in packaging Docker images for deployment. Strong hands-on experience with Kubernetes, including managing clusters and deploying applications. Strong hands-on experience with Terraform for Infrastructure as Code (IaC). In-depth knowledge of AWS services, including EC2, S3, IAM, EKS, MSK, Route53 and VPC. Solid experience in managing and maintaining Kafka ecosystems. Strong Linux system administration skills. Proficiency in database management, optimization, and troubleshooting. Experience with monitoring tools like Prometheus and Grafana. Excellent scripting skills in languages like Bash, Python. Strong problem-solving skills and the ability to work in a fast-paced environment. Excellent communication skills and a collaborative mindset. Good to Have Skills: Experience with Keycloak for identity and access management. Familiarity with Nginx or Traefik for reverse proxy and load balancing. Hands-on experience in PostgreSQL maintenance, including backups, tuning, and troubleshooting. Knowledge of the railway domain, including industry-specific challenges and standards. Experience in implementing and managing high-availability architectures. Exposure to distributed systems and microservices architecture. Desired Skills: 3-5 years of experience is required. Great Communication skills. Analytical and problem-solving skills Join us and be yourself! Make your mark in our exciting world at Siemens. This role is based in Pune and is an Individual contributor role. You might be required to visit other locations within India and outside. In return, you'll get the chance to work with teams impacting - and the shape of things to come. Find out more about Siemens careers at: & more about mobility at https://new.siemens.com/global/en/products/mobility.html
Posted Just now
8.0 - 13.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Hello Talented Techie! We provide support in Project Services and Transformation, Digital Solutions and Delivery Management. We offer joint operations and digitalization services for Global Business Services and work closely alongside the entire Shared Services organization. We make optimal use of the possibilities of new technologies such as Business Process Management (BPM) and Robotics as enablers for efficient and effective processes. We are looking for Sr. AWS Cloud Architect Architect and Design Develop scalable and efficient data solutions using AWS services such as AWS Glue, Amazon Redshift, S3, Kinesis(Apache Kafka), DynamoDB, Lambda, AWS Glue(Streaming ETL) and EMR Integration Integrate real-time data from various Siemens organizations into our data lake, ensuring seamless data flow and processing. Data Lake Management Design and manage a large-scale data lake using AWS services like S3, Glue, and Lake Formation. Data Transformation Apply various data transformations to prepare data for analysis and reporting, ensuring data quality and consistency. Snowflake Integration Implement and manage data pipelines to load data into Snowflake, utilizing Iceberg tables for optimal performance and flexibility. Performance Optimization Optimize data processing pipelines for performance, scalability, and cost-efficiency. Security and Compliance Ensure that all solutions adhere to security best practices and compliance requirements. Collaboration Work closely with cross-functional teams, including data engineers, data scientists, and application developers, to deliver end-to-end solutions. Monitoring and Troubleshooting Implement monitoring solutions to ensure the reliability and performance of data pipelines. Troubleshoot and resolve any issues that arise. Youd describe yourself as: Experience 8+ years of experience in data engineering or cloud solutioning, with a focus on AWS services. Technical Skills Proficiency in AWS services such as AWS API, AWS Glue, Amazon Redshift, S3, Apache Kafka and Lake Formation. Experience with real-time data processing and streaming architectures. Big Data Querying Tools: Strong knowledge of big data querying tools (e.g., Hive, PySpark). Programming Strong programming skills in languages such as Python, Java, or Scala for building and maintaining scalable systems. Problem-Solving Excellent problem-solving skills and the ability to troubleshoot complex issues. Communication Strong communication skills, with the ability to work effectively with both technical and non-technical stakeholders. Certifications AWS certifications are a plus. Create a better #TomorrowWithUs! This role, based in Bangalore, is an individual contributor position. You may be required to visit other locations within India and internationally. In return, you'll have the opportunity to work with teams shaping the future. At Siemens, we are a collection of over 312,000 minds building the future, one day at a time, worldwide. Find out more about Siemens careers at
Posted Just now
6.0 - 8.0 years
8 - 13 Lacs
Pune
Work from Office
Hello Visionary! We empower our people to stay resilient and relevant in a constantly changing world. Were looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like youThen it seems like youd make a great addition to our vibrant team. Siemens founded the new business unit Siemens Foundational Technologies (formerly known as Siemens IoT Services) on April 1, 2019 with its headquarter in Munich, Germany. It has been crafted to unlock the digital future of its clients by offering end-to-end support on their outstanding digitalization journey. Siemens Foundational Technologies is a strategic advisor and a trusted implementation partner in digital transformation and industrial IoT with a global network of more than 8000 employees in 10 countries and 21 offices. Highly skilled and experienced specialists offer services which range from consulting to craft & prototyping to solution & implementation and operation- everything out of one hand. We are looking for a Senior DevOps Engineer Youll make a difference by: Key Responsibilities: Design, implement, and maintain CI/CD pipelines using GitLab, including configuring GitLab Runners. Build, manage, and scale containerized applications using Docker, Kubernetes, and HELM. Automate infrastructure provisioning and management with Terraform. Manage and optimize cloud-based environments, especially AWS. Administer and optimize Kafka clusters for data streaming and processing. Oversee the performance and reliability of databases and Linux environments. Monitor and enhance system health using tools like Prometheus and Grafana. Collaborate with cross-functional teams to implement DevOps best practices. Ensure system security, scalability, and disaster recovery readiness. Troubleshoot and resolve technical issues across the infrastructure. Required Skills & Qualifications: 6 - 8 years of experience in DevOps, system administration, or a related role. Expertise in CI/CD tools and workflows, especially GitLab Pipelines and GitLab Runners. Proficient in containerization and orchestration tools like Docker, Kubernetes, and HELM. Strong hands-on experience with Docker Swarm, including creating and managing Docker clusters. Proficiency in packaging Docker images for deployment. Strong hands-on experience with Kubernetes, including managing clusters and deploying applications. Strong hands-on experience with Terraform for Infrastructure as Code (IaC). In-depth knowledge of AWS services, including EC2, S3, IAM, EKS, MSK, Route53 and VPC. Solid experience in managing and maintaining Kafka ecosystems. Strong Linux system administration skills. Proficiency in database management, optimization, and troubleshooting. Experience with monitoring tools like Prometheus and Grafana. Excellent scripting skills in languages like Bash, Python. Strong problem-solving skills and the ability to work in a fast-paced environment. Excellent communication skills and a collaborative mindset. Good to Have Skills: Experience with Keycloak for identity and access management. Familiarity with Nginx or Traefik for reverse proxy and load balancing. Hands-on experience in PostgreSQL maintenance, including backups, tuning, and troubleshooting. Knowledge of the railway domain, including industry-specific challenges and standards. Experience in implementing and managing high-availability architectures. Exposure to distributed systems and microservices architecture. Desired Skills: 5-8 years of experience is required. Great Communication skills. Analytical and problem-solving skills Join us and be yourself! Make your mark in our exciting world at Siemens. This role is based in Pune and is an Individual contributor role. You might be required to visit other locations within India and outside. In return, you'll get the chance to work with teams impacting - and the shape of things to come. Find out more about Siemens careers at: & more about mobility at https://new.siemens.com/global/en/products/mobility.html
Posted Just now
9.0 - 14.0 years
11 - 16 Lacs
Bengaluru
Work from Office
Educational BCA,Master Of Technology,Bachelor Of Technology,Bachelor of Engineering,BTech,Bachelor Of Science,Master Of Engineering Service Line Cloud & Infrastructure Services Responsibilities Roles and Responsibilities: Responsible for design, development, implementation, operation improvement and debug cloud environments in AWS and Cloud Management Platform and orchestration tools. Performs engineering design evaluations for new environment builds Architect, implement and improve possible automations for cloud environments Recommends alterations to development and design to improve quality of products and/or procedures. Implementation of industry standard security practices during implementation and maintain it throughout the lifecycle. Advise and engage with the customer executives on their cloud strategy roadmap, improvements, alignment by bringing in industry best practice/trends and work on further improvements with required business case analysis and required presentations. Creating business cases for transformation and modernization, including analysis of both total cost of ownership and potential cost and revenue impacts of the transformation Process analysis and design, with a focus on identifying technology-driven improvements to core enterprise processes.If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Good knowledge on software configuration management systems Strong business acumen, strategy and cross-industry thought leadership Awareness of latest technologies and Industry trends Proven experience assessing clients’ workloads and technology landscape for Cloud suitability, develop business case and Cloud adoption roadmap Proven knowledge of leading Cloud Management Platform and orchestration tools Proven knowledge of evaluating the AWS/Azure hosting consumption charges and optimization of the charges Experience in defining new architectures and ability to drive project from architecture standpoint Ability to quickly establish credibility and trustworthiness within key executive stakeholders in client organization; Excellent verbal, written and presentation skills; Ability to quickly produce PowerPoint slides which are both content rich, succinct and visually appealing; Technical and Professional : Strong hands-on experience in AWS Cloud Infrastructure Excellent understanding of AWS Services / components with experience in multiple projects Strong Terraform Scripting Skills. Creating the CI/CD pipelines using GitLab Good Hands-on in provisioning the Containers in AWS Container Instances and AKS etc. Preferred Skills: Technology-Cloud Platform-AWS Core services Technology-Cloud Platform-Amazon Webservices Architecture Technology-Cloud Platform-Amazon Webservices DevOps
Posted Just now
10.0 - 15.0 years
15 - 19 Lacs
Bengaluru
Work from Office
Experience: 8+ years of experience in data engineering, specifically in cloud environments like AWS. Proficiency in PySpark for distributed data processing and transformation. Solid experience with AWS Glue for ETL jobs and managing data workflows. Hands-on experience with AWS Data Pipeline (DPL) for workflow orchestration. Strong experience with AWS services such as S3, Lambda, Redshift, RDS, and EC2. Technical Skills: Proficiency in Python and PySpark for data processing and transformation tasks. Deep understanding of ETL concepts and best practices. Familiarity with AWS Glue (ETL jobs, Data Catalog, and Crawlers). Experience building and maintaining data pipelines with AWS Data Pipeline or similar orchestration tools. Familiarity with AWS S3 for data storage and management, including file formats (CSV, Parquet, Avro). Strong knowledge of SQL for querying and manipulating relational and semi-structured data. Experience with Data Warehousing and Big Data technologies, specifically within AWS. Additional Skills: Experience with AWS Lambda for serverless data processing and orchestration. Understanding of AWS Redshift for data warehousing and analytics. Familiarity with Data Lakes, Amazon EMR, and Kinesis for streaming data processing. Knowledge of data governance practices, including data lineage and auditing. Familiarity with CI/CD pipelines and Git for version control. Experience with Docker and containerization for building and deploying applications. Design and Build Data PipelinesDesign, implement, and optimize data pipelines on AWS using PySpark, AWS Glue, and AWS Data Pipeline to automate data integration, transformation, and storage processes. ETL DevelopmentDevelop and maintain Extract, Transform, and Load (ETL) processes using AWS Glue and PySpark to efficiently process large datasets. Data Workflow AutomationBuild and manage automated data workflows using AWS Data Pipeline, ensuring seamless scheduling, monitoring, and management of data jobs. Data IntegrationWork with different AWS data storage services (e.g., S3, Redshift, RDS) to ensure smooth integration and movement of data across platforms. Optimization and ScalingOptimize and scale data pipelines for high performance and cost efficiency, utilizing AWS services like Lambda, S3, and EC2.
Posted 1 hour ago
12.0 - 17.0 years
16 - 20 Lacs
Hyderabad
Work from Office
Excellent understanding of AWS Services components with experience in multiple projects Strong Terraform Scripting Skills. Creating the CI/CD pipelines Good Hands-on in provisioning the Containers in AWS Container Instances and AKS etc. AWS ECS, Postgres, Lambda, S3, Route53, SNS, SQS Python (for Lambda functions) Strong Java Knowledge required. People skills Ability to quickly absorb knowledge as it relates to our application existing recorded KT sessions will be provided and core SME team will be available for any questions or addl guidance) Good communication & partnership with others Motivated in what they are working on if not clear or unsure about something, immediately raise up Ability to problem solve issues that arise
Posted 1 hour ago
14.0 - 19.0 years
11 - 16 Lacs
Hyderabad
Work from Office
10 years of hands-on experience in Thought Machine Vault, Kubernetes, Terraform, GCP/AWS, PostgreSQL, CI/CD REST APIs, Docker, Kubernetes, Microservices Architect and manage enterprise-level databases with 24/7 availability Lead efforts on optimization, backup, and disaster recovery planning Ensure compliance, implement monitoring and automation Guide developers on schema design and query optimization Conduct DB health audits and capacity planningCollaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Bachelor's or Master's degree in Computer Science or related field. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge, Strong communication and collaboration skills.
Posted 1 hour ago
8.0 - 13.0 years
8 - 12 Lacs
Hyderabad
Work from Office
Overall 8+ years experience on Design, develop, test, and deploy scalable and resilient microservices using Java and Spring Boot. Collaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Should Be a Java Full Stack Developer. Bachelor's or Master's degree in Computer Science or related field. 6+ years of hands-on experience in Java development, with a focus on microservices architecture. Should have Java Full Stack developer. Proficiency in Spring Boot and other Spring Framework components. Extensive experience in designing and developing RESTful APIs. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Strong communication and collaboration skills.
Posted 1 hour ago
9.0 - 14.0 years
11 - 16 Lacs
Hyderabad
Work from Office
8 years of hands-on experience in Thought Machine Vault, Kubernetes, Terraform, GCP/AWS, PostgreSQL, CI/CD REST APIs, Docker, Kubernetes, Microservices Architect and manage enterprise-level databases with 24/7 availability Lead efforts on optimization, backup, and disaster recovery planning Ensure compliance, implement monitoring and automation Guide developers on schema design and query optimization Conduct DB health audits and capacity planningCollaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Bachelor's or master's degree in computer science or related field. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge Strong communication and collaboration skills
Posted 1 hour ago
8.0 - 13.0 years
12 - 16 Lacs
Hyderabad
Work from Office
8 years of hands-on experience in Thought Machine Vault, Kubernetes, Terraform, GCP/AWS, PostgreSQL, CI/CD REST APIs, Docker, Kubernetes, Microservices Architect and manage enterprise-level databases with 24/7 availability Lead efforts on optimization, backup, and disaster recovery planning Design and manage scalable CI/CD pipelines for cloud-native apps Automate infrastructure using Terraform/CloudFormation Implement container orchestration using Kubernetes and ECS Ensure cloud security, compliance, and cost optimization Monitor performance and implement high-availability setups Collaborate with dev, QA, and security teams; drive architecture decisions Mentor team members and contribute to DevOps best practicesIntegrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Bachelor's or Master's degree in Computer Science or related field. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge, Strong communication and collaboration skills.
Posted 1 hour ago
10.0 - 15.0 years
10 - 15 Lacs
Hyderabad
Work from Office
Design, develop, test, and deploy scalable and resilient microservices using Java and Spring Boot. Collaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Should Be a Java Full Stack Developer. Bachelor's or Master's degree in Computer Science or related field. 8+ years of hands-on experience in JAVA FULL STACK - JAVA SPRING BOOT Java 11+, Spring Boot, Angular/React, REST APIs, Docker, Kubernetes, Microservices Proficiency in Spring Boot and other Spring Framework components. Extensive experience in designing and developing RESTful APIs. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge, Strong communication and collaboration skills.
Posted 1 hour ago
10.0 - 15.0 years
10 - 14 Lacs
Bengaluru
Work from Office
Required Skills: Good experience with GitLab, CI/CD pipelines, GitLab runners, security, and compliance frameworks. Familiarity with Docker, Kubernetes, and container orchestration tools. Strong proficiency in Python for scripting, automation, and troubleshooting. Basic to intermediate knowledge of logging and monitoring tools like Splunk, DXAPM Ability to identify and resolve issues across applications, infrastructure, and pipelines. Proven experience in working effectively with cross-functional teams in a collaborative environment. Strong written and verbal communication skills Ability to work with various stakeholders to manage expectations and drive tasks to completion. High level of accountability, ability to take ownership of tasks and drive them to completion autonomously. Write modular, reusable, and efficient code following best design practices to ensure the codebase is easy to maintain and scale. Ensure clear and concise documentation of code and processes being implemented. Desirable Experience with Hashicorp Vault AWSFamiliarity with key AWS services and infrastructure management
Posted 1 hour ago
7.0 - 12.0 years
5 - 9 Lacs
Hyderabad
Work from Office
4 years of hands-on experience in JAVA FULL STACK - JAVA SPRING BOOT Design, develop, test, and deploy scalable and resilient microservices using Java and Spring Boot. Collaborate with cross-functional teams to define, design, and ship new features. Work on the entire software development lifecycle, from concept and design to testing and deployment. Implement and maintain AWS cloud-based solutions, ensuring high performance, security, and scalability. Integrate microservices with Kafka for real-time data streaming and event-driven architecture. Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Should Be a Java Full Stack Developer. Bachelor's or Master's degree in Computer Science or related field. Proficiency in Spring Boot and other Spring Framework components. Extensive experience in designing and developing RESTful APIs. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Strong communication and collaboration skills.
Posted 1 hour ago
8.0 - 13.0 years
5 - 10 Lacs
Hyderabad
Work from Office
6+ years of experience with Java Spark. Strong understanding of distributed computing, big data principles, and batch/stream processing. Proficiency in working with AWS services such as S3, EMR, Glue, Lambda, and Athena. Experience with Data Lake architectures and handling large volumes of structured and unstructured data. Familiarity with various data formats. Strong problem-solving and analytical skills. Excellent communication and collaboration abilities. Design, develop, and optimize large-scale data processing pipelines using Java Spark Build scalable solutions to manage data ingestion, transformation, and storage in AWS-based Data Lake environments. Collaborate with data architects and analysts to implement data models and workflows aligned with business requirements. Ensure performance tuning, fault tolerance, and reliability of distributed data processing systems.
Posted 1 hour ago
10.0 - 15.0 years
14 - 18 Lacs
Hyderabad
Work from Office
8 years of hands-on experience in AWS PostgreSQL, Oracle, MySQL, MongoDB, Performance Tuning, Backup, Replication REST APIs, Docker, Kubernetes, Microservices Architect and manage enterprise-level databases with 24/7 availability Lead efforts on optimization, backup, and disaster recovery planning Design and manage scalable CI/CD pipelines for cloud-native apps Automate infrastructure using Terraform/CloudFormation Implement container orchestration using Kubernetes and ECS Ensure cloud security, compliance, and cost optimization Monitor performance and implement high-availability setups Collaborate with dev, QA, and security teams; drive architecture decisions Troubleshoot and resolve issues in a timely manner, ensuring optimal system performance. Keep up-to-date with industry trends and advancements, incorporating best practices into our development processes. Bachelor's or Master's degree in Computer Science or related field. Solid understanding of AWS services, including but not limited to EC2, Lambda, S3, and RDS. Experience with Kafka for building event-driven architectures. Strong database skills, including SQL and NoSQL databases. Familiarity with containerization and orchestration tools (Docker, Kubernetes). Excellent problem-solving and troubleshooting skills. Good to have TM Vault core banking knowledge, Strong communication and collaboration skills.
Posted 1 hour ago
8.0 - 13.0 years
5 - 9 Lacs
Pune
Work from Office
Responsibilities / Qualifications: Candidate must have 5-6 years of IT working experience with at least 3 years of experience on AWS Cloud environment is preferred Ability to understand the existing system architecture and work towards the target architecture. Experience with data profiling activities, discover data quality challenges and document it. Experience with development and implementation of large-scale Data Lake and data analytics platform with AWS Cloud platform. Develop and unit test Data pipeline architecture for data ingestion processes using AWS native services. Experience with development on AWS Cloud using AWS data stores such as Redshift, RDS, S3, Glue Data Catalog, Lake formation, Apache Airflow, Lambda, etc Experience with development of data governance framework including the management of data, operating model, data policies and standards. Experience with orchestration of workflows in an enterprise environment. Working experience with Agile Methodology Experience working with source code management tools such as AWS Code Commit or GitHub Experience working with Jenkins or any CI/CD Pipelines using AWS Services Experience working with an on-shore / off-shore model and collaboratively work on deliverables. Good communication skills to interact with onshore team.
Posted 1 hour ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
With the increasing adoption of cloud services in India, there is a growing demand for professionals skilled in Amazon Web Services Identity and Access Management (AWS IAM). AWS IAM jobs in India offer lucrative opportunities for individuals with expertise in managing user permissions, roles, and policies within the AWS environment.
Entry-level AWS IAM professionals can expect a salary ranging from INR 4-6 lakhs per annum, while experienced professionals with advanced skills and certifications can earn up to INR 15-20 lakhs per annum.
The career progression in AWS IAM typically follows a path from Junior AWS IAM Engineer to AWS IAM Specialist, AWS IAM Architect, and finally AWS IAM Consultant or Manager.
In addition to AWS IAM expertise, professionals in this field are often expected to have knowledge and experience in cloud computing, security best practices, scripting languages (such as Python or Shell scripting), and understanding of networking concepts.
As you prepare for AWS IAM job interviews in India, make sure to brush up on your technical skills, understand the best practices in IAM management, and showcase your hands-on experience with AWS services. With the right preparation and confidence, you can land a rewarding career in the field of AWS IAM. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
20312 Jobs | Dublin
Wipro
11977 Jobs | Bengaluru
EY
8165 Jobs | London
Accenture in India
6667 Jobs | Dublin 2
Uplers
6464 Jobs | Ahmedabad
Amazon
6352 Jobs | Seattle,WA
Oracle
5993 Jobs | Redwood City
IBM
5803 Jobs | Armonk
Capgemini
3897 Jobs | Paris,France
Tata Consultancy Services
3776 Jobs | Thane