Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 9.0 years
0 Lacs
maharashtra
On-site
This is an exciting opportunity for you to steer your career in a new direction and lead multiple teams to success at a premier financial institution. As the Manager of Software Engineering at JPMorgan Chase in the Consumer and Community Banking sector, you will be responsible for overseeing multiple teams and coordinating day-to-day implementation activities. Your role will involve identifying and escalating issues, ensuring that your teams" work aligns with compliance standards, business requirements, and tactical best practices. You will provide guidance to your team of software engineers on their daily tasks and activities, setting expectations for their output, practices, and collaborative efforts. Anticipating dependencies with other teams to meet business requirements will be crucial, along with managing stakeholder relationships in compliance with standards and business needs. Fostering a culture of diversity, equity, and inclusion within your team and prioritizing diverse representation will also be key aspects of your role. The ideal candidate should have formal training or certification in AWS, Kafka, Java, J2EE concepts, with at least 5 years of applied experience. Experience in leading technology projects, managing technologists, proficiency in automation and continuous delivery methods, and a strong understanding of the Software Development Life Cycle are essential. An advanced knowledge of agile methodologies, financial services industry IT systems, system design, analysis, development, people management, and exposure to Machine Learning and Artificial Intelligence is required. Additionally, practical experience in AWS technologies like MSK, EKS, ECS, S3, Dynamo, and cloud native applications will be advantageous. Preferred qualifications include hands-on experience working at the code level, a background in Computer Science, Engineering, Mathematics, or related fields, and expertise in various technology disciplines. If you are ready to take on this challenging and rewarding role, apply now and be a part of shaping the future of technology in the financial sector.,
Posted 3 days ago
4.0 - 8.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Platform developer at Barclays, you will play a crucial role in shaping the digital landscape and enhancing customer experiences. Leveraging cutting-edge technology, you will work alongside a team of engineers, business analysts, and stakeholders to deliver high-quality solutions that meet business requirements. Your responsibilities will include tackling complex technical challenges, building efficient data pipelines, and staying updated on the latest technologies to continuously enhance your skills. To excel in this role, you should have hands-on coding experience in Python, along with a strong understanding and practical experience in AWS development. Experience with tools such as Lambda, Glue, Step Functions, IAM roles, and various AWS services will be essential. Additionally, your expertise in building data pipelines using Apache Spark and AWS services will be highly valued. Strong analytical skills, troubleshooting abilities, and a proactive approach to learning new technologies are key attributes for success in this role. Furthermore, experience in designing and developing enterprise-level software solutions, knowledge of different file formats like JSON, Iceberg, Avro, and familiarity with streaming services such as Kafka, MSK, Kinesis, and Glue Streaming will be advantageous. Effective communication and collaboration skills are essential to interact with cross-functional teams and document best practices. Your role will involve developing and delivering high-quality software solutions, collaborating with various stakeholders to define requirements, promoting a culture of code quality, and staying updated on industry trends. Adherence to secure coding practices, implementation of effective unit testing, and continuous improvement are integral parts of your responsibilities. As a Data Platform developer, you will be expected to lead and supervise a team, guide professional development, and ensure the delivery of work to a consistently high standard. Your impact will extend to related teams within the organization, and you will be responsible for managing risks, strengthening controls, and contributing to the achievement of organizational objectives. Ultimately, you will be part of a team that upholds Barclays" values of Respect, Integrity, Service, Excellence, and Stewardship, while embodying the Barclays Mindset of Empower, Challenge, and Drive in your daily interactions and work ethic.,
Posted 1 week ago
7.0 - 12.0 years
25 - 32 Lacs
Bengaluru
Remote
Role & responsibilities Job Title: Senior Data Engineer Company: V2Soft India Location: [Remote/BLR] Work Mode: [Remote] Experience: 7+ Years Employment Type: Full-Time About the Role: V2Soft India is looking for a highly skilled and motivated Senior Data Engineer to join our growing team. You will play a critical role in designing, building, and maintaining scalable, secure, and high-performance data platforms to support cutting-edge data products and real-time streaming systems. This is a great opportunity for someone who thrives in solving complex data challenges and wants to contribute to high-impact initiatives. Key Responsibilities: Design and develop scalable, low-latency data pipelines to ingest, process, and stream massive amounts of structured and unstructured data. Collaborate cross-functionally to clean, curate, and transform data to meet business needs. Integrate privacy and security controls into CI/CD pipelines for all data flows. Embed operational excellence practices including error handling, monitoring, logging, and alerting. Continuously improve reliability, scalability, and performance of data systems while ensuring high data quality. Own KPIs related to platform performance, data delivery, and operational efficiency. Required Skills & Experience: 5+ years of hands-on experience in cloud-native, real-time data systems with strong emphasis on streaming, scalability, and reliability . Proficiency in real-time data technologies such as Apache Spark, Apache Flink, AWS Kinesis, Kafka, AWS Lambda, EMR/EKS , and Lakehouse platforms like Delta.io / Databricks . Strong expertise in AWS architecture , including infrastructure automation, CI/CD, and security best practices. Solid understanding of SQL, NoSQL, and relational databases along with SQL tuning . Proficient in Spark-Scala, PySpark, Python , and/or Java . Experience in containerized deployments using Docker, Kubernetes, Helm . Familiarity with monitoring systems for data loss detection and data quality assurance . Deep knowledge of data structures, algorithms , and data engineering design patterns. Passionate about continuous learning and delivering reliable, high-quality solutions. Nice to Have: Certifications in AWS or Big Data technologies Experience with data governance and compliance frameworks Exposure to ML pipelines or AI data workflows Why Join V2Soft? Work with cutting-edge technologies in a fast-paced and collaborative environment Opportunity to contribute to innovative, high-impact data initiatives Supportive team culture and career growth opportunities How to Apply: Submit your updated resume to [mbalaram@v2soft.com].
Posted 1 week ago
8.0 - 12.0 years
0 Lacs
karnataka
On-site
As a Principal Data Engineer (Associate Director) at Fidelity in Bangalore, you will be an integral part of the ISS Data Platform Team. This team plays a crucial role in building and maintaining the platform that supports the ISS business operations. You will have the opportunity to lead a team of senior and junior developers, providing mentorship and guidance, while taking ownership of delivering a subsection of the wider data platform. Your role will involve designing, developing, and maintaining scalable data pipelines and architectures to facilitate data ingestion, integration, and analytics. Collaboration will be a key aspect of your responsibilities as you work closely with enterprise architects, business analysts, and stakeholders to understand data requirements, validate designs, and communicate progress. Your innovative mindset will drive technical advancements within the department, focusing on enhancing code reusability, quality, and developer productivity. By challenging the status quo and incorporating the latest data engineering practices and techniques, you will contribute to the continuous improvement of the data platform. Your expertise in leveraging cloud-based data platforms, particularly Snowflake and Databricks, will be essential in creating an enterprise lake house. Additionally, your advanced proficiency in the AWS ecosystem and experience with core AWS data services like Lambda, EMR, and S3 will be highly valuable. Experience in designing event-based or streaming data architectures using Kafka, along with strong skills in Python and SQL, will be crucial for success in this role. Furthermore, your role will involve implementing data access controls to ensure data security and performance optimization in compliance with regulatory requirements. Proficiency in CI/CD pipelines for deploying infrastructure and pipelines, experience with RDBMS and NOSQL offerings, and familiarity with orchestration tools like Airflow will be beneficial. Your soft skills, including problem-solving, strategic communication, and project management, will be key in leading problem-solving efforts, engaging with stakeholders, and overseeing project lifecycles. By joining our team at Fidelity, you will not only receive a comprehensive benefits package but also support for your wellbeing and professional development. We are committed to creating a flexible work environment that prioritizes work-life balance and motivates you to contribute effectively to our team. To explore more about our work culture and opportunities for growth, visit careers.fidelityinternational.com.,
Posted 1 week ago
5.0 - 9.0 years
0 Lacs
pune, maharashtra
On-site
As a Data Platform Engineer Lead at Barclays, your role is crucial in building and maintaining systems that collect, store, process, and analyze data, including data pipelines, data warehouses, and data lakes. Your responsibility includes ensuring the accuracy, accessibility, and security of all data. To excel in this role, you should have hands-on coding experience in Java or Python and a strong understanding of AWS development, encompassing various services such as Lambda, Glue, Step Functions, IAM roles, and more. Proficiency in building efficient data pipelines using Apache Spark and AWS services is essential. You are expected to possess strong technical acumen, troubleshoot complex systems, and apply sound engineering principles to problem-solving. Continuous learning and staying updated with new technologies are key attributes for success in this role. Design experience in diverse projects where you have led the technical development is advantageous, especially in the Big Data/Data Warehouse domain within Financial services. Additional skills in enterprise-level software solutions development, knowledge of different file formats like JSON, Iceberg, Avro, and familiarity with streaming services such as Kafka, MSK, and Kinesis are highly valued. Effective communication, collaboration with cross-functional teams, documentation skills, and experience in mentoring team members are also important aspects of this role. Your accountabilities will include the construction and maintenance of data architectures pipelines, designing and implementing data warehouses and data lakes, developing processing and analysis algorithms, and collaborating with data scientists to deploy machine learning models. You will also be expected to contribute to strategy, drive requirements for change, manage resources and policies, deliver continuous improvements, and demonstrate leadership behaviors if in a leadership role. Ultimately, as a Data Platform Engineer Lead at Barclays in Pune, you will play a pivotal role in ensuring data accuracy, accessibility, and security while leveraging your technical expertise and collaborative skills to drive innovation and excellence in data management.,
Posted 1 week ago
6.0 - 10.0 years
0 Lacs
karnataka
On-site
As an AWS Data Engineer at Quest Global, you will be responsible for designing, developing, and maintaining data pipelines while ensuring data quality and integrity within the MedTech industry. Your key responsibilities will include designing scalable data solutions on the AWS cloud platform, developing data pipelines using Databricks and PySpark, collaborating with cross-functional teams to understand data requirements, optimizing data workflows for improved performance, and ensuring data quality through validation and testing processes. To be successful in this role, you should have a Bachelor's degree in Computer Science, Engineering, or a related field, along with at least 6 years of experience as a Data Engineer with expertise in AWS, Databricks, PySpark, and S3. You should possess a strong understanding of data architecture, data modeling, and data warehousing concepts, as well as experience with ETL processes, data integration, and data transformation. Excellent problem-solving skills and the ability to work in a fast-paced environment are also essential. In terms of required skills and experience, you should have experience in implementing Cloud-based analytics solutions in Databricks (AWS) and S3, scripting experience in building data processing pipelines with PySpark, and knowledge of Data Platform and Cloud (AWS) ecosystems. Working experience with AWS Native services such as DynamoDB, Glue, MSK, S3, Athena, CloudWatch, Lambda, and IAM is important, as well as expertise in ETL development, analytics applications development, and data migration. Exposure to all stages of SDLC, strong SQL development skills, and proficiency in Python and PySpark development are also desired. Additionally, experience in writing unit test cases using PyTest or similar tools would be beneficial. If you are a talented AWS Data Engineer looking to make a significant impact in the MedTech industry, we invite you to apply for this exciting opportunity at Quest Global.,
Posted 2 weeks ago
0.0 - 2.0 years
2 - 3 Lacs
Mumbai, Mumbai Suburban, Mumbai (All Areas)
Work from Office
We are looking for a qualified physiotherapist to join our team and take on the role of facilitating the treatment and therapy of patients who suffer from physical challenges. To be successful as a physiotherapist, you should have strong interpersonal skills, good administration skills, and strong knowledge of physiotherapy techniques. Ultimately, you should have a passion for helping people, strong organizational skills, and the ability to work well within a team. Clinic : Physiobox - The Clinical Studio Instagram handle : physiobox_india Physiotherapist Responsibilities: Making assessments of patients' physical conditions. Formulating treatment plans to address the conditions and needs of patients on advanced machineries like Pilates reformer & Wunda Chair. Conducting complex mobilization techniques. Assisting trauma patients with how to walk again. Educating patients, family members, and the community on how to prevent injuries and live a healthy lifestyle. Planning and organizing physiotherapy and fitness programs. Physiotherapist Requirements: Degree in physiotherapy i.e BPT Experience working as a physiotherapist (preferred) Good interpersonal skills. The ability to build and maintain rapport with patients. Administration skills. Tolerance and patience.
Posted 3 weeks ago
7.0 - 12.0 years
25 - 37 Lacs
Bengaluru
Work from Office
Job Summary: We are seeking a skilled Senior NodeJS Developer with experience in software development to join our team in Bangalore. The ideal candidate will have a strong background in developing and maintaining scalable and secure microservices using TypeScript (Node.js) and supporting cloud-native services on AWS. This role is vital in delivering high-quality solutions that meet compliance, performance, and security requirements in the financial services industry. Software Requirements: Required Proficiency: TypeScript (Node.js) for developing RESTful APIs and microservices. AWS services including Lambda, Aurora PostgreSQL, and Serverless Framework. CI/CD processes using GitHub Actions. Docker for containerization. Preferred Proficiency: Experience with Kubernetes for container orchestration. Familiarity with Kafka (MSK) for event-driven architectures. Overall Responsibilities: Develop and maintain scalable and secure microservices using TypeScript (Node.js). Support the implementation of cloud-native services on AWS. Translate technical and business requirements into well-structured, maintainable code adhering to best practices. Contribute to CI/CD workflows, ensuring clean code and comprehensive testing. Collaborate with cross-functional teams to deliver high-quality software solutions. Ensure that all work aligns with compliance, performance, and security requirements specific to the financial services sector.
Posted 4 weeks ago
10.0 - 12.0 years
32 Lacs
Hyderabad, Telangana, India
On-site
Job Description Experian is seeking a seasoned Software Engineering Manager to lead a team of talented cloud-native Java and Node.js engineers supporting our enterprise-grade, consumer-permissioned data platform. This role is pivotal in driving the development and delivery of scalable, secure, and high-performance services in a cloud-native environment. You will collaborate closely with cross-functional teams based in the U.S., including Engineering, Quality Assurance, Product Management, and Project Management, to ensure alignment on requirements, timelines, and deliverables. This role's primary responsibility is managing the team, but the ideal candidate should also be capable of contributing to the codebase as time permits using Java, Spring, and Node.js in an AWS environment. Qualifications 10+ years of hands-on experience as a software engineer, with strong proficiency in Java and Node.js. Experience building and scaling enterprise data platforms. Diligently observe and help maintain Standards for Regulatory Compliance and Information Security. Familiarity with data privacy and security best practices preferred. 5+ years of experience managing software development teams. Lead, mentor, and grow a team of software engineers working on cloud-native applications. Oversee the delivery of well-tested, robust, and efficient software while following software development best practices. Ensure high-quality software development practices including code reviews, testing, and CI/CD. Collaborate with U.S.-based stakeholders to define technical requirements, project scope, and delivery timelines. Solid understanding of Agile/Scrum methodologies. Excellent communication, collaboration, and mentoring skills. Own deliverables from ideation to production operationalization. Experience working with distributed teams across time zones preferred. Proven experience working in cloud environments, preferably AWS. Strong understanding of AWS services including ECS Fargate, S3, RDS, Lambda, SQS, MSK (or Kafka). Experience with NATS.io is a plus. Proven experience integrating with third-party HTTP APIs, typically leveraging JSON payloads. Java engineers should have strong experience with Spring and Spring Cloud frameworks. Proficiency with development and monitoring tools such as GitHub, Splunk, DataDog, Jira. Contribute to the codebase as needed, providing hands-on support and technical guidance. Foster a culture of continuous improvement, innovation, and accountability. Drive adoption of best practices in cloud architecture, microservices, and DevOps. Troubleshoot system functionality and performance using tools like Splunk and DataDog. Foster a culture of continuous improvement, innovation, and accountability. Drive adoption of best practices in cloud architecture, microservices, and DevOps. Additional Information Our uniqueness is that we celebrate yours. Experian's culture and people are important differentiators. We take our people agenda very seriously and focus on what matters DEI, work/life balance, development, authenticity, collaboration, wellness, reward & recognition, volunteering... the list goes on. Experian's people first approach is award-winning World's Best Workplaces 2024 (Fortune Top 25), Great Place To Work in 24 countries, and Glassdoor Best Places to Work 2024 to name a few. Check out Experian Life on social or our Careers Site to understand why. Experian is proud to be an Equal Opportunity and Affirmative Action employer. Innovation is an important part of Experian's DNA and practices, and our diverse workforce drives our success. Everyone can succeed at Experian and bring their whole self to work, irrespective of their gender, ethnicity, religion, colour, sexuality, physical ability or age. If you have a disability or special need that requires accommodation, please let us know at the earliest opportunity. Experian Careers - Creating a better tomorrow together
Posted 1 month ago
5.0 - 8.0 years
1 - 2 Lacs
Kochi
Remote
Job Description: Job Title: AWS Cloud Engineer Location: Kochi, India (Remote Option Available) We are looking for a highly skilled AWS Cloud Engineer with a minimum of 5 years of hands-on experience in AWS cloud technologies. The ideal candidate will have strong expertise in AWS services such as S3, EC2, MSK, Glue, DMS, and SageMaker, along with solid development experience in Python and Docker. This role involves troubleshooting issues, reviewing solution designs, and coding high-quality implementations. Key Responsibilities: Work extensively with AWS services including S3, EC2, MSK, Glue, DMS, and SageMaker Develop, containerize, and deploy applications using Python and Docker Design and review system architecture and cloud-based solutions Troubleshoot and resolve issues in AWS infrastructure and application layers Collaborate with development and DevOps teams to build scalable and secure applications Preferred candidate profile Requirements: Minimum 5 years of hands-on experience in AWS Cloud Proficiency in Python and containerization using Docker Strong understanding of AWS data and streaming services Experience with AWS Glue, DMS, and SageMaker Ability to troubleshoot issues, analyze root causes, and implement effective solutions Strong communication and problem-solving skills Preferred Qualifications: AWS Certification (Associate or Professional level) is a plus
Posted 1 month ago
5.0 - 8.0 years
10 - 15 Lacs
Kochi
Remote
We are looking for a skilled AWS Cloud Engineer with a minimum of 5 years of hands-on experience in managing and implementing cloud-based solutions on AWS. The ideal candidate will have expertise in AWS core services such as S3, EC2, MSK, Glue, DMS, and SageMaker, along with strong programming and containerization skills using Python and Docker.Design, implement, and manage scalable AWS cloud infrastructure solutions. Hands-on experience with AWS services: S3, EC2, MSK, Glue, DMS, and SageMaker. Develop, deploy, and maintain Python-based applications in cloud environments. Containerize applications using Docker and manage deployment pipelines. Troubleshoot infrastructure and application issues, review designs, and code solutions. Ensure high availability, performance, and security of cloud resources. Collaborate with cross-functional teams to deliver reliable and scalable solutions.
Posted 1 month ago
10.0 - 15.0 years
12 - 17 Lacs
Bengaluru
Work from Office
Grade : 7 Purpose of your role This role sits within the ISS Data Platform Team. The Data Platform team is responsible for building and maintaining the platform that enables the ISS business to operate. This role is appropriate for a Lead Data Engineer capable of taking ownership and a delivering a subsection of the wider data platform. Key Responsibilities Design, develop and maintain scalable data pipelines and architectures to support data ingestion, integration and analytics. Be accountable for technical delivery and take ownership of solutions. Lead a team of senior and junior developers providing mentorship and guidance. Collaborate with enterprise architects, business analysts and stakeholders to understand data requirements, validate designs and communicate progress. Drive technical innovation within the department to increase code reusability, code quality and developer productivity. Challenge the status quo by bringing the very latest data engineering practices and techniques. Essential Skills and Experience Core Technical Skills Expert in leveraging cloud-based data platform (Snowflake, Databricks) capabilities to create an enterprise lake house. Advanced expertise with AWS ecosystem and experience in using a variety of core AWS data services like Lambda, EMR, MSK, Glue, S3. Experience designing event-based or streaming data architectures using Kafka. Advanced expertise in Python and SQL. Open to expertise in Java/Scala but require enterprise experience of Python. Expert in designing, building and using CI/CD pipelines to deploy infrastructure (Terraform) and pipelines with test automation. Data Security & Performance Optimization:Experience implementing data access controls to meet regulatory requirements. Experience using both RDBMS (Oracle, Postgres, MSSQL) and NOSQL (Dynamo, OpenSearch, Redis) offerings. Experience implementing CDC ingestion. Experience using orchestration tools (Airflow, Control-M, etc..) Bonus technical Skills: Strong experience in containerisation and experience deploying applications to Kubernetes. Strong experience in API development using Python based frameworks like FastAPI. Key Soft Skills: Problem-Solving:Leadership experience in problem-solving and technical decision-making. Communication:Strong in strategic communication and stakeholder engagement. Project Management:Experienced in overseeing project lifecycles working with Project Managers to manage resources.
Posted 2 months ago
6.0 - 8.0 years
8 - 10 Lacs
Pune
Work from Office
Hello Visionary! We empower our people to stay resilient and relevant in a constantly changing world. Were looking for people who are always searching for creative ways to grow and learn. People who want to make a real impact, now and in the future. Does that sound like you? Then it seems like youd make a great addition to our vibrant team. Siemens founded the new business unit Siemens Foundational Technologies (formerly known as Siemens IoT Services) on April 1, 2019 with its headquarter in Munich, Germany. It has been crafted to unlock the digital future of its clients by offering end-to-end support on their outstanding digitalization journey. Siemens Foundational Technologies is a strategic advisor and a trusted implementation partner in digital transformation and industrial IoT with a global network of more than 8000 employees in 10 countries and 21 offices. Highly skilled and experienced specialists offer services which range from consulting to craft & prototyping to solution & implementation and operation everything out of one hand. We are looking for a Senior DevOps Engineer Youll make a difference by: Key Responsibilities: Design, implement, and maintain CI/CD pipelines using GitLab, including configuring GitLab Runners. Build, manage, and scale containerized applications using Docker, Kubernetes, and HELM. Automate infrastructure provisioning and management with Terraform. Manage and optimize cloud-based environments, especially AWS. Administer and optimize Kafka clusters for data streaming and processing. Oversee the performance and reliability of databases and Linux environments. Monitor and enhance system health using tools like Prometheus and Grafana. Collaborate with cross-functional teams to implement DevOps best practices. Ensure system security, scalability, and disaster recovery readiness. Troubleshoot and resolve technical issues across the infrastructure. Required Skills & Qualifications: 6 - 8 years of experience in DevOps, system administration, or a related role. Expertise in CI/CD tools and workflows, especially GitLab Pipelines and GitLab Runners. Proficient in containerization and orchestration tools like Docker, Kubernetes, and HELM. Strong hands-on experience with Docker Swarm, including creating and managing Docker clusters. Proficiency in packaging Docker images for deployment. Strong hands-on experience with Kubernetes, including managing clusters and deploying applications. Strong hands-on experience with Terraform for Infrastructure as Code (IaC). In-depth knowledge of AWS services, including EC2, S3, IAM, EKS, MSK, Route53 and VPC. Solid experience in managing and maintaining Kafka ecosystems. Strong Linux system administration skills. Proficiency in database management, optimization, and troubleshooting. Experience with monitoring tools like Prometheus and Grafana. Excellent scripting skills in languages like Bash, Python. Strong problem-solving skills and the ability to work in a fast-paced environment. Excellent communication skills and a collaborative mindset. Good to Have Skills: Experience with Keycloak for identity and access management. Familiarity with Nginx or Traefik for reverse proxy and load balancing. Hands-on experience in PostgreSQL maintenance, including backups, tuning, and troubleshooting. Knowledge of the railway domain, including industry-specific challenges and standards. Experience in implementing and managing high-availability architectures. Exposure to distributed systems and microservices architecture. Desired Skills: 5-8 years of experience is required. Great Communication skills. Analytical and problem-solving skills This role is based in Pune and is an Individual contributor role. You might be required to visit other locations within India and outside. In return, you'll get the chance to work with teams impacting - and the shape of things to come.
Posted 2 months ago
3.0 - 5.0 years
5 - 7 Lacs
Pune
Work from Office
Youll make a difference by: Key Responsibilities: Design, implement, and maintain CI/CD pipelines using GitLab, including configuring GitLab Runners. Build, manage, and scale containerized applications using Docker, Kubernetes, and HELM. Automate infrastructure provisioning and management with Terraform. Manage and optimize cloud-based environments, especially AWS. Administer and optimize Kafka clusters for data streaming and processing. Oversee the performance and reliability of databases and Linux environments. Monitor and enhance system health using tools like Prometheus and Grafana. Collaborate with cross-functional teams to implement DevOps best practices. Ensure system security, scalability, and disaster recovery readiness. Troubleshoot and resolve technical issues across the infrastructure. Required Skills & Qualifications: 3 - 5 years of experience in DevOps, system administration, or a related role. Expertise in CI/CD tools and workflows, especially GitLab Pipelines and GitLab Runners. Proficient in containerization and orchestration tools like Docker, Kubernetes, and HELM. Strong hands-on experience with Docker Swarm, including creating and managing Docker clusters. Proficiency in packaging Docker images for deployment. Strong hands-on experience with Kubernetes, including managing clusters and deploying applications. Strong hands-on experience with Terraform for Infrastructure as Code (IaC). In-depth knowledge of AWS services, including EC2, S3, IAM, EKS, MSK, Route53 and VPC. Solid experience in managing and maintaining Kafka ecosystems. Strong Linux system administration skills. Proficiency in database management, optimization, and troubleshooting. Experience with monitoring tools like Prometheus and Grafana. Excellent scripting skills in languages like Bash, Python. Strong problem-solving skills and the ability to work in a fast-paced environment. Excellent communication skills and a collaborative mindset. Good to Have Skills: Experience with Keycloak for identity and access management. Familiarity with Nginx or Traefik for reverse proxy and load balancing. Hands-on experience in PostgreSQL maintenance, including backups, tuning, and troubleshooting. Knowledge of the railway domain, including industry-specific challenges and standards. Experience in implementing and managing high-availability architectures. Exposure to distributed systems and microservices architecture. Desired Skills: 3-5 years of experience is required. Great Communication skills. Analytical and problem-solving skills Find out more about Siemens careers at: & more about mobility at
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
39581 Jobs | Dublin
Wipro
19070 Jobs | Bengaluru
Accenture in India
14409 Jobs | Dublin 2
EY
14248 Jobs | London
Uplers
10536 Jobs | Ahmedabad
Amazon
10262 Jobs | Seattle,WA
IBM
9120 Jobs | Armonk
Oracle
8925 Jobs | Redwood City
Capgemini
7500 Jobs | Paris,France
Virtusa
7132 Jobs | Southborough