Jobs
Interviews

563 S3 Jobs - Page 4

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 8.0 years

15 - 27 Lacs

Bengaluru

Work from Office

Job Summary We are seeking a Senior Data Engineer to join our growing data team, where you will help build and scale the data infrastructure powering analytics, machine learning, and product innovation. As a Senior Data Engineer, you will be responsible for designing, building, and optimizing robust, scalable, and secure data pipelines and platforms. You will work closely with data scientists, software engineers, and product teams to deliver clean, reliable data for critical business and clinical applications. Key Responsibilities: Design, implement, and optimize complex data pipelines using advanced SQL, ETL tools, and integration technologies. Collaborate with cross-functional teams to implement optimal data solutions for advanced analytics and data science initiatives. Spearhead process improvements, including automation, data delivery optimization, and infrastructure redesign for scalability. Evaluate and recommend emerging data technologies to build comprehensive data integration strategies. Lead technical discovery processes, defining complex requirements and mapping out detailed scenarios. • Develop and maintain data governance policies and procedures. What Youll Need to Be Successful (Required Skills): 5 -7 years of experience in data engineering or related roles. Advanced proficiency in multiple programming languages (e.g., Python, Java, Scala) and expert-level SQL knowledge. Extensive experience with big data technologies (Hadoop ecosystem, Spark, Kafka) and cloudbased environments (Azure, AWS, or GCP). Proven experience in designing and implementing large-scale data warehousing solutions. Deep understanding of data modeling techniques and enterprise-grade ETL tools. • Demonstrated ability to solve complex analytical problems. Education/ Certifications: Bachelor's degree in computer science, Information Management or related field . Preferred Skills: Experience in the healthcare industry, including clinical, financial, and operational data. Knowledge of machine learning and AI technologies and their data requirements. Familiarity with data visualization tools and real-time data processing. Understanding data privacy regulations and experience implementing compliant solutions Note: We work 5days from Office - India regular shift. Netsmart, India has setup our new Global Capability Centre(GCC) at Godrej Centre, Byatarayanapura (Hebbal area) -(https://maps.app.goo.gl/RviymAeGSvKZESSo6) .

Posted 6 days ago

Apply

3.0 - 5.0 years

12 - 18 Lacs

Pune

Work from Office

Job Overview: We are seeking a skilled Java Developer with a strong DevOps mindset to join our team. This role offers a balanced blend of backend development and hands-on deployment responsibilities. The ideal candidate will be proficient in Core Java, Spring, and Hibernate, with solid experience in AWS infrastructure, CI/CD pipelines, and Linux systems. Key Responsibilities: Development 40% Design, develop, and enhance Java-based backend applications using Spring and Hibernate. Write clean, maintainable, and well-documented code. Build and manage database interactions using Oracle/SQL. Collaborate with business and QA teams to translate requirements into scalable solutions. Deployment & DevOps 60% Manage application deployment and infrastructure on AWS (EC2, RDS, S3). Develop and maintain CI/CD pipelines using GitLab/Git, Jenkins. Automate deployment tasks using tools like Ansible and Docker (good to have). Monitor system health, troubleshoot issues, and implement fixes in a timely manner. Ensure high availability, scalability, and security of applications in production environments. Mandatory Skills: Core Java, Spring Framework, Hibernate Strong experience with Oracle/SQL databases Hands-on experience with Linux environments Working knowledge of AWS services EC2, RDS, S3 Proficiency with Git/GitLab version control systems Experience in setting up and maintaining Jenkins pipelines Good to Have: Experience with Ansible and Docker Exposure to Agile/Scrum development practices Familiarity with containerization and infrastructure as code (IaC) Preferred Attributes: Ability to shift seamlessly between development and deployment responsibilities Strong analytical and troubleshooting skills Effective communicator and a proactive team player

Posted 6 days ago

Apply

3.0 - 5.0 years

12 - 18 Lacs

Pune

Work from Office

Job Overview: We are seeking a skilled Java Developer with a strong DevOps mindset to join our team. This role offers a balanced blend of backend development and hands-on deployment responsibilities. The ideal candidate will be proficient in Core Java, Spring, and Hibernate, with solid experience in AWS infrastructure, CI/CD pipelines, and Linux systems. Key Responsibilities: Development 40% Design, develop, and enhance Java-based backend applications using Spring and Hibernate. Write clean, maintainable, and well-documented code. Build and manage database interactions using Oracle/SQL. Collaborate with business and QA teams to translate requirements into scalable solutions. Deployment & DevOps 60% Manage application deployment and infrastructure on AWS (EC2, RDS, S3). Develop and maintain CI/CD pipelines using GitLab/Git, Jenkins. Automate deployment tasks using tools like Ansible and Docker (good to have). Monitor system health, troubleshoot issues, and implement fixes in a timely manner. Ensure high availability, scalability, and security of applications in production environments. Mandatory Skills: Core Java, Spring Framework, Hibernate Strong experience with Oracle/SQL databases Hands-on experience with Linux environments Working knowledge of AWS services EC2, RDS, S3 Proficiency with Git/GitLab version control systems Experience in setting up and maintaining Jenkins pipelines Good to Have: Experience with Ansible and Docker Exposure to Agile/Scrum development practices Familiarity with containerization and infrastructure as code (IaC) Preferred Attributes: Ability to shift seamlessly between development and deployment responsibilities Strong analytical and troubleshooting skills Effective communicator and a proactive team player

Posted 6 days ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

You should have experience in designing and building serverless data lake solutions using a layered components architecture. This includes expertise in Ingestion, Storage, processing, Security & Governance, Data cataloguing & Search, and Consumption layer. Proficiency in AWS serverless technologies like Lake formation, Glue, Glue Python, Glue Workflows, Step Functions, S3, Redshift, Quick sight, Athena, AWS Lambda, and Kinesis is required. It is essential to have hands-on experience with Glue. You must be skilled in designing, building, orchestrating, and deploying multi-step data processing pipelines using Python and Java. Experience in managing source data access security, configuring authentication and authorization, and enforcing data policies and standards is also necessary. Additionally, familiarity with AWS Environment setup and configuration is expected. A minimum of 6 years of relevant experience with at least 3 years in building solutions using AWS is mandatory. The ability to work under pressure and a commitment to meet customer expectations are essential traits for this role. If you meet these requirements and are ready to take on this challenging opportunity, please reach out to hr@Stanratech.com.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

indore, madhya pradesh

On-site

At InfoBeans, we believe in making other peoples lives better through our work and everyday interactions. We are currently looking for a Java Fullstack with Angular professional to join our team in Indore/Pune. With a focus on Web Application Development and over 10 years of experience, you will play a crucial role in developing Microservices using Spring/AWS technologies and deploying them on the AWS platform. Your responsibilities will include supporting Java Angular enterprise applications with multi-region setups, performing unit and system testing of application code, and executing implementation activities. You will be involved in designing, building, and testing Java EE and Angular full stack applications. In this role, you will have the opportunity to work in an open workspace with smart and pragmatic team members. You can expect ever-growing opportunities for professional and personal growth in a learning culture that encourages teamwork, collaboration, and diversity. Excellence, compassion, openness, and ownership are highly valued and rewarded in our environment. To excel in this role, we expect you to have in-depth knowledge of popular Java frameworks such as Spring boot and Spring, experience with Object-Oriented Design (OOD), and proficiency in Spring, Spring Boot, Relational Databases, MySQL, and ORM technologies (JPA2, Hibernate). Experience working in Agile (Scrum/Lean) with a DevSecOps focus is essential, along with familiarity with AWS, Kubernetes, Docker Containers, and AWS Component Usage, Configurations, and Deployment including Elasticsearch, EC2, S3, SNS, SQS, API Gateway Service, and Kinesis. An AWS certification would be advantageous, and any knowledge of Health and related technologies will be beneficial in this role. If you are looking for a challenging yet rewarding opportunity to contribute to cutting-edge projects in a supportive and dynamic environment, we encourage you to apply for this position.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

The role involves providing production support for trading applications and requires the candidate to be comfortable with working in a rotational shift (7 AM - 4 PM / 11 AM - 8 PM / 1 PM - 10 PM). The applications have transitioned from on-premises to AWS cloud, necessitating strong experience in AWS services such as EC2, S3, and Kubernetes. Monitoring overnight batch jobs is also a key responsibility. Key Requirements: - Proficiency in AWS services like EC2, S3, Kubernetes, CloudWatch, etc. - Familiarity with monitoring tools like Datadog, Grafana, Prometheus. Good to have: - Basic understanding of SQL. - Experience in utilizing Control-M/Autosys schedulers.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

navi mumbai, maharashtra

On-site

Seekify Global is looking for an experienced and driven Data Catalog Engineer to join the Data Engineering team. The ideal candidate should have a strong background in designing and implementing metadata and data catalog solutions specifically in AWS-centric data lake and data warehouse environments. As a Data Catalog Engineer, you will play a crucial role in improving data discoverability, governance, and lineage across the organization's data assets. Your key responsibilities will include leading the end-to-end implementation of a data cataloging solution within AWS, establishing and managing metadata frameworks for diverse data assets, integrating the data catalog with AWS-based storage solutions, collaborating with various project teams to define metadata standards and processes, developing automation scripts for metadata management, working closely with other data professionals to ensure data accuracy, and implementing access controls to comply with data privacy standards. The ideal candidate should possess at least 7-8 years of experience in data engineering or metadata management roles, with proven expertise in implementing data catalog solutions within AWS environments. Strong knowledge of AWS services such as Glue, S3, Athena, Redshift, EMR, Data Catalog, and Lake Formation is essential. Proficiency in Python, SQL, and automation scripting for metadata pipelines is required, along with familiarity with data governance and compliance standards. Experience with BI tools and third-party catalog tools is a plus. Preferred qualifications include AWS certifications, experience with data catalog tools like Alation, Collibra, or Informatica EDC, exposure to data quality frameworks, stewardship practices, and knowledge of data migration processes. This is a full-time position that requires in-person work.,

Posted 1 week ago

Apply

5.0 - 13.0 years

0 Lacs

pune, maharashtra

On-site

We are seeking talented and experienced individuals to join our engineering team in the roles of Staff Development Engineer and Senior Software Development Engineer (SDE 3). As a member of our team, you will be responsible for taking ownership of complex projects, designing and constructing high-performance, scalable systems. In the role of SDE 3, you will play a crucial part in ensuring that the solutions we develop are not only robust but also efficient. This is a hands-on position that requires you to lead projects from concept to deployment, ensuring the delivery of top-notch, production-ready code. Given the fast-paced environment, strong problem-solving skills and a dedication to crafting exceptional software are indispensable. Your responsibilities will include: - Developing high-quality, secure, and scalable enterprise-grade backend components in alignment with technical requirements specifications and design artifacts within the expected time and budget. - Demonstrating a proficient understanding of the choice of technology and its application, supported by thorough research. - Identifying, troubleshooting, and ensuring the timely resolution of software defects. - Participating in functional specification, design, and code reviews. - Adhering to established practices for the development and upkeep of application code. - Taking an active role in diminishing the technical debt across our various codebases. We are looking for candidates with the following qualifications: - Proficiency in Python programming and frameworks such as Flask/FastAPI. - Prior experience in constructing REST API-based microservices. - Excellent knowledge and hands-on experience with RDBMS (e.g., MySQL, PostgreSQL), message brokers, caching, and queueing systems. - Preference for experience with NoSQL databases. - Ability for Research & Development to explore new topics and use cases. - Hands-on experience with AWS services like EC2, SQS, Fargate, Lambda, and S3. - Knowledge of Docker for application containerization. - Cybersecurity knowledge is considered advantageous. - Strong technical background with the ability to swiftly adapt to emerging technologies. - Desired experience: 5-13 years in Software Engineering for Staff or SDE 3 roles. Working Conditions: This role necessitates full-time office-based work; remote work arrangements are not available. Company Culture: At Fortinet, we uphold a culture of innovation, collaboration, and continuous learning. We are dedicated to fostering an inclusive environment where every employee is valued and respected. We encourage applications from individuals of all backgrounds and identities. Our competitive Total Rewards package is designed to assist you in managing your overall health and financial well-being. We also offer flexible work arrangements and a supportive work environment. If you are looking for a challenging, fulfilling, and rewarding career journey, we invite you to contemplate joining us and contributing solutions that have a meaningful and enduring impact on our 660,000+ global customers.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

Join us as a Cloud Data Engineer at Barclays, where you'll spearhead the evolution of the digital landscape, driving innovation and excellence. You'll harness cutting-edge technology to revolutionize digital offerings, ensuring unparalleled customer experiences. You may be assessed on key critical skills relevant for success in the role, such as risk and control, change and transformations, business acumen, strategic thinking, and digital technology, as well as job-specific skill sets. To be successful as a Cloud Data Engineer, you should have experience with: - Experience on AWS Cloud technology for data processing and a good understanding of AWS architecture. - Experience with computer services like EC2, Lambda, Auto Scaling, VPC, EC2. - Experience with Storage and container services like ECS, S3, DynamoDB, RDS. - Experience with Management & Governance KMS, IAM, CloudFormation, CloudWatch, CloudTrail. - Experience with Analytics services such as Glue, Athena, Crawler, Lake Formation, Redshift. - Experience with Solution delivery for data processing components in larger End to End projects. Desirable skill sets/good to have: - AWS Certified professional. - Experience in Data Processing on Databricks and unity catalog. - Ability to drive projects technically with right first deliveries within schedule and budget. - Ability to collaborate across teams to deliver complex systems and components and manage stakeholders" expectations well. - Understanding of different project methodologies, project lifecycles, major phases, dependencies and milestones within a project, and the required documentation needs. - Experienced with planning, estimating, organizing, and working on multiple projects. This role will be based out of Pune. Purpose of the role: To build and maintain systems that collect, store, process, and analyze data, such as data pipelines, data warehouses, and data lakes to ensure that all data is accurate, accessible, and secure. Accountabilities: - Build and maintenance of data architecture pipelines that enable the transfer and processing of durable, complete, and consistent data. - Design and implementation of data warehouses and data lakes that manage appropriate data volumes and velocity and adhere to required security measures. - Development of processing and analysis algorithms fit for the intended data complexity and volumes. - Collaboration with data scientists to build and deploy machine learning models. Analyst Expectations: - Will have an impact on the work of related teams within the area. - Partner with other functions and business areas. - Takes responsibility for end results of a team's operational processing and activities. - Escalate breaches of policies/procedure appropriately. - Take responsibility for embedding new policies/procedures adopted due to risk mitigation. - Advise and influence decision making within own area of expertise. - Take ownership for managing risk and strengthening controls in relation to the work you own or contribute to. - Deliver your work and areas of responsibility in line with relevant rules, regulations, and codes of conduct. - Maintain and continually build an understanding of how own sub-function integrates with function, alongside knowledge of the organization's products, services, and processes within the function. - Demonstrate understanding of how areas coordinate and contribute to the achievement of the objectives of the organization sub-function. - Resolve problems by identifying and selecting solutions through the application of acquired technical experience and will be guided by precedents. - Guide and persuade team members and communicate complex/sensitive information. - Act as a contact point for stakeholders outside of the immediate function, while building a network of contacts outside the team and external to the organization. All colleagues will be expected to demonstrate the Barclays Values of Respect, Integrity, Service, Excellence, and Stewardship our moral compass, helping us do what we believe is right. They will also be expected to demonstrate the Barclays Mindset to Empower, Challenge, and Drive the operating manual for how we behave.,

Posted 1 week ago

Apply

5.0 - 9.0 years

0 Lacs

pune, maharashtra

On-site

You have an opportunity to impact your career and embark on an adventure where you can push the limits of what's possible. As a Manager of Software Engineering - Cloud at JPMorgan Chase, you will lead a team of cloud engineers to develop and implement scalable, reliable, and secure cloud-based solutions. Your role will be pivotal in shaping the cloud strategy and architecture, ensuring alignment with business goals and technical requirements. Your leadership will drive innovation and operational excellence in cloud technologies, fostering a collaborative environment to achieve project objectives. You will be responsible for leading and mentoring a team of cloud engineers, fostering a culture of innovation and continuous improvement. Collaboration with technical teams and business stakeholders to propose and implement cloud solutions that meet current and future needs will be a key aspect of your role. You will define and drive the technical target state of cloud products, ensuring alignment with strategic goals, and participate in architecture governance bodies to ensure compliance with best practices and standards. Your expertise will be crucial in evaluating and providing feedback on new cloud technologies, recommending solutions for future state architecture. You will oversee the design, development, and deployment of cloud-based solutions on AWS, utilizing services such as EC2, S3, Lambda, and RDS. Integration of DevOps practices, including Infrastructure as Code (IaC) using tools like Terraform and AWS CloudFormation, and Configuration Management with Ansible or Chef will be part of your responsibilities. Establishing and maintaining Continuous Integration/Continuous Deployment (CI/CD) pipelines using Jenkins, GitLab CI, or AWS CodePipeline will also fall under your purview. Identifying opportunities to automate remediation of recurring issues to improve operational stability of cloud applications and systems will be essential. Leading evaluation sessions with external vendors, startups, and internal teams to assess architectural designs and technical credentials will also be part of your responsibilities. **Required Qualifications, Capabilities, and Skills:** - Formal training or certification in cloud engineering concepts with 5+ years of applied experience. - Proven experience in leading cloud engineering teams and delivering cloud solutions. - Advanced proficiency in one or more programming languages. - Expertise in automation and continuous delivery methods. - Proficient in all aspects of the Software Development Life Cycle, with a focus on cloud technologies. - Advanced understanding of agile methodologies such as CI/CD, Application Resiliency, and Security. - Demonstrated proficiency in cloud applications and technical processes within a technical discipline (e.g., cloud, artificial intelligence, machine learning, mobile, etc.). - Practical cloud-native experience, particularly with AWS services and architecture, including VPC, IAM, and CloudWatch. **Preferred Qualifications, Capabilities, and Skills:** - In-depth knowledge of the financial services industry and their IT systems. - Advanced knowledge of cloud software, applications, and architecture disciplines. - Ability to evaluate current and emerging cloud technologies to recommend the best solutions for the future state architecture.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

pune, maharashtra

On-site

As a DataOps Engineer, you will play a crucial role within our data engineering team, operating in the realm that merges software engineering, DevOps, and data analytics. Your primary responsibility will involve creating and managing secure, scalable, and production-ready data pipelines and infrastructure that are vital in supporting advanced analytics, machine learning, and real-time decision-making capabilities for our clientele. Your key duties will encompass designing, developing, and overseeing the implementation of robust, scalable, and efficient ETL/ELT pipelines leveraging Python and contemporary DataOps methodologies. You will also be tasked with incorporating data quality checks, pipeline monitoring, and error handling mechanisms, as well as constructing data solutions utilizing cloud-native services on AWS like S3, ECS, Lambda, and CloudWatch. Furthermore, your role will entail containerizing applications using Docker and orchestrating them via Kubernetes to facilitate scalable deployments. You will collaborate with infrastructure-as-code tools and CI/CD pipelines to automate deployments effectively. Additionally, you will be involved in designing and optimizing data models using PostgreSQL, Redis, and PGVector, ensuring high-performance storage and retrieval while supporting feature stores and vector-based storage for AI/ML applications. In addition to your technical responsibilities, you will be actively engaged in driving Agile ceremonies such as daily stand-ups, sprint planning, and retrospectives to ensure successful sprint delivery. You will also be responsible for reviewing pull requests (PRs), conducting code reviews, and upholding security and performance standards. Your collaboration with product owners, analysts, and architects will be essential in refining user stories and technical requirements. To excel in this role, you are required to have at least 10 years of experience in Data Engineering, DevOps, or Software Engineering roles with a focus on data products. Proficiency in Python, Docker, Kubernetes, and AWS (specifically S3 and ECS) is essential. Strong knowledge of relational and NoSQL databases like PostgreSQL, Redis, and experience with PGVector will be advantageous. A deep understanding of CI/CD pipelines, GitHub workflows, and modern source control practices is crucial, as is experience working in Agile/Scrum environments with excellent collaboration and communication skills. Moreover, a passion for developing clean, well-documented, and scalable code in a collaborative setting, along with familiarity with DataOps principles encompassing automation, testing, monitoring, and deployment of data pipelines, will be beneficial for excelling in this role.,

Posted 1 week ago

Apply

1.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

As a Lead Developer, you will report to the Head of Technology based in London. Your core responsibilities will include leading the development of new features and platform improvements, managing, mentoring, and motivating team members, liaising with UK-based stakeholders to ensure team alignment with business and technical objectives, planning, coordinating, and delivering technical projects to the agreed schedule, championing and enforcing technical standards, ensuring the Software Development Life Cycle (SDLC) is followed within the team, and assisting with hiring, onboarding, and developing the team. In this role, you will work with an exciting and modern tech stack built for scale, reliability, and productivity. To succeed, you should have solid experience with tools such as Python (SQLAlchemy, Flask, Numpy, Pandas), MySQL, AWS (ECS, S3, Lambda, RDS), RabbitMQ, Docker, Linux, GitLab, Generative AI tools, and AI Tools. The required qualifications and experience for this role include a university degree in a STEM subject from a reputable institution, at least 8 years of professional software development experience, with at least 2 years in a lead/management role, proven experience liaising with remote stakeholders, familiarity with the tech stack or equivalent technologies, and a basic understanding of financial markets and derivative products. You should possess excellent teamwork skills, professional fluency in English (both written and spoken), excellent interpersonal and communication skills to collaborate effectively across global teams and time zones, a strong understanding of distributed software systems, an analytical and inquisitive mindset, and a desire to take on responsibility and make a difference. In terms of benefits, the company offers a competitive compensation package, including a competitive salary based on experience and role fit, annual/performance bonus, health insurance, life insurance, meal benefits, learning & development opportunities relevant to your role and career growth, enhanced leave policy, and a transport budget for roles requiring commute outside of business hours. This is a full-time position that requires in-person work.,

Posted 1 week ago

Apply

8.0 - 12.0 years

0 Lacs

chennai, tamil nadu

On-site

You should have experience working with BigID or Collibra, along with knowledge of data classification and data products. It is important to have an understanding of data loss and personal information security. Exposure to platforms such as Snowflake, S3, Redshift, SharePoint, and Box is required. You should also have knowledge of connecting to various source systems. A deep understanding and practical knowledge of IDEs like Eclipse, PyCharm, or any Workflow Designer is essential. Experience with one or more of the following languages - Java, JavaScript, Groovy, Python is preferred. Hands-on experience with CI/CD processes and tooling such as GitHub is necessary. Working experience in DevOps teams based on Kubernetes tools is also expected. Proficiency in database concepts and a basic understanding of data classification, lineage, and storage would be advantageous. Excellent written and spoken English, interpersonal skills, and a collaborative approach to delivery are essential. Desirable Skills And Experience: - A total of 8 to 12 years of overall IT experience - Technical Degree to support your experience - Deep technical expertise - Demonstrated understanding of the required technology and problem-solving skills - Analytical, focused, and capable of working independently with minimal supervision - Good collaborator management and team player - Exposure to platforms like Talend Data Catalog, BigID, or Snowflake is beneficial - Basic knowledge of AWS is a plus - Knowledge and experience with integration technologies such as Mulesoft and SnapLogic - Proficiency in Jira, including the ability to quickly generate JQL queries and save them for reference - Proficient in creating documentation in Confluence - Experience with Agile practices, preferably having been part of an Agile team for several years,

Posted 1 week ago

Apply

7.0 - 11.0 years

0 Lacs

noida, uttar pradesh

On-site

You are an experienced .NET Developer with at least 7 years of experience in designing, coding, deployment, and development of web-based applications using the latest .NET technologies such as .Net Core, ASP.NET, C#, MVC, LINQ, and Web API's. You are responsible for building, designing, and architecting .NET web-based applications, ensuring that the code written improves product quality. You have hands-on experience in the latest versions of .NET technologies and AWS architecture, including working with MVC Architecture, JavaScript, jQuery, and creating RESTful APIs/Web API. Your role involves writing industry-adopted code for Microservices based products and utilizing .Net Core 3.0 or above, Microservices, .Net Core Entity Framework, Gulp/Webpack, NodeJS, Python (optional), Unit/Integration testing, and Message Queuing Tools such as Kafka/RabbitMQ. You also have experience with MS SQL databases, including writing queries, store procedures, tables, views, triggers, functions, and SQL Jobs. Familiarity with SQS, SNS, S3, EC2, API Gateway, and AWS Aurora is a must. As an individual contributor, you are willing to contribute your views and opinions, questioning and probing when necessary to ensure thorough and robust solutions. Strong analytical and problem-solving skills are essential for this role, along with excellent interpersonal communication skills. Experience in working on large, complex products, very large scalable applications, or websites with huge data size is considered a plus. This position is based in Noida, Sector 68, with a 5-day work week and fixed off on Saturdays and Sundays. There are two working modes available: one opening for work from home with onsite visits and another for 100% onsite work from day one. The shift timings are in the morning shift from 4:30 am to 12:30 pm. The ideal candidate should be able to join immediately or within 15-30 days. The interview process consists of 2 rounds, the first being a core technical round and the second being a departmental HR round with a mix of technical and non-technical questions. Interviews will be conducted virtually.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

kolkata, west bengal

On-site

As a DevOps Engineer with AWS certification, you will be responsible for implementing, maintaining, monitoring, and supporting the IT infrastructure. Your role will involve developing custom scripts to support Continuous Integration & Deployment processes and integrating various tools for automation based on target architecture. You will create packaging, deployment documentation, and scripts for production builds and assist agile development teams with builds and releases. Your key responsibilities will include implementing release automation solutions, branching & merging strategies, and providing guidance to the team on build & deployment automation issues. You will design and implement release orchestration solutions for medium or large-sized projects, ensuring efficient and effective deployment processes. To be successful in this role, you must have a minimum of 2 years of experience with AWS, a background in Linux/Unix Administration, and proficiency in using a variety of Open-Source Tools. Hands-on experience with AWS services like RDS, EC2, ELB, EBS, S3, SQS, Code Deploy, and Cloud Watch is essential. Strong skills in managing SQL and MySQL databases, as well as experience with Web Servers like Apache, Nginx, Lighttpd, and Tomcat, will be valued. You should be proficient in docker & Kubernetes deployment scripts, GIT, Jenkins, and cluster setup. Experience in environment setup, connectivity, support levels, system security compliance, and data security is required. Additionally, you should have expertise in application and infrastructure planning, testing, development, centralized configuration management, log management, and dashboards to ensure smooth operations. If you are a proactive and skilled DevOps Engineer with a passion for automation and infrastructure management, and meet the above requirements, we invite you to join our team in Kolkata for this full-time on-site position.,

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

surat, gujarat

On-site

At devx, we specialize in assisting some of India's most innovative brands in unlocking growth opportunities through AI-powered and cloud-native solutions developed in collaboration with AWS. As a rapidly growing consultancy, we are dedicated to addressing real-world business challenges by leveraging cutting-edge technology. We are currently seeking a proactive and customer-focused AWS Solutions Architect to become part of our team. In this position, you will collaborate directly with clients to craft scalable, secure, and economical cloud architectures that effectively tackle significant business obstacles. Your role will involve bridging the gap between business requirements and technical implementation, establishing yourself as a reliable advisor to our clients. Key Responsibilities - Engage with clients to comprehend their business goals and transform them into cloud-based architectural solutions. - Design, deploy, and document AWS architectures, emphasizing scalability, security, and performance. - Develop solution blueprints and collaborate closely with engineering teams to ensure successful execution. - Conduct workshops, presentations, and in-depth technical discussions with client teams. - Stay informed about the latest AWS offerings and best practices, integrating them into solution designs. - Collaborate with various teams, including sales, product, and engineering, to provide comprehensive solutions. We are looking for individuals with the following qualifications: - Minimum of 2 years of experience in designing and implementing solutions on AWS. - Proficiency in fundamental AWS services like EC2, S3, Lambda, RDS, API Gateway, IAM, VPC, etc. - Proficiency in fundamental AI/ML and Data services such as Bedrock, Sagemaker, Glue, Athena, Kinesis, etc. - Proficiency in fundamental DevOps services such as ECS, EKS, CI/CD Pipeline, Fargate, Lambda, etc. - Excellent written and verbal communication skills in English. - Comfortable in client-facing capacities, with the ability to lead technical dialogues and establish credibility with stakeholders. - Capability to strike a balance between technical intricacies and business context, effectively communicating value to decision-makers. Location: Surat, Gujarat Note: This is an on-site role in Surat, Gujarat. Please apply only if you are willing to relocate.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

maharashtra

On-site

As an AWS Dataops Lead at Birlasoft, you will be responsible for configuring, deploying, monitoring, and managing AWS data platforms. Your role will involve managing data flows and dispositions in S3, Snowflake, and Postgres. You will also be in charge of user access and authentication on AWS, ensuring proper resource provisioning, security, and compliance. Your experience in GitHub integration will be valuable in this role. Additionally, familiarity with AWS native tools like Glue, Glue Catalog, CloudWatch, and CloudFormation (or Terraform) will be essential. You will also play a key part in assisting with backup and disaster recovery processes. Join our team and be a part of Birlasoft's commitment to leveraging Cloud, AI, and Digital technologies to empower societies worldwide and enhance business efficiency and productivity. With over 12,000 professionals and a rich heritage spanning 170 years, we are dedicated to building sustainable communities and driving innovation through our consultative and design-thinking approach.,

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 13 Lacs

Pune

Work from Office

Role & responsibilities Exp: 3 to 5 years Senior Backend Software Engineer Job Summary: This Backend Developer position involves designing, building, and maintaining scalable backend systems in AWS cloud services while following full life cycle of software development. The software development activity includes requirement specification, design, implementation, testing, manufacturing support, and problem investigation of field reported issues. Responsibilities: 1. Design and develop scalable backend services and APIs using modern programming languages 2. Build and maintain microservices architecture on AWS cloud platform 3. Develop serverless applications using AWS Lambda, API Gateway, and other managed services 4. Design and optimize database schemas for both SQL and NoSQL databases 5. Deploy and manage applications using AWS services including EC2, ECS, EKS, and Lambda 6. Manage containerized applications with Docker and Kubernetes on EKS 7. Develop software design specification that are tracible to requirement specification in accordance with the development process. 8. Perform required design testing including unit testing, integration testing, performance testing, and reliability testing. 9. Implement logging strategies and troubleshoot production issues 10. Optimize application performance and scalability based on metrics and user feedback Minimum Qualifications: • Degree in Electrical or Computer Engineering, Computer Science or a Technology Diploma with relevant industry experience in full-stack software development. • Work well individually and in a team environment. • Ability to work in a fast paced and agile development environment with measurable results • Effective written and verbal communication skills • Effective problem-solving skills. • 4-5 years of experience in two or more of the following areas: Excellent Proficiency in Java programming Hands-on experience with core AWS services including: Compute: EC2, Lambda, ECS/EKS Storage: S3, EBS, EFS Database: RDS, DynamoDB Networking: VPC, CloudFront, Route 53 Monitoring: CloudWatch Experience with both relational (MySQL) and NoSQL (DynamoDB, Redis) databases Experience with containerization technologies (Docker, Kubernetes) Understanding of CI/CD principles and tools Familiarity with message queues and event-driven architectures (SQS, SNS, EventBridge)

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 8 Lacs

Ahmedabad

Work from Office

We are seeking a certified and experienced AWS & Linux Administrator to support the infrastructure of SAP ECC systems running on Oracle databases in AWS. The role demands expertise in AWS services, enterprise Linux (RHEL/SLES), and experience supporting SAP ECC and Oracle in mission-critical environments. This position will have a work shift that aligns with US Pacific Time, which is 12:30 hours behind India Time. Key Responsibilities Deploy, configure, and maintain Linux servers (RHEL/SLES) on AWS EC2 for SAP ECC and Oracle. Administer and monitor SAP ECC infrastructure and Oracle DB back-end, ensuring high availability and performance. Design and manage AWS infrastructure using EC2, EBS, VPC, IAM, S3, CloudWatch, and Backup services. Collaborate with SAP Basis and Oracle DBA teams to manage system patching, tuning, and upgrades. Implement backup and disaster recovery strategies for SAP and Oracle in AWS. Automate routine tasks using Shell scripts, Ansible, or AWS Systems Manager. Ensure security, compliance, and system hardening of SAP ECC and Oracle landscape. Support system refreshes, migrations, and environment cloning. Troubleshoot infrastructure-related incidents affecting SAP or Oracle availability. Minimum Qualifications: AWS Certified SysOps Administrator Associate (or higher AWS certification). Linux Certification: Red Hat RHCSA/RHCE or SUSE Certified Administrator. 5+ years experience managing Linux systems in enterprise or cloud environments. 3+ years of hands-on AWS infrastructure administration. Solid understanding of Oracle DB administration basics in SAP contexts (e.g., listener setup, tablespaces, logs). Preferred Skills Knowledge of SAP ECC on Oracle deployment architecture. Experience managing Oracle on AWS using EC2 Familiarity with SAP Notes, SAP EarlyWatch reports, and SAP/Oracle performance tuning. Understanding of hybrid connectivity, such as VPN/Direct Connect between on-prem and AWS. Hands-on with AWS CloudFormation, Terraform, or automation pipelines for infrastructure deployment. Soft Skills Analytical thinking with attention to root-cause analysis. Strong communication and documentation skills. Ability to coordinate across SAP, DBA, and DevOps teams. Flexibility to provide off-hours support as required.

Posted 1 week ago

Apply

2.0 - 5.0 years

0 - 0 Lacs

Nagpur

Remote

Key Responsibilities: Provision and manage GPU-based EC2 instances for training and inference workloads. Configure and maintain EBS volumes and Amazon S3 buckets (versioning, lifecycle policies, multipart uploads) to handle large video and image datasets . Build, containerize, and deploy ML workloads using Docker and push images to ECR . Manage container deployment using Lambda , ECS , or AWS Batch for video inference jobs. Monitor and optimize cloud infrastructure using CloudWatch, Auto Scaling Groups , and Spot Instances to ensure cost efficiency. Set up and enforce IAM roles and permissions for secure access control across services. Collaborate with the AI/ML, annotation, and backend teams to streamline cloud-to-model pipelines. Automate cloud workflows and deployment pipelines using GitHub Actions , Jenkins , or similar CI/CD tools. Maintain logs, alerts, and system metrics for performance tuning and auditing. Required Skills: Cloud & Infrastructure: AWS Services : EC2 (GPU), S3, EBS, ECR, Lambda, Batch, CloudWatch, IAM Data Management : Large file transfer, S3 Multipart Uploads, storage lifecycle configuration, archive policies (Glacier/IA) Security & Access : IAM Policies, Roles, Access Keys, VPC (preferred) DevOps & Automation: Tools : Docker, GitHub Actions, Jenkins, Terraform (bonus) Scripting : Python, Shell scripting for automation & monitoring CI/CD : Experience in building and managing pipelines for model and API deployments ML/AI Environment Understanding: Familiarity with GPU-based ML workloads Knowledge of model training, inference architecture (batch and real-time) Experience with containerized ML model execution is a plus Preferred Qualifications: 2-5 years of experience in DevOps or Cloud Infrastructure roles AWS Associate/Professional Certification (DevOps/Architect) is a plus Experience in managing data-heavy pipelines , such as drones, surveillance, or video AI systems

Posted 1 week ago

Apply

8.0 - 10.0 years

14 - 18 Lacs

Jaipur

Work from Office

About the Role We are looking for a highly skilled and experienced Frontend JavaScript Developer for position of Principal Software Engineer who can lead the development and design of high-performance frontend architectures. In this role, you will take ownership of frontend systems, establish scalable components and design patterns, and collaborate across teams to ensure cohesive, secure, and performant product delivery. The ideal candidate is someone who can architect complex frontend systems, has a deep understanding of browser rendering, code bundling, optimization strategies, and modern state management, and can also guide junior developers to grow in both technical and collaborative aspects. Candidates with exposure to backend fundamentals using Node.js , Express.js , and WebSocket-based real-time communication are highly preferred for seamless cross-functional collaboration. About Auriga IT Auriga IT is a digital solutions company founded in 2010 by an IIT Roorkee alumnus, based in Jaipur, India. It serves as a digital solutions partner for startups, corporates, government entities, and unicorns. Auriga IT focuses on design, technology, and data capabilities to help organizations launch new businesses and drive digital transformation. Key Responsibilities Lead the architectural design and implementation of scalable, reusable frontend applications using React.js and Next.js Define and implement frontend architecture flows, maintainable design systems, and component libraries Establish and enforce coding standards, performance budgets, and testing strategies Optimize applications for high performance and scalability, focusing on Core Web Vitals, bundle size reduction, and runtime performance Integrate secure practices: CSP, secure token flows, input validation, XSS/CSRF protections Guide the use of state management libraries (Redux Toolkit, Zustand, React Query) based on use case suitability Collaborate with DevOps and backend teams on API integrations, WebSocket implementation (e.g., Socket.io), deployment, and system health Drive CI/CD processes using tools like GitHub Actions, Jenkins, Docker, and Vite/Webpack/Grunt/Gulp Participate in code reviews and mentor junior developers to build both technical and product understanding Conduct root-cause analysis and production-level debugging for critical issues across environments Coordinate with cross-functional teams, including QA, backend, DevOps, product, and design Required Skills and Qualifications: Strong command of: React.js , Next.js JavaScript (ES6+) and TypeScript HTML5 , CSS3 , Tailwind CSS , Styled Components , or Bootstrap Proven experience in: Designing modular component-based architecture SSR, ISR, SSG patterns in Next.js Modern state management (Redux Toolkit, Zustand, React Query) RESTful API consumption and error handling Application security best practices (OAuth2, JWT, XSS/CSRF protection) Performance optimization (code splitting, dynamic imports, lazy loading, etc.) Dev tooling: Chrome DevTools, Lighthouse, Web Vitals, source map analysis Hands-on exposure to: CI/CD (GitHub Actions, Jenkins) Webpack/Vite bundling Git branching, GitHub PRs, version control standards Testing frameworks: Jest/Cypress. Strong foundation in debugging production issues, analyzing frontend logs, and performance bottlenecks Experience in building or maintaining design systems using tools like Storybook Ability to translate product vision into long-term frontend architecture plans Experience working in Agile teams and leading sprint activities Ensure accessibility compliance (a11y), semantic HTML and SEO optimization for frontend delivery Familiarity with AWS tools such as S3 , CloudFront , Lambda, Load Balancing and EC2 Knowledge of GraphQL, Design patterns, and caching layers Good to Have Working knowledge of backend tools and APIs using and Express.js Exposure to Vue.js , SvelteKit or other modern JS frameworks Understanding of micro frontends and federated module architecture Familiarity with infrastructure as code (Terraform, Pulumi - optional) Awareness of observability and monitoring tools like Sentry , LogRocket or Datadog Working knowledge of Docker-based local environments Contributions to documentation, technical blogs or internal tooling Experience with feature flags, A/B testing tools, or experiment-driven development

Posted 1 week ago

Apply

4.0 - 9.0 years

8 - 12 Lacs

Mumbai

Work from Office

We are looking for Role: AWS Infrastructure Engineer Experience: 4+ yrs Job Location: Bavdhan, Pune Work Mode: Remote Job Description: Skilled AWS Infrastructure Engineer with expertise in AWS services, Linux, and Windows systems. The ideal candidate will design, deploy, and manage scalable, secure cloud infrastructure while supporting hybrid environment. Key Responsibilities: Hands-on experience with multi-cloud environments (e.g., Azure, AWS, GCP) Design and maintain AWS infrastructure (EC2, S3, VPC, RDS, IAM, Lambda and other AWS services). Implement security best practices (IAM, GuardDuty, Security Hub, WAF). Configure and troubleshoot AWS networking and hybrid and url filtering solutions (VPC, TGW, Route 53, VPNs, Direct Connect). Experience managing physical firewall management (palo alto , cisco etc..) Manage , troubleshoot, Configure and optimize services like Apache, NGINX, and MySQL/PostgreSQL on Linux/Windows/ Ensure Linux/Windows server compliance with patch management and security updates. Provide L2/L3 support for Linux and Windows systems, ensuring minimal downtime and quick resolution of incidents Collaborate with DevOps, application, and database teams to ensure seamless integration of infrastructure solutions Automate tasks using Terraform, CloudFormation, or scripting (Bash, Python). Monitor and optimize cloud resources using CloudWatch, Trusted Advisor, and Cost Explorer Requirements: 4+ years of AWS experience and system administration in Linux & Windows. Proficiency in AWS networking, security, and automation tools. Certifications: AWS Solutions Architect (required), RHCSA/MCSE (preferred). Strong communication and problem-solving skills webserver - apache2 , nginx , IIS, OS- ubuntu , windows server certification : AWS solution architected associate level -- Muugddha Vanjarii 7822804824 mugdha.vanjari@sunbrilotechnologies.com

Posted 1 week ago

Apply

2.0 - 6.0 years

0 Lacs

pune, maharashtra

On-site

As a member of Zendesk's engineering team in Australia, your main objective is to improve the customer experience by developing products that cater to over 170,000 global brands. These brands, including Discord, Calm, and Skyscanner, rely on Zendesk's solutions to ensure customer satisfaction on a daily basis. Working within a highly innovative and fast-paced environment, you will have the opportunity to collaborate with a diverse group of individuals from around the world, contributing to the success of some of Zendesk's most beloved products. This position is a hybrid role that combines remote work with on-site requirements, necessitating three days in the office and relocation to Pune. You will be part of a dynamic team focused on creating distributed, high-scale, and data-intensive integrations that enhance Zendesk's core SaaS product. Collaborating with other SaaS providers and cloud vendors such as Slack, Atlassian, and AWS, you will be involved in incorporating cutting-edge technologies and features to deliver top-notch solutions to customers. Your daily responsibilities will include designing, leading, and implementing customer-facing software projects, emphasizing the importance of good software practices and timely project delivery. Excellent communication skills, attention to detail, and the ability to influence others diplomatically are essential qualities for this role. Additionally, you will be expected to demonstrate leadership qualities, mentor team members, and consistently apply best practices throughout the development cycle. To excel in this role, you should have a solid background in Golang for high-volume applications, at least 2 years of experience in frontend development using JavaScript and React, and a strong focus on long-term solution viability. Experience in identifying and resolving reliability issues at scale, effective time management, and building integrations with popular SaaS products are also key requirements. The tech stack you will be working with includes Golang, JavaScript/TypeScript, React, Redux, React Testing Library, Cypress, Jest, AWS, Spinnaker, Kubernetes, Aurora/MySQL, DynamoDB, and S3. Please note that candidates must be physically located and willing to work from Karnataka or Maharashtra. In this hybrid role, you will have the opportunity to work both remotely and onsite, fostering connections, collaboration, and learning while maintaining a healthy work-life balance. Zendesk is committed to providing an inclusive and fulfilling environment for its employees, enabling them to thrive in a diverse and supportive workplace.,

Posted 1 week ago

Apply

10.0 - 14.0 years

0 Lacs

maharashtra

On-site

As a Senior Python Engineer at JPMorgan Chase within the AM Research Technology team, you will play a crucial role in an agile team dedicated to enhancing, building, and delivering cutting-edge technology products in a secure, stable, and scalable manner. Your contributions will drive significant business impact and leverage your deep technical expertise to address a wide range of challenges across various technologies and applications, particularly focusing on cloud-based systems and AI-driven solutions. You will be responsible for executing software solutions, designing, developing, and troubleshooting technical issues with a forward-thinking approach to problem-solving. Your role will involve creating secure and high-quality production code, maintaining algorithms that operate seamlessly with different systems, and producing architecture and design artifacts for complex applications while ensuring adherence to design constraints. Additionally, you will analyze and synthesize large, diverse data sets to enhance software applications and systems continuously. Furthermore, your role will involve proactively identifying hidden problems and patterns in data to drive improvements in coding practices and system architecture. You will actively contribute to software engineering communities of practice and participate in events focused on exploring new and emerging technologies while fostering a team culture centered on diversity, equity, inclusion, and respect. To qualify for this position, you should possess formal training or certification in software engineering concepts along with a minimum of 10 years of applied experience. You must have hands-on practical experience in Java and Python development, as well as proficiency in Java frameworks like Spring and Python frameworks like Flask and SQL Alchemy. Experience in using AI/ML frameworks such as Langchain, Langchain4j, and Spring AI is essential, along with practical knowledge of AWS Services like ECS, EKS, and S3. Moreover, you should have a strong background in infrastructure provisioning using Terraform, practical experience in RDBMS like Postgres and NoSql databases like Dynamo DB and Elastic Cache, and expertise in system design, application development, testing, and operational stability. Proficiency in coding in multiple languages, along with extensive experience in developing, debugging, and maintaining code within a large corporate environment, is required. A solid understanding of agile methodologies, including CI/CD, Applicant Resiliency, and Security, is crucial, as well as demonstrated knowledge of software applications and technical processes within technical disciplines such as cloud computing, artificial intelligence, machine learning, and mobile technologies.,

Posted 1 week ago

Apply

3.0 - 7.0 years

0 Lacs

pune, maharashtra

On-site

As an AWS Cloud Engineer at Talent Worx, you will be responsible for designing, deploying, and managing AWS cloud solutions to meet our organizational objectives. Your expertise in AWS technologies will be crucial in building scalable, secure, and robust cloud architectures that ensure optimal performance and efficiency. Key Responsibilities: - Design, implement, and manage AWS cloud infrastructure following best practices - Develop cloud-based solutions utilizing AWS services such as EC2, S3, Lambda, RDS, and VPC - Automate deployment, scaling, and management of cloud applications using Infrastructure as Code (IaC) tools like AWS CloudFormation and Terraform - Implement security measures and best practices, including IAM, VPC security, and data protection - Monitor system performance, troubleshoot issues, and optimize cloud resources for cost and performance - Collaborate with development teams to set up CI/CD pipelines for streamlined deployment workflows - Conduct cloud cost analysis and optimization to drive efficiency - Stay updated on AWS features and industry trends for continuous innovation and improvement Required Skills and Qualifications: - 3+ years of experience in cloud engineering or related field, focusing on AWS technologies - Proficiency in AWS services like EC2, S3, EBS, RDS, Lambda, and CloudFormation - Experience with scripting and programming languages (e.g., Python, Bash) for automation - Strong understanding of networking concepts (VPC, subnetting, NAT, VPN) - Familiarity with containerization technologies such as Docker and orchestration tools like Kubernetes - Knowledge of DevOps principles, CI/CD tools, and practices - AWS certification (e.g., AWS Certified Solutions Architect, AWS Certified Developer) preferred - Analytical skills, attention to detail, and excellent communication abilities - Bachelor's degree in Computer Science, Information Technology, or related field Join Talent Worx and enjoy benefits like global exposure, accelerated career growth, diverse learning opportunities, collaborative culture, cross-functional mobility, access to cutting-edge tools, a focus on purpose and impact, and mentorship for leadership development.,

Posted 1 week ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies