Jobs
Interviews

215 Rds Jobs - Page 3

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

You will be responsible for planning, implementing, and growing the AWS cloud infrastructure. Your role will involve building, releasing, and managing the configuration of all production systems. It will be essential to manage a continuous integration and deployment methodology for server-based technologies. Collaboration with architecture and engineering teams to design and implement scalable software services will also be part of your responsibilities. Ensuring system security through the utilization of best-in-class cloud security solutions will be crucial. Staying up to date with new technology options and vendor products is important, and you will be expected to evaluate which ones would be suitable for the company. Implementing continuous integration/continuous delivery (CI/CD) pipelines when needed will also fall under your purview. You will have the opportunity to recommend process and architecture improvements, troubleshoot the system, and resolve problems across all platform and application domains. Overseeing pre-production acceptance testing to maintain the high quality of the company's services and products will be part of your duties. Experience with Terraform, Ansible, GIT, and Cloud Formation will be beneficial for this role. Additionally, a solid background in Linux/Unix and Windows server system administration is required. Configuring the AWS CloudWatch and monitoring, creating and modifying scripts, and hands-on experience with MySQL are also essential skills. You should have experience in designing and building web environments on AWS, including working with services like EC2, ELB, RDS, and S3. This is a full-time position with benefits such as Provident Fund and a yearly bonus. The work schedule is during the day shift, and the preferred experience level for AWS is 3 years. The work location is in person.,

Posted 1 week ago

Apply

4.0 - 8.0 years

0 Lacs

chennai, tamil nadu

On-site

This is a full-time on-site role for a PHP Laravel Developer based in Chennai. In this position, you will play a key role in developing and maintaining web applications utilizing the Laravel framework. Your responsibilities will include coding, debugging, testing, and deploying new features. Additionally, you will collaborate with cross-functional teams to create efficient and scalable solutions. To excel in this role, you must possess a strong proficiency in PHP and have hands-on experience with the Laravel framework. Familiarity with frontend technologies like HTML, CSS, and JavaScript is essential. Moreover, knowledge of database management systems, particularly MySQL, is required. Understanding RESTful APIs, integrating third-party services, and using version control systems like Git are also important aspects of this position. Candidates should have practical experience in schema design, query optimization, REST API, and AWS services such as EC2, S3, RDS, Lambda, and Redis. Proficiency in designing scalable and secure web applications, expertise in automated testing frameworks, and a solid grasp of web security practices are crucial for success in this role. The ideal candidate will be able to prioritize tasks effectively and work both independently and collaboratively as part of a team. Strong problem-solving and troubleshooting skills are essential, as is clear communication and the ability to work with others. A Bachelor's degree in computer science or a related field, or equivalent experience, is required. Requirements: - Strong proficiency in PHP with Laravel framework - Experience in HTML, CSS, and JavaScript - Knowledge of MySQL and RESTful APIs - Familiarity with Git and version control systems - Hands-on experience with schema design, query optimization, and REST API - Profound knowledge of AWS services - Demonstrated experience in designing scalable and secure web applications - Expertise in automated testing frameworks - Strong understanding of web security practices - Ability to prioritize tasks and work independently or as part of a team - Excellent problem-solving and troubleshooting skills - Good communication and collaboration skills - Bachelor's degree or equivalent experience in computer science or related field Experience: 4+ Years Location: Chennai/Madurai Interested candidates can share CV at anushya.a@extendotech.com / 6374472538 Job Type: Full-time Benefits: Health insurance, Provident Fund Location Type: In-person Schedule: Morning shift Work Location: In person,

Posted 1 week ago

Apply

10.0 - 17.0 years

0 Lacs

hyderabad, telangana

On-site

We have an exciting opportunity for an ETL Data Architect position with an AI-ML driven SaaS Solution Product Company in Hyderabad. As an ETL Data Architect, you will play a crucial role in designing and implementing a robust Data Access Layer to provide consistent data access needs to the underlying heterogeneous storage layer. You will also be responsible for developing and enforcing data governance policies to ensure data security, quality, and compliance across all systems. In this role, you will lead the architecture and design of data solutions that leverage the latest tech stack and AWS cloud services. Collaboration with product managers, tech leads, and cross-functional teams will be essential to align data strategy with business objectives. Additionally, you will oversee data performance optimization, scalability, and reliability of data systems while guiding and mentoring team members on data architecture, design, and problem-solving. The ideal candidate should have at least 10 years of experience in data-related roles, with a minimum of 5 years in a senior leadership position overseeing data architecture and infrastructure. A deep background in designing and implementing enterprise-level data infrastructure, preferably in a SaaS environment, is required. Extensive knowledge of data architecture principles, data governance frameworks, security protocols, and performance optimization techniques is essential. Hands-on experience with AWS services such as RDS, Redshift, S3, Glue, Document DB, as well as other services like MongoDB, Snowflake, etc., is highly desirable. Familiarity with big data technologies (e.g., Hadoop, Spark) and modern data warehousing solutions is a plus. Proficiency in at least one programming language (e.g., Node.js, Java, Golang, Python) is a must. Excellent communication skills are crucial in this role, with the ability to translate complex technical concepts to non-technical stakeholders. Proven leadership experience, including team management and cross-functional collaboration, is also required. A Bachelor's degree in computer science, Information Systems, or related field is necessary, with a Master's degree being preferred. Preferred qualifications include experience with Generative AI and Large Language Models (LLMs) and their applications in data solutions, as well as familiarity with financial back-office operations and the FinTech domain. Stay updated on emerging trends in data technology, particularly in AI/ML applications for finance. Industry: IT Services and IT Consulting,

Posted 2 weeks ago

Apply

7.0 - 12.0 years

10 - 15 Lacs

Bengaluru

Hybrid

Hiring an AWS Data Engineer for a 6-month hybrid contractual role based in Bellandur, Bengaluru. The ideal candidate will have 7+ years of experience in data engineering, with strong expertise in AWS services (S3, EC2, RDS, Lambda, EKS), PostgreSQL, Redis, Apache Iceberg, and Graph/Vector Databases. Proficiency in Python or Golang is essential. Responsibilities include designing and optimizing data pipelines on AWS, managing structured and in-memory data, implementing advanced analytics with vector/graph databases, and collaborating with cross-functional teams. Prior experience with CI/CD and containerization (Docker/Kubernetes) is a plus.

Posted 2 weeks ago

Apply

3.0 - 8.0 years

2 - 7 Lacs

Nagpur

Work from Office

Role & responsibilities Provide advanced L2 support for server, virtualization, and desktop infrastructure. Design, create, and optimize Group Policies and global IT policies across multi-domain environments. Hands on knowledge and experience working on Windows OS (Client OS and Server OS) Hands on knowledge and experience on Active Directory, Azure Administration, O365 Administration. Manage and support Windows Servers , including installation, configuration, and maintenance. Work extensively with virtualization platforms such as Hyper-V and VMware . Configure and manage VDI solutions , especially using VMware Horizon . Set up and maintain Remote Desktop Services , including complex multi-user environments. Perform image refresh, deployment, and configuration of Thin Clients . Manage backup and disaster recovery solutions using Veeam . Collaborate with internal and external teams to support IT infrastructure for multiple clients. Maintain documentation for configurations, procedures, and changes. Required Skills: Proven experience in L2 IT Infrastructure Support Hands-on expertise in MSP environments and multi-client infrastructure management Strong understanding of Group Policy , Active Directory , DNS , and DHCP Proficient in VMware , Hyper-V , Horizon View , and overall virtualization technologies Experience with Veeam Backup & Replication In-depth knowledge of Remote Desktop Services and Thin Client configuration & image deployment Excellent communication, problem-solving, and documentation skills

Posted 2 weeks ago

Apply

6.0 - 11.0 years

12 - 22 Lacs

Bengaluru

Work from Office

Our engineering team is looking for a Data Engineer who is very proficient in python, has a very good understanding of AWS cloud computing, ETL Pipelines and demonstrates proficiency with SQL and relational database concepts In this role you will be a very mid-to senior-level individual contributor guiding our migration efforts by serving as a Senior data engineer working closely with the Data architects to evaluate best-fit solutions and processes for our team You will work with the rest of the team as we move away from legacy tech and introduce new tools and ETL pipeline solutions You will collaborate with subject matter experts, data architects, informaticists and data scientists to evolve our current cloud based ETL to the next Generation Responsibilities Independently prototypes/develop data solutions of high complexity to meet the needs of the organization and business customers. Designs proof-of-concept solutions utilizing an advanced understanding of multiple coding languages to meet technical and business requirements, with an ability to perform iterative solution testing to ensure specifications are met. Designs and develops data solutions that enables effective self-service data consumption, and can describe their value to the customer. Collaborates with stakeholders in defining metrics that are impactful to the business. Prioritize efforts based on customer value. Has an in-depth understanding of Agile techniques. Can set expectations for deliverables of high complexity. Can assist in the creation of roadmaps for data solutions. Can turn vague ideas or problems into data product solutions. Influences strategic thinking across the team and the broader organization. Maintains proof-of-concepts and prototype data solutions, and manages any assessment of their viability and scalability, with own team or in partnership with IT. Working with IT, assists in building robust systems focusing on long-term and ongoing maintenance and support. Ensure data solutions include deliverables required to achieve high quality data. Displays a strong understanding of complex multi-tier, multi-platform systems, and applies principles of metadata, lineage, business definitions, compliance, and data security to project work. Has an in-depth understanding of Business Intelligence tools, including visualization and user experience techniques. Can set expectations for deliverables of high complexity. Works with IT to help scale prototypes. Demonstrates a comprehensive understanding of new technologies as needed to progress initiatives. Requirements Expertise in Python programming, with demonstrated real-world experience building out data tools in a Python environment. Expertise in AWS Services , with demonstrated real-world experience building out data tools in a Python environment. Bachelor`s Degree in Computer Science, Computer Engineering, or related discipline preferred. Masters in same or related disciplines strongly preferred. 3+ years experience in coding for data management, data warehousing, or other data environments, including, but not limited to, Python and Spark. Experience with SAS is preferred. 3+ years experience as developer working in an AWS cloud computing environment. 3+ years experience using GIT or Bitbucket. Experience with Redshift, RDS, DynomoDB is preferred. Strong written and oral communication skills are required. Experience in the healthcare industry with healthcare data analytics products Experience with healthcare vocabulary and data standards (OMOP, FHIR) is a plus

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 14 Lacs

Bengaluru

Work from Office

Key Skills: Core Java, Spring / Spring Boot, AWS Services (EC2, S3, Lambda, RDS, API Gateway, CloudWatch), RESTful APIs, Microservices Architecture, Docker, Git / Version Control, CI/CD Tools (SQL / RDBMS , Agile Methodologies, Unit Testing (JUnit, Mockito), Maven / Gradle, API Documentation . Roles and Responsibilities: Design, develop, and maintain scalable Java-based applications. Implement and manage services on AWS cloud infrastructure. Collaborate with cross-functional teams to gather requirements and deliver solutions. Develop RESTful APIs and integrate them with front-end components. Ensure high performance, reliability, and scalability of the application. Monitor, troubleshoot, and optimize application performance. Write unit and integration tests to ensure code quality. Participate in code reviews and follow best practices in software development. Experience Requirement: Strong proficiency in Java development. 5-8 years of experience with AWS services such as EC2, S3, Lambda, RDS, and API Gateway. Familiarity with microservices architecture and containerization tools like Docker. Experience with version control systems such as Git. Good understanding of CI/CD pipelines and DevOps practices. Strong analytical and problem-solving skills. Minimum years of relevant experience. Education : B.E., B.Tech, B. Sc.

Posted 2 weeks ago

Apply

8.0 - 12.0 years

22 - 35 Lacs

Bengaluru

Hybrid

Role & responsibilities As a Senior Data Engineer and database specialist you will be designing, creating and managing the cloud databases and data pipelines that underpin our decoupled cloud architecture and API first approach. You have proven expertise in database design, data ingestion, transformation, data writing, scheduling and query management within a cloud environment. You will have proven experience and expertise in working with AWS Cloud Infrastructure Engineers, Software/API Developers and Architects to design, develop, deploy and operate data services and solutions that underpin a cloud ecosystem. You will take ownership and accountability of functional and non-functional design and work within a team of Engineers to create innovative solutions that unlock value and modernise technology designs. You will role model continuous improvement mindset in the team, and in your project interactions, by taking technical ownership of key assets, including roadmaps and technical direction of data services running on our AWS environments. See yourself in our team The Business Banking Technology Domain works in an Agile methodology with our business banking business to plan, prioritise and deliver on high value technology objectives with key results that meet our regulatory obligations and protect the community. You will work within the VRM Crew that is working on initiatives such as Gen AI based cash flow coach to provide relevant data to our regulators. To achieve our objectives, you will use you deep understanding of data modelling and data quality and your extensive experience with SQL to access relational databases such as Oracle and Postgres to identify, transform and validate data required for complex business reporting requirements. You will use your experience in designing and building reliable and efficient data pipelines preferably using modern cloud services on AWS such as S3, Lambda, Redshift, Glue, etc to process large volumes of data efficiently. Experience with data centric frameworks such as Spark with programming knowledge in Scala or Python is highly advantageous. As is experience working on Linux with shell and automation frameworks to manage code and infrastructure in a well-structured and reliable manner. Experience with Pega workflow software as a source or target for data integration is also highly regarded. Were interested in hearing from people who: • Can design and implement databases for data integration in the enterprise • Can performance tune applications from a database code and design perspective • Can automate data ingestion and transformation processes using scheduling tools. Monitor and troubleshoot data pipelines to ensure reliability and performance. • Have experience working through performance and scaling through horizontal scaling designs vs database tuning • Can design application logical database requirements and implement physical solutions • Can collaborate with business and technical teams in order to design and build critical databases and data pipelines • Can advise business owners on strategic database direction and application solution design Tech skills We use a broad range of tools, languages, and frameworks. We dont expect you to know them all but having significant experience and exposure with some of these (or equivalents) will set you up for success in this team. • AWS Data products such as AWS Glue and AWS EMR • Oracle and AWS Aurora RDS such as PostgreSQL • AWS S3 ingestion, transformation and writing to databases • Proficiency in programming languages like Python, Scala or Java for developing data ingestion and transformation scripts. • Strong knowledge of SQL for writing, optimizing, and debugging queries. • Familiarity with database design, indexing, and normalization principles. Understanding of data formats (JSON, CSV, XML) and techniques for converting between them. Ability to handle data validation, cleaning, and transformation. • Proficiency in automation tools and scripting (e.g., bash scripting, cron jobs) for scheduling and monitoring data processes. • Experience with version control systems (e.g., Git) for managing code and collaboration. Working with us: Whether youre passionate about customer service, driven by data, or called by creativity, a career with CommBank is for you. Our people bring their diverse backgrounds and unique perspectives to build a respectful, inclusive, and flexible workplace with flexible work locations. One where were driven by our values, and supported to share ideas, initiatives, and energy. One where making a positive impact for customers, communities and each other is part of our every day. Here, youll thrive. You’ll be supported when faced with challenges and empowered to tackle new opportunities. We’re hiring engineers from across all of Australia and have opened technology hubs in Melbourne and Perth. We really love working here, and we think you will too. We support our people with the flexibility to balance where work is done with at least half their time each month connecting in office. We also have many other flexible working options available including changing start and finish times, part-time arrangements and job share to name a few. Talk to us about how these arrangements might work in the role you’re interested in. If this sounds like the role for you then we would love to hear from you. Apply today! If you are interested for this job so Please share your detail with updated CV on Krishankant@thinkpeople.in Total Exp.- Rel Exp.- Current Company- CTC- ECTC- Notice Period- DOB- Edu.-

Posted 2 weeks ago

Apply

3.0 - 7.0 years

8 - 12 Lacs

Mumbai, Navi Mumbai, Mumbai (All Areas)

Work from Office

Cloud DevOps Engineer (AWS) 3 to 5 Years Experience Location - Mumbai ( WFO) Core Technical Skills: 1. AWS Services: AWS EKS/ECS, RDS, NACLs, Route Tables, Security Services config 2. CI/CD & Infrastructure As Code: Jenkins, Terraform, DevSecOps, End-to-end DevOps 3. Containerization: Docker and Kubernetes 4. Familiarity with tools: ELK, Redis, WAf, Firewall, VPN, Cloudfront/CDN, 5. Linux OS Bonus Skills: 1. AWS Certification (e.g., AWS Certified DevOps Engineer/AWS Solution Architect) 2. Knowledge of cost optimization techniques 4. Linux Administration: Strong in managing Linux and automation via scripts

Posted 2 weeks ago

Apply

6.0 - 10.0 years

22 - 25 Lacs

Bengaluru

Work from Office

Our engineering team is looking for a Data Engineer who is very proficient in python, has a very good understanding of AWS cloud computing, ETL Pipelines and demonstrates proficiency with SQL and relational database concepts. In this role you will be a very mid-to senior-level individual contributor guiding our migration efforts by serving as a Senior data engineer working closely with the Data architects to evaluate best-fit solutions and processes for our team. You will work with the rest of the team as we move away from legacy tech and introduce new tools and ETL pipeline solutions. You will collaborate with subject matter experts, data architects, informaticists and data scientists to evolve our current cloud based ETL to the next Generation. Responsibilities Independently prototypes/develop data solutions of high complexity to meet the needs of the organization and business customers. Designs proof-of-concept solutions utilizing an advanced understanding of multiple coding languages to meet technical and business requirements, with an ability to perform iterative solution testing to ensure specifications are met. Designs and develops data solutions that enables effective self-service data consumption, and can describe their value to the customer. Collaborates with stakeholders in defining metrics that are impactful to the business. Prioritize efforts based on customer value. Has an in-depth understanding of Agile techniques. Can set expectations for deliverables of high complexity. Can assist in the creation of roadmaps for data solutions. Can turn vague ideas or problems into data product solutions. Influences strategic thinking across the team and the broader organization. Maintains proof-of-concepts and prototype data solutions, and manages any assessment of their viability and scalability, with own team or in partnership with IT. Working with IT, assists in building robust systems focusing on long-term and ongoing maintenance and support. Ensure data solutions include deliverables required to achieve high quality data. Displays a strong understanding of complex multi-tier, multi-platform systems, and applies principles of metadata, lineage, business definitions, compliance, and data security to project work. Has an in-depth understanding of Business Intelligence tools, including visualization and user experience techniques. Can set expectations for deliverables of high complexity. Works with IT to help scale prototypes. Demonstrates a comprehensive understanding of new technologies as needed to progress initiatives. Requirements Expertise in Python programming, with demonstrated real-world experience building out data tools in a Python environment. Expertise in AWS Services , with demonstrated real-world experience building out data tools in a Python environment. Bachelor`s Degree in Computer Science, Computer Engineering, or related discipline preferred. Masters in same or related disciplines strongly preferred. 3+ years experience in coding for data management, data warehousing, or other data environments, including, but not limited to, Python and Spark. Experience with SAS is preferred. 3+ years experience as developer working in an AWS cloud computing environment. 3+ years experience using GIT or Bitbucket. Experience with Redshift, RDS, DynomoDB is preferred. Strong written and oral communication skills are required. Experience in the healthcare industry with healthcare data analytics products Experience with healthcare vocabulary and data standards (OMOP, FHIR) is a plus

Posted 2 weeks ago

Apply

6.0 - 9.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Our engineering team is looking for a Data Engineer who is very proficient in python, has a very good understanding of AWS cloud computing, ETL Pipelines and demonstrates proficiency with SQL and relational database concepts. In this role you will be a very mid-to senior-level individual contributor guiding our migration efforts by serving as a Senior data engineer working closely with the Data architects to evaluate best-fit solutions and processes for our team. You will work with the rest of the team as we move away from legacy tech and introduce new tools and ETL pipeline solutions. You will collaborate with subject matter experts, data architects, informaticists and data scientists to evolve our current cloud based ETL to the next Generation. Responsibilities Independently prototypes/develop data solutions of high complexity to meet the needs of the organization and business customers. Designs proof-of-concept solutions utilizing an advanced understanding of multiple coding languages to meet technical and business requirements, with an ability to perform iterative solution testing to ensure specifications are met. Designs and develops data solutions that enables effective self-service data consumption, and can describe their value to the customer. Collaborates with stakeholders in defining metrics that are impactful to the business. Prioritize efforts based on customer value. Has an in-depth understanding of Agile techniques. Can set expectations for deliverables of high complexity. Can assist in the creation of roadmaps for data solutions. Can turn vague ideas or problems into data product solutions. Influences strategic thinking across the team and the broader organization. Maintains proof-of-concepts and prototype data solutions, and manages any assessment of their viability and scalability, with own team or in partnership with IT. Working with IT, assists in building robust systems focusing on long-term and ongoing maintenance and support. Ensure data solutions include deliverables required to achieve high quality data. Displays a strong understanding of complex multi-tier, multi-platform systems, and applies principles of metadata, lineage, business definitions, compliance, and data security to project work. Has an in-depth understanding of Business Intelligence tools, including visualization and user experience techniques. Can set expectations for deliverables of high complexity. Works with IT to help scale prototypes. Demonstrates a comprehensive understanding of new technologies as needed to progress initiatives. Requirements Expertise in Python programming, with demonstrated real-world experience building out data tools in a Python environment. Expertise in AWS Services , with demonstrated real-world experience building out data tools in a Python environment. Bachelor`s Degree in Computer Science, Computer Engineering, or related discipline preferred. Masters in same or related disciplines strongly preferred. 3+ years experience in coding for data management, data warehousing, or other data environments, including, but not limited to, Python and Spark. Experience with SAS is preferred. 3+ years experience as developer working in an AWS cloud computing environment. 3+ years experience using GIT or Bitbucket. Experience with Redshift, RDS, DynomoDB is preferred. Strong written and oral communication skills are required. Experience in the healthcare industry with healthcare data analytics products Experience with healthcare vocabulary and data standards (OMOP, FHIR) is a plus

Posted 2 weeks ago

Apply

6.0 - 11.0 years

10 - 20 Lacs

Bengaluru

Work from Office

We are looking for energetic, self-motivated and exceptional Data engineer to work on extraordinary enterprise products based on AI and Big Data engineering leveraging AWS/Databricks tech stack. He/she will work with star team of Architects, Data Scientists/AI Specialists, Data Engineers and Integration. Skills and Qualifications: 5+ years of experience in DWH/ETL Domain; Databricks/AWS tech stack 2+ years of experience in building data pipelines with Databricks/ PySpark /SQL Experience in writing and interpreting SQL queries, designing data models and data standards. Experience in SQL Server databases, Oracle and/or cloud databases. Experience in data warehousing and data mart, Star and Snowflake model. Experience in loading data into database from databases and files. Experience in analyzing and drawing design conclusions from data profiling results. Understanding business process and relationship of systems and applications. Must be comfortable conversing with the end-users. Job Description: Our engineering team is looking for a Data Engineer who is very proficient in python, has a very good understanding of AWS cloud computing, ETL Pipelines and demonstrates proficiency with SQL and relational database concepts. In this role you will be a very mid-to senior-level individual contributor guiding our migration efforts by serving as a Senior data engineer working closely with the Data architects to evaluate best-fit solutions and processes for our team. You will work with the rest of the team as we move away from legacy tech and introduce new tools and ETL pipeline solutions. You will collaborate with subject matter experts, data architects, informaticists and data scientists to evolve our current cloud based ETL to the next Generation. Responsibilities Independently prototypes/develop data solutions of high complexity to meet the needs of the organization and business customers. Designs proof-of-concept solutions utilizing an advanced understanding of multiple coding languages to meet technical and business requirements, with an ability to perform iterative solution testing to ensure specifications are met. Designs and develops data solutions that enables effective self-service data consumption, and can describe their value to the customer. Collaborates with stakeholders in defining metrics that are impactful to the business. Prioritize efforts based on customer value. Has an in-depth understanding of Agile techniques. Can set expectations for deliverables of high complexity. Can assist in the creation of roadmaps for data solutions. Can turn vague ideas or problems into data product solutions. Influences strategic thinking across the team and the broader organization. Maintains proof-of-concepts and prototype data solutions, and manages any assessment of their viability and scalability, with own team or in partnership with IT. Working with IT, assists in building robust systems focusing on long-term and ongoing maintenance and support. Ensure data solutions include deliverables required to achieve high quality data. Displays a strong understanding of complex multi-tier, multi-platform systems, and applies principles of metadata, lineage, business definitions, compliance, and data security to project work. Has an in-depth understanding of Business Intelligence tools, including visualization and user experience techniques. Can set expectations for deliverables of high complexity. Works with IT to help scale prototypes. Demonstrates a comprehensive understanding of new technologies as needed to progress initiatives. Requirements Expertise in Python programming, with demonstrated real-world experience building out data tools in a Python environment. Expertise in AWS Services , with demonstrated real-world experience building out data tools in a Python environment. Bachelor`s Degree in Computer Science, Computer Engineering, or related discipline preferred. Masters in same or related disciplines strongly preferred. 3+ years experience in coding for data management, data warehousing, or other data environments, including, but not limited to, Python and Spark. Experience with SAS is preferred. 3+ years experience as developer working in an AWS cloud computing environment. 3+ years experience using GIT or Bitbucket. Experience with Redshift, RDS, DynomoDB is preferred. Strong written and oral communication skills are required. Experience in the healthcare industry with healthcare data analytics products Experience with healthcare vocabulary and data standards (OMOP, FHIR) is a plus

Posted 2 weeks ago

Apply

6.0 - 11.0 years

5 - 8 Lacs

Bengaluru

Work from Office

Experience - 6- 12 Years Notice - Immediate to 30 days Requirements:- Full Stack Development: Build and maintain web applications using React.js and Node.js. Develop back-end services and APIs using Node.js and selectively in Python FastAPI. Create RESTful and GraphQL APIs; integrate with internal and external systems. Optimize frontend performance and backend scalability. Cloud & DevOps (AWS): Deploy and manage services on AWS (EC2, S3, Lambda, RDS, API Gateway, etc.). Set up CI/CD pipelines for automated build, test, and deployment. Monitor cloud environments for reliability, cost-efficiency, and performance. Implement security best practices (IAM policies, encryption, WAF, etc.). Skills: Tech Stack: Frontend: React.js, Redux, HTML5/CSS3, Next.js (optional) Backend: Node.js (Express.js), Python (FastAPI) Database: MongoDB, PostgreSQL/MySQL (optional) Cloud: AWS (EC2, S3, Lambda, API Gateway, IAM, RDS, etc.) DevOps: Docker, CI/CD, GitHub/GitLab Actions Others: REST APIs, GraphQL, JWT/OAuth, WebSockets, and Microservices.

Posted 2 weeks ago

Apply

4.0 - 8.0 years

6 - 10 Lacs

Noida

Work from Office

Job Summary: We are seeking an experienced and results-driven Senior Java Developer to join our team in Noida . The ideal candidate should have strong hands-on experience in Java, Spring Framework, JPA, Hibernate, Kubernetes, AWS, and Microservices architecture. You will play a critical role in the design, development, and deployment of scalable and high-performance applications. Key Responsibilities: Design, develop, and implement robust and scalable Java-based applications. Develop microservices using Spring Boot . Hands-on experience with Docker-based containerization and Kubernetes for application deployment. Work with JPA and Hibernate for effective data persistence and database operations. Deploy and manage services on AWS cloud infrastructure . Collaborate with architects, DevOps engineers, QA, and other developers to deliver enterprise-grade solutions. Optimize application performance and ensure responsiveness to front-end requests. Ensure code quality and maintainability through code reviews and best practices. Participate in the full software development life cycle: requirement analysis, design, development, testing, and deployment. Required Skills & Qualifications: 4 to 8 years of strong Java development experience. Hands-on experience with Spring Boot , Spring Core , and other spring modules. Strong knowledge of JPA and Hibernate ORM frameworks. Experience with Kubernetes for container orchestration and microservices management. Working knowledge of AWS services (EC2, S3, RDS, ECS, etc.). Strong understanding of RESTful APIs and Microservices architecture . Familiarity with CI/CD tools and version control systems (e.g., Git, Jenkins). Solid problem-solving skills and a strong sense of ownership. Bachelors or Masters degree in Computer Science, Engineering, or related discipline. Preferred Skills: Experience with Docker and containerization. Exposure to monitoring tools like Prometheus, Grafana, etc. Knowledge of Agile/Scrum methodologies.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

8 - 18 Lacs

Gurugram

Remote

Job Title: Part-Time DevOps Engineer (AWS) Contract Location: Remote Engagement Type: Part-Time Contract (Hourly/Monthly Block) Experience Required: 5+ Years in DevOps and AWS Overview: We are seeking an experienced DevOps Engineer with a strong background in AWS cloud infrastructure for a part-time, remote contract role. This is not a full-time opportunity; we are looking to engage a professional on an hourly or monthly block-of-time basis to support ongoing infrastructure, CI/CD, automation, and cloud optimization needs. Key Responsibilities: Manage and optimize AWS infrastructure (EC2, S3, IAM, VPC, Lambda, RDS, etc.) Build and maintain CI/CD pipelines using tools such as GitHub Actions, Jenkins, or similar Implement Infrastructure as Code (IaC) using Terraform or CloudFormation Monitor and troubleshoot system performance, scalability, and security issues Automate repetitive tasks and deploy updates with zero downtime Collaborate with development and product teams to align DevOps practices Provide on-demand support and availability within agreed working hours Required Skills & Qualifications: 5+ years of hands-on experience in DevOps roles Strong expertise in AWS cloud services and cost optimization strategies Proficiency with CI/CD, Docker, Kubernetes, Git, and scripting (Bash/Python) Experience with monitoring/logging tools (CloudWatch, ELK, Prometheus, etc.) Solid understanding of security, networking, and cloud architecture Excellent problem-solving and communication skills Prior experience working in remote/contract-based roles preferred

Posted 2 weeks ago

Apply

4.0 - 9.0 years

5 - 12 Lacs

Silchar, Goalpara, Dimapur

Work from Office

Role & responsibilities Area of Responsibility Deliver volume & revenue sales target for all products by executing the distribution strategy at the channel-partner(RDS) level Monitor quality of distribution through the RDS sales team Strength relationship with key retail customers Competition Tracking & reporting schemes & programs Ensure availability of stock at RDS and Retail while adhering to the norms Execute promotional activities for channel partners to drive sales and build market credibility Distribution expansion and extraction: Achieve retail (MBO) expansion targets through increase in number of outlets in existing and new geographies Requirements & Expectations RDS Sales Executive Management (RDS SE) Target Setting for RDS SE RDS SE beat plan adherence Systems / formats at RDS SE Manage In-store promoters Impart product knowledge to sellers Drive distribution KPIs delivery RDS Management RDS Infra / SE Availability monitoring Monitor Stock holding & Market credit Day to day Performance Review & discussions Problem Solving Systems/formats at RDS point Compliance to company policies Critical Success Factors Continuous Learning & Empowering Talent Building Team Commitment Leads Decision Making & Delivering Results Builds Strategic Relationships & Organizational Agility Analytical Thinking Core Competencies Products Services & Technology Knowledge Consumer Negotiation Working with Partners Solving Problems Sales Planning & Forecasting Formal qualifications University degree in Business, Marketing or Engineering/ICT (or similar/equivalent). Higher university such as an MBA considered a merit. Three to Five years of experience in distribution planning and channel implementation. Understanding of general retail management best practices and customer relationship management. Hardworking, persistent, and dependable. Positive and enthusiastic. Financial Accountability for revenue targets for distribution channel for all products. Non Financial Monitoring of distributors sales force and retailers Resolution of channel-specific issues within timelines. Key performance indicators – Your Background Achievement of key targets in the distribution network (Sales, Revenue) in the territory. Achievement of retail outlet (MBO) expansion targets. Performance management of channel partners, sales force. Delivery of distribution metrics Interested candidate kindly share your updated resume amrita.singh@manpower.co.in

Posted 3 weeks ago

Apply

6.0 - 9.0 years

14 - 24 Lacs

Hyderabad, Bengaluru

Hybrid

Hiring for Dotnet Fullstack with Cloud Exp- 6-9 yrs Level - Assistant Manager but IC role Skill and Location Dotnet Core with Angular and Azure Service (Not Azure Devops) for Bangalore / Hyderabad Dotnet Core with Angular and AWS for Hyderabad

Posted 3 weeks ago

Apply

6.0 - 11.0 years

11 - 12 Lacs

Hyderabad

Work from Office

We are seeking a highly skilled Devops Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Devops and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in devops, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 6 to 11+ years of experience in full-stack development, with a strong focus on DevOps. DevOps with AWS Data Engineer - Roles & Responsibilities: Use AWS services like EC2, VPC, S3, IAM, RDS, and Route 53. Automate infrastructure using Infrastructure as Code (IaC) tools like Terraform or AWS CloudFormation . Build and maintain CI/CD pipelines using tools AWS CodePipeline, Jenkins,GitLab CI/CD. Cross-Functional Collaboration Automate build, test, and deployment processes for Java applications. Use Ansible , Chef , or AWS Systems Manager for managing configurations across environments. Containerize Java apps using Docker . Deploy and manage containers using Amazon ECS , EKS (Kubernetes) , or Fargate . Monitoring & Logging using Amazon CloudWatch,Prometheus + Grafana,E Stack (Elasticsearch, Logstash, Kibana),AWS X-Ray for distributed tracing manage access with IAM roles/policies . Use AWS Secrets Manager / Parameter Store for managing credentials. Enforce security best practices , encryption, and audits. Automate backups for databases and services using AWS Backup , RDS Snapshots , and S3 lifecycle rules . Implement Disaster Recovery (DR) strategies. Work closely with development teams to integrate DevOps practices. Document pipelines, architecture, and troubleshooting runbooks. Monitor and optimize AWS resource usage. Use AWS Cost Explorer , Budgets , and Savings Plans . Must-Have Skills: Experience working on Linux-based infrastructure. Excellent understanding of Ruby, Python, Perl, and Java . Configuration and managing databases such as MySQL, Mongo. Excellent troubleshooting. Selecting and deploying appropriate CI/CD tools Working knowledge of various tools, open-source technologies, and cloud services. Awareness of critical concepts in DevOps and Agile principles. Managing stakeholders and external interfaces. Setting up tools and required infrastructure. Defining and setting development, testing, release, update, and support processes for DevOps operation. Have the technical skills to review, verify, and validate the software code developed in the project. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm

Posted 3 weeks ago

Apply

4.0 - 6.0 years

6 - 8 Lacs

Bengaluru, Bellandur

Hybrid

Hiring an AWS Data Engineer for a 6-month hybrid contractual role based in Bellandur, Bengaluru. The ideal candidate will have 4-6 years of experience in data engineering, with strong expertise in AWS services (S3, EC2, RDS, Lambda, EKS), PostgreSQL, Redis, Apache Iceberg, and Graph/Vector Databases. Proficiency in Python or Golang is essential. Responsibilities include designing and optimizing data pipelines on AWS, managing structured and in-memory data, implementing advanced analytics with vector/graph databases, and collaborating with cross-functional teams. Prior experience with CI/CD and containerization (Docker/Kubernetes) is a plus.

Posted 3 weeks ago

Apply

2.0 - 7.0 years

4 - 6 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Develop scalable microservices using Java Spring Boot Design, implement REST APIs and integrate with frontend and external services Deploy and manage services using AWS services like EC2, S3, Lambda, RDS, and ECS Required Candidate profile Use CI/CD pipelines for automated builds, deployments (e.g., Jenkins, GitHub Actions) Collaborate with frontend, QA, DevOps, and business teams Write unit, integration tests to ensure code quality Perks and benefits Perks and Benefits

Posted 3 weeks ago

Apply

4.0 - 9.0 years

15 - 25 Lacs

Hyderabad

Work from Office

python, experienced in performing ETL andData Engineering concepts (PySpark, NumPy, Pandas, AWS Glue and Airflow)SQL exclusively Oracle hands-on work experience SQL profilers / Query Analyzers AWS cloud related (S3, RDS, RedShift) ETLPython

Posted 3 weeks ago

Apply

4.0 - 9.0 years

6 - 11 Lacs

Silchar, Goalpara, Dimapur

Work from Office

Role & responsibilities Area of Responsibility Deliver volume & revenue sales target for all products by executing the distribution strategy at the channel-partner(RDS) level Monitor quality of distribution through the RDS sales team Strength relationship with key retail customers Competition Tracking & reporting schemes & programs Ensure availability of stock at RDS and Retail while adhering to the norms Execute promotional activities for channel partners to drive sales and build market credibility Distribution expansion and extraction: Achieve retail (MBO) expansion targets through increase in number of outlets in existing and new geographies Requirements & Expectations RDS Sales Executive Management (RDS SE) Target Setting for RDS SE RDS SE beat plan adherence Systems / formats at RDS SE Manage In-store promoters Impart product knowledge to sellers Drive distribution KPIs delivery RDS Management RDS Infra / SE Availability monitoring Monitor Stock holding & Market credit Day to day Performance Review & discussions Problem Solving Systems/formats at RDS point Compliance to company policies Critical Success Factors Continuous Learning & Empowering Talent Building Team Commitment Leads Decision Making & Delivering Results Builds Strategic Relationships & Organizational Agility Analytical Thinking Core Competencies Products Services & Technology Knowledge Consumer Negotiation Working with Partners Solving Problems Sales Planning & Forecasting Formal qualifications University degree in Business, Marketing or Engineering/ICT (or similar/equivalent). Higher university such as an MBA considered a merit. Three to Five years of experience in distribution planning and channel implementation. Understanding of general retail management best practices and customer relationship management. Hardworking, persistent, and dependable. Positive and enthusiastic. Financial Accountability for revenue targets for distribution channel for all products. Non Financial Monitoring of distributors’ sales force and retailers Resolution of channel-specific issues within timelines. Key performance indicators – Your Background Achievement of key targets in the distribution network (Sales, Revenue) in the territory. Achievement of retail outlet (MBO) expansion targets. Performance management of channel partners, sales force. Delivery of distribution metrics Interested candidate kindly share your updated resume amrita.singh@manpower.co.in

Posted 3 weeks ago

Apply

7.0 - 12.0 years

11 - 12 Lacs

Hyderabad

Work from Office

We are seeking a highly skilled Devops Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Devops and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in devops, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 7 to 12+ years of experience in full-stack development, with a strong focus on DevOps. DevOps with AWS Data Engineer - Roles & Responsibilities: Use AWS services like EC2, VPC, S3, IAM, RDS, and Route 53. Automate infrastructure using Infrastructure as Code (IaC) tools like Terraform or AWS CloudFormation . Build and maintain CI/CD pipelines using tools AWS CodePipeline, Jenkins,GitLab CI/CD. Cross-Functional Collaboration Automate build, test, and deployment processes for Java applications. Use Ansible , Chef , or AWS Systems Manager for managing configurations across environments. Containerize Java apps using Docker . Deploy and manage containers using Amazon ECS , EKS (Kubernetes) , or Fargate . Monitoring & Logging using Amazon CloudWatch,Prometheus + Grafana,E Stack (Elasticsearch, Logstash, Kibana),AWS X-Ray for distributed tracing manage access with IAM roles/policies . Use AWS Secrets Manager / Parameter Store for managing credentials. Enforce security best practices , encryption, and audits. Automate backups for databases and services using AWS Backup , RDS Snapshots , and S3 lifecycle rules . Implement Disaster Recovery (DR) strategies. Work closely with development teams to integrate DevOps practices. Document pipelines, architecture, and troubleshooting runbooks. Monitor and optimize AWS resource usage. Use AWS Cost Explorer , Budgets , and Savings Plans . Must-Have Skills: Experience working on Linux-based infrastructure. Excellent understanding of Ruby, Python, Perl, and Java . Configuration and managing databases such as MySQL, Mongo. Excellent troubleshooting. Selecting and deploying appropriate CI/CD tools Working knowledge of various tools, open-source technologies, and cloud services. Awareness of critical concepts in DevOps and Agile principles. Managing stakeholders and external interfaces. Setting up tools and required infrastructure. Defining and setting development, testing, release, update, and support processes for DevOps operation. Have the technical skills to review, verify, and validate the software code developed in the project. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm

Posted 3 weeks ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Hyderabad

Work from Office

As a Senior Software Engineer I, you will be a critical member of our technology team, responsible for designing, developing, and deploying scalable software solutions. You will leverage your expertise in Java, ReactJS, AWS, and emerging AI tools to deliver innovative products and services that enhance healthcare outcomes and streamline operations. Primary Responsibilities: Design, develop, test, deploy, and maintain full-stack software solutions leveraging Java, ReactJS, and AWS cloud services Collaborate closely with cross-functional teams, including Product Managers, Designers, Data Scientists, and DevOps Engineers, to translate business requirements into technical solutions Implement responsive UI/UX designs using ReactJS, ensuring optimal performance and scalability Develop robust backend services and APIs using Java and related frameworks (e.g., Spring Boot) Leverage AWS cloud services (e.g., EC2, , S3, Postgres /DynamoDB, ECS, EKS, CloudFormation) to build scalable, secure, and highly available solutions Incorporate AI/ML tools and APIs (such as OpenAI, Claude, Gemini, Amazon AI services) into existing and new solutions to enhance product capabilities Conduct code reviews and adhere to software engineering best practices to ensure quality, security, maintainability, and performance Actively participate in agile methodologies, sprint planning, backlog grooming, retrospectives, and continuous improvement processes Troubleshoot, debug, and resolve complex technical issues and identify root causes to ensure system reliability and performance Document technical solutions, system designs, and code effectively for knowledge sharing and future reference Mentor junior team members, fostering technical growth and engineering excellence Comply with the terms and conditions of the employment contract, company policies and procedures, and any and all directives (such as, but not limited to, transfer and/or re-assignment to different work locations, change in teams and/or work shifts, policies in regards to flexibility of work benefits and/or work environment, alternative work arrangements, and other decisions that may arise due to the changing business environment). The Company may adopt, vary or rescind these policies and directives in its absolute discretion and without any limitation (implied or otherwise) on its ability to do so Required Qualifications: Bachelors Degree or higher in Computer Science, Software Engineering, or related technical discipline 6+ years of hands-on software development experience across the full stack Solid experience developing front-end applications using ReactJS, TypeScript / JavaScript, HTML5, CSS3 Familiarity with AI/ML tools and APIs (such as OpenAI, Claude, Gemini, AWS AI/ML services) and experience integrating them into software solutions Experience with relational and NoSQL databases, along with solid SQL skills Experience in agile development methodologies and CI/CD pipelines Monitoring tools experience like Splunk, Datadog, Dynatrace Solid analytical and problem-solving skills, with the ability to troubleshoot complex technical issues independently Solid proficiency in Java, J2EE, Spring/Spring Boot, and RESTful API design Demonstrable experience deploying and managing applications on AWS (e.g., EC2, S3, Postgres /DynamoDB, RDS, ECS, EKS, CloudFormation) Proven excellent written, verbal communication, and interpersonal skills Preferred Qualifications: Experience in healthcare domain and understanding of healthcare data and workflows Hands-on experience with containerization technologies (Docker, Kubernetes) Experience with performance optimization, monitoring, and logging tools Familiarity with DevOps practices, Infrastructure as Code, and tools like Jenkins, Terraform, Git, and GitHub Actions Exposure to modern architectural patterns such as microservices, serverless computing, and event-driven architecture.

Posted 3 weeks ago

Apply

10.0 - 14.0 years

12 - 16 Lacs

Bengaluru

Work from Office

Responsibilities We are seeking a highly skilled and experienced Senior DevOps Engineer to join our dynamic team. As a Senior DevOps Engineer, you will be responsible for designing, implementing, and maintaining scalable infrastructure, CI/CD pipelines, and automation processes. You will work closely with development teams, system administrators, and other engineers to improve the overall development lifecycle and optimize system reliability, performance, and security. Key Responsibilities: Design, implement, and maintain cloud infrastructure (AWS, Azure, GCP) and automation tools to streamline deployment processes. Collaborate with development teams to build and improve Continuous Integration/Continuous Delivery (CI/CD) pipelines. Automate system configurations, deployment, monitoring, and scaling using tools such as Ansible, Chef, Puppet, or Terraform. Maintain and improve monitoring and alerting systems to ensure high availability and quick issue resolution. Troubleshoot complex issues across infrastructure, networking, and application layers. Ensure security best practices are followed in both infrastructure and code development, including implementing automated security scans and patches. Perform system updates, patches, and backups to maintain the integrity and reliability of infrastructure. Optimize resource utilization and cost efficiency across cloud platforms. Lead efforts in the design, development, and implementation of disaster recovery strategies. Mentor junior engineers and help improve team skills and productivity through code reviews, knowledge sharing, and training. Stay updated with emerging technologies, best practices, and industry trends related to DevOps, cloud infrastructure, and CI/CD. Skills Must have Overall experience in the role of at least 10 years. More than 6 years of hands-on experience with the following tools and technologies: Azure Cloud, Ansible, Terraform scripting At least 4 years of hands-on experience with deploying, managing, and upgrading Kubernetes Cloud Infrastructure: In-depth experience with Azure is mandatory (AWS, GCP is a plus). Familiarity with cloud services like EC2, S3, Lambda, RDS, Kubernetes, VPC, IAM, etc. Expertise in cloud networking, security, and cost optimization strategies. CI/CD & Automation: Strong experience with CI/CD tools such as Jenkins, GitLab CI, CircleCI, or Travis CI. Ability to implement and maintain end-to-end automation for build, test, and deployment pipelines. Experience with infrastructure-as-code tools (e.g., Terraform, AWS CloudFormation, or Azure Resource Manager). Familiarity with containerization (Docker) and container orchestration (Kubernetes, OpenShift). Version Control and Scripting: Advanced proficiency in Git and Git workflows (e.g., branching, pull requests, rebasing). Experience with scripting languages like Python, Bash, Ruby, or PowerShell. Ability to write modular, reusable scripts to automate tasks. Monitoring & Logging: Experience with monitoring tools like Prometheus, Grafana, Nagios, Datadog, or New Relic. Knowledge of log aggregation tools (e.g., ELK stack, Splunk, Fluentd). Strong understanding of setting up alerting and proactive issue detection. Containerization and Orchestration: Expertise in Docker containerization and managing containerized applications. Experience with Kubernetes or other container orchestration platforms (e.g., OpenShift, ECS, EKS). Familiarity with Helm charts for Kubernetes deployments. Infrastructure Management & Configuration: Proficient in tools like Ansible, Chef, Puppet, SaltStack, or similar configuration management tools. Experience with managing servers, clusters, and networking configurations. Security: Knowledge of security best practices, vulnerability management, and compliance (e.g., SOC2, HIPAA, PCI-DSS). Experience with IAM, encryption, key management, and security auditing. Ability to implement secure deployment practices (e.g., secrets management with Vault or AWS Secrets Manager). Networking & Performance: Strong understanding of networking fundamentals (DNS, Load Balancing, HTTP/S, VPNs, proxies). Experience with performance tuning, network optimization, and troubleshooting. Database Management & Backup: Experience with managing and automating database backups and restores. Familiarity with databases like MySQL, PostgreSQL, MongoDB, and cloud-native databases (e.g., RDS, Cosmos DB). Collaboration & Communication: Strong interpersonal and communication skills. Experience working in Agile/Scrum teams, and ability to collaborate with cross-functional teams. Ability to document processes, procedures, and architecture designs clearly. Soft Skills: Strong problem-solving and troubleshooting abilities. Self-motivated and able to work independently with minimal supervision. Excellent time management skills and ability to prioritize tasks effectively. Nice to have Other clouds experience, Support experience, Python, React, Strong experience in Agile and Scrum

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies