Jobs
Interviews

18 Boto3 Jobs

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 5.0 years

4 - 9 Lacs

Chennai

Remote

Coordinating with development teams to determine application requirements. Writing scalable code using Python programming language. Testing and debugging applications. Developing back-end components. Required Candidate profile Knowledge of Python and related frameworks including Django and Flask. A deep understanding and multi-process architecture and the threading limitations of Python. Perks and benefits Flexible Work Arrangements.

Posted 3 days ago

Apply

5.0 - 8.0 years

14 - 22 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Hiring For Top IT Company- Designation: Python Developer Skills: Python + Pyspark Location :Bang/Mumbai Exp: 5-8 yrs Best CTC 9783460933 9549198246 9982845569 7665831761 6377522517 7240017049 Team Converse

Posted 3 days ago

Apply

4.0 - 8.0 years

15 - 25 Lacs

Bengaluru

Work from Office

Role: DevOps/SRE Engineer with Python We are looking for a talented and experienced DevOps/Site Reliability Engineer (SRE) with a strong proficiency in Python to join our team at Cloud Raptor. The ideal candidate will be responsible for optimizing our company's production environment and ensuring the reliability and stability of our systems. Key Responsibilities: 1. Collaborate with development teams to design, develop, and maintain infrastructure for our highly available and scalable applications. 2. Automate processes using Python scripting to streamline the deployment and monitoring of our applications. 3. Monitor and manage cloud infrastructure on AWS, including EC2, S3, RDS, and Lambda. 4. Implement and manage CI/CD pipelines for automated testing and deployment of applications. 5. Troubleshoot and resolve production issues, ensuring high availability and performance of our systems. 6. Collaborate with cross-functional teams to ensure security, scalability, and reliability of our infrastructure. 7. Develop and maintain documentation for system configurations, processes, and procedures. Key Requirements: 1. Bachelor's degree in Computer Science, Engineering, or a related field. 2. 3+ years of experience in a DevOps/SRE role, with a strong focus on automation and infrastructure as code. 3. Proficiency in Python scripting for automation and infrastructure management. 4. Hands-on experience with containerization technologies such as Docker and Kubernetes. 5. Strong knowledge of cloud platforms such as AWS, including infrastructure provisioning and management. 6. Experience with monitoring and logging tools such as Prometheus, Grafana, and ELK stack. 7. Knowledge of CI/CD tools like Jenkins or Github Actions. 8. Familiarity with configuration management tools such as Ansible, Puppet, or Chef. 9. Strong problem-solving and troubleshooting skills, with an ability to work in a fast-paced and dynamic environment. 10. Excellent communication and collaboration skills to work effectively with cross-functional teams.

Posted 1 week ago

Apply

10.0 - 15.0 years

12 - 16 Lacs

Hyderabad

Work from Office

JD for Data Engineering Lead - Python: Data Engineering Lead with at least 7 to 10 years experience in Python with following AWS Services AWS servicesAWS SQS, AWS MSK, AWS RDS Aurora DB, BOTO 3, API Gateway, and CloudWatch. Providing architectural guidance to the offshore team,7-10, reviewing code and troubleshoot errors. Very strong SQL knowledge is a must, should be able to understand & build complex queries. Familiar with Gitlab( repos and CI/CD pipelines). He/she should be closely working with Virtusa onshore team as well as enterprise architect & other client teams at onsite as needed. Experience in API development using Python is a plus. Experience in building MDM solution is a plus.

Posted 2 weeks ago

Apply

6.0 - 11.0 years

8 - 13 Lacs

Hyderabad

Work from Office

JD for Data Engineer Python At least 5 to 8 years of experience in AWS Python programming and who can design, build, test & deploy the code. Candidate should have worked on LABMDA based APIs development. Should have experience in using following AWS servicesAWS SQS, AWS MSK, AWS RDS Aurora DB, BOTO 3. Very strong SQL knowledge is a must, should be able to understand build complex queries. He/she should be closely working with enterprise architect & other client teams at onsite as needed. Having experience in building solutions using Kafka would be good value addition(optional).

Posted 2 weeks ago

Apply

10.0 - 15.0 years

6 - 14 Lacs

Bengaluru

Work from Office

AsyncIO, FastAPI,Tornado,Flask,gRPC,Netmiko,Boto3,Pandas,Celery,Pytest

Posted 2 weeks ago

Apply

4.0 - 7.0 years

12 - 16 Lacs

Bengaluru

Work from Office

As a Senior Cloud Platform Back-End Engineer with a strong background in AWS tools and services, you will join the Data & AI Solutions - Engineering team in our Healthcare R&D business. Your expertise will enhance the development and continuous improvement of a critical AWS-Cloud-based analytics platform, supporting our R&D efforts in drug discovery. This role involves implementing the technical roadmap and maintaining existing functionalities. You will adapt to evolving technologies, manage infrastructure and security, design and implement new features, and oversee seamless deployment of updates. Additionally, you will implement strategies for data archival and optimize the data lifecycle processes for efficient storage management in compliance with regulations. Join a multicultural team working in agile methodologies with high autonomy. The role requires office presence at our Bangalore location. Who You Are: University degree in Computer Science, Engineering, or a related field Proficiency using Python, especially with the boto3 library to interact with AWS services programmatically, for infrastructure as a code with AWS CDK and AWS Lambdas Experience with API Development & Management by designing, developing, and managing APIs using AWS API Gateway and other relevant API frameworks. Strong understanding of AWS security best practices, IAM policies, encryption, auditing and regulatory compliance (e.g. GDPR). Experience with Application Performance Monitoring and tracing solutions like AWS CloudWatch, X-Ray, and OpenTelemetry. Proficiency in navigating and utilizing various AWS tools and services System design skills in cloud environment Experience with SQL and data integration into Snowflake Familiarity with Microsoft Entra ID for identity and access management Willingness to work in a multinational environment and cross-functional teams distributed between US, Europe (mostly, Germany) and India Sense of accountability and ownership, fast learner Fluency in English & excellent communication skills

Posted 3 weeks ago

Apply

5.0 - 7.0 years

27 - 30 Lacs

Hyderabad, Chennai

Work from Office

Experience required: 7+ years Core Generative AI & LLM Skills: * 5+ years in Software Engineering, 1+ year in Generative AI. * Strong understanding of LLMs, prompt engineering, and RAG. * Experience with multi-agent system design (planning, delegation, feedback). * Hands-on with LangChain (tools, memory, callbacks) and LangGraph (multi-agent orchestration). * Proficient in using vector DBs (OpenSearch, Pinecone, FAISS, Weaviate). * Skilled in Amazon Bedrock and integrating LLMs like Claude, Titan, Llama. * Strong Python (LangChain, LangGraph, FastAPI, boto3). * Experience building MCP servers/tools. * Designed robust APIs, integrated external tools with agents. * AWS proficiency: Lambda, API Gateway, DynamoDB, S3, Neptune, Bedrock Agents * Knowledge of data privacy, output filtering, audit logging * Familiar with AWS IAM, VPCs, and KMS encryption Desired Skills: * Integration with Confluence, CRMs, knowledge bases, etc. * Observability with Langfuse, OpenTelemetry, Prompt Catalog * Understanding of model alignment & bias mitigation

Posted 3 weeks ago

Apply

6.0 - 9.0 years

14 - 22 Lacs

Pune, Chennai

Work from Office

Hiring For Top IT Company- Designation: Python Developer Skills: AWS SDK +AI services integration Location :Pune/Chennai Exp: 6-8 yrs Best CTC Surbhi:9887580624 Anchal:9772061749 Gitika:8696868124 Shivani:7375861257 Team Converse

Posted 3 weeks ago

Apply

2.0 - 6.0 years

2 - 6 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Roles and Responsibilities: To do so, the engineer is expected to build and support solutions that pre-process the paper image claims to extract data, build pipelines using serverless solutions and invoke AI/ML processes to populate claim data from the submitted claims. The engineer will also be working on building metrics, monitoring and operational dashboards. Required Skills: Strong hands-on experience with Python, Boto3, and test-driven development techniques such as unit testing and gameday testing. Hands-on experience in writing unit tests with Python. Hands-on experience with common AWS Services such as Lambdas, Step Functions, DynamoDB, S3, and Cloud Watch. Experience in deploying applications to development and test environments. Enters an existing team and learns rapidly about the overall goals of the solution. Collaborates with the rest of the team to explore paths towards the overall goals. Participate in peer reviews anddeployments. Executes,understands that work is not complete until it is implemented. When analysis is complete and decisions have been made, the work has only just begun. Embraces an agile mindset to adjust to best achieve the overall goals; is not locked into initial decisions. At the same time, develops plans in advance to find a healthy balance of preparedness and flexibility, as appropriate for each situation s needs. Rapidly raises up defects, and reflects on where prior judgment was incorrect in the spirit of growth. Good news travels fast, bad news faster. Addresses the mistakes of others in the spirit of learning and growth. Models these behaviors in the team retrospective.

Posted 1 month ago

Apply

8.0 - 13.0 years

9 - 14 Lacs

Bengaluru

Work from Office

8+ years experience combined between backend and data platform engineering roles Worked on large scale distributed systems. 5+ years of experience building data platform with (one of) Apache Spark, Flink or with similar frameworks. 7+ years of experience programming with Java Experience building large scale data/event pipelines Experience with relational SQL and NoSQL databases, including Postgres/MySQL, Cassandra, MongoDB Demonstrated experience with EKS, EMR, S3, IAM, KDA, Athena, Lambda, Networking, elastic cache and other AWS services.

Posted 1 month ago

Apply

8.0 - 11.0 years

7 - 11 Lacs

Hyderabad

Work from Office

HIH - Software Engineering Associate Advisor Position Overview The successful candidate will be a member of our US medical Integration Solutions ETL team. They will play a major role in the design and development if the ETL application in support of various portfolio projects. Responsibilities Analyze business requirements and translate into ETL architecture and data rules Serve as advisor and subject matter expert on project teams Manage both employees and consultants on multiple ETL projects. Oversee and review all design and coding from developers to ensure they follow company standards and best practices, as well as architectural direction Assist in data analysis and metadata management Test planning and execution Effectively operate within a team of technical and business professionals Asses new talent and mentor direct reports on best practices Review all designs and code from developers Qualifications Desired Skills & Experience: 8 - 11 Years of Experience in Java and Python, PySpark to support new development as well as support existing 7+ Years of Experience with Cloud technologies, specifically AWS Experience in AWS services such as Lambda, Glue, s3, MWAA, API Gateway and Route53, DynamoDB, RDS MySQL, SQS, CloudWatch, Secrete Manager, KMS, IAM, EC2 and Auto Scaling Group, VPC and Security Groups Experience with Boto3, Pandas and Terraforms for building Infrastructure as a Code Experience with IBM Datastage ETL tool Experience with CD /CI methodologies and processing and the development of these processes DevOps experience Knowledge in writing SQL Data mappingsource to target target to multiple formats Experience in the development of data extraction and load processes in a parallel framework Understanding of normalized and de-normalized data repositories Ability to define ETL standards & processes SQL Standards / Processes / Tools: Mapping of data sources ETL Development, monitoring, reporting and metrics Focus on data quality Experience with DB2/ZOS, Oracle, SQL Server, Teradata and other database environments Unix experience Excellent problem solving and organizational skills Strong teamwork and interpersonal skills and ability to communicate with all management levels Leads others toward technical accomplishments and collaborative project team efforts Very strong communication skills, both verbal and written, including technical writing Strong analytical and conceptual skills Location & Hours of Work (Specify whether the position is remote, hybrid, in-office and where the role is located as well as the required hours of work) About Evernorth Health Services Evernorth Health Services, a division of The Cigna Group, creates pharmacy, care and benefit solutions to improve health and increase vitality. We relentlessly innovate to make the prediction, prevention and treatment of illness and disease more accessible to millions of people. Join us in driving growth and improving lives.

Posted 1 month ago

Apply

4.0 - 8.0 years

9 - 14 Lacs

Bengaluru

Work from Office

The Cigna International Health unit uses Amazon Web Services (AWS) services and custom, proprietary solutions implemented in AWS, to pre-process paper health care claims from around the world. The volume is expected to reach significantly higher per day, as expansion of the initiative is one of the top priorities for International Health. To do so, the engineer is expected to build and support solutions that pre-process the paper image claims to extract data, build pipelines using serverless solutions and invoke AI/ML processes to populate claim data from the submitted claims. The engineer will also be working on building metrics, monitoring and operational dashboards. Required Skills: Strong hands-on experience with Python, Boto3, and test-driven development techniques such as unit testing and gameday testing. Hands-on experience in writing unit tests with Python. Hands-on experience with common AWS Services such as Lambdas, Step Functions, DynamoDB, S3, and CloudWatch. Experience in deploying applications to development and test environments. Enters an existing team and learns rapidly about the overall goals of the solution. Collaborates with the rest of the team to explore paths towards the overall goals. Participate in peer reviews and deployments.Executes, understands that work is not complete until it is implemented. When analysis is complete and decisions have been made, the work has only just begun. Embraces an agile mindset to adjust to best achieve the overall goals; is not locked into initial decisions. At the same time, develops plans in advance to find a healthy balance of preparedness and flexibility, as appropriate for each situation’s needs. Rapidly raises up defects, and reflects on where prior judgment was incorrect in the spirit of growth. Good news travels fast, bad news faster. Addresses the mistakes of others in the spirit of learning and growth. Models these behaviors in the team retrospective. About The Cigna Group Cigna Healthcare, a division of The Cigna Group, is an advocate for better health through every stage of life. We guide our customers through the health care system, empowering them with the information and insight they need to make the best choices for improving their health and vitality. Join us in driving growth and improving lives.

Posted 1 month ago

Apply

1.0 - 5.0 years

9 - 13 Lacs

Bengaluru

Work from Office

Minimum Qualifications:- BA/BSc/B.E./BTech degree from Tier I, II college in Computer Science, Statistics, Mathematics, Economics or related fields- 1to 4 years of experience in working with data and conducting statistical and/or numerical analysis- Strong understanding of how data can be stored and accessed in different structures- Experience with writing computer programs to solve problems- Strong understanding of data operations such as sub-setting, sorting, merging, aggregating and CRUD operations- Ability to write SQL code and familiarity with R/Python, Linux shell commands- Be willing and able to quickly learn about new businesses, database technologies and analysis techniques- Ability to tell a good story and support it with numbers and visuals- Strong oral and written communication Preferred Qualifications:- Experience working with large datasets- Experience with AWS analytics infrastructure (Redshift, S3, Athena, Boto3)- Experience building analytics applications leveraging R, Python, Tableau, Looker or other- Experience in geo-spatial analysis with POSTGIS, QGIS

Posted 1 month ago

Apply

4.0 - 9.0 years

10 - 14 Lacs

Bengaluru

Work from Office

Minimum Qualifications: - BA/BSc/B.E./BTech degree from Tier I, II college in Computer Science, Statistics, Mathematics, Economics or related fields - 1to 4 years of experience in working with data and conducting statistical and/or numerical analysis - Strong understanding of how data can be stored and accessed in different structures - Experience with writing computer programs to solve problems - Strong understanding of data operations such as sub-setting, sorting, merging, aggregating and CRUD operations - Ability to write SQL code and familiarity with R/Python, Linux shell commands - Be willing and able to quickly learn about new businesses, database technologies and analysis techniques - Ability to tell a good story and support it with numbers and visuals - Strong oral and written communication Preferred Qualifications: - Experience working with large datasets - Experience with AWS analytics infrastructure (Redshift, S3, Athena, Boto3) - Experience building analytics applications leveraging R, Python, Tableau, Looker or other - Experience in geo-spatial analysis with POSTGIS, QGIS Apply Save Save Pro Insights

Posted 1 month ago

Apply

3.0 - 8.0 years

9 - 18 Lacs

Hyderabad

Hybrid

Data Engineer with Python development experience Experience: 3+ Years Mode: Hybrid (2-3 days/week) Location: Hyderabad Key Responsibilities Develop, test, and deploy data processing pipelines using AWS Serverless technologies such as AWS Lambda, Step Functions, DynamoDB, and S3. Implement ETL processes to transform and process structured and unstructured data eiciently. Collaborate with business analysts and other developers to understand requirements and deliver solutions that meet business needs. Write clean, maintainable, and well-documented code following best practices. Monitor and optimize the performance and cost of serverless applications. Ensure high availability and reliability of the pipeline through proper design and error handling mechanisms. Troubleshoot and debug issues in serverless applications and data workows. Stay up-to-date with emerging technologies in the AWS and serverless ecosystem to recommend improvements. Required Skills and Experience 3-5 years of hands-on Python development experience, including experience with libraries like boto3, Pandas, or similar tools for data processing. Strong knowledge of AWS services, especially Lambda, S3, DynamoDB, Step Functions, SNS, SQS, and API Gateway. Experience building data pipelines or workows to process and transform large datasets. Familiarity with serverless architecture and event-driven programming. Knowledge of best practices for designing secure and scalable serverless applications. Prociency in version control systems (e.g., Git) and collaboration tools. Understanding of CI/CD pipelines and DevOps practices. Strong debugging and problem-solving skills. Familiarity with database systems, both SQL (e.g., RDS) and NoSQL (e.g., DynamoDB). Preferred Qualications AWS certications (e.g., AWS Certied Developer Associate or AWS Certied Solutions Architect Associate). Familiarity with testing frameworks (e.g., pytest) and ensuring test coverage for Python applications. Experience with Infrastructure as Code (IaC) tools such as AWS CDK, CloudFormation. Knowledge of monitoring and logging tools . Apply for Position

Posted 1 month ago

Apply

5 - 8 years

8 - 10 Lacs

Bengaluru

Work from Office

Mandatory Skill Set: Devops, AWS, Python Scripting, Kubernetes, CI/CD Pipelines, Automation, Ad-hoc & Post Production Support, Boto3

Posted 2 months ago

Apply

4 - 9 years

12 - 16 Lacs

Hyderabad, Pune, Bengaluru

Work from Office

Role & responsibilities Urgent Hiring for one of the reputed MNC Immediate Joiners Only Females Exp - 4-9 Years Bang / Hyd / Pune As a Python Developer with AWS , you will be responsible for developing cloud-based applications, building data pipelines, and integrating with various AWS services. You will work closely with DevOps, Data Engineering, and Product teams to design and deploy solutions that are scalable, resilient, and efficient in an AWS cloud environment. Key Responsibilities: Python Development : Design, develop, and maintain applications and services using Python in a cloud environment. AWS Cloud Services : Leverage AWS services such as EC2 , S3 , Lambda , RDS , DynamoDB , and API Gateway to build scalable solutions. Data Pipelines : Develop and maintain data pipelines, including integrating data from various sources into AWS-based storage solutions. API Integration : Design and integrate RESTful APIs for application communication and data exchange. Cloud Optimization : Monitor and optimize cloud resources for cost efficiency, performance, and security. Automation : Automate workflows and deployment processes using AWS Lambda , CloudFormation , and other automation tools. Security & Compliance : Implement security best practices (e.g., IAM roles, encryption) to protect data and maintain compliance within the cloud environment. Collaboration : Work with DevOps, Cloud Engineers, and other developers to ensure seamless deployment and integration of applications. Continuous Improvement : Participate in the continuous improvement of development processes and deployment practices. Required Qualifications: Python Expertise : Strong experience in Python programming, including using libraries like Pandas , NumPy , Boto3 (AWS SDK for Python), and frameworks like Flask or Django . AWS Knowledge : Hands-on experience with AWS services such as S3 , EC2 , Lambda , RDS , DynamoDB , CloudFormation , and API Gateway . Cloud Infrastructure : Experience in designing, deploying, and maintaining cloud-based applications using AWS. API Development : Experience in designing and developing RESTful APIs, integrating with external services, and managing data exchanges. Automation & Scripting : Experience with automation tools and scripts (e.g., using AWS Lambda , Boto3 , CloudFormation ). Version Control : Proficiency with version control tools such as Git . CI/CD Pipelines : Experience building and maintaining CI/CD pipelines for cloud-based applications. Preferred candidate profile Familiarity with serverless architectures using AWS Lambda and other AWS serverless services. AWS Certification (e.g., AWS Certified Developer Associate , AWS Certified Solutions Architect Associate ) is a plus. Knowledge of containerization tools like Docker and orchestration platforms such as Kubernetes . Experience with Infrastructure as Code (IaC) tools such as Terraform or AWS CloudFormation .

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies