Home
Jobs

271 S3 Jobs - Page 3

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

6.0 - 9.0 years

14 - 22 Lacs

Pune, Chennai

Work from Office

Naukri logo

Hiring For Top IT Company- Designation: Python Developer Skills: AWS SDK +AI services integration Location :Pune/Chennai Exp: 6-8 yrs Best CTC Surbhi:9887580624 Anchal:9772061749 Gitika:8696868124 Shivani:7375861257 Team Converse

Posted 1 week ago

Apply

9.0 - 13.0 years

32 - 40 Lacs

Ahmedabad

Remote

Naukri logo

About the Role: We are looking for a hands-on AWS Data Architect OR Lead Engineer to design and implement scalable, secure, and high-performing data solutions. This is an individual contributor role where you will work closely with data engineers, analysts, and stakeholders to build modern, cloud-native data architectures across real-time and batch pipelines. Experience: 715 Years Location: Fully Remote Company: Armakuni India Key Responsibilities: Data Architecture Design: Develop and maintain a comprehensive data architecture strategy that aligns with the business objectives and technology landscape. Data Modeling: Create and manage logical, physical, and conceptual data models to support various business applications and analytics. Database Design: Design and implement database solutions, including data warehouses, data lakes, and operational databases. Data Integration: Oversee the integration of data from disparate sources into unified, accessible systems using ETL/ELT processes. Data Governance: Implemented enforce data governance policies and procedures to ensure data quality, consistency, and security. Technology Evaluation: Evaluate and recommend data management tools, technologies, and best practices to improve data infrastructure and processes. Collaboration: Work closely with data engineers, data scientists, business analysts, and other stakeholders to understand data requirements and deliver effective solutions. Trusted by the worlds leading brands Documentation: Create and maintain documentation related to data architecture, data flows, data dictionaries, and system interfaces. Performance Tuning: Optimize database performance through tuning, indexing, and query optimization. Security: Ensure data security and privacy by implementing best practices for data encryption, access controls, and compliance with relevant regulations (e.g., GDPR, CCPA) Required Skills: Helping project teams with solutions architecture, troubleshooting, and technical implementation assistance. Proficiency in SQL and database management systems (e.g., MySQL, PostgreSQL, Oracle, SQL Server). Minimum7to15 years of experience in data architecture or related roles. Experience with big data technologies (e.g., Hadoop, Spark, Kafka, Airflow). Expertise with cloud platforms (e.g., AWS, Azure, Google Cloud) and their data services. Knowledge of data integration tools (e.g., Informatica, Talend, FiveTran, Meltano). Understanding of data warehousing concepts and tools (e.g., Snowflake, Redshift, Synapse, BigQuery). Experience with data governance frameworks and tools.

Posted 1 week ago

Apply

6.0 - 10.0 years

25 - 30 Lacs

Noida, and Remote

Work from Office

Naukri logo

Job Title: Full Stack Software Developer Experience Required: 6+ Years Location: [Noida / Remote] Employment Type: Full-Time Job Summary We are seeking a talented and motivated Full Stack Software Developer with 6+ years of experience to join our dynamic team. The ideal candidate should be highly skilled in React and Node.js, with a solid grasp of GraphQL and AWS being a significant advantage. You will be instrumental in designing, developing, and maintaining scalable, efficient, and user-centric applications across the entire technology stack. Key Responsibilities Design & Development: Build, deploy, and maintain robust front-end and back-end applications using React and Node.js. API Integration: Create and consume RESTful and GraphQL APIs to support dynamic client-server interactions. System Architecture: Contribute to the design of scalable and maintainable software systems. Cloud Integration: Leverage AWS services (e.g., Lambda, S3, EC2) to host and scale applications efficiently. Collaboration: Work closely with cross-functional teams including product managers, designers, and other developers. Code Quality: Maintain clean, testable, and maintainable code following best practices. Troubleshooting: Diagnose and resolve issues across the stack to ensure high performance and reliability. Skills and Qualifications Required: Strong proficiency in JavaScript/TypeScript, React, and Node.js. Solid understanding of front-end development concepts (state management, component lifecycle, performance tuning). Experience working with REST and/or GraphQL APIs. Familiarity with relational databases like PostgreSQL or similar. Excellent problem-solving abilities and experience in Agile development environments. Preferred: Hands-on experience with GraphQL and tools like Apollo. Working knowledge of AWS services such as EC2, S3, Lambda, API Gateway, and DynamoDB. Experience with CI/CD tools (e.g., GitHub Actions, Jenkins). Understanding of automated testing using frameworks like Jest, Cypress, etc.

Posted 1 week ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities • Application Development: Design, develop, and maintain dealership applications using Java, Spring Boot, and Microservices architecture. • Cloud Deployment: Build and manage cloud-native applications on AWS, leveraging services such as Lambda, ECS, RDS, S3, DynamoDB, and API Gateway. • System Architecture: Develop and maintain scalable, high-availability architectures to support large-scale dealership operations. • Database Management: Work with relational and NoSQL databases like PostgreSQL, and DynamoDB for efficient data storage and retrieval. • Integration & APIs: Develop and manage RESTful APIs and integrate with third-party services, including OEMs and dealership management systems (DMS). • DevOps & CI/CD: Implement CI/CD pipelines using tools like Jenkins, GitHub Actions, or AWS CodePipeline for automated testing and deployments. • Security & Compliance: Ensure application security, data privacy, and compliance with industry regulations. • Collaboration & Mentorship: Work closely with product managers, designers, and other engineers. Mentor junior developers to maintain best coding practices. Technical Skills Required • Programming Languages: Java (8+), Spring Boot, Hibernate • Cloud Platforms: AWS (Lambda, S3, EC2, ECS, RDS, DynamoDB, CloudFormation) • Microservices & API Development: RESTful APIs, API Gateway • Database Management: PostgreSQL, DynamoDB • DevOps & Automation: Docker, Kubernetes, Terraform, CI/CD pipelines • Testing & Monitoring: JUnit, Mockito, Prometheus, CloudWatch • Version Control & Collaboration: Git, GitHub, Jira, Confluence Nice-to-Have Skills • Experience with serverless architectures (AWS Lambda, Step Functions) • Exposure to event-driven architectures (Kafka, SNS, SQS) Qualifications • Bachelor's or masters degree in computer science, Software Engineering, or a related field • 5+ years of hands-on experience in Java-based backend development • Strong problem-solving skills with a focus on scalability and performance.

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

The Core AI BI & Data Platforms Team has been established to create, operate and run the Enterprise AI, BI and Data that facilitate the time to market for reporting, analytics and data science teams to run experiments, train models and generate insights as well as evolve and run the CoCounsel application and its shared capability of CoCounsel AI Assistant. The Enterprise Data Platform aims to provide self service capabilities for fast and secure ingestion and consumption of data across TR. At Thomson Reuters, we are recruiting a team of motivated Cloud professionals to transform how we build, manage and leverage our data assets. The Data Platform team in Bangalore is seeking an experienced Software Engineer with a passion for engineering cloud-based data platform systems. Join our dynamic team as a Software Engineer and take a pivotal role in shaping the future of our Enterprise Data Platform. You will develop and implement data processing applications and frameworks on cloud-based infrastructure, ensuring the efficiency, scalability, and reliability of our systems. About the Role In this opportunity as the Software Engineer, you will: Develop data processing applications and frameworks on cloud-based infrastructure in partnership withData Analysts and Architects with guidance from Lead Software Engineer. Innovatewithnew approaches to meet data management requirements. Make recommendations about platform adoption, including technology integrations, application servers, libraries, and AWS frameworks, documentation, and usability by stakeholders. Contribute to improving the customer experience. Participate in code reviews to maintain a high-quality codebase Collaborate with cross-functional teams to define, design, and ship new features Work closely with product owners, designers, and other developers to understand requirements and deliver solutions. Effectively communicate and liaise across the data platform & management teams Stay updated on emerging trends and technologies in cloud computing About You You're a fit for the role of Software Engineer, if you meet all or most of these criteria: Bachelor's degree in Computer Science, Engineering, or a related field 3+ years of relevant experience in Implementation of data lake and data management of data technologies for large scale organizations. Experience in building & maintaining data pipelines with excellent run-time characteristics such as low-latency, fault-tolerance and high availability. Proficient in Python programming language. Experience in AWS services and management, including Serverless, Container, Queueing and Monitoring services like Lambda, ECS, API Gateway, RDS, Dynamo DB, Glue, S3, IAM, Step Functions, CloudWatch, SQS, SNS. Good knowledge in Consuming and building APIs Business Intelligence tools like PowerBI Fluency in querying languages such as SQL Solid understanding in Software development practicessuch as version control via Git, CI/CD and Release management Agile development cadence Good critical thinking, communication, documentation, troubleshooting and collaborative skills.

Posted 1 week ago

Apply

4.0 - 6.0 years

7 - 9 Lacs

Bengaluru

Hybrid

Naukri logo

We are looking for an experienced *Software Engineer - Informatica* with 4 to 6 years of hands-on expertise* in designing, developing, and optimizing large-scale *ETL solutions* using *Informatica PowerCenter*. The ideal candidate will lead ETL projects, mentor junior developers, and ensure high-performance data integration across enterprise systems. About The Role In this role as Software Engineer, you will: - Analyze business and functional requirements to design and implement scalable data integration solutions - Understand and interpret High-Level Design (HLD) documents and convert them into detailed Low-Level Design (LLD) - Develop robust, reusable, and optimized Informatica mappings, sessions, and workflows - Apply mapping optimization and performance tuning techniques to ensure efficient ETL processes - Conduct peer code reviews and suggest improvements for reliability and performance - Prepare and execute comprehensive unit test cases and support system/integration testing - Maintain detailed technical documentation, including LLDs, data flow diagrams, and test cases - Build data pipelines and transformation logic in Snowflake, ensuring performance and scalability - Develop and manage Unix shell scripts for automation, scheduling, and monitoring of ETL jobs - Collaborate with cross-functional teams to support UAT, deployments, and production issues. About You You are a fit for this position if your background includes: - 4-6 years of strong hands-on experience with Informatica PowerCenter - Proficient in developing and optimizing ETL mappings, workflows, and sessions - Solid experience with performance tuning techniques and best practices in ETL processes - Hands-on experience with Snowflake for data loading, SQL transformations, and optimization - Strong skills in Unix/Linux scripting for job automation - Experience in converting HLDs into LLDs and defining unit test cases - Knowledge of data warehousing concepts, data modelling, and data quality frameworks Good to Have - Knowledge of Salesforce data model and integration (via Informatica or API-based solutions) - Exposure to AWS cloud services like S3, Glue, Redshift, Lambda, etc. - Familiarity with relational databases such as SQL Server and PostgreSQL - Experience with job schedulers like Control-M, ESP, or equivalent - Agile methodology experience and tools such as JIRA, Confluence, and Git - Knowledge of DBT (Data Build Tool) for data transformation and orchestration - Experience with Python scripting for data manipulation, automation, or integration tasks.

Posted 1 week ago

Apply

8.0 - 10.0 years

10 - 15 Lacs

Gurugram

Work from Office

Naukri logo

T he Team: Financial Risk Analytics at S&P Global provides products and solutions to financial institutions to measure and manage their counterparty credit risk, market risk, regulatory risk capital and derivative valuation adjustments. Using the latest analytics and technology such as a fully vectorized pricing library, Machine Learning and a Big Data stack for scalability, our products and solutions are used by the largest tier-one banks to smaller niche firms. Our products are available deployed, in the cloud or can be run as a service. We have a need for an enthusiastic and skilled Senior Python developer who is interested in learning about quantitative analytics and perhaps looking to make a career at the intersection of Financial Analytics, Big Data and Mathematics! The Impact: You will be working on a strategic component that allows clients to on-demand extract data required for pricing and risk calculations. This is an essential entry point to a risk calculation which requires speed to market and good design to drive efficient and robust workflows. Whats in it for y ou: The successful candidate will gain exposure to risk analytics and latest trending technology that allows you to grow into a hybrid role specializi ng in both financial markets and technology a highly rewarding, challenging, and marketable position to gain skills in. Responsibilities: The successful candidate will work on the Market Risk solution with a technology stack that is best of breed, involving Python 3.10+ , Airflow, Pandas, NumPy , ECS (AWS) . You will join a fast-paced, dynamic team environment, building commercial products that are at the heart of the business and contributing directly to revenue generation. Design and implement end to end applications in Python with an emphasis on efficiently writing functions on large datasets. Interpret and analyse business use-cases and feature requests into technical designs and development tasks. Participate in regular design and code review meetings. Be a responsive team player in system architecture and design discussions. Be proud of the high quality of your own work. Always follow quality standards (unit tests, integration tests and documented code) Happy to coach and mentor junior engineers. Be delivery focused, have a passion for technology and enjoy offering new ideas and approaches. Demonstrable technical capacity in understanding technical deliveries and dependencies. Strong experience in working in software engineering projects in an Agile manner. What Were Looking For: Bachelors degree in computer science, Engineering, or a related discipline, or equivalent experience Computer Science and Software Engineering: Strong software development experience Minimum 8years' experience in developing applications using Python. Experience using Python 3.10+ Core Python with rich knowledge in OO methodologies and design. Experience writing python code that is scalable and performant. Experience/exposure to complex data types when designing and anticipating issues that impact performance (under ETL processes) by generating metrics using industry adopted profiling tools during development. Experience working on AWS, ECS, S3 and ideally MWAA (hosted Airflow on AWS) Experience working in data engineering/orchestration and scalable efficient flow design. Experience in developing data pipelines using Airflow. Good working competency in Docker, Git, Linux Good working knowledge of Pandas and NumPy Understanding of CI/CD pipelines Test frameworks. Agile and XP (Scrum, Kanban, TDD) Experience with cloud-based infrastructures, preferably with AWS. Fluent in English Passionate individual who thrives development, data and is hands on.

Posted 1 week ago

Apply

6.0 - 11.0 years

8 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

The Team: This team is part of the global Application Operations and Infrastructure group that provides production support to Ratings Applications. These applications are critical for the Analysts who drive the business through their actions. Team is responsible for the high availability and resiliency of these applications. The Impact: As part of global team of engineers, provide production support for Tier-1 business critical applications. Troubleshoot application related issues and work with infrastructure & Database team to triage Major Incidents. Contribute to delivery of innovative and continuous highly reliable technology services. Strong focus towards developing shared integration services with automation and cloud enablement, guide the team to design technical solutions. Become an integral part of a high performing global network of engineers working from India, Denver, New York and London to help advance our technology. Whats in it for you: Working with a team of highly skilled, ambitious and result-oriented professionals. An ever-challenging environment to hone your existing skills in Automation, performance, service layer testing, SQL scripting etc. A plenty of skill building, knowledge sharing, and innovation opportunities. Building a fulfilling career with a global financial technology company. Ability to lead and build a world class production support group. Highly technical hands-on role which will help enhance team skills. Work on Tier-1 applications that are in the critical path for the business. Ability to work on cutting-edge technologies such as AWS, Oracle and Ansible. Ability to grow within the organization thats part of the global team. Responsibilities: This role requires extensive skills in operating within the AWS cloud platform, along with deep expertise in database engineering, performance tuning, backup and recovery solutions (such as Cohesity), cloud database technologies, and the auditing and security of database systems. Hands-on experience working with AWS cloud service provider. encompassing key services such as IAM (Identity and Access Management), Compute, Storage, Elastic Load Balancing, RDS (Relational Database Service), VPC (Virtual Private Cloud), TGW (Transit Gateway), Route 53, ACM, Serverless computing, Containerization, Account Administration, CloudWatch, CloudTrail etc. Additional experience with other cloud providers is advantageous. Proficiency in working with configuration management tools such as Ansible Solid understanding of CI/CD pipelines, utilizing tools such as Azure DevOps and GitHub for seamless integration and deployment. Proficiency in scripting languages such as PowerShell, Bash, and Python. Demonstrated ability to learn new technologies quickly and integrate them into existing systems. Collaborate with cross-functional teams to ensure the stability, security, and efficiency of our database environment Ability to support/resolve infrastructure related issues across different business applications. As part of global team of engineers, deliver innovative and continuous highly reliable technology services. Ability to communicate well and manage multiple initiatives with multiple engineers potentially across multiple time zones. Participate in on call and a weekly rotating shift schedule Involvement in Architecture and Development design reviews for new implementation and integration projects. Troubleshoot application related issues and work with infrastructure team to triage Major Incidents. Work with business users to understand needs, issues, develop root cause analysis and work with the team for the development of solutions and enhancements Manage the Error Budgets to measure risk, balance availability and feature development. Drive the automation to reduce the Manual Toil Measure, Track & Report the SLOs Create & Manage the Systems & Process documentation. Analyse & Conduct Post Incident reviews & drive the actions. What were looking for: Basic Qualifications: 6+ Years of IT Experience Bachelor MS degree in Computer Science, Engineering, or a related subject Ability to architect high availability application and servers on cloud adhering best practices. Hands-on experience using automation tooling like Shell, Python, Ansible and Terraform Hand-on experience with DevOps tools like ADO, Jenkins, Ansible Tower, Docker. Hands-on experience integrating AWS services like VPC, EC2, Route53, S3 to create scalable application environments. Experience performing Root Cause analyses and automating solutions to address underlying issues. Having exposure to Database technologies like Oracle, PostgreSQL, SQL Server, Mongo etc, are desirable A team player capable of high performance, flexibility in a dynamic working environment. Skill and ability to train others on technical and procedural topics. Ability to support/resolve infrastructure related issues as required. Preferred Qualifications: Bachelors degree in Computer Science, Engineering or a related technical discipline Proven working experience in AWS Cloud Platform Engineering Expert knowledge of Observability Tools like SPLUNK & Open Telemetry. Expert knowledge automating the building and deployment of containerized applications Expertise in Infra as Code automations Certification in AWS Cloud Technologies, DevOps preferred.

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

The Core AI BI & Data Platforms Team has been established to create, operate and run the Enterprise AI, BI and Data that facilitate the time to market for reporting, analytics and data science teams to run experiments, train models and generate insights as well as evolve and run the CoCounsel application and its shared capability of CoCounsel AI Assistant. The Enterprise Data Platform aims to provide self service capabilities for fast and secure ingestion and consumption of data across TR. At Thomson Reuters, we are recruiting a team of motivated Cloud professionals to transform how we build, manage and leverage our data assets. The Data Platform team in Bangalore is seeking an experienced Software Engineer with a passion for engineering cloud-based data platform systems. Join our dynamic team as a Software Engineer and take a pivotal role in shaping the future of our Enterprise Data Platform. You will develop and implement data processing applications and frameworks on cloud-based infrastructure, ensuring the efficiency, scalability, and reliability of our systems. About the Role In this opportunity as the Software Engineer, you will: Develop data processing applications and frameworks on cloud-based infrastructure in partnership with Data Analysts and Architects with guidance from Lead Software Engineer. Innovatewithnew approaches to meet data management requirements. Make recommendations about platform adoption, including technology integrations, application servers, libraries, and AWS frameworks, documentation, and usability by stakeholders. Contribute to improving the customer experience. Participate in code reviews to maintain a high-quality codebase Collaborate with cross-functional teams to define, design, and ship new features Work closely with product owners, designers, and other developers to understand requirements and deliver solutions. Effectively communicate and liaise across the data platform & management teams Stay updated on emerging trends and technologies in cloud computing About You You're a fit for the role of Software Engineer, if you meet all or most of these criteria: Bachelor's degree in Computer Science, Engineering, or a related field 3+ years of relevant experience in Implementation of data lake and data management of data technologies for large scale organizations. Experience in building & maintaining data pipelines with excellent run-time characteristics such as low-latency, fault-tolerance and high availability. Proficient in Python programming language. Experience in AWS services and management, including Serverless, Container, Queueing and Monitoring services like Lambda, ECS, API Gateway, RDS, Dynamo DB, Glue, S3, IAM, Step Functions, CloudWatch, SQS, SNS. Good knowledge in Consuming and building APIs Business Intelligence tools like PowerBI Fluency in querying languages such as SQL Solid understanding in Software development practicessuch as version control via Git, CI/CD and Release management Agile development cadence Good critical thinking, communication, documentation, troubleshooting and collaborative skills.

Posted 1 week ago

Apply

5.0 - 8.0 years

1 - 2 Lacs

Kochi

Remote

Naukri logo

Job Description: Job Title: AWS Cloud Engineer Location: Kochi, India (Remote Option Available) We are looking for a highly skilled AWS Cloud Engineer with a minimum of 5 years of hands-on experience in AWS cloud technologies. The ideal candidate will have strong expertise in AWS services such as S3, EC2, MSK, Glue, DMS, and SageMaker, along with solid development experience in Python and Docker. This role involves troubleshooting issues, reviewing solution designs, and coding high-quality implementations. Key Responsibilities: Work extensively with AWS services including S3, EC2, MSK, Glue, DMS, and SageMaker Develop, containerize, and deploy applications using Python and Docker Design and review system architecture and cloud-based solutions Troubleshoot and resolve issues in AWS infrastructure and application layers Collaborate with development and DevOps teams to build scalable and secure applications Preferred candidate profile Requirements: Minimum 5 years of hands-on experience in AWS Cloud Proficiency in Python and containerization using Docker Strong understanding of AWS data and streaming services Experience with AWS Glue, DMS, and SageMaker Ability to troubleshoot issues, analyze root causes, and implement effective solutions Strong communication and problem-solving skills Preferred Qualifications: AWS Certification (Associate or Professional level) is a plus

Posted 1 week ago

Apply

6.0 - 9.0 years

30 - 32 Lacs

Hyderabad, Coimbatore, Bengaluru

Work from Office

Naukri logo

Job Summary: We are seeking a skilled Cloud Migration Consultant with hands-on experience in assessing and migrating complex applications from AWS to Azure. The ideal candidate will work closely with Microsoft business units, participating in application intake, assessment, and migration planning. This role includes creating migration artifacts, leading client interactions, and supporting application modernization initiatives on Azure with occasional AWS exposure. Key Responsibilities: Assess application readiness by documenting architecture, dependencies, and migration strategies. Conduct interviews with stakeholders and generate discovery insights using tools like Azure Migrate, CloudockIt, and PowerShell. Develop architecture diagrams, migration playbooks, and manage Azure DevOps boards. Set up and configure applications in on-premises and cloud environments, primarily Azure. Support proof-of-concepts (PoCs) and provide expert advice on migration and modernization options. Collaborate with application, database, and infrastructure teams to ensure smooth transition to migration factory teams. Track project progress, identify blockers and risks, and report timely status updates to leadership. Required Skills and Qualifications: Minimum 4 years of experience in cloud migration and application assessment. Strong expertise in Azure IaaS and PaaS services (e.g., VMs, App Services, Azure Data Factory). Familiarity with AWS IaaS and PaaS components (e.g., EC2, RDS, Glue, S3). Proficient in programming languages and frameworks including Java (Spring Boot), C#, .NET, Python, Angular, React.js, and REST APIs. Working knowledge of Kafka, Docker, Kubernetes, and Azure DevOps. Solid understanding of network infrastructure including VNets, NSGs, Firewalls, and WAFs. Experience with IAM concepts and technologies such as OAuth, SAML, Okta, and SiteMinder. Exposure to Big Data technologies like Databricks, Hadoop, Oracle, and DocumentDB. Preferred Qualifications: Azure or AWS cloud certifications. Prior experience with enterprise-scale cloud migration projects, especially within the Microsoft ecosystem. Excellent communication skills and proven ability to manage stakeholder relationships effectively. Location : Hyderabad/ Bangalore/ Coimbatore/ Pune

Posted 1 week ago

Apply

3.0 - 5.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Naukri logo

Data Analysis: Conduct in-depth analysis of data to identify trends, anomalies, and opportunities, utilizing SQL, AWS, and Python to extract and manipulate data. Business Transformation: Translate existing SQL queries into business transformation logics, enabling the conversion of raw data into actionable insights to drive strategic decision-making. Requirements Gathering: Collaborate with business stakeholders to gather and document. clear and concise business requirements, ensuring a thorough understanding of data needs. Documentation: Develop and maintain documentation related to data analysis, transformation, and reporting processes, ensuring knowledge transfer and continuity. AWS Integration: Leverage AWS services to facilitate data extraction, storage, and analysis, making data readily available for the business. Quality Assurance: Implement data quality checks and validation processes to ensure the accuracy and integrity of data used in analyses. Qualifications: Bachelors degree in business, Computer Science, or a related field. Proven experience as a Business Analyst with a strong focus on data analysis and transformation. Proficiency in SQL for querying and manipulating relational databases. Awareness of AWS services such as Redshift, S3, Athena, Lambda, Step Functions, AWS Batch Proficiency in Python for data analysis and scripting. Experience in converting SQL queries into actionable business transformation logics. Strong problem-solving and critical-thinking skills. Excellent communication and interpersonal skills to work effectively with cross-functional. teams and stakeholders. Attention to detail and a commitment to data accuracy and quality.

Posted 1 week ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Naukri logo

Design,develop,maintain scalable microservices using Java,Kotlin,Spring Boot. Build/ optimize data models and queries in MongoDB. Integrate and manage Apache Kafka for real-time data streaming and messaging. Implement CI/CD pipelines using Jenkins Required Candidate profile 6+ yrs of exp in backend development with Java/Spring Boot. Exp with Jenkins for CI/CD automation. Familiarity with AWS services (EC2, S3, Lambda, RDS, etc.) or OpenShift for container orchestration

Posted 1 week ago

Apply

8.0 - 12.0 years

25 - 40 Lacs

Chennai, Bengaluru, Delhi / NCR

Work from Office

Naukri logo

AWS Admin exp involving design, Landing Zone deployment, Migration, and optimization Design and develop AWS cloud solutions Create architectural blueprints, diagrams, and documentation Hands on Exp on AWS Terraform and Cloud formation automation

Posted 1 week ago

Apply

3.0 - 8.0 years

5 - 15 Lacs

Hyderabad

Work from Office

Naukri logo

Dear Candidates, We are conducting a Walk-In Interview in Hyderabad for the position of Data Engineering on 20th/21st/22nd June 2025 . Position: Data Engineering Job description: Expert knowledge in AWS Data Lake implementation and support (S3, Glue,DMS Athena, Lambda, API Gateway, Redshift) Handling of data related activities such as data parsing, cleansing quality definition data pipelines, storage and ETL scripts Experiences in programming language Python/Pyspark/SQL Experience with data migration with hands-on experience Experiences in consuming rest API using various authentication options with in AWS Lambda architecture orchestrate triggers, debug and schedule batch job using a AWS Glue, Lambda and step functions understanding of AWS security features such as IAM roles and policies Exposure to Devops tools AWS certification in AWS is highly preferred Mandatory skills for Data engineer: Python/Pyspark, Aws Glue, lambda , redshift. Date: 20th June 2025 to 22nd June 2025 Time : 9.00 AM to 6.00 PM Eligibility: Any Graduate Experience : 2- 10 Years Gender: ANY Interested candidates can walk in directly. For any queries, please contact us at +91 7349369478/ 8555079906 Interview Venue Details: Selectify Analytics Address: Capital Park (Jain Sadguru Capital Park) Ayyappa Society, Silicon Valley, Madhapur, Hyderabad, Telangana 500081 Contact Person: Mr. Deepak/Saqeeb/Ravi Kumar Interview Time: 9.00 AM to 6.00 PM Contact Number : +91 7349369478/ 8555079906

Posted 1 week ago

Apply

7.0 - 12.0 years

6 - 16 Lacs

Noida, Gurugram, Delhi / NCR

Work from Office

Naukri logo

Role & responsibilities Total Experience: 7 yrs Relevant experience: 6 yrs Mandatory skills: Hadoop Apache Spark, Hive AWS cloud services including S3, Redshift, EMR etc SQL Nice to have: Good experience in Linux and shell scripting. Experience in Data Pipeline using Apache Airflow / Control-M. Practical experience in Core Java /Python/Scala. Experience with SDLC tools (e.g. Bamboo, JIRA, GIT, Confluence, Bitbucket) Having experience in Data Modelling, Data Quality, Load Assurance. Ability to communicate problems and solutions effectively with both business and technical stakeholders (written and verbal) Added advantages : Scheduling tools - control-m, airflow Monitoring tools : Sumologic, SPLUNK Process familiarity of Incident Management Job Description: Experience in Big Data Technology like Hadoop, Apache Spark, Hive. Having experience in AWS cloud services including S3, Redshift, EMR etc. Strong expertise in RDBMS and SQL. Good experience in Linux and shell scripting. Experience in Data Pipeline using Apache Airflow / Control-M. Practical experience in Core Java /Python/Scala. Experience with SDLC tools (e.g. Bamboo, JIRA, GIT, Confluence, Bitbucket) Having experience in Data Modelling, Data Quality, Load Assurance. Ability to communicate problems and solutions effectively with both business and technical stakeholders (written and verbal) Added advantages : Scheduling tools - control-m, airflow Monitoring tools : Sumologic, SPLUNK Process familiarity of Incident Management

Posted 1 week ago

Apply

1.0 - 6.0 years

3 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities Design and develop new features to meet evolving business and technical needs Maintain and enhance existing functionality to ensure reliability and performance Collaborate directly with customers to gather requirements and understand business objectives Stay up to date with the latest technologies and apply them to influence project decisions and outcomes Requirements 1+ year of experience in developing commercial applications on .NET Good understanding of the Software Development Lifecycle Understanding of C#, including .NET 6/8 and .NET Framework 4.8 Good knowledge and experience with Azure (Azure Functions, VMs, Cosmos DB, Azure SQL) or AWS (EC2, Lambda, S3, DynamoDB) Skills in front-end web development (React, Angular, TypeScript) Substantial knowledge of relational and non-relational databases Good knowledge of Event-Driven Architecture (CQRS & Event Sourcing), Domain-Driven Design, and Microservices Experience working with CI/CD (Azure DevOps, AWS CodePipeline) Experience with testing tools and techniques Good spoken English (at least B1 level according to CEFR)

Posted 1 week ago

Apply

8.0 - 13.0 years

15 - 25 Lacs

Gurugram

Remote

Naukri logo

Minimum 6 years of hands-on experience deploying, enhancing, and troubleshooting foundational AWS Services (EC2, S3, RDS, VPC, CloudTrail, CloudFront, Lambda, EKS, ECS, etc.) • 3+ years of experience with serverless technologies, services, and container technologies (Docker, Kubernetes, etc.) o Manage Kubernetes charts using helm. o Managed production application deployments in Kubernetes cluster using KubeCTL. o Expertise in deploying distributed apps with containers (Docker) & orchestration (Kubernetes EKS,). o Experience in infrastructure-as-code tools for provisioning and managing Kubernetes infrastructure. o (Preferred) Certification in container orchestration systems and/or Certified Kubernetes Administrator. o Experience with Log Management and Analytics tools such as Splunk / ELK • 3+ years of experience with writing, debugging, and enhancing Terraform to write infrastructure as code to create scrips for EKS, EC2, S3, and other AWS services. o Expertise with working with Terraform Key features such as Infrastructure as code, execution plans, resource graphs, and change automation. o Implemented cluster services using Kubernetes and docker to manage local deployments in Kubernetes by building self-hosted Kubernetes clusters using Terraform. o Managed provisioning of AWS infrastructure using Terraform. o Develop and maintain infrastructure-as-code solutions using Terraform. • Ability to write scripts in JavaScript, Bash, Python, Typescript, or similar languages. • Able to work independently and as a team to architect and implement new solutions and technologies. • Very strong written and verbal communication skills; the ability to communicate verbally and in writing with all levels of employees and management, capable of successful formal and informal communication, speaks and writes clearly and understandably at the right level. • Ability to identify, evaluate, learn, and POC new technologies for implementation. • Experience in designing and implementing highly resilient AWS solutions.

Posted 1 week ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities Develop key features and enhance the platforms performance, scalability, and security Design and implement secure, efficient APIs to support communication and data flow Optimize database performance and adhere to data standards Collaborate with team members to align on technical and business requirements Contribute to the maintenance of CI/CD processes and proactive monitoring Stay updated on advancements in PHP, Laravel, AWS, and healthcare technology Requirements Experience with PHP frameworks, particularly Laravel, and a good understanding of modern development practices Knowledge of optimizing database performance, including query optimization and indexing Experience in designing and implementing RESTful APIs with authentication standards such as OAuth 2.0 Understanding of software architecture patterns, including MVC and microservices, to ensure maintainable and scalable code CI/CD and Monitoring Skills Familiarity with CI/CD processes and tools, with experience in automating testing and deployment Knowledge of monitoring tools for system health and performance Core Infrastructure Skills Practical experience in deploying, managing, and scaling applications using AWS services like EC2, RDS, and S3 Basic experience managing cloud-based architectures, including EC2 instances and load balancing Familiarity with containerization technologies such as Docker for application deployment Nice to have Experience with front-end development Basic knowledge of front-end technologies (HTML, CSS, JavaScript) for collaborative development Integration Skills Experience integrating third-party APIs, such as EMR systems or communication tools like Twilio Regulatory Awareness Understanding of regulatory standards impacting healthcare technology, such as HIPAA compliance Communication and Problem-Solving Strong communication skills to work effectively with team members and stakeholders Ability to analyze and troubleshoot technical issues to improve application performance

Posted 1 week ago

Apply

12.0 - 18.0 years

35 - 45 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

Job Summary We are seeking an experienced Amazon Connect Architect with 12 to 15 years of experience to design, develop and implement scalable and reliable cloud-based contact center solutions using Amazon Connect and AWS ecosystem services You will play a key role in translating business needs into technical solutions and lead implementation across clients or business units Key Responsibilities Architect and design contact center solutions using Amazon Connect and AWS services like Lambda Lex DynamoDB S3 and CloudWatch Lead the endtoend implementation and configuration of Amazon Connect Integrate Amazon Connect with CRMs, Salesforce, ServiceNow etc, ticketing systems, and third-party tools Define call flows IVR designs, routing profiles and queue configurations Implement Contact Lens realtime metrics and historical reporting Collaborate with cross-functional teams, developers, business analysts project managers Create technical documentation diagrams and handoff materials Stay updated on AWS best practices and new Amazon Connect features Provide technical leadership and mentorship to development support teams Required Skills Proven experience designing and deploying Amazon Connect solutions Strong hands-on knowledge of AWS Lambda, IAM, S3, DynamoDB, Kinesis, and CloudFormation Experience with Amazon Lex and AIML for voice bots Proficiency in programming scripting JavaScript, Node.js Familiarity with CRM integrations especially Salesforce Service Cloud Voice Understanding of telephony concepts SIP DID ACD IVR CTI Experience with CICD pipelines and version control Git Strong documentation and communication skills Preferred Skills AWS Certified Solutions Architect or Amazon Connect accreditation

Posted 2 weeks ago

Apply

8.0 - 13.0 years

22 - 30 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

LOCATION: PAN INDIA Experience: 8+ years Support Model: 24x7 rotational Role Overview: Handle service delivery and ensure performance across all Amazon Connect support areas. Oversee overall support operations, enhancements, and system updates. Act as primary escalation point for incidents. Manage SLAs and ensure service standards are met. Identify process gaps and implement improvements. Lead and mentor Junior engineers. Maintain relationships with internal and external stakeholders. Skills Required: Deep hands-on experience with Amazon Connect Strong knowledge of AWS Lambda, DynamoDB, S3 In-depth understanding of contact flows, queues, routing profile, quick connect, telephony config, and call routing Strong troubleshooting skills in WebRTC and voice issues Experience with CloudWatch, Connect Metrics, CI/CD pipelines Experience integrating with Salesforce. (Service Cloud Voice) Good documentation and process improvement capability Strong leadership and communication skills

Posted 2 weeks ago

Apply

3.0 - 8.0 years

15 - 25 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Work from Office

Naukri logo

LOCATION: PAN INDIA Experience: 3-5 years Support Model: 24x7 rotational Role Overview: Provide support on Amazon Connect-related incidents and user issues. Handle basic troubleshooting of voice, call routing, and UI-based configurations. Support change announcements and basic deployment activities. Coordinate with L2/L3 engineers for escalation. Maintain documentation and update knowledge base. Skills Required: Hands-on experience with Amazon Connect (basic flows, routing, and settings) Exposure to AWS Lambda, S3, DynamoDB Basic understanding of WebRTC and voice troubleshooting Familiar with CloudWatch, Connect Metrics Willingness to learn Salesforce integration. (Service Cloud Voice) Strong willingness to work in support model and take ownership Experience: 5-8 years Support Model: 24x7 rotational Role Overview: Provide L2 level support for Amazon Connect and associated AWS services. Address incidents and troubleshoot system or telephony-related issues. Support service delivery and ensure announced changes are implemented. Maintain SLAs and escalate where required. Contribute to documentation and improvement plans. Support deployment through CI/CD pipeline. Skills Required: Strong hands-on experience with Amazon Connect Working knowledge of Lambda, DynamoDB, S3 Good understanding of call flows, routing, and WebRTC troubleshooting Familiarity with CloudWatch, Connect Metrics, CI/CD Exposure to Salesforce integration helpful. (Service Cloud Voice) Ability to work independently with issue resolution Good communication and support handling

Posted 2 weeks ago

Apply

3.0 - 5.0 years

1 - 3 Lacs

Chennai

Work from Office

Naukri logo

**AWS Infrastructure Management:** Design, implement, and maintain scalable, secure cloud infrastructure using AWS services (EC2, Lambda, S3, RDS, Cloud Formation/Terraform, etc.) Monitor and optimize cloud resource usage and costs **CI/CD Pipeline Automation:** Set up and maintain robust CI/CD pipelines using tools such as GitHub Actions, GitLab CI, Jenkins, or AWS Code Pipeline Ensure smooth deployment processes for staging and production environments **Git Workflow Management:** Implement and enforce best practices for version control and branching strategies (Gitflow, trunk-based development, etc.) Support development teams in resolving Git issues and improving workflows **Twilio Integration & Support:** Manage and maintain Twilio-based communication systems (SMS, Voice, WhatsApp, Programmable Messaging) Develop and deploy Twilio Functions and Studio Flows for customer engagement Monitor communication systems and troubleshoot delivery or quality issues **Infrastructure as Code & Automation:** Use tools like Terraform, Cloud Formation, or Pulumi for reproducible infrastructure Create scripts and automation tools to streamline routine DevOps tasks **Monitoring, Logging & Security:** Implement and maintain monitoring/logging tools (Cloud Watch, Datadog, ELK, etc.) Ensure adherence to best practices around IAM, secrets management, and compliance **Requirements** 3-5+ years of experience in DevOps or a similar role Expert-level experience with **Amazon Web Services (AWS)** Strong command of **Git** and Git-based CI/CD practices Experience building and supporting solutions using **Twilio APIs** (SMS, Voice, Programmable Messaging, etc.) Proficiency in scripting languages (Bash, Python, etc.) Hands-on experience with containerization (Docker) and orchestration tools (ECS, EKS, Kubernetes) Familiarity with Agile/Scrum workflows and collaborative development environments **Preferred Qualifications** AWS Certifications (e.g., Solutions Architect, DevOps Engineer) Experience with serverless frameworks and event-driven architectures Previous work with other communication platforms (e.g., SendGrid, Nexmo) a plus Knowledge of RESTful API development and integration Experience working in high-availability, production-grade systems

Posted 2 weeks ago

Apply

6.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

Naukri logo

We are looking for a skilled and proactive AWS Operational Support Analyst to join our cloud infrastructure team. The ideal candidate will be responsible for monitoring, maintaining, and improving the performance, security, and reliability of AWS-hosted environments. This role is essential in ensuring uninterrupted cloud operations and supporting DevOps, development, and business teams with cloud-related issues. Key Responsibilities Monitor AWS cloud infrastructure for performance, availability, and operational issues. Manage incident response, root cause analysis, and resolution of infrastructure-related issues. Execute daily operational tasks including backups, system patching, and performance tuning. Collaborate with DevOps and engineering teams to ensure smooth CI/CD operations. Maintain system documentation and support knowledge base. Automate routine tasks using shell scripts or AWS tools (e.g., Lambda, Systems Manager). Manage AWS services such as EC2, RDS, S3, CloudWatch, IAM, and VPC. Implement cloud cost-optimization practices and security compliance controls. Perform health checks, generate reports, and suggest performance improvements.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Qualification & Experience: Minimum of 8 years of experience as a Data Scientist/Engineer with demonstrated expertise in data engineering and cloud computing technologies. Technical Responsibilities Excellent proficiency in Python, with a strong focus on developing advanced skills. Extensive exposure to NLP and image processing concepts. Proficient in version control systems like Git. In-depth understanding of Azure deployments. Expertise in OCR, ML model training, and transfer learning. Experience working with unstructured data formats such as PDFs, DOCX, and images. O Strong familiarity with data science best practices and the ML lifecycle. Strong experience with data pipeline development, ETL processes, and data engineering tools such as Apache Airflow, PySpark, or Databricks. Familiarity with cloud computing platforms like Azure, AWS, or GCP, including services like Azure Data Factory, S3, Lambda, and BigQuery. Tool Exposure: Advanced understanding and hands-on experience with Git, Azure, Python, R programming and data engineering tools such as Snowflake, Databricks, or PySpark. Data mining, cleaning and engineering: Leading the identification and merging of relevant data sources, ensuring data quality, and resolving data inconsistencies. Cloud Solutions Architecture: Designing and deploying scalable data engineering workflows on cloud platforms such as Azure, AWS, or GCP. Data Analysis : Executing complex analyses against business requirements using appropriate tools and technologies. Software Development : Leading the development of reusable, version-controlled code under minimal supervision. Big Data Processing : Developing solutions to handle large-scale data processing using tools like Hadoop, Spark, or Databricks. Principal Duties & Key Responsibilities: Leading data extraction from multiple sources, including PDFs, images, databases, and APIs. Driving optical character recognition (OCR) processes to digitize data from images. Applying advanced natural language processing (NLP) techniques to understand complex data. Developing and implementing highly accurate statistical models and data engineering pipelines to support critical business decisions and continuously monitor their performance. Designing and managing scalable cloud-based data architectures using Azure, AWS, or GCP services. Collaborating closely with business domain experts to identify and drive key business value drivers. Documenting model design choices, algorithm selection processes, and dependencies. Effectively collaborating in cross-functional teams within the CoE and across the organization. Proactively seeking opportunities to contribute beyond assigned tasks. Required Competencies: Exceptional communication and interpersonal skills. Proficiency in Microsoft Office 365 applications. Ability to work independently, demonstrate initiative, and provide strategic guidance. Strong networking, communication, and people skills. Outstanding organizational skills with the ability to work independently and as part of a team. Excellent technical writing skills. Effective problem-solving abilities. Flexibility and adaptability to work flexible hours as required. Key competencies / Values: Client Focus : Tailoring skills and understanding client needs to deliver exceptional results. Excellence : Striving for excellence defined by clients, delivering high-quality work. Trust : Building and retaining trust with clients, colleagues, and partners. Teamwork : Collaborating effectively to achieve collective success. Responsibility : Taking ownership of performance and safety, ensuring accountability. People : Creating an inclusive environment that fosters individual growth and development.

Posted 2 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies