Home
Jobs

270 S3 Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

10 - 18 Lacs

Noida

Work from Office

Naukri logo

Precognitas Health Pvt. Ltd., a fully owned subsidiary of Foresight Health Solutions LLC, is seeking a Data Engineer to build and optimize our data pipelines, processing frameworks, and analytics infrastructure that power critical healthcare insights. Are you a bright, energetic, and skilled data engineer who wants to make a meaningful impact in a dynamic environment? Do you enjoy designing and implementing scalable data architectures, ML pipelines, automating ETL workflows, and working with cloud-native solutions to process large datasets efficiently? Are you passionate about transforming raw data into actionable insights that drive better healthcare outcomes? If so, join us! Youll play a crucial role in shaping our data strategy, optimizing data ingestion, and ensuring seamless data flow across our systems while leveraging the latest cloud and big data technologies. Required Skills & Experience : 4+ years of experience in data engineering, data pipelines, and ETL/ELT workflows. Strong Python programming skills with expertise in Python Programming, NumPy, Pandas, and data manipulation techniques. Hands-on experience with orchestration tools like Prefect, Apache Airflow, or AWS Step Functions for managing complex workflows. Proficiency in AWS services, including AWS Glue, AWS Batch, S3, Lambda, RDS, Athena, and Redshift. Experience with Docker containerization and Kubernetes for scalable and efficient data processing. Strong understanding of data processing layers, batch and streaming data architectures, and analytics frameworks. Expertise in SQL and NoSQL databases, query optimization, and data modeling for structured and unstructured data. Familiarity with big data technologies like Apache Spark, Hadoop, or similar frameworks. Experience implementing data validation, quality checks, and observability for robust data pipelines. Strong knowledge of Infrastructure as Code (IaC) using Terraform or AWS CDK for managing cloud-based data infrastructure. Ability to work with distributed systems, event-driven architectures (Kafka, Kinesis), and scalable data storage solutions. Experience with CI/CD for data workflows, including version control (Git), automated testing, and deployment pipelines. Knowledge of data security, encryption, and access control best practices in cloud environments. Strong problem-solving skills and ability to collaborate with cross-functional teams, including data scientists and software engineers. Compensation will be commensurate with experience. If you are interested, please send your application to jobs@precognitas.com. For more information about our work, visit www.caliper.care

Posted 23 hours ago

Apply

6.0 - 10.0 years

30 - 35 Lacs

Kochi, Hyderabad, Coimbatore

Work from Office

Naukri logo

1. The resource should have knowledge on Data Warehouse and Data Lake 2. Should aware of building data pipelines using Pyspark 3. Should be strong in SQL skills 4. Should have exposure to AWS environment and services like S3, EC2, EMR, Athena, Redshift etc 5. Good to have programming skills in Python

Posted 1 day ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Mumbai, Maharastra

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 10 S&P Global Dow Jones Indices The Role Development Engineer Python Full Stack S&P Dow Jones Indices a global leader in providing investable and benchmark indices to the financial markets, is looking for a Development Engineer with full stack experience to join our technology team. This is mostly a back-end development role but will also support UI development work. The Team : You will be part of global technology team comprising of Dev, QA and BA teams and will be responsible for analysis, design, development and testing. Responsibilities and Impact You will be working on one of the key systems that is responsible for calculating re-balancing weights and asset selections for S&P indices. Ultimately, the output of this team is used to maintain some of the most recognized and important investable assets globally. Development of RESTful web services and databases; supporting UI development requirements. Interfacing with various AWS infrastructure and services, deploying to Docker environment. Coding, Documentation, Testing, Debugging, Documentation and tier-3 support. Work directly with stakeholders and technical architect to formalize/document requirements for both supporting existing application as well as new initiatives. Perform Application & System Performance tuning and troubleshoot performance issues. Coordinately closely with the QA team and the scrum master to optimize team velocity and task flow. Helps establish and maintain technical standards via code reviews and pull requests Whats in it for you This is an opportunity to work on a team of highly talented and motivated engineers at a highly respected company. You will work on new development as well as enhancements to existing functionality. What Were Looking For: Basic Qualifications 7 - 10 years of IT experience in application development and support, primarily in a back-end API and database development roles with at least some UI development experience. Bachelor's degree in Computer Science, Information Systems, Engineering or, or in lieu, a demonstrated equivalence in work experience. Proficiency in modern Python 10+ (minimum 4 years dedicated, recent Python experience) AWS services experience including API Gateway, ECS / Docker, DynamoDB, S3, Kafka, SQS. SQL database experience, with at least 1 year of Postgres. Python libraries experience including Pydantic, SQLAlchemy and at least one of (Flask, FastAPI, Sanic), focusing on creating RESTful endpoints for data services. JavaScript / Typescript experience and at least one of (Vue 3, React, Angular) Strong unit testing skills with PyTest or UnitTest, and API testing using Postman or Bruno. CI/CD build process experience using Jenkins. Experience with software testing (unit testing, integration testing, test driven development). Strong Work Ethic and good communication skills. Additional Preferred Qualifications : Basic understanding of financial markets (stocks, funds, indices, etc.) Experience working in mission-critical enterprise organizations A passion for creating high quality code and broad unit test coverage. Ability to understand complex business problems, break into smaller executable parts, and delegate. About S&P Global Dow Jones Indic e s At S&P Dow Jones Indices, we provide iconic and innovative index solutions backed by unparalleled expertise across the asset-class spectrum. By bringing transparency to the global capital markets, we empower investors everywhere to make decisions with conviction. Were the largest global resource for index-based concepts, data and research, and home to iconic financial market indicators, such as the S&P 500 and the Dow Jones Industrial Average . More assets are invested in products based upon our indices than any other index provider in the world. With over USD 4 trillion in passively managed assets linked to our indices and over USD 13 trillion benchmarked to our indices, our solutions are widely considered indispensable in tracking market performance, evaluating portfolios and developing investment strategies. S&P Dow Jones Indices is a division of S&P Global (NYSESPGI). S&P Global is the worlds foremost provider of credit ratings, benchmarks, analytics and workflow solutions in the global capital, commodity and automotive markets. With every one of our offerings, we help many of the worlds leading organizations navigate the economic landscape so they can plan for tomorrow, today. Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals.

Posted 1 day ago

Apply

10.0 - 17.0 years

9 - 15 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Dear Candidate, Please find below job description Role :- MLOps + ML Engineer Job Description: Role Overview: We are looking for a highly experienced MLOps and ML Engineer to lead the design, deployment, and optimization of machine learning systems at scale. This role requires deep expertise in MLOps practices, CI/CD automation, and AWS SageMaker, with a strong foundation in machine learning engineering and cloud-native development. Key Responsibilities: Architect and implement robust MLOps pipelines for model development, deployment, monitoring, and governance. Lead the operationalization of ML models using AWS SageMaker and other AWS services. Build and maintain CI/CD pipelines for ML workflows using tools like GitHub Actions, Jenkins, or AWS CodePipeline. Automate model lifecycle management including retraining, versioning, and rollback. Collaborate with data scientists, ML engineers, and DevOps teams to ensure seamless integration and scalability. Monitor production models for performance, drift, and reliability. Establish best practices for reproducibility, security, and compliance in ML systems. Required Skills: 10+ years of experience in ML Engineering, MLOps, or related fields. Deep hands-on experience with AWS SageMaker, Lambda, S3, CloudWatch, and related AWS services. Strong programming skills in Python and experience with Docker, Kubernetes, and Terraform. Expertise in CI/CD tools and infrastructure-as-code. Familiarity with model monitoring tools (e.g., Evidently, Prometheus, Grafana). Solid understanding of ML algorithms, data pipelines, and production-grade systems. Preferred Qualifications: AWS Certified Machine Learning Specialty or DevOps Engineer certification. Experience with feature stores, model registries, and real-time inference systems. Leadership experience in cross-functional ML/AI teams. Primary Skills: MLOps, ML Engineering, AWS related services (SageMaker/S3/CloudWatch) Regards Divya Grover +91 8448403677

Posted 1 day ago

Apply

10.0 - 15.0 years

15 - 25 Lacs

Kolkata, Hyderabad, Bengaluru

Hybrid

Naukri logo

Experience: 10+ Years Job Description: Role Overview: We are seeking an experienced AWS Data & Analytics Architect with a strong background in delivery and excellent communication skills. The ideal candidate will have over 10 years of experience and a proven track record in managing teams and client relationships. You will be responsible for leading data modernization and transformation projects using AWS services. Key Responsibilities: Lead and architect data modernization/transformation projects using AWS services. Manage and mentor a team of data engineers and analysts. Build and maintain strong client relationships, ensuring successful project delivery. Design and implement scalable data architectures and solutions. Oversee the migration of large datasets to AWS, ensuring data integrity and security. Collaborate with stakeholders to understand business requirements and translate them into technical solutions. Ensure best practices in data management and governance are followed. Required Skills and Experience: 10+ years of experience in data architecture and analytics. Hands-on experience with AWS services such as Redshift, S3, Glue, Lambda, RDS, and others. Proven experience in delivering 1-2 large data migration/modernization projects using AWS. Strong leadership and team management skills. Excellent communication and interpersonal skills. Deep understanding of data modeling, ETL processes, and data warehousing. Experience with data governance and security best practices. Ability to work in a fast-paced, dynamic environment. Preferred Qualifications: AWS Certified Solutions Architect Professional or AWS Certified Big Data Specialty. Experience with other cloud platforms (e.g., Azure, GCP) is a plus. Familiarity with machine learning and AI technologies.

Posted 1 day ago

Apply

7.0 - 12.0 years

30 - 45 Lacs

Noida, Pune, Gurugram

Hybrid

Naukri logo

Role: Lead Data Engineer Experience: 7-12 years Must-Have: 7+ years of relevant experienceinData Engineeringand delivery. 7+ years of relevant work experience in Big Data Concepts. Worked on cloud implementations. Have experience in Snowflake, SQL, AWS (glue, EMR, S3, Aurora, RDS, AWS architecture) Good experience withAWS cloudand microservices AWS glue, S3, Python, and Pyspark. Good aptitude, strong problem-solving abilities, analytical skills, and ability to take ownership asappropriate. Should be able to do coding, debugging, performance tuning, and deploying the apps to the Production environment. Experience working in Agile Methodology Ability to learn and help the team learn new technologiesquickly. Excellentcommunication and coordination skills Good to have: Have experience in DevOps tools (Jenkins, GIT etc.) and practices, continuous integration, and delivery (CI/CD) pipelines. Spark, Python, SQL (Exposure to Snowflake), Big Data Concepts, AWS Glue. Worked on cloud implementations (migration, development, etc. Role & Responsibilities: Be accountable for the delivery of the project within the defined timelines with good quality. Working with the clients and Offshore leads to understanding requirements, coming up with high-level designs, and completingdevelopment,and unit testing activities. Keep all the stakeholders updated about the task status/risks/issues if there are any. Keep all the stakeholders updated about the project status/risks/issues if there are any. Work closely with the management wherever and whenever required, to ensure smooth execution and delivery of the project. Guide the team technically and give the team directions on how to plan, design, implement, and deliver the projects. Education: BE/B.Tech from a reputed institute.

Posted 1 day ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

Bengaluru

Work from Office

Naukri logo

About this role: Wells Fargo is seeking a Lead Software Engineer (Lead Data Engineer). In this role, you will: Lead complex technology initiatives including those that are companywide with broad impact Act as a key participant in developing standards and companywide best practices for engineering complex and large scale technology solutions for technology engineering disciplines Design, code, test, debug, and document for projects and programs Review and analyze complex, large-scale technology solutions for tactical and strategic business objectives, enterprise technological environment, and technical challenges that require in-depth evaluation of multiple factors, including intangibles or unprecedented technical factors Make decisions in developing standard and companywide best practices for engineering and technology solutions requiring understanding of industry best practices and new technologies, influencing and leading technology team to meet deliverables and drive new initiatives Collaborate and consult with key technical experts, senior technology team, and external industry groups to resolve complex technical issues and achieve goals Lead projects, teams, or serve as a peer mentor Required Qualifications: 5+ years of Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work experience, training, military experience, education Desired Qualifications: 5+ Years of experience in Data Engineering. 5+ years of overall experience of software development. 5+ years of Python development experience must include 3+ years in Spark framework. 5+ years of Oracle or SQL Server experience in designing, coding and delivering database applications Expert knowledge and considerable development experience with at least two or more of the following : Kafka ,ETL, Big Data, NoSql database, S3 or other object store . Strong understanding of data flows design and how to implement your designs in python Experience in writing and debugging complex PL/SQL or TSQL Stored Procedures Excellent troubleshooting and debugging skills Analyze a feature story and design a robust solution for it and create specs for complex business rules and calculations Ability to understand business problems and articulate a corresponding solution Excellent verbal, written, and interpersonal communication skills Job Expectations: Strong knowledge and understanding of Dremio framework Database query design and optimization Strong experience using the development ecosystem of applications (JIRA, ALM, GitHub, uDeploy(Urban Code Deploy), Jenkins, Artifactory, SVN, etc) Knowledge and understanding of multiple source code version control systems in working with branches, tags and labels

Posted 1 day ago

Apply

3.0 - 5.0 years

0 - 1 Lacs

Pune

Work from Office

Naukri logo

Company: Covalensedigital Location: Pune Experience: 3.5 to 5 Years Hiring For: Consultant About the Role Covalensedigital is seeking a passionate and experienced Full Stack Developer to join our technology team. The ideal candidate will have strong backend development expertise in Python (Flask or Django), along with experience in MySQL, frontend development, and data visualization. You will be responsible for building and maintaining scalable internal tools and dashboards that support business operations and insights. Key Responsibilities Backend Development Develop RESTful APIs using Flask or Django Implement authentication, authorization, and business logic Write scalable and modular Python code Database Management Design and optimize MySQL schemas Perform migrations, indexing, and tuning for performance Frontend Development Build intuitive user interfaces with HTML, CSS, JavaScript Integrate frontend components with backend APIs Data Visualization Utilize Pandas, NumPy, Plotly, Matpand lotlib for analysis and dashboarding Deliver real-time insights through dynamic visualizations Deployment & DevOps Deploy applications on AWS (EC2, RDS, S3) Manage environment variables, backups, and basic CI/CD pipelines Required Skills Minimum 2 years of hands-on experience in Flask or Django Proficiency in MySQL (queries, joins, procedures) Strong understanding of REST APIs and JSON Experience in frontend development with HTML, CSS, JavaScript Data analysis and visualization using Pandas, NumPy, Plotly, Matplotlib AWS deployment experience Familiarity with Git and collaborative workflows Nice to Have Exposure to React or Vue.js Experience with Docker or containerised deployments Familiarity with background jobs (Celery, cron) Understanding of secure coding practices Immediate Joiners Preferred Work Location: Pune (Onsite/Hybrid as per project needs) Send your updated resume to: Email: kalaivanan.balasubramaniam@covalensedigital.com Thanks Kalai 8015302990

Posted 1 day ago

Apply

7.0 - 12.0 years

15 - 25 Lacs

Hyderabad, Bengaluru, Mumbai (All Areas)

Hybrid

Naukri logo

Job Summary: We are seeking a highly motivated and experienced Senior Data Engineer to join our team. This role requires a deep curiosity about our business and a passion for technology and innovation. You will be responsible for designing and developing robust, scalable data engineering solutions that drive our business intelligence and data-driven decision-making processes. If you thrive in a dynamic environment and have a strong desire to deliver top-notch data solutions, we want to hear from you. Key Responsibilities: Collaborate with agile teams to design and develop cutting-edge data engineering solutions. Build and maintain distributed, low-latency, and reliable data pipelines ensuring high availability and timely delivery of data. Design and implement optimized data engineering solutions for Big Data workloads to handle increasing data volumes and complexities. Develop high-performance real-time data ingestion solutions for streaming workloads. Adhere to best practices and established design patterns across all data engineering initiatives. Ensure code quality through elegant design, efficient coding, and performance optimization. Focus on data quality and consistency by implementing monitoring processes and systems. Produce detailed design and test documentation, including Data Flow Diagrams, Technical Design Specs, and Source to Target Mapping documents. Perform data analysis to troubleshoot and resolve data-related issues. Automate data engineering pipelines and data validation processes to eliminate manual interventions. Implement data security and privacy measures, including access controls, key management, and encryption techniques. Stay updated on technology trends, experimenting with new tools, and educating team members. Collaborate with analytics and business teams to improve data models and enhance data accessibility. Communicate effectively with both technical and non-technical stakeholders. Qualifications: Education: Bachelors degree in Computer Science, Computer Engineering, or a related field. Experience: Minimum of 5+ years in architecting, designing, and building data engineering solutions and data platforms. Proven experience in building Lakehouse or Data Warehouses on platforms like Databricks or Snowflake. Expertise in designing and building highly optimized batch/streaming data pipelines using Databricks. Proficiency with data acquisition and transformation tools such as Fivetran and DBT. Strong experience in building efficient data engineering pipelines using Python and PySpark. Experience with distributed data processing frameworks such as Apache Hadoop, Apache Spark, or Flink. Familiarity with real-time data stream processing using tools like Apache Kafka, Kinesis, or Spark Structured Streaming. Experience with various AWS services, including S3, EC2, EMR, Lambda, RDS, DynamoDB, Redshift, and Glue Catalog. Expertise in advanced SQL programming and performance tuning. Key Skills: Strong problem-solving abilities and perseverance in the face of ambiguity. Excellent emotional intelligence and interpersonal skills. Ability to build and maintain productive relationships with internal and external stakeholders. A self-starter mentality with a focus on growth and quick learning. Passion for operational products and creating outstanding employee experiences.

Posted 2 days ago

Apply

5.0 - 10.0 years

16 - 20 Lacs

Mumbai, Goregaon

Work from Office

Naukri logo

Role Overview We are seeking a highly skilled Engineering Manager with deep expertise in the MERN stack (MongoDB, Express, React, Node.js), AWS infrastructure, and DevOps practices. This role requires both hands-on technical leadership and strong people management to lead a team of engineers building scalable, high-performance applications. Key Responsibilities Lead, mentor, and manage a team of full-stack developers working primarily with MERN. Own architecture decisions, code quality, and engineering practices across multiple microservices. Collaborate with Product, Design, and QA teams to define and deliver on product roadmaps. Implement CI/CD pipelines, infrastructure as code, and automated testing strategies. Ensure system scalability, security, and performance optimization across services. Drive sprint planning, code reviews, and technical documentation standards. Work closely with DevOps to maintain uptime and operational excellence. Required Skills 6+ years of experience with full-stack JavaScript development (MERN stack) 2+ years in a leadership/managerial role Strong understanding of Node.js backend and API development Hands-on with React.js, component design, and front-end state management Proficient in MongoDB and designing scalable NoSQL schemas Experience in AWS services (EC2, S3, RDS, Lambda, CloudWatch, IAM) Working knowledge of Docker, GitHub Actions, or similar CI/CD tools Familiarity with monitoring tools like New Relic, Datadog, or Prometheus Solid experience managing agile workflows and team velocity

Posted 2 days ago

Apply

0.0 - 1.0 years

2 - 3 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities: * Collaborate with cross-functional teams on project delivery. * Develop backend solutions using Python, FastAPI & AWS. * Optimize performance through Redis DB & Nginx.

Posted 2 days ago

Apply

3.0 - 6.0 years

40 - 45 Lacs

Kochi, Kolkata, Bhubaneswar

Work from Office

Naukri logo

We are seeking experienced Data Engineers with over 3 years of experience to join our team at Intuit, through Cognizant. The selected candidates will be responsible for developing and maintaining scalable data pipelines, managing data warehousing solutions, and working with advanced cloud environments. The role requires strong technical proficiency and the ability to work onsite in Bangalore. Key Responsibilities: Design, build, and maintain data pipelines to ingest, process, and analyze large datasets using PySpark. Work on Data Warehouse and Data Lake solutions to manage structured and unstructured data. Develop and optimize complex SQL queries for data extraction and reporting. Leverage AWS cloud services such as S3, EC2, EMR, Athena, and Redshift for data storage, processing, and analytics. Collaborate with cross-functional teams to ensure the successful delivery of data solutions that meet business needs. Monitor data pipelines and troubleshoot any issues related to data integrity or system performance. Required Skills: 3 years of experience in data engineering or related fields. In-depth knowledge of Data Warehouses and Data Lakes. Proven experience in building data pipelines using PySpark. Strong expertise in SQL for data manipulation and extraction. Familiarity with AWS cloud services, including S3, EC2, EMR, Athena, Redshift, and other cloud computing platforms. Preferred Skills: Python programming experience is a plus. Experience working in Agile environments with tools like JIRA and GitHub.

Posted 2 days ago

Apply

8.0 - 10.0 years

12 - 20 Lacs

Chennai

Hybrid

Naukri logo

We are looking for a skilled Solutions Engineer with 8-10 years of experience in technical leadership, software architecture, and design. In this role, youll have the opportunity to design and develop scalable, reliable software solutions, utilizing a range of AWS services and open-source technologies. If youre passionate about creating impactful solutions and bridging technical and business needs, wed love to hear from you! Key Responsibilities: Lead the architecture and design of complex, scalable software solutions. Develop and implement robust applications using AWS Lambda, Python, EC2, S3, PHP, Laravel, Serverless Microservices, and open-source software. (Experience with Drupal is a plus.) Apply Agile methodologies and DevOps principles to ensure efficient development cycles. Optimize AWS cloud solutions to balance performance and cost. Collaborate with non-technical stakeholders to explain technical concepts effectively. Qualifications: Bachelors degree in Computer Science, Information Technology, or a related technical field. 8-10 years of experience in a technical leadership role focused on software architecture and design. Proficiency in software development with strong experience in AWS cloud services and programming languages. Excellent problem-solving and critical-thinking skills. Effective communication and interpersonal skills.

Posted 2 days ago

Apply

5.0 - 10.0 years

14 - 18 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

5+ years of working experience in Python 4+ years of hands-on experience with AWS Development, PySpark, Lambdas, Cloud Watch (Alerts), SNS, SQS, Cloud formation, Docker, ECS, Fargate, and ECR. Very strong hands-on knowledge on using Python for integrations between systems through different data formats Expert in deploying and maintaining the applications in AWS and Hands on experience in Kinesis streams, Auto-scaling Team player with very good written and communication skills Strong problem solving and decision-making skills Ability to solve complex software system issues Collaborate with business and other teams to understand business requirements and work on the project deliverables. Participate in requirements gathering and understanding Design a solution based on available framework and code

Posted 2 days ago

Apply

4.0 - 9.0 years

0 - 3 Lacs

Hyderabad, Pune, Bengaluru

Hybrid

Naukri logo

Dear Candidate, Interested candidates : Please share your updated resume along with Photo, Pan card and PF member service history. Seasons Greetings! Role: Python Data Engineer Work Nature - Contract To hire (3rd Party Payroll) Work Location Pan India Total Experience 4+ Yrs Immediate Joiners only Email: Mounika.t@firstmeridianglobal.com Job Description: 4+ years of experience in backend development with Python. Strong experience with AWS services and cloud architecture. Proficiency in developing RESTful APIs and microservices. Experience with database technologies such as SQL, PostgreSQL, and NoSQL databases. Familiarity with containerization and orchestration tools like Docker and Kubernetes. Knowledge of CI/CD pipelines and tools such as Jenkins, GitLab CI, or AWS CodePipeline. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Interested candidates : Please share your updated resume along with Photo, Pan card and PF member service history to the mentioned email: Mounika.t@firstmeridianglobal.com Element Details Full Name as Per Govt Proofs: Contact Number Alternate Contact Number Email ID Date Of Birth Fathers Name Total Experience Relevant Experience PAN Card Number Current CTC Expected CTC Current Work Location Preferred Location Open for Relocation (Yes/No)- Current Company Name Notice Period Mode of Employment (Contract/Permanent) If Contract, please provide payroll company: Do you know anyone who could be interested in this profile? Feel free to share this email with the person. You might make their day just by forwarding this email. Regards Mounika Mounika.t@firstmeridianglobal.com

Posted 2 days ago

Apply

11.0 - 20.0 years

25 - 40 Lacs

Hyderabad, Chennai, Greater Noida

Hybrid

Naukri logo

Primary Skills Proficiency in AWS Services : Deep knowledge of EC2, S3, RDS, Lambda, VPC, IAM, AWS Event Bridge, AWS B2Bi (EDI Generator), CloudFormation, and more. Cloud Architecture Design : Ability to design scalable, resilient, and cost-optimized architectures. Networking & Connectivity: Understanding of VPC peering, Direct Connect, Route 53, and load balancing. Security & Compliance: Implementing IAM policies, encryption, KMS, and compliance frameworks like HIPAA or GDPR. Infrastructure as Code (IaC): Using tools like AWS CloudFormation or Terraform to automate deployments. DevOps Integration : Familiarity with CI/CD pipelines, AWS CodePipeline, and container orchestration (ECS, EKS). Cloud Migration : Planning and executing lift-and-shift or re-architecting strategies for cloud adoption. Monitoring & Optimization: Using CloudWatch, X-Ray, and Trusted Advisor for performance tuning and cost control. Secondary Skills Programming Skills : Python, Java, or Node.js for scripting and automation. Serverless Architecture: Designing with Lambda, API Gateway, and Step Functions. Cost Management: Understanding pricing models (On-Demand, Reserved, Spot) and using Cost Explorer. Disaster Recovery & High Availability: Multi-AZ deployments, backups, and failover strategies. Soft Skills: Communication, stakeholder management, and documentation. Team Collaboration: Working with DevOps, security, and development teams to align cloud goals. Certifications: AWS Certified Solutions Architect Associate/Professional, and optionally DevOps Engineer or Security Specialty

Posted 3 days ago

Apply

5.0 - 10.0 years

10 - 20 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

Role & responsibilities Hands-on experience with AWS services including S3,Glue, API Gateway, and SQS. Strong skills in data engineering on AWS, with proficiency in Python ,pyspark & SQL. Experience with batch job scheduling and managing data dependencies. Knowledge of data processing tools like Spark and Airflow. Automate repetitive tasks and build reusable frameworks to improve efficiency. Provide Run/DevOps support and manage the ongoing operation of data services.

Posted 3 days ago

Apply

6.0 - 8.0 years

8 - 12 Lacs

Mumbai, Bengaluru, Delhi / NCR

Work from Office

Naukri logo

Skills Required Experience in designing and building a serverless data lake solution using a layered components architecture including Ingestion, Storage, processing, Security & Governance, Data cataloguing & Search, Consumption layer. Hands on experience in AWS serverless technologies such as Lake formation, Glue, Glue Python, Glue Workflows, Step Functions, S3, Redshift, Quick sight, Athena, AWS Lambda, Kinesis. Must have experience in Glue. Experience in design, build, orchestration and deploy multi-step data processing pipelines using Python and Java Experience in managing source data access security, configuring authentication and authorisation, enforcing data policies and standards. Experience in AWS Environment setup and configuration. Minimum 6 years of relevant experience with atleast 3 years in building solutions using AWS Ability to work under pressure and commitment to meet customer expectations Location-Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad

Posted 3 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Hyderabad, Bengaluru

Work from Office

Naukri logo

About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotaks Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotaks data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If youve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologies: Redshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 3 days ago

Apply

10.0 - 11.0 years

20 - 22 Lacs

Pune

Work from Office

Naukri logo

Strong programming knowledge in Python, Java,and JavaScript. Exp .with AWS services including Lambda,S3, API Gateway, IAM. Hands-on experience with Infrastructure as Code using Terraform. Exp. integrating backend services with Amazon API Gateway.

Posted 3 days ago

Apply

5.0 - 8.0 years

7 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

We are seeking a highly skilled Senior Data Engineer to join our dynamic team in Bangalore. You will design, develop, and maintain scalable data ingestion frameworks and ELT pipelines using tools such as DBT, Apache Airflow, and Prefect. The ideal candidate will have deep technical expertise in cloud platforms (especially AWS), data architecture, and orchestration tools. You will work with modern cloud data warehouses like Snowflake, Redshift, or Databricks and integrate pipelines with AWS services such as S3, Lambda, Step Functions, and Glue. A strong background in SQL, scripting, and CI/CD practices is essential. Experience with data systems in manufacturing is a plus.

Posted 4 days ago

Apply

5.0 - 7.0 years

9 - 12 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Hiring Data Engineers with 3+ yrs in Databricks, PySpark, Delta Lake, and AWS (S3, Glue, Redshift, Lambda, EMR). Must have strong SQL/Python, CI/CD, and data pipeline experience. Only Tier-1 company backgrounds are considered.

Posted 4 days ago

Apply

5.0 - 10.0 years

7 - 12 Lacs

Chennai

Remote

Naukri logo

Location: 100% Remote Employment Type: Full-Time Must have Own laptop and Internet connection Work hours: 11 AM to 8 PM IST Position Summary: We are looking for a highly skilled and self-driven Full Stack Developer with deep expertise in React.js, Node.js, and AWS cloud services. The ideal candidate will play a critical role in designing, developing, and deploying full-stack web applications in a secure and scalable cloud environment. Key Responsibilities: Design and develop scalable front-end applications using React.js and modern JavaScript/TypeScript frameworks. Build and maintain robust backend services using Node.js, Express, and RESTful APIs. Architect and deploy full-stack solutions on AWS using services such as Lambda, API Gateway, ECS, RDS, S3, CloudFormation, CloudWatch, and DynamoDB. Ensure application performance, security, scalability, and maintainability. Work collaboratively in Agile/Scrum environments and participate in sprint planning, code reviews, and daily standups. Integrate CI/CD pipelines and automate testing and deployment workflows using AWS-native tools or services like Jenkins, CodeBuild, or GitHub Actions. Troubleshoot production issues, optimize system performance, and implement monitoring and alerting solutions. Maintain clean, well-documented, and reusable code and technical documentation. Required Qualifications: 5+ years of professional experience as a full stack developer. Strong expertise in React.js (Hooks, Context, Redux, etc.). Advanced backend development experience with Node.js and related frameworks. Proven hands-on experience designing and deploying applications on AWS Cloud. Solid understanding of RESTful services, microservices architecture, and cloud-native design. Experience working with relational databases (PostgreSQL, MySQL, DynamoDB). Proficient in Git and modern DevOps practices (CI/CD, Infrastructure as Code, etc.). Strong communication skills and ability to collaborate in distributed teams.

Posted 4 days ago

Apply

7.0 - 9.0 years

15 - 22 Lacs

Chennai

Work from Office

Naukri logo

Key Responsibilities • Application Development: Design, develop, and maintain dealership applications using Java, Spring Boot, and Microservices architecture. • Cloud Deployment: Build and manage cloud-native applications on AWS, leveraging services such as Lambda, ECS, RDS, S3, DynamoDB, and API Gateway. • System Architecture: Develop and maintain scalable, high-availability architectures to support large-scale dealership operations. • Database Management: Work with relational and NoSQL databases like PostgreSQL, and DynamoDB for efficient data storage and retrieval. • Integration & APIs: Develop and manage RESTful APIs and integrate with third-party services, including OEMs and dealership management systems (DMS). • DevOps & CI/CD: Implement CI/CD pipelines using tools like Jenkins, GitHub Actions, or AWS CodePipeline for automated testing and deployments. • Security & Compliance: Ensure application security, data privacy, and compliance with industry regulations. • Collaboration & Mentorship: Work closely with product managers, designers, and other engineers. Mentor junior developers to maintain best coding practices. Technical Skills Required • Programming Languages: Java (8+), Spring Boot, Hibernate • Cloud Platforms: AWS (Lambda, S3, EC2, ECS, RDS, DynamoDB, CloudFormation) • Microservices & API Development: RESTful APIs, API Gateway • Database Management: PostgreSQL, DynamoDB • DevOps & Automation: Docker, Kubernetes, Terraform, CI/CD pipelines • Testing & Monitoring: JUnit, Mockito, Prometheus, CloudWatch • Version Control & Collaboration: Git, GitHub, Jira, Confluence Nice-to-Have Skills • Experience with serverless architectures (AWS Lambda, Step Functions) • Exposure to event-driven architectures (Kafka, SNS, SQS) Qualifications • Bachelor's or masters degree in computer science, Software Engineering, or a related field • 5+ years of hands-on experience in Java-based backend development • Strong problem-solving skills with a focus on scalability and performance.

Posted 4 days ago

Apply

7.0 - 9.0 years

15 - 22 Lacs

Chennai

Work from Office

Naukri logo

Key Responsibilities • Application Development: Design, develop, and maintain dealership applications using Java, Spring Boot, and Microservices architecture. • Cloud Deployment: Build and manage cloud-native applications on AWS, leveraging services such as Lambda, ECS, RDS, S3, DynamoDB, and API Gateway. • System Architecture: Develop and maintain scalable, high-availability architectures to support large-scale dealership operations. • Database Management: Work with relational and NoSQL databases like PostgreSQL, and DynamoDB for efficient data storage and retrieval. • Integration & APIs: Develop and manage RESTful APIs and integrate with third-party services, including OEMs and dealership management systems (DMS). • DevOps & CI/CD: Implement CI/CD pipelines using tools like Jenkins, GitHub Actions, or AWS CodePipeline for automated testing and deployments. • Security & Compliance: Ensure application security, data privacy, and compliance with industry regulations. • Collaboration & Mentorship: Work closely with product managers, designers, and other engineers. Mentor junior developers to maintain best coding practices. Technical Skills Required • Programming Languages: Java (8+), Spring Boot, Hibernate • Cloud Platforms: AWS (Lambda, S3, EC2, ECS, RDS, DynamoDB, CloudFormation) • Microservices & API Development: RESTful APIs, API Gateway • Database Management: PostgreSQL, DynamoDB • DevOps & Automation: Docker, Kubernetes, Terraform, CI/CD pipelines • Testing & Monitoring: JUnit, Mockito, Prometheus, CloudWatch • Version Control & Collaboration: Git, GitHub, Jira, Confluence Nice-to-Have Skills • Experience with serverless architectures (AWS Lambda, Step Functions) • Exposure to event-driven architectures (Kafka, SNS, SQS) Qualifications • Bachelor's or masters degree in computer science, Software Engineering, or a related field • 5+ years of hands-on experience in Java-based backend development • Strong problem-solving skills with a focus on scalability and performance.

Posted 4 days ago

Apply

Exploring s3 Jobs in India

The job market for s3 professionals in India is growing rapidly with the increasing demand for cloud computing services. Companies are looking for skilled individuals who can effectively manage and optimize their s3 storage solutions. If you are considering a career in s3, here is a detailed guide to help you navigate the job market in India.

Top Hiring Locations in India

  1. Bangalore
  2. Pune
  3. Hyderabad
  4. Chennai
  5. Mumbai

Average Salary Range

The average salary range for s3 professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum

Career Path

A typical career path in s3 may include the following progression: - Junior s3 Engineer - s3 Developer - s3 Architect - s3 Specialist - s3 Consultant

Related Skills

In addition to expertise in s3, employers often look for professionals with the following skills: - Cloud computing knowledge - AWS certification - Data management skills - Programming skills (e.g., Python, Java) - Problem-solving abilities

Interview Questions

  • What is Amazon s3 and how does it work? (basic)
  • How do you secure s3 buckets? (medium)
  • Explain the difference between s3 and EBS. (medium)
  • How can you improve the performance of s3? (medium)
  • What are the different storage classes in s3? (basic)
  • What is the maximum size of an s3 object? (basic)
  • How would you troubleshoot slow s3 performance? (medium)
  • What is versioning in s3 and why is it useful? (basic)
  • Explain the significance of object lifecycle policies in s3. (advanced)
  • How do you monitor s3 storage usage and performance? (medium)
  • Describe the process of transferring data to and from s3. (basic)
  • What is cross-region replication in s3? (medium)
  • How do you handle encryption in s3? (medium)
  • What are the limitations of s3? (medium)
  • How do you handle data consistency in s3? (advanced)
  • Explain the concept of event notifications in s3. (medium)
  • How do you manage permissions in s3? (basic)
  • What is the difference between s3 and EFS? (medium)
  • How can you optimize costs in s3 storage? (medium)
  • Describe the process of hosting a static website on s3. (medium)
  • What is the significance of multipart uploads in s3? (medium)
  • How do you handle versioning conflicts in s3? (advanced)
  • Explain the concept of pre-signed URLs in s3. (advanced)
  • How do you handle data archiving in s3? (medium)
  • What are the best practices for s3 bucket naming conventions? (basic)

Closing Remark

As you explore opportunities in the s3 job market in India, remember to showcase your expertise, skills, and knowledge confidently during interviews. With the right preparation and a positive attitude, you can excel in s3 roles and contribute effectively to the growing field of cloud computing. Good luck in your job search!

cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies