Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
1.0 - 5.0 years
9 - 13 Lacs
Bengaluru
Work from Office
Minimum Qualifications:- BA/BSc/B.E./BTech degree from Tier I, II college in Computer Science, Statistics, Mathematics, Economics or related fields- 1to 4 years of experience in working with data and conducting statistical and/or numerical analysis- Strong understanding of how data can be stored and accessed in different structures- Experience with writing computer programs to solve problems- Strong understanding of data operations such as sub-setting, sorting, merging, aggregating and CRUD operations- Ability to write SQL code and familiarity with R/Python, Linux shell commands- Be willing and able to quickly learn about new businesses, database technologies and analysis techniques- Ability to tell a good story and support it with numbers and visuals- Strong oral and written communication Preferred Qualifications:- Experience working with large datasets- Experience with AWS analytics infrastructure (Redshift, S3, Athena, Boto3)- Experience building analytics applications leveraging R, Python, Tableau, Looker or other- Experience in geo-spatial analysis with POSTGIS, QGIS
Posted 1 month ago
0.0 - 1.0 years
4 - 8 Lacs
Gurugram
Work from Office
Job Description: Good understanding of Python / Django / Flask tech stack with exposure to RDBMS. Understanding of OOPs and programming fundamentals. Should be able to write efficient algorithms to solve business problems. Should be flexible to cut across programming languages to solve a problem end to end and work with cross-stack dev team. Should be ready to work in high availability and complex business systems, with readiness to learn and contribute each day. Experience: 0 to 2 years Location: Gurgaon Qualification: BE / BTECH / MCA / MTECH in Computer Science or related stream Competencies: Drive for results, Very High on Aptitude, ANALYTICALLY SHARP and EAGER to learn new technologies. Job Responsibilities: Passionate about programming Ready to solve real world challenges with efficient coding using open source stack. Drive for results, ANALYTICALLY SHARP and EAGER to learn new technologies. Ready to work in challenging environment where technology is no bar Learn and improvise on the fly as every day would be a new day/new challenges Who you are: Understanding project requirements as provided in the Design Documents and develop the application modules to meet the requirements Work with developers and architects, to ensure bug free and timely delivery Following coding best practices and guidelines. Support live systems with enhancements, maintenance and/or bug fixes. Conducting unit testing / implementing unit test cases. Should be a PASSIONATE about work and delivering quality results. Strong programming and problem solving skills. Good understanding of OOPs / Python / Django and/or Flask Knowledge of AWS serverless stack (Lambda, DynamoDB, SQS, S3) would be a value add Knowledge of REST/JSON APIs and/or SOAP/XML webservices Experience with Github and advanced Github features (good to have). *Should be available to join within 30 days from the date of offer
Posted 1 month ago
3.0 - 8.0 years
10 - 18 Lacs
Kolkata, Hyderabad, Pune
Work from Office
JD is below: Design, develop, and deploy generative AI based applications using AWS Bedrock. Proficiency in prompt engineering and RAG pipeline Experience in building Agentic Generative AI applications Fine-tune and optimize foundation models from AWS Bedrock for various use cases. Integrate generative AI capabilities into enterprise applications and workflows. Collaborate with cross-functional teams, including data scientists, ML engineers, and software developers, to implement AI-powered solutions. Utilize AWS services (S3, Lambda, SageMaker, etc.) to build scalable AI solutions. Develop APIs and interfaces to enable seamless interaction with AI models. Monitor model performance, conduct A/B testing, and enhance AI-driven products. Ensure compliance with AI ethics, governance, and security best practices. Stay up-to-date with advancements in generative AI and AWS cloud technologies. Required Skills & Qualifications: Bachelor's or Masters degree in Computer Science, AI, Machine Learning, or a related field. 3+ years of experience in AI/ML development, with a focus on generative AI. Hands-on experience with AWS Bedrock and foundation models. Proficiency in Python and ML frameworks. Experience with AWS services such as SageMaker, Lambda, API Gateway, DynamoDB, and S3. Experience with prompt engineering, model fine-tuning, and inference optimization. Familiarity with MLOps practices and CI/CD pipelines for AI deployment. Ability to work with large-scale datasets and optimize AI models for performance. Excellent problem-solving skills and ability to work in an agile environment. Preferred Qualifications: AWS Certified Machine Learning – Specialty or equivalent certification. Experience in LLMOps and model lifecycle management. Knowledge of multi-modal AI models (text, image, video generation). Hands-on experience with other cloud AI platforms (Google Vertex AI, Azure OpenAI). Strong understanding of ethical AI principles and bias mitigation techniques.
Posted 1 month ago
3.0 - 5.0 years
7 - 9 Lacs
Bengaluru
Work from Office
We are looking for a skilled Senior Associate to join our team in Bengaluru, with 3-5 years of experience in AWS infrastructure solutions architecture. The ideal candidate will have a strong background in designing and implementing scalable cloud-based systems. Roles and Responsibility Design and implement secure, scalable, and highly available cloud-based systems using AWS services such as EC2, S3, EBS, and Lambda. Collaborate with cross-functional teams to identify business requirements and develop technical solutions that meet those needs. Develop and maintain technical documentation for cloud-based systems, including design documents and implementation guides. Troubleshoot and resolve complex technical issues related to cloud-based systems, ensuring minimal downtime and optimal system performance. Participate in code reviews to ensure high-quality code standards and adherence to best practices. Stay up-to-date with the latest trends and technologies in cloud computing, applying this knowledge to improve existing systems and processes. Job Requirements Strong understanding of cloud computing concepts, including IaaS, PaaS, and SaaS. Proficiency in programming languages such as Python, Java, or C++ is desirable. Experience with containerization using Docker and orchestration using Kubernetes is preferred. Knowledge of Agile methodologies and version control systems like Git is beneficial. Excellent problem-solving skills, with the ability to analyze complex technical issues and develop creative solutions. Strong communication and collaboration skills, with the ability to work effectively with cross-functional teams. At least 3 years of AWS cloud IaaS and PaaS hands-on experience. A seasoned candidate managing clients requirements end-to-end (discovery, planning, design, implementation, and transition). Plan, develop, and configure AWS infrastructure from conceptualization through stabilization using various AWS tools, methodology, design best practices, etc. Planning for data backup, disaster recovery, data privacy, and security requirements to ensure solution remains secured and compliant with security standards and frameworks. Monitoring, troubleshooting, and resolving infrastructure issues in AWS cloud. Experience in keeping cloud environment secure and proactively preventing downtime. Good knowledge in determining associated security risks and mitigation techniques. Ability to work both independently and in a multi-disciplinary team environment. Own the design documentation of solution implemented i.e., High Level & Low Level Design documents. Perform routine infrastructure analysis and evaluation on resource requirements necessary to maintain and/or improve SLAs. Strong problem-solving skills, customer service, and people skills. Excellent command of the English language (both verbal and written).
Posted 1 month ago
6.0 - 10.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Notice Period : Immediate - 30 days Mandatory Skills : Big Data, Python, SQL, Spark/Pyspark, AWS Cloud JD and required Skills & Responsibilities : - Actively participate in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing, roll-out, and support. - Solve complex business problems by utilizing a disciplined development methodology. - Produce scalable, flexible, efficient, and supportable solutions using appropriate technologies. - Analyse the source and target system data. Map the transformation that meets the requirements. - Interact with the client and onsite coordinators during different phases of a project. - Design and implement product features in collaboration with business and Technology stakeholders. - Anticipate, identify, and solve issues concerning data management to improve data quality. - Clean, prepare, and optimize data at scale for ingestion and consumption. - Support the implementation of new data management projects and re-structure the current data architecture. - Implement automated workflows and routines using workflow scheduling tools. - Understand and use continuous integration, test-driven development, and production deployment frameworks. - Participate in design, code, test plans, and dataset implementation performed by other data engineers in support of maintaining data engineering standards. - Analyze and profile data for the purpose of designing scalable solutions. - Troubleshoot straightforward data issues and perform root cause analysis to proactively resolve product issues. Required Skills : - 5+ years of relevant experience developing Data and analytic solutions. - Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive & PySpark - Experience with relational SQL. - Experience with scripting languages such as Python. - Experience with source control tools such as GitHub and related dev process. - Experience with workflow scheduling tools such as Airflow. - In-depth knowledge of AWS Cloud (S3, EMR, Databricks) - Has a passion for data solutions. - Has a strong problem-solving and analytical mindset - Working experience in the design, Development, and test of data pipelines. - Experience working with Agile Teams. - Able to influence and communicate effectively, both verbally and in writing, with team members and business stakeholders - Able to quickly pick up new programming languages, technologies, and frameworks. - Bachelor's degree in computer science Apply Insights Follow-up Save this job for future reference Did you find something suspicious? Report Here! Hide This Job? Click here to hide this job for you. You can also choose to hide all the jobs from the recruiter.
Posted 1 month ago
6.0 - 10.0 years
8 - 12 Lacs
Bengaluru
Work from Office
Notice Period : Immediate - 30 days Mandatory Skills : Big Data, Python, SQL, Spark/Pyspark, AWS Cloud JD and required Skills & Responsibilities : - Actively participate in all phases of the software development lifecycle, including requirements gathering, functional and technical design, development, testing, roll-out, and support. - Solve complex business problems by utilizing a disciplined development methodology. - Produce scalable, flexible, efficient, and supportable solutions using appropriate technologies. - Analyse the source and target system data. Map the transformation that meets the requirements. - Interact with the client and onsite coordinators during different phases of a project. - Design and implement product features in collaboration with business and Technology stakeholders. - Anticipate, identify, and solve issues concerning data management to improve data quality. - Clean, prepare, and optimize data at scale for ingestion and consumption. - Support the implementation of new data management projects and re-structure the current data architecture. - Implement automated workflows and routines using workflow scheduling tools. - Understand and use continuous integration, test-driven development, and production deployment frameworks. - Participate in design, code, test plans, and dataset implementation performed by other data engineers in support of maintaining data engineering standards. - Analyze and profile data for the purpose of designing scalable solutions. - Troubleshoot straightforward data issues and perform root cause analysis to proactively resolve product issues. Required Skills : - 5+ years of relevant experience developing Data and analytic solutions. - Experience building data lake solutions leveraging one or more of the following AWS, EMR, S3, Hive & PySpark - Experience with relational SQL. - Experience with scripting languages such as Python. - Experience with source control tools such as GitHub and related dev process. - Experience with workflow scheduling tools such as Airflow. - In-depth knowledge of AWS Cloud (S3, EMR, Databricks) - Has a passion for data solutions. - Has a strong problem-solving and analytical mindset - Working experience in the design, Development, and test of data pipelines. - Experience working with Agile Teams. - Able to influence and communicate effectively, both verbally and in writing, with team members and business stakeholders - Able to quickly pick up new programming languages, technologies, and frameworks. - Bachelor's degree in computer science
Posted 1 month ago
5.0 - 10.0 years
20 - 27 Lacs
Hyderabad
Work from Office
Position: Experienced Data Engineer We are seeking a skilled and experienced Data Engineer to join our fast-paced and innovative Data Science team. This role involves building and maintaining data pipelines across multiple cloud-based data platforms. Requirements: A minimum of 5 years of total experience, with at least 3-4 years specifically in Data Engineering on a cloud platform. Key Skills & Experience: Proficiency with AWS services such as Glue, Redshift, S3, Lambda, RDS , Amazon Aurora ,DynamoDB ,EMR, Athena, Data Pipeline , Batch Job. Strong expertise in: SQL and Python DBT and Snowflake OpenSearch, Apache NiFi, and Apache Kafka In-depth knowledge of ETL data patterns and Spark-based ETL pipelines. Advanced skills in infrastructure provisioning using Terraform and other Infrastructure-as-Code (IaC) tools. Hands-on experience with cloud-native delivery models, including PaaS, IaaS, and SaaS. Proficiency in Kubernetes, container orchestration, and CI/CD pipelines. Familiarity with GitHub Actions, GitLab, and other leading DevOps and CI/CD solutions. Experience with orchestration tools such as Apache Airflow and serverless/FaaS services. Exposure to NoSQL databases is a plus
Posted 1 month ago
10.0 - 14.0 years
12 - 17 Lacs
Hyderabad
Work from Office
Overview We are seeking a highly skilled and motivated Associate Manager AWS Site Reliability Engineer (SRE) to join our team. As an Associate Manager AWS SRE, you will play a critical role in designing, managing, and optimizing our cloud infrastructure to ensure high availability, reliability, and scalability of our services. You will collaborate with cross-functional teams to implement best practices, automate processes, and drive continuous improvements in our cloud environment Responsibilities Design and Implement Cloud Infrastructure: Architect, deploy, and maintain AWS infrastructure using Infrastructure-as-Code (IaC) tools such as Terraform or CloudFormation. Monitor and Optimize Performance: Develop and implement monitoring, alerting, and logging solutions to ensure the performance and reliability of our systems. Ensure High Availability: Design and implement strategies for achieving high availability and disaster recovery, including backup and failover mechanisms. Automate Processes: Automate repetitive tasks and processes to improve efficiency and reduce human error using tools such as AWS Lambda, Jenkins, and Ansible. Incident Response: Lead and participate in incident response activities, troubleshoot issues, and perform root cause analysis to prevent future occurrences. Security and Compliance: Implement and maintain security best practices and ensure compliance with industry standards and regulations. Collaborate with Development Teams: Work closely with software development teams to ensure smooth deployment and operation of applications in the cloud environment. Capacity Planning: Perform capacity planning and scalability assessments to ensure our infrastructure can handle growth and increased demand. Continuous Improvement: Drive continuous improvement initiatives by identifying and implementing new tools, technologies, and processes. Qualifications Experience: 10+ years of experience and Minimum of 5 years of experience in a Site Reliability Engineer (SRE) or DevOps role, with a focus on AWS cloud infrastructure. Technical Skills: Proficiency in AWS services such as EC2, S3, RDS, VPC, Lambda, CloudFormation, and CloudWatch. Automation Tools: Experience with Infrastructure-as-Code (IaC) tools such as Terraform or CloudFormation, and configuration management tools like Ansible or Chef. Scripting: Strong scripting skills in languages such as Python, Bash, or PowerShell. Monitoring and Logging: Experience with monitoring and logging tools such as Prometheus, Grafana, ELK Stack, or CloudWatch. Problem-Solving: Excellent troubleshooting and problem-solving skills, with a proactive and analytical approach. Communication: Strong communication and collaboration skills, with the ability to work effectively in a team-oriented environment. Certifications: AWS certifications such as AWS Certified Solutions Architect, AWS Certified DevOps Engineer, or AWS Certified SysOps Administrator are highly desirable. Education: Bachelors degree in Computer Science, Engineering, or a related field, or equivalent work experience.
Posted 1 month ago
4.0 - 9.0 years
11 - 21 Lacs
Bengaluru
Hybrid
Java,Spring Boot, AWSGraphQ,RDBMS PostgreSQL, REST APIs. AWS services including EC2, EKS, S3, CloudWatch, Lambda, SNS, and SQS, Junit/Jest, and AI Tools like GitHub Copilot.Desirable-Node.js, Hasura frameworks.
Posted 1 month ago
5.0 - 10.0 years
10 - 20 Lacs
Bengaluru, Mumbai (All Areas)
Work from Office
Key Responsibilities Design, develop, and optimize data pipelines using Python and AWS services such asGlue, Lambda, S3, EMR, Redshift, Athena, and Kinesis. Implement ETL/ELT processes to extract, transform, and load data from various sources into centralized repositories (e.g., data lakes or data warehouses). Collaborate with cross-functional teams to understand business requirements and translate them into scalable data solutions. Monitor, troubleshoot, and enhance data workflows for performance and cost optimization. Ensure data quality and consistency by implementing validation and governance practices. Work on data security best practices in compliance with organizational policies and regulations. Automate repetitive data engineering tasks using Python scripts and frameworks. Leverage CI/CD pipelines for deployment of data workflows on AWS. Required Skills and Qualifications Professional Experience:5+ years of experiencein data engineering or a related field. Programming: Strong proficiency inPython, with experience in libraries likepandas,pySpark,orboto3. AWS Expertise: Hands-on experience with core AWS services for data engineering, such as: AWS Gluefor ETL/ELT. S3for storage. RedshiftorAthenafor data warehousing and querying. Lambdafor serverless compute. KinesisorSNS/SQSfor data streaming. IAM Rolesfor security. Databases: Proficiency in SQL and experience withrelational(e.g., PostgreSQL, MySQL) andNoSQL(e.g., DynamoDB) databases. Data Processing: Knowledge of big data frameworks (e.g., Hadoop, Spark) is a plus. DevOps: Familiarity with CI/CD pipelines and tools like Jenkins, Git, and CodePipeline. Version Control: Proficient with Git-based workflows. Problem Solving: Excellent analytical and debugging skills. Optional Skills Knowledge ofdata modelinganddata warehouse designprinciples. Experience withdata visualization tools(e.g., Tableau, Power BI). Familiarity with containerization (e.g., Docker) and orchestration (e.g., Kubernetes). Exposure to other programming languages like Scala or Java.
Posted 1 month ago
5.0 - 10.0 years
0 - 3 Lacs
Hyderabad
Hybrid
Dear Candidate, A Warm Greeting for SAIS IT Services! We are hiring for Java Developer for our client. Interested people can share your CV to Jyoti.r@saisservices.com, For more queries, kindly reach me on 8360298749 with the below mentioned details; Please fill the below details: Total Exp- CTC- ECTC- Notice Period- Current Location- Comfortable for Work from Office- Job Description: Were Hiring:Java Developer Location: Hyderabad Job Title: Java Developer Experience 5+ years Work Mode: Hybrid(3 Days WFO) Strong hands-on experience in Java + AWS (Lambda-Mandatory, s3, Ec2) (3 years in AWS) Regards, Jyoti Rani 8360298749 Jyoti.r@saisservices.com
Posted 1 month ago
7.0 - 12.0 years
10 - 20 Lacs
Hyderabad
Remote
Job Title: Senior Data Engineer Location: Remote Job Type: Fulltime Experience Level: 7+ years About the Role: We are seeking a highly skilled Senior Data Engineer to join our team in building a modern data platform on AWS. You will play a key role in transitioning from legacy systems to a scalable, cloud-native architecture using technologies like Apache Iceberg, AWS Glue, Redshift, and Atlan for governance. This role requires hands-on experience across both legacy (e.g., Siebel, Talend, Informatica) and modern data stacks. Responsibilities: Design, develop, and optimize data pipelines and ETL/ELT workflows on AWS. Migrate legacy data solutions (Siebel, Talend, Informatica) to modern AWS-native services. Implement and manage a data lake architecture using Apache Iceberg and AWS Glue. Work with Redshift for data warehousing solutions including performance tuning and modelling. Apply data quality and observability practices using Soda or similar tools. Ensure data governance and metadata management using Atlan (or other tools like Collibra, Alation). Collaborate with data architects, analysts, and business stakeholders to deliver robust data solutions. Build scalable, secure, and high-performing data platforms supporting both batch and real-time use cases. Participate in defining and enforcing data engineering best practices. Required Qualifications: 7+ years of experience in data engineering and data pipeline development. Strong expertise with AWS services, especially Redshift, Glue, S3, and Athena. Proven experience with Apache Iceberg or similar open table formats (like Delta Lake or Hudi). Experience with legacy tools like Siebel, Talend, and Informatica. Knowledge of data governance tools like Atlan, Collibra, or Alation. Experience implementing data quality checks using Soda or equivalent. Strong SQL and Python skills; familiarity with Spark is a plus. Solid understanding of data modeling, data warehousing, and big data architectures. Strong problem-solving skills and the ability to work in an Agile environment.
Posted 1 month ago
9.0 - 14.0 years
20 - 30 Lacs
Kochi, Bengaluru
Work from Office
Senior Data Engineer AWS (Glue, Data Warehousing, Optimization & Security) Experienced Senior Data Engineer (6+ Yrs) with deep expertise in AWS cloud Data services, particularly AWS Glue, to design, build, and optimize scalable data solutions. The ideal candidate will drive end-to-end data engineering initiatives — from ingestion to consumption — with a strong focus on data warehousing, performance optimization, self-service enablement, and data security. The candidate needs to have experience in doing consulting and troubleshooting exercise to design best-fit solutions. Key Responsibilities Consult with business and technology stakeholders to understand data requirements, troubleshoot and advise on best-fit AWS data solutions Design and implement scalable ETL pipelines using AWS Glue, handling structured and semi-structured data Architect and manage modern cloud data warehouses (e.g., Amazon Redshift, Snowflake, or equivalent) Optimize data pipelines and queries for performance, cost-efficiency, and scalability Develop solutions that enable self-service analytics for business and data science teams Implement data security, governance, and access controls Collaborate with data scientists, analysts, and business stakeholders to understand data needs Monitor, troubleshoot, and improve existing data solutions, ensuring high availability and reliability Required Skills & Experience 8+ years of experience in data engineering in AWS platform Strong hands-on experience with AWS Glue, Lambda, S3, Athena, Redshift, IAM Proven expertise in data modelling, data warehousing concepts, and SQL optimization Experience designing self-service data platforms for business users Solid understanding of data security, encryption, and access management Proficiency in Python Familiarity with DevOps practices & CI/CD Strong problem-solving Exposure to BI tools (e.g., QuickSight, Power BI, Tableau) for self-service enablement Preferred Qualifications AWS Certified Data Analytics – Specialty or Solutions Architect – Associate
Posted 1 month ago
7.0 - 12.0 years
22 - 27 Lacs
Hyderabad
Work from Office
Key Responsibilities Data Pipeline Development: Design, develop, and optimize robust data pipelines to efficiently collect, process, and store large-scale datasets for AI/ML applications. ETL Processes: Develop and maintain Extract, Transform, and Load (ETL) processes to ensure accurate and timely data delivery for machine learning models. Data Integration: Integrate diverse data sources (structured, unstructured, and semi-structured data) into a unified and scalable data architecture. Data Warehousing & Management: Design and manage data warehouses to store processed and raw data in a highly structured, accessible format for analytics and AI/ML models. AI/ML Model Development: Collaborate with Data Scientists to build, fine-tune, and deploy machine learning models into production environments. Focus on model optimization, scalability, and operationalization. Automation: Implement automation techniques to support model retraining, monitoring, and reporting. Cloud & Distributed Systems: Work with cloud platforms (AWS, Azure, GCP) and distributed systems to store and process data efficiently, ensuring that AI/ML models are scalable and maintainable in the cloud environment. Data Quality & Governance: Implement data quality checks, monitoring, and governance frameworks to ensure the integrity and security of the data being used for AI/ML models. Collaboration: Work cross-functionally with Data Science, Business Intelligence, and other engineering teams to meet organizational data needs and ensure seamless integration with analytics platforms. Required Skills and Qualifications Bachelor's or Masters Degree in Computer Science, Engineering, Data Science, or a related field. Strong proficiency in Python for AI/ML and data engineering tasks. Experience with AI/ML frameworks such as TensorFlow, PyTorch, Scikit-learn, and Keras. Proficient in SQL and working with relational databases (e.g., MySQL, PostgreSQL, SQL Server). Strong experience with ETL pipelines and data wrangling in large datasets. Familiarity with cloud-based data engineering tools and services (e.g., AWS (S3, Lambda, Redshift), Azure, GCP). Solid understanding of big data technologies like Hadoop, Spark, and Kafka for data processing at scale. Experience in managing and processing both structured and unstructured data. Knowledge of version control systems (e.g., Git) and agile development methodologies. Experience with data containers and orchestration tools such as Docker and Kubernetes. Strong communication skills to collaborate effectively with cross-functional teams. Preferred Skills Experience with Data Warehouses (e.g., Amazon Redshift, Google BigQuery, Snowflake). Familiarity with CI/CD pipelines for ML model deployment and automation. Familiarity with machine learning model monitoring and performance optimization. Experience with data visualization tools like Tableau, Power BI, or Plotly. Knowledge of deep learning models and frameworks. DevOps or MLOps experience for automating deployment of models. Advanced statistics or math background for improving model performance and accuracy.
Posted 1 month ago
5.0 - 8.0 years
3 - 7 Lacs
Hyderabad, Bengaluru
Work from Office
Key Responsibilities: Design, implement, and maintain cloud-based infrastructure on AWS. Manage and monitor AWS services, including EC2, S3, Lambda, RDS, CloudFormation, VPC, etc. Develop automation scripts for deployment, monitoring, and scaling using AWS services. Collaborate with DevOps teams to automate build, test, and deployment pipelines. Ensure the security and compliance of cloud environments using AWS security best practices. Optimize cloud resource usage to reduce costs while maintaining high performance. Troubleshoot issues related to cloud infrastructure and services. Participate in capacity planning and disaster recovery strategies. Monitor application performance and make necessary adjustments to ensure optimal performance. Stay current with new AWS features and tools and evaluate their applicability for the organization. Requirements: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as an AWS Engineer or in a similar cloud infrastructure role. In-depth knowledge of AWS services, including EC2, S3, RDS, Lambda, VPC, CloudWatch, etc. Proficiency in scripting languages such as Python, Shell, or Bash. Experience with infrastructure-as-code tools like Terraform or AWS CloudFormation. Strong understanding of networking concepts, cloud security, and best practices. Familiarity with containerization technologies (e.g., Docker, Kubernetes) is a plus. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication skills, both written and verbal. AWS certifications (AWS Certified Solutions Architect, AWS Certified DevOps Engineer, etc.) are preferred. Preferred Skills: Experience with serverless architectures and services. Knowledge of CI/CD pipelines and DevOps methodologies. Experience with monitoring and logging tools like CloudWatch, Datadog, or Prometheus. Knowledge in AWS FinOps.
Posted 1 month ago
5.0 - 10.0 years
15 - 25 Lacs
Bengaluru, Mumbai (All Areas)
Work from Office
Hello Candidates We are hiring for Java Developer Please find the Job Description below Position - Java Developer Experience - 5 + Years Location - Mumbai / Bengaluru Skills - Java, SQL, AWS Please find below the JD for the position. Proven experience in Java development, with a strong understanding of object-oriented programming principles. Experience with AWS services, including ECS, S3, RDS, Elasticache and CloudFormation. Experience with microservices architecture and RESTful API design. Strong problem-solving skills and attention to detail. Experience in the financial services industry, particularly in trading or risk management, is a plus. Excellent communication and collaboration skills. All Important Check points for all the requirements: Candidate should have all necessary documents as they have very strict Background Verification All Employment: 1) Documents from all the companies candidate worked till now (offer letters, Experience letters & reliving letters) 2) PF, UAN Number, Form 16 & Form 26 A - Mandatory 3) Educational Documents - Marksheets & Degree Certificates Kindly revert back with your acknowledgement on same, & share your updated CV. Total Experience: Relevant Experience: Current Salary: Expected Salary: Current Company / Last Company: Notice Period Last Working Date : Reason for Job Change: Current Location: Preferred Location Have you applied for Mphasis before : YES/NO Alternate Mail ID : Alternate Phone No : PAN CARD No. : NOTE: Interested candidates can share their resume - shrutia.talentsketchers@gmail.com Regards Shruti TS
Posted 1 month ago
6.0 - 11.0 years
15 - 30 Lacs
Hyderabad, Pune, Bengaluru
Hybrid
Warm Greetings from SP Staffing Services Private Limited!! We have an urgent opening with our CMMI Level5 client for the below position. Please send your update profile if you are interested. Relevant Experience: 6 - 15 Yrs Location: Pan India Job Description: Candidate must be experienced working in projects involving Other ideal qualifications include experiences in Primarily looking for a data engineer with expertise in processing data pipelines using Databricks Spark SQL on Hadoop distributions like AWS EMR Data bricks Cloudera etc. Should be very proficient in doing large scale data operations using Databricks and overall very comfortable using Python Familiarity with AWS compute storage and IAM concepts Experience in working with S3 Data Lake as the storage tier Any ETL background Talend AWS Glue etc. is a plus but not required Cloud Warehouse experience Snowflake etc. is a huge plus Carefully evaluates alternative risks and solutions before taking action. Optimizes the use of all available resources Develops solutions to meet business needs that reflect a clear understanding of the objectives practices and procedures of the corporation department and business unit Skills Hands on experience on Databricks Spark SQL AWS Cloud platform especially S3 EMR Databricks Cloudera etc. Experience on Shell scripting Exceptionally strong analytical and problem-solving skills Relevant experience with ETL methods and with retrieving data from dimensional data models and data warehouses Strong experience with relational databases and data access methods especially SQL Excellent collaboration and cross functional leadership skills Excellent communication skills both written and verbal Ability to manage multiple initiatives and priorities in a fast-paced collaborative environment Ability to leverage data assets to respond to complex questions that require timely answers has working knowledge on migrating relational and dimensional databases on AWS Cloud platform Skills Interested can share your resume to sankarspstaffings@gmail.com with below inline details. Over All Exp : Relevant Exp : Current CTC : Expected CTC : Notice Period :
Posted 2 months ago
5.0 - 10.0 years
25 - 30 Lacs
Pune
Remote
- Passionate about TDD (Test First Development) - Have at least 7 to 10 years of development experience with Python. - Have at least 2 to 3 years of experience with React. - Document key business workflows and software designs. Required Candidate profile - Have built complex applications with AWS Serverless technologies (AppSync, DynamoDB, DynamoDB Streams, Lambda, Cognito, S3, CloudFront, Route 53, Amplify) - Strong knowledge of GraphQL.
Posted 2 months ago
8.0 - 10.0 years
1 - 2 Lacs
Hyderabad, Chennai, Bengaluru
Work from Office
Urgent Hiring: Senior Software Engineer Java AWS Location: Bangalore – Domlur (Work from Office – 4 Days/Week) Type: Contract (6 Months, Extendable) Notice Period: Immediate to 15 Days Open Positions: 2 Required Skills: Java (8/11/17) , Spring Boot , Microservices Architecture Experience with Kafka or Apache Camel Minimum 2 Years of AWS Hands-on Experience with: EC2, ECS, S3, SQS, SNS, Lambda, DynamoDB, CloudFormation Experience: 5+ Years
Posted 2 months ago
5.0 - 8.0 years
5 - 8 Lacs
Bengaluru
Work from Office
Develop, test, and maintain applications using Java and Spring Boot. Design and implement microservices architecture. Work with databases to ensure data integrity and performance. Collaborate with cross-functional teams to define, design, Required Candidate profile Proficiency in Java programming. Experience with Spring Boot framework. Knowledge of microservices architecture. Familiarity with databases (SQL/NoSQL). Basic understanding of Kafka and S3.
Posted 2 months ago
3.0 - 8.0 years
5 - 15 Lacs
Pune
Work from Office
P2 3 Java Full stack - Angular 4-6 yrs DL 20 Java web developers Design, develop, and maintain REST-based microservices using Java. Develop intuitive and responsive user interfaces using modern front-end technologies such as Angular, React, and HTML5 Build and optimize robust back-end services, ensuring seamless integration with databases (SQL). Deploy and manage cloud-native applications on AWS infrastructure. Collaborate with cross-functional teams, including UI/UX designers, DevOps, and product owners, to deliver end-to-end solutions. Ensure the applications performance, scalability, and reliability. Write clean, maintainable, and efficient code while following best practices, including unit testing and code reviews. Troubleshoot, debug, and optimize application code and infrastructure. Stay up to date with emerging technologies and industry trends to drive continuous improvement. Required : We are seeking a highly skilled Software Engineers with expertise in full-stack development. The ideal candidate will have experience building scalable, cloud-native applications and a strong understanding of modern software development practices. Microservice Development: Hands-on experience developing and deploying REST-based microservices using Java frameworks (e.g., Spring and Hibernate). Full-Stack Development: Front-End: Proficiency in Angular, React, and HTML5 for building interactive UIs. Back-End: Expertise in Java for business logic and APIs. Database: Strong understanding of SQL and experience with relational databases. Cloud Experience: Hands-on experience with AWS services (e.g., EC2, S3, Lambda, RDS). Familiarity with cloud-native architecture and deployment practices. Experience with CI/CD tools (Jenkins, GitHub, etc.) and containerization technologies (Docker, Kubernetes). Solid understanding of software development principles, including design patterns, clean code, system design, software architecture and agile methodologies. Experience with Advertising, AdTech, Ad Server (SSP/DSP), OTT or Media Streaming will be preferred. Work Experience : - P2 : 3 to 5Yrs, P3 : 5 to 8yrs. P4 : 8 to 12Yrs Job Location : - Pan India
Posted 2 months ago
8.0 - 12.0 years
20 - 30 Lacs
Hyderabad
Work from Office
Design and development of cloud-hosted web applications for insurance industry from high-level architecture and network infrastructure to low-level creation of site layout, user experience, database schema, data structure, work-flows, graphics, unit testing, an end to end integration testing, etc. Working from static application mock-ups and wireframes, developing front-end user interfaces and page templates in HTML 5, CSS, SAAS, LESS TypeScript, bootstrap, Angular and third-party controls like Kendo UI/Infragistics. Proficiency in AWS services like Lambda, EC2, S3, and IAM for deploying and managing applications. Excellent programming skills in Python with the ability to develop, maintain, and debug Python-based applications. Develop, maintain, and debug applications using .NET Core and C#. Stay up to date with the latest industry trends and technologies related to PostgreSQL, AWS, and Python. Design and implement risk management business functionality and in-database analytics. Identify complex data problems and review related information to develop and evaluate options and design and implement solutions. Design and develop functional and responsive web applications by collaborating with other engineers in the Agile team. Develop REST API and understand WCF Services. Prepare documentations and specifications.
Posted 2 months ago
4.0 - 9.0 years
0 - 3 Lacs
Pune
Work from Office
We are seeking a highly skilled and motivated FullStack Node.js Developer to join our dynamic engineering team. The ideal candidate will be responsible for designing, developing, and maintaining robust and scalable backend services, APIs, and integrations, as well as contributing to the development of our user interfaces. This role requires strong expertise in Node.js, PostgreSQL, and a solid understanding of various AWS services, including S3, Athena, RDS, and EC2. Experience with Stripe integration for payment processing and a proven ability to both write and consume APIs are essential, along with proficiency in front-end technologies like HTML and CSS. Key Responsibilities: Design, develop, and maintain high-performance, scalable, and secure backend services using Node.js. Develop and implement RESTful APIs for various internal and external applications, ensuring high availability and performance. Integrate with third-party APIs, including payment gateways like Stripe, and other external services. Manage and optimize PostgreSQL databases, including schema design, query optimization, and data migration. Work extensively with AWS services, specifically: Amazon S3: Store and manage application data, backups, and other static assets. AWS Athena: Develop and execute analytical queries on data stored in S3 for reporting and insights. Amazon RDS (PostgreSQL): Configure, manage, and optimize PostgreSQL instances within RDS. Amazon EC2: Deploy, manage, and scale Node.js applications on EC2 instances. Develop responsive and engaging user interfaces using HTML and CSS. Implement and maintain secure coding practices, including data encryption, authentication, and authorization mechanisms. Collaborate with the client and team to define requirements and deliver high-quality software solutions. Participate in code reviews, ensuring code quality, maintainability, and adherence to best practices. Troubleshoot and debug production issues, providing timely resolutions. Contribute to the continuous improvement of our development processes and tools. Qualifications: Technical Skills: Proven experience as a Node.js Developer with a strong understanding of its asynchronous nature, event loop, and best practices. Expertise in database design, development, and optimization with PostgreSQL. Hands-on experience with AWS services, including: S3: Object storage and management. Athena: Serverless query service for S3 data. RDS (PostgreSQL): Managed relational database service. EC2: Virtual servers for deploying applications. Proficiency in designing, building, and consuming RESTful APIs. Experience integrating with payment processing platforms, specifically Stripe. Strong proficiency in HTML5 and CSS3, including responsive design principles. Familiarity with version control systems (Git). Understanding of software development lifecycle (SDLC) and agile methodologies. Experience with the Redis server for caching, session management, and task scheduling. Experience 5+ years of experience in fullStack development with Node.js. 3+ years of experience working with PostgreSQL. 5+ years of experience with AWS cloud services. Nice to have Familiarity with other AWS services (e.g., Lambda, SQS, SNS). Experience with microservices architecture. Familiarity with JavaScript frameworks/libraries (e.g., React, Angular, Vue.js) Soft Skills Excellent problem-solving and analytical skills. Strong communication and interpersonal abilities. Ability to work independently and as part of a team. Proactive and eager to learn new technologies.
Posted 2 months ago
5.0 - 7.0 years
3 - 7 Lacs
Hyderabad, Bengaluru
Work from Office
Key Responsibilities: Design, implement, and maintain cloud-based infrastructure on AWS. Manage and monitor AWS services, including EC2, S3, Lambda, RDS, CloudFormation, VPC, etc. Develop automation scripts for deployment, monitoring, and scaling using AWS services. Collaborate with DevOps teams to automate build, test, and deployment pipelines. Ensure the security and compliance of cloud environments using AWS security best practices. Optimize cloud resource usage to reduce costs while maintaining high performance. Troubleshoot issues related to cloud infrastructure and services. Participate in capacity planning and disaster recovery strategies. Monitor application performance and make necessary adjustments to ensure optimal performance. Stay current with new AWS features and tools and evaluate their applicability for the organization. Requirements: Bachelor's degree in Computer Science, Engineering, or a related field. Proven experience as an AWS Engineer or in a similar cloud infrastructure role. In-depth knowledge of AWS services, including EC2, S3, RDS, Lambda, VPC, CloudWatch, etc. Proficiency in scripting languages such as Python, Shell, or Bash. Experience with infrastructure-as-code tools like Terraform or AWS CloudFormation. Strong understanding of networking concepts, cloud security, and best practices. Familiarity with containerization technologies (e.g., Docker, Kubernetes) is a plus. Excellent problem-solving, analytical, and troubleshooting skills. Strong communication skills, both written and verbal. AWS certifications (AWS Certified Solutions Architect, AWS Certified DevOps Engineer, etc.) are preferred. Preferred Skills: Experience with serverless architectures and services. Knowledge of CI/CD pipelines and DevOps methodologies. Experience with monitoring and logging tools like CloudWatch, Datadog, or Prometheus. Knowledge in AWS FinOps
Posted 2 months ago
10.0 - 15.0 years
12 - 17 Lacs
Bengaluru
Work from Office
Grade : 7 Purpose of your role This role sits within the ISS Data Platform Team. The Data Platform team is responsible for building and maintaining the platform that enables the ISS business to operate. This role is appropriate for a Lead Data Engineer capable of taking ownership and a delivering a subsection of the wider data platform. Key Responsibilities Design, develop and maintain scalable data pipelines and architectures to support data ingestion, integration and analytics. Be accountable for technical delivery and take ownership of solutions. Lead a team of senior and junior developers providing mentorship and guidance. Collaborate with enterprise architects, business analysts and stakeholders to understand data requirements, validate designs and communicate progress. Drive technical innovation within the department to increase code reusability, code quality and developer productivity. Challenge the status quo by bringing the very latest data engineering practices and techniques. Essential Skills and Experience Core Technical Skills Expert in leveraging cloud-based data platform (Snowflake, Databricks) capabilities to create an enterprise lake house. Advanced expertise with AWS ecosystem and experience in using a variety of core AWS data services like Lambda, EMR, MSK, Glue, S3. Experience designing event-based or streaming data architectures using Kafka. Advanced expertise in Python and SQL. Open to expertise in Java/Scala but require enterprise experience of Python. Expert in designing, building and using CI/CD pipelines to deploy infrastructure (Terraform) and pipelines with test automation. Data Security & Performance Optimization:Experience implementing data access controls to meet regulatory requirements. Experience using both RDBMS (Oracle, Postgres, MSSQL) and NOSQL (Dynamo, OpenSearch, Redis) offerings. Experience implementing CDC ingestion. Experience using orchestration tools (Airflow, Control-M, etc..) Bonus technical Skills: Strong experience in containerisation and experience deploying applications to Kubernetes. Strong experience in API development using Python based frameworks like FastAPI. Key Soft Skills: Problem-Solving:Leadership experience in problem-solving and technical decision-making. Communication:Strong in strategic communication and stakeholder engagement. Project Management:Experienced in overseeing project lifecycles working with Project Managers to manage resources.
Posted 2 months ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
31458 Jobs | Dublin
Wipro
16542 Jobs | Bengaluru
EY
10788 Jobs | London
Accenture in India
10711 Jobs | Dublin 2
Amazon
8660 Jobs | Seattle,WA
Uplers
8559 Jobs | Ahmedabad
IBM
7988 Jobs | Armonk
Oracle
7535 Jobs | Redwood City
Muthoot FinCorp (MFL)
6170 Jobs | New Delhi
Capgemini
6091 Jobs | Paris,France