Jobs
Interviews

1207 Aws Cloud Jobs - Page 21

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2.0 - 7.0 years

13 - 17 Lacs

Chennai

Work from Office

Job Area: Engineering Group, Engineering Group > Software Engineering General Summary: As a leading technology innovator, Qualcomm pushes the boundaries of what's possible to enable next-generation experiences and drives digital transformation to help create a smarter, connected future for all. As a Qualcomm Software Engineer, you will design, develop, create, modify, and validate embedded and cloud edge software, applications, and/or specialized utility programs that launch cutting-edge, world class products that meet and exceed customer needs. Qualcomm Software Engineers collaborate with systems, hardware, architecture, test engineers, and other teams to design system-level software solutions and obtain information on performance requirements and interfaces. Minimum Qualifications: Bachelor's degree in Engineering, Information Systems, Computer Science, or related field and 2+ years of Software Engineering or related work experience. OR Master's degree in Engineering, Information Systems, Computer Science, or related field and 1+ year of Software Engineering or related work experience. OR PhD in Engineering, Information Systems, Computer Science, or related field. 2+ years of academic or work experience with Programming Language such as C, C++, Java, Python, etc. Job Title: MLOps Engineer - ML Platform Hiring Title: Flexible based on candidate experience – about Staff Engineer preferred : We are seeking a highly skilled and experienced MLOps Engineer to join our team and contribute to the development and maintenance of our ML platform both on premises and AWS Cloud. As a MLOps Engineer, you will be responsible for architecting, deploying, and optimizing the ML & Data platform that supports training of Machine Learning Models using NVIDIA DGX clusters and the Kubernetes platform, including technologies like Helm, ArgoCD, Argo Workflow, Prometheus, and Grafana. Your expertise in AWS services such as EKS, EC2, VPC, IAM, S3, and EFS will be crucial in ensuring the smooth operation and scalability of our ML infrastructure. You will work closely with cross-functional teams, including data scientists, software engineers, and infrastructure specialists, to ensure the smooth operation and scalability of our ML infrastructure. Your expertise in MLOps, DevOps, and knowledge of GPU clusters will be vital in enabling efficient training and deployment of ML models. Responsibilities will include: Architect, develop, and maintain the ML platform to support training and inference of ML models. Design and implement scalable and reliable infrastructure solutions for NVIDIA clusters both on premises and AWS Cloud. Collaborate with data scientists and software engineers to define requirements and ensure seamless integration of ML and Data workflows into the platform. Optimize the platform’s performance and scalability, considering factors such as GPU resource utilization, data ingestion, model training, and deployment. Monitor and troubleshoot system performance, identifying and resolving issues to ensure the availability and reliability of the ML platform. Implement and maintain CI/CD pipelines for automated model training, evaluation, and deployment using technologies like ArgoCD and Argo Workflow. Implement and maintain monitoring stack using Prometheus and Grafana to ensure the health and performance of the platform. Manage AWS services including EKS, EC2, VPC, IAM, S3, and EFS to support the platform. Implement logging and monitoring solutions using AWS CloudWatch and other relevant tools. Stay updated with the latest advancements in MLOps, distributed computing, and GPU acceleration technologies, and proactively propose improvements to enhance the ML platform. What are we looking for: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Proven experience as an MLOps Engineer or similar role, with a focus on large-scale ML and/or Data infrastructure and GPU clusters. Strong expertise in configuring and optimizing NVIDIA DGX clusters for deep learning workloads. Proficient in using the Kubernetes platform, including technologies like Helm, ArgoCD, Argo Workflow, Prometheus , and Grafana . Solid programming skills in languages like Python, Go and experience with relevant ML frameworks (e.g., TensorFlow, PyTorch ). In-depth understanding of distributed computing, parallel computing, and GPU acceleration techniques. Familiarity with containerization technologies such as Docker and orchestration tools. Experience with CI/CD pipelines and automation tools for ML workflows (e.g., Jenkins, GitHub, ArgoCD). Experience with AWS services such as EKS , EC2, VPC, IAM, S3, and EFS. Experience with AWS logging and monitoring tools. Strong problem-solving skills and the ability to troubleshoot complex technical issues. Excellent communication and collaboration skills to work effectively within a cross-functional team. We would love to see: Experience with training and deploying models. Knowledge of ML model optimization techniques and memory management on GPUs. Familiarity with ML-specific data storage and retrieval systems. Understanding of security and compliance requirements in ML infrastructure.

Posted 3 weeks ago

Apply

1.0 - 5.0 years

12 - 16 Lacs

Chennai

Work from Office

Job Area: Engineering Group, Engineering Group > Software Engineering General Summary: As a leading technology innovator, Qualcomm pushes the boundaries of what's possible to enable next-generation experiences and drives digital transformation to help create a smarter, connected future for all. As a Qualcomm Software Engineer, you will design, develop, create, modify, and validate embedded and cloud edge software, applications, and/or specialized utility programs that launch cutting-edge, world class products that meet and exceed customer needs. Qualcomm Software Engineers collaborate with systems, hardware, architecture, test engineers, and other teams to design system-level software solutions and obtain information on performance requirements and interfaces. Minimum Qualifications: Bachelor's degree in Engineering, Information Systems, Computer Science, or related field. Job Title: MLOps Engineer - ML Platform Hiring Title: Flexible based on candidate experience – about Staff Engineer preferred : We are seeking a highly skilled and experienced MLOps Engineer to join our team and contribute to the development and maintenance of our ML platform both on premises and AWS Cloud. As a MLOps Engineer, you will be responsible for architecting, deploying, and optimizing the ML & Data platform that supports training of Machine Learning Models using NVIDIA DGX clusters and the Kubernetes platform, including technologies like Helm, ArgoCD, Argo Workflow, Prometheus, and Grafana. Your expertise in AWS services such as EKS, EC2, VPC, IAM, S3, and EFS will be crucial in ensuring the smooth operation and scalability of our ML infrastructure. You will work closely with cross-functional teams, including data scientists, software engineers, and infrastructure specialists, to ensure the smooth operation and scalability of our ML infrastructure. Your expertise in MLOps, DevOps, and knowledge of GPU clusters will be vital in enabling efficient training and deployment of ML models. Responsibilities will include: Architect, develop, and maintain the ML platform to support training and inference of ML models. Design and implement scalable and reliable infrastructure solutions for NVIDIA clusters both on premises and AWS Cloud. Collaborate with data scientists and software engineers to define requirements and ensure seamless integration of ML and Data workflows into the platform. Optimize the platform’s performance and scalability, considering factors such as GPU resource utilization, data ingestion, model training, and deployment. Monitor and troubleshoot system performance, identifying and resolving issues to ensure the availability and reliability of the ML platform. Implement and maintain CI/CD pipelines for automated model training, evaluation, and deployment using technologies like ArgoCD and Argo Workflow. Implement and maintain monitoring stack using Prometheus and Grafana to ensure the health and performance of the platform. Manage AWS services including EKS, EC2, VPC, IAM, S3, and EFS to support the platform. Implement logging and monitoring solutions using AWS CloudWatch and other relevant tools. Stay updated with the latest advancements in MLOps, distributed computing, and GPU acceleration technologies, and proactively propose improvements to enhance the ML platform. What are we looking for: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Proven experience as an MLOps Engineer or similar role, with a focus on large-scale ML and/or Data infrastructure and GPU clusters. Strong expertise in configuring and optimizing NVIDIA DGX clusters for deep learning workloads. Proficient in using the Kubernetes platform, including technologies like Helm, ArgoCD, Argo Workflow, Prometheus , and Grafana . Solid programming skills in languages like Python, Go and experience with relevant ML frameworks (e.g., TensorFlow, PyTorch ). In-depth understanding of distributed computing, parallel computing, and GPU acceleration techniques. Familiarity with containerization technologies such as Docker and orchestration tools. Experience with CI/CD pipelines and automation tools for ML workflows (e.g., Jenkins, GitHub, ArgoCD). Experience with AWS services such as EKS , EC2, VPC, IAM, S3, and EFS. Experience with AWS logging and monitoring tools. Strong problem-solving skills and the ability to troubleshoot complex technical issues. Excellent communication and collaboration skills to work effectively within a cross-functional team. We would love to see: Experience with training and deploying models. Knowledge of ML model optimization techniques and memory management on GPUs. Familiarity with ML-specific data storage and retrieval systems. Understanding of security and compliance requirements in ML infrastructure.

Posted 3 weeks ago

Apply

4.0 - 9.0 years

12 - 17 Lacs

Chennai

Work from Office

Job Area: Engineering Group, Engineering Group > Software Engineering General Summary: As a leading technology innovator, Qualcomm pushes the boundaries of what's possible to enable next-generation experiences and drives digital transformation to help create a smarter, connected future for all. As a Qualcomm Software Engineer, you will design, develop, create, modify, and validate embedded and cloud edge software, applications, and/or specialized utility programs that launch cutting-edge, world class products that meet and exceed customer needs. Qualcomm Software Engineers collaborate with systems, hardware, architecture, test engineers, and other teams to design system-level software solutions and obtain information on performance requirements and interfaces. Minimum Qualifications: Bachelor's degree in Engineering, Information Systems, Computer Science, or related field and 4+ years of Software Engineering or related work experience. OR Master's degree in Engineering, Information Systems, Computer Science, or related field and 3+ years of Software Engineering or related work experience. OR PhD in Engineering, Information Systems, Computer Science, or related field and 2+ years of Software Engineering or related work experience. 2+ years of work experience with Programming Language such as C, C++, Java, Python, etc. Job Title: MLOps Engineer - ML Platform Hiring Title: Flexible based on candidate experience – about Staff Engineer preferred : We are seeking a highly skilled and experienced MLOps Engineer to join our team and contribute to the development and maintenance of our ML platform both on premises and AWS Cloud. As a MLOps Engineer, you will be responsible for architecting, deploying, and optimizing the ML & Data platform that supports training of Machine Learning Models using NVIDIA DGX clusters and the Kubernetes platform, including technologies like Helm, ArgoCD, Argo Workflow, Prometheus, and Grafana. Your expertise in AWS services such as EKS, EC2, VPC, IAM, S3, and EFS will be crucial in ensuring the smooth operation and scalability of our ML infrastructure. You will work closely with cross-functional teams, including data scientists, software engineers, and infrastructure specialists, to ensure the smooth operation and scalability of our ML infrastructure. Your expertise in MLOps, DevOps, and knowledge of GPU clusters will be vital in enabling efficient training and deployment of ML models. Responsibilities will include: Architect, develop, and maintain the ML platform to support training and inference of ML models. Design and implement scalable and reliable infrastructure solutions for NVIDIA clusters both on premises and AWS Cloud. Collaborate with data scientists and software engineers to define requirements and ensure seamless integration of ML and Data workflows into the platform. Optimize the platform’s performance and scalability, considering factors such as GPU resource utilization, data ingestion, model training, and deployment. Monitor and troubleshoot system performance, identifying and resolving issues to ensure the availability and reliability of the ML platform. Implement and maintain CI/CD pipelines for automated model training, evaluation, and deployment using technologies like ArgoCD and Argo Workflow. Implement and maintain monitoring stack using Prometheus and Grafana to ensure the health and performance of the platform. Manage AWS services including EKS, EC2, VPC, IAM, S3, and EFS to support the platform. Implement logging and monitoring solutions using AWS CloudWatch and other relevant tools. Stay updated with the latest advancements in MLOps, distributed computing, and GPU acceleration technologies, and proactively propose improvements to enhance the ML platform. What are we looking for: Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field. Proven experience as an MLOps Engineer or similar role, with a focus on large-scale ML and/or Data infrastructure and GPU clusters. Strong expertise in configuring and optimizing NVIDIA DGX clusters for deep learning workloads. Proficient in using the Kubernetes platform, including technologies like Helm, ArgoCD, Argo Workflow, Prometheus , and Grafana . Solid programming skills in languages like Python, Go and experience with relevant ML frameworks (e.g., TensorFlow, PyTorch ). In-depth understanding of distributed computing, parallel computing, and GPU acceleration techniques. Familiarity with containerization technologies such as Docker and orchestration tools. Experience with CI/CD pipelines and automation tools for ML workflows (e.g., Jenkins, GitHub, ArgoCD). Experience with AWS services such as EKS , EC2, VPC, IAM, S3, and EFS. Experience with AWS logging and monitoring tools. Strong problem-solving skills and the ability to troubleshoot complex technical issues. Excellent communication and collaboration skills to work effectively within a cross-functional team. We would love to see: Experience with training and deploying models. Knowledge of ML model optimization techniques and memory management on GPUs. Familiarity with ML-specific data storage and retrieval systems. Understanding of security and compliance requirements in ML infrastructure.

Posted 3 weeks ago

Apply

13.0 - 18.0 years

14 - 18 Lacs

Gurugram

Work from Office

0px> Who are we? In one sentence We are seeking a Java Full Stack Architect & People Manager with strong technical depth and leadership capabilities to lead our Java Modernization projects. The ideal candidate will possess a robust understanding of Java Full Stack, Databases and Cloud-based solution delivery , combined with proven experience in managing high-performing technical teams. This role requires a visionary who can translate business challenges into scalable distributed solutions while nurturing talent and fostering innovation. What will your job look like? Lead the design and implementation of Java Full Stack solutions covering Frontend, Backend and Batch processes & interface Integrations across business use cases. Translate business requirements into technical architectures using Azure/AWS Cloud Platforms . Manage and mentor a multidisciplinary team of engineers, Leads and Specialists . Drive adoption of Databricks , Python In addition to Java-based frameworks within solution development. Collaborate closely with product owners, data engineering teams, and customer IT & business stakeholders. Ensure high standards in code quality, system performance, and model governance . Track industry trends and continuously improve the technology stack adopting newer trends showcasing productization, automation and innovative ideas. Oversee end-to-end lifecycle: use case identification, PoC, MVP, production deployment, and support. Define and monitor KPIs to measure team performance and project impact. All you need is... 13+ years of overall IT experience with a strong background in Telecom domain (preferred). Proven hands-on experience with Java Full Stack technologies and Cloud DBs Strong understanding of Design Principles and patterns for distributed applications OnPrem as well as OnCloud . Demonstrated experience in building and deploying on Azure or AWS via CI/CD practices . Strong expertise in Java, Databases, Python, Kafka and Linux Scripting . In-depth understanding of cloud-native architecture , microservices , and data pipelines . Solid people management experience: team building, mentoring, performance reviews. Strong analytical thinking and communication skills. Ability to be Hands-On with Coding, Reviews while Development and Production Support Good to Have Skills: Familiarity with Databricks, PySpark Familiarity of Snowflake Why you will love this job: You will be challenged with leading and mentoring a few development teams & projects You will join a strong team with lots of activities, technologies, business challenges and a progression path You will have the opportunity to work with the industry most advanced technologies

Posted 3 weeks ago

Apply

6.0 - 10.0 years

19 - 27 Lacs

Bengaluru

Work from Office

Strong Python, Flask, REST API, and NoSQL skills. AWS Developer Associate certification is required. . Architect, build, and maintain secure, scalable backend services on AWS platforms. Utilize core AWS services and serverless technologies. Provident fund

Posted 3 weeks ago

Apply

0.0 - 1.0 years

0 - 2 Lacs

Ahmedabad

Work from Office

The key qualifications we are looking for include: Strong problem-solving and communication skills. Experience with virtualization (VMware, Nutanix, Hyper-V). Hands-on experience with server hardware (Dell, HP, Cisco, Supermicro). Proficiency in networking protocols and security best practices. Prior experience as an infrastructure engineer or similar role. Relevant certifications (MCSE, AWS Solutions Architect) are a plus. Ability to work collaboratively in a team-oriented environment. 10th and 12th average should be 70%

Posted 3 weeks ago

Apply

3.0 - 5.0 years

27 - 32 Lacs

Bengaluru

Work from Office

Job Title : Data Engineer (DE) / SDE – Data Location Bangalore Experience range 3-15 What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 3 weeks ago

Apply

0.0 - 2.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Data Engineer -1 (Experience – 0-2 years) What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Bengaluru

Work from Office

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 3 weeks ago

Apply

2.0 - 5.0 years

30 - 32 Lacs

Bengaluru

Work from Office

Data Engineer -2 (Experience – 2-5 years) What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

30 - 35 Lacs

Bengaluru

Work from Office

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 3 weeks ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Pune, Ahmedabad

Work from Office

We are seeking a skilled and motivated Google / AWS Cloud DevOps Engineer with over 3 years of hands-on experience in building and maintaining scalable, reliable, and secure cloud infrastructure. You will be part of a dynamic team that focuses on delivering robust DevOps solutions using Google Cloud Platform (GCP), AWS, helping streamline CI/CD pipelines, automate infrastructure provisioning, and optimize cloud-based deployments. Key Responsibilities: Design, implement, and manage scalable and secure infrastructure on Google Cloud Platform / AWS. Develop and maintain CI/CD pipelines using tools such as Cloud Build, Jenkins, GitLab CI/CD, or similar. Implement infrastructure as code (IaC) using Terraform or Pulumi. Monitor system health and performance using AWS / GCPs operations suite (formerly Stackdriver). Automate manual processes to improve system reliability and deployment frequency. Collaborate with software engineers to ensure best DevOps practices are followed in application development and deployment. Handle incident response and root cause analysis for production issues. Ensure compliance with security and governance policies on AWS / GCP. Optimize cost and resource utilization across cloud services. Required Qualifications: 3+ years of hands-on experience with DevOps tools and practices in a cloud environment. Strong experience with Google Cloud Platform (GCP) / AWS services (Compute Engine, Kubernetes Engine, Cloud Functions, Cloud Storage, VPC, etc.). Google / AWS Cloud Professional Cloud DevOps Engineer certification is mandatory. Proficiency with CI/CD tools and version control systems (e.g., Git, GitHub/GitLab, Cloud Build). Solid scripting skills in Bash, Python, or similar languages. Experience with Docker and Kubernetes. Familiarity with monitoring/logging tools such as Prometheus, Grafana, and Cloud Monitoring. Knowledge of networking, security best practices, and IAM on GCP / AWS. Preferred Qualifications: Experience with multi-cloud or hybrid cloud environments. Familiarity with Agile and DevOps culture and practices. Experience with serverless architectures and event-driven design patterns. Knowledge of cost optimization and GCP/AWS billing.

Posted 3 weeks ago

Apply

4.0 - 9.0 years

8 - 18 Lacs

Hyderabad, Chennai

Work from Office

About the Role: We are looking for a highly skilled and experienced Machine Learning / AI Engineer to join our team at Zenardy. The ideal candidate needs to have a proven track record of building, deploying, and optimizing machine learning models in real-world applications. You will be responsible for designing scalable ML systems, collaborating with cross-functional teams, and driving innovation through AI-powered solutions. Location: Chennai, Hyderabad Key Responsibilities: Design, develop, and deploy machine learning models to solve complex business problems Work across the full ML lifecycle: data collection, preprocessing, model training, evaluation, deployment, and monitoring Collaborate with data engineers, product managers, and software engineers to integrate ML models into production systems Conduct research and stay up-to-date with the latest ML/AI advancements, applying them where appropriate Optimize models for performance, scalability, and robustness Document methodologies, experiments, and findings clearly for both technical and non-technical audiences Mentor junior ML engineers or data scientists as needed Required Qualifications: Bachelors or Masters degree in Computer Science, Machine Learning, Data Science, or related field (Ph.D. is a plus) Minimum of 5 hands-on ML/AI projects, preferably in production or with real-world datasets Proficiency in Python and ML libraries/frameworks like TensorFlow, PyTorch, Scikit-learn, XGBoost Solid understanding of core ML concepts: supervised/unsupervised learning, neural networks, NLP, computer vision, etc. Experience with model deployment using APIs, containers (Docker), cloud platforms (AWS/GCP/Azure) Strong data manipulation and analysis skills using Pandas, NumPy, and SQL Knowledge of software engineering best practices: version control (Git), CI/CD, unit testing Preferred Skills: Experience with MLOps tools (MLflow, Kubeflow, SageMaker, etc.) Familiarity with big data technologies like Spark, Hadoop, or distributed training frameworks Experience working in Fintech environments would be a plus Strong problem-solving mindset with excellent communication skills Experience in working with vector database. Understanding of RAG vs Fine-tuning vs Prompt Engineering Why Join Us: Work on impactful, real-world AI challenges Collaborate with a passionate and innovative team Opportunities for career advancement and learning Flexible work environment (remote/hybrid options) Competitive compensation and benefits

Posted 3 weeks ago

Apply

15.0 - 20.0 years

15 - 19 Lacs

Ahmedabad

Work from Office

Project Role : Technology Architect Project Role Description : Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have skills : Amazon Web Services (AWS) Good to have skills : Java Full Stack DevelopmentMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years of fulltime education.Role:Technology Architect Project Role Description:Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have Skills :Amazon Web Services (AWS), SSINON SSI:Good to Have Skills :SSI:Java Full Stack Development NON SSI :Job :'',//field Key Responsibilities:1 Experience of designing multiple Cloud-native Application Architectures2 Experience of developing and deploying cloud-native application including serverless environment like Lambda 3 Optimize applications for AWS environment 4 Design, build and configure applications on AWS environment to meet business process and application requirements5 Understanding of security performance and cost optimizations for AWS6 Understanding to AWS Well-Architected best practices Technical Experience:1 8/15 years of experience in the industry with at least 5 years and above in AWS 2 Strong development background with exposure to majority of services in AWS3 AWS Certified Developer professional and/or AWs specialty level certification DevOps /Security 4 Application development skills on AWS platform with either Java SDK, Python SDK, Reactjs5 Strong in coding using any of the programming languages like Python/Nodejs/Java/Net understanding of AWS architectures across containerization microservices and serverless on AWS 6 Preferred knowledge in cost explorer, budgeting and tagging in AWS 7 Experience with DevOps tools including AWS native DevOps tools like CodeDeploy, Professional Attributes:a Ability to harvest solution and promote reusability across implementations b Self Motivated experts who can work under their own direction with right set of design thinking expertise c Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Educational Qualification:15 years of fulltime education. Additional Info:1 Application developers skills on AWS platform with either Java SDK, Python SDK, Nodejs, ReactJS 2 AWS services Lambda, AWS Amplify, AWS App Runner, AWS CodePipeline, AWS Cloud nine, EBS, Faregate,Additional comments:Only Bangalore, No Location Flex and No Level Flex Qualification 15 years of fulltime education.

Posted 3 weeks ago

Apply

9.0 - 12.0 years

27 - 42 Lacs

Chennai

Work from Office

We are looking for a Sr. Software Engineer to analyze large amounts of raw information to find patterns and build data products to extract valuable business insights. Java developer roles and responsibilities include managing Java/Java EE application development while providing expertise in the full software development lifecycle, from concept and design to testing. Mandatory • Database Knowledge (like MSQL, Oracle, NoSQL etc.,) • Advanced Java (mainly on Spring Boot framework, web development, networking, and some familiarity with specific tools like Maven) Nice to have • PowerBI/Tableau • Python • Azure/AWS Cloud Skills and any Azure AI skills Non-Technical skills: • Analytical mind and business acumen • Strong math skills (e.g. statistics, algebra) • Problem-solving aptitude • Excellent communication and presentation skills

Posted 3 weeks ago

Apply

2.0 - 5.0 years

18 - 21 Lacs

Hyderabad

Work from Office

Overview Annalect is currently seeking a data engineer to join our technology team. In this role you will build Annalect products which sit atop cloud-based data infrastructure. We are looking for people who have a shared passion for technology, design & development, data, and fusing these disciplines together to build cool things. In this role, you will work on one or more software and data products in the Annalect Engineering Team. You will participate in technical architecture, design, and development of software products as well as research and evaluation of new technical solutions. Responsibilities Design, build, test and deploy scalable and reusable systems that handle large amounts of data. Collaborate with product owners and data scientists to build new data products. Ensure data quality and reliability Qualifications Experience designing and managing data flows. Experience designing systems and APIs to integrate data into applications. 4+ years of Linux, Bash, Python, and SQL experience 2+ years using Spark and other frameworks to process large volumes of data. 2+ years using Parquet, ORC, or other columnar file formats. 2+ years using AWS cloud services, esp. services that are used for data processing e.g. Glue, Dataflow, Data Factory, EMR, Dataproc, HDInsights , Athena, Redshift, BigQuery etc. Passion for Technology: Excitement for new technology, bleeding edge applications, and a positive attitude towards solving real world challenges

Posted 3 weeks ago

Apply

0.0 - 2.0 years

1 - 2 Lacs

Kolkata

Work from Office

Job Title: Cloud Presales Support Executive (Kolkata Based Candidates are preferred) Location: Kolkata Job Type: Full-time Work from Office Experience: 6 months - 2 years Job Summary: We are seeking a skilled and dynamic Cloud Presales Support Executive to bridge the gap between cloud technical teams and business stakeholders. The ideal candidate will support cloud proposals, pricing, and customer engagement providing both technical insights and commercial value propositions. You will work closely with sales, engineering, and product teams to deliver cloud solutions that meet client needs while aligning with business goals. Role & responsibilities Assist in tracking client inquiries and coordinate with the sales and technical teams to ensure timely and accurate responses. Support cloud usage analysis by generating basic reports using tools like AWS Cost Explorer, Azure Cost Management. Maintain and regularly update cloud solution templates, pricing sheets, and client presentation decks to ensure accuracy and relevance. Benchmark cloud service pricing and features across providers (AWS, Azure, GWS) to support solution comparisons and recommendations. Create and manage a centralized repository of reusable cloud solution assets such as case studies, proposal templates, and SLAs. Monitor industry news, vendor updates, and promotional offers to inform the team about new opportunities or price changes. Participate in internal brainstorming sessions to contribute to the development of customized and cost-effective client cloud solutions. Schedule and coordinate meetings, demos, and follow-ups related to pre-sales and commercial discussions. Assist in the creation of customer-facing documentation, including FAQs, solution diagrams, service overviews, and how-to guides. Work under the guidance of senior cloud engineers to identify and suggest cost optimization strategies based on client usage patterns. Preferred candidate profile Good understanding of cloud services provided by AWS, Azure and GCP. Strong interest in a hybrid career role involving both technology and business. Good communication and presentation skills. Ability to work collaboratively with both technical and sales teams. Certification in AWS Cloud Practitioner or Azure Fundamentals (AZ-900) is an added advantage. Kolkata-based candidates will be preferred.

Posted 3 weeks ago

Apply

2.0 - 4.0 years

2 - 7 Lacs

Chennai, Bengaluru

Work from Office

Role & responsibilities: Provision and manage Azure resources such as Virtual Machines, Storage Accounts, Networking components, Databricks workspaces, and Azure SQL Managed Instance Deploy and configure Azure Databricks privately, including setting up clusters, managing workspaces, libraries, and permissions. Configured and debugged private endpoint setups and firewall rules for secure access to Azure Databricks. Optimized cluster sizing and autoscaling configurations based on workload characteristics and cost considerations. Analyzed job run logs, and cluster event logs to identify and remediate root causes of failures. Troubleshoot and resolve errors and service interruptions across the Azure ecosystem, especially issues related to Databricks, Azure Data Factory, and APIs . Monitor health and performance of services using Azure Monitor, Log Analytics, and Application Insights . Ensure optimal configuration of Azure Networking , Loadbalacer, Application Gateway,including VNets, NSGs, Firewalls, and ExpressRoute or VPNs if required. Implement and maintain RBAC, IAM policies, and resource tagging for access control and cost management. Coordinate with engineering and data teams to support infrastructure needs and resolve platform-level issues. Maintain proper backup, disaster recovery, and patch management across services. Work with Azure DevOps for resource deployment automation and release management. Designed and implemented CI/CD pipelines using GitHub Actions for automated build, test, and deployment workflows across multiple environments. Keep documentation updated for architecture, configurations, and operational processes. Designed, deployed, and managed scalable Kubernetes clusters using Azure Kubernetes Service (AKS). Configured node pools, autoscaling, and workload balancing in AKS. Implemented and maintained AKS cluster upgrades and versioning strategies. Integrated AKS with Azure services including Azure Monitor, Azure Key Vault, and Azure Container Registry (ACR). Managed network configurations including VNETs, subnets, NSGs, and private cluster setups for AKS. Designed and implemented CI/CD pipelines using GitHub Actions for automated build, test, and deployment workflows across multiple environments. Required Skills & Experience: 2-6 years of experience as an Azure Administrator or similar role. Strong hands-on experience in managing: Azure Databricks (clusters, workspaces, permissions) Azure VMs, Networking, and Storage,Backup,AKS,Keyvault,Private Endpoint Azure Monitor and Diagnostics Azure Resource Manager (ARM) templates or Bicep Proficient in identifying and resolving Azure connectivity errors and performance issues , especially in Databricks pipelines and integrations . Working knowledge of PowerShell, CLI , and portal-based operations . Familiarity with Azure Data Factory , APIM , and SQL MI is a plus. Strong troubleshooting and communication skills to work across teams.

Posted 3 weeks ago

Apply

2.0 - 5.0 years

7 - 12 Lacs

Pune

Work from Office

We're looking for a skilled Python Developer with strong hands-on experience integrting any modern LLMs such as OpenAI , Claude , Amazon Titan , Gemini , etc. The ideal candidate will have deep experience in building and fine-tuning LLM-powered features.

Posted 3 weeks ago

Apply

6.0 - 10.0 years

14 - 24 Lacs

Bengaluru

Work from Office

Job Title: Java +AWS OR JAVA + Azure Experience: Minimum 4years Location: Bengaluru Preferred- Immediate (Only who can able to join July 2025) Interested Candidate share CV Mohini.sharma@adecco.com Job Description: We are looking for a highly motivated and experienced Java Full Stack Developer with a strong command of Java Coding, SQL , and Data Structures & Algorithms (DSA) . The ideal candidate will have at least 4 years of hands-on experience in full stack development and a solid understanding of software design principles. Key Responsibilities: Design and develop robust, scalable applications using Java, Spring Boot, and SQL . Write efficient code leveraging Data Structures and Algorithms for performance optimization. Develop responsive UI components using ReactJS or Angular . Integrate RESTful services and manage data formats like JSON and XML . Collaborate with cross-functional teams in Agile development environments. Implement unit testing using JUnit/Mockito and participate in CI/CD pipelines. Required Skills: Bachelors degree in Computer Science or related field. Minimum 5 years of hands-on experience in Java Full Stack development. Proficiency in Core Java , Spring Boot , and Microservices architecture. Strong SQL expertise with databases such as Oracle, MySQL, PostgreSQL . Solid understanding of Data Structures and Algorithms . Experience with version control (Git) and build tools (Maven/Gradle). Familiarity with CI/CD , automated testing , and Agile/Scrum practices. Exposure to cloud platforms such as AWS, Azure, or Pivotal Cloud is a plus. Excellent communication and problem-solving skills. Preferred Skills (Good to Have) Frontend experience with ReactJS or Angular . Experience in writing and consuming REST/SOAP web services . Exposure to containerization tools like Docker and orchestration with Kubernetes . Location: Bengaluru Work Mode: WFO/Hybrid (depending on project requirements) Notice Period: Immediate joiners or up to 30 days preferred

Posted 3 weeks ago

Apply

4.0 - 9.0 years

5 - 15 Lacs

Coimbatore, Bengaluru

Work from Office

Role & responsibilities The ideal candidate will have a strong background in understanding business problem statements, mapping them to available data, and deriving insights through thorough data analysis using SQL and Python. The role involves handling large datasets, writing complex database queries, and leveraging automation to streamline processes. Experience in Python and automation is a plus. Perform advanced business analysis to identify trends and insights in large healthcare datasets using SQL and Python Write and optimize SQL queries to extract, join, and analyse data from various systems Collaborate with business users to understand needs and translate them into actionable data models Interpret operational data to support process optimization and revenue improvement Analyze and interpret complex healthcare data from various sources to provide insights and support strategic decision making. Work closely with product, engineering, and client-facing teams to define data requirements and specifications. Produce and maintain data flow diagrams, data catalogs, and data dictionaries ensuring they are concise, up-to-date, and scalable. Uphold data architecture standards and best practices to improve data quality, interoperability, and portability. Stay informed on healthcare industry trends, regulations, and standard clinical metrics and taxonomies. Prepare and present detailed data analysis reports for both technical and non-technical stakeholders. Coordinate with clients, partners, and internal teams to understand their data needs and requirements and provide appropriate solutions. Qualifications: Business Data Analyst with strong SQL expertise and 5+years of experience preferably in a healthtech company. Proven ability to perform comprehensive business analysis and data interpretation Proficiency in writing complex SQL queries and optimizing database performance Familiarity with business simulation tools to test proposed improvements Hands-on experience in handling databases and working with data extraction and transformation. Experience in the healthcare or RCM domain is a big plus Proficiency in SQL, Python , and data visualization platforms (e.g., Power BI, Tableau ). Exposure to cloud platforms (e.g., AWS, Azure, GCP) Strong knowledge of Excel and PowerPoint for reporting. Some experience or interest with data from payer, hospital, and clinic support systems such as Electronic Health Records (EHR) systems (Epic, Cerner, Allscripts, Meditech, etc.), Payor claims processing, EDI, EHR, HIE, PBM as well as standard clinical metrics and taxonomies (HEDIS, STARS, HCC, CCS, etc). Understanding of the US healthcare system. Proven experience collaborating with business and engineering teams. Excellent communication skills, with the ability to effectively lead meetings and reach consensus through collaboration. Proficiency with data and analytics tools, including statistical software and databases. Experience with data visualization tools and techniques is a plus. Demonstrated ability to translate complex data into clear, actionable insights.

Posted 3 weeks ago

Apply

11.0 - 15.0 years

30 - 40 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Job Summary: As a Cloud Solution Lead, you will drive cloud strategy, architecture, and implementation for business-critical systems. You will provide technical leadership while managing a team of cloud engineers, ensuring efficient deployment, monitoring, and optimization of cloud solutions. Your expertise in cloud technologies, stakeholder collaboration, and people management will be essential in fostering innovation and operational excellence. Key Responsibilities: Lead the design, development, and deployment of scalable cloud solutions. Oversee cloud security, compliance, and governance in alignment with industry standards. Provide technical direction and mentoring to cloud engineering teams. Manage relationships with vendors and internal stakeholders. Optimize cloud costs and performance using industry best practices. Ensure seamless cloud integration with existing IT infrastructure and applications. Lead incident resolution and continuous improvement strategies. Drive automation and DevOps adoption to enhance operational efficiency. Required Skills & Experience: Strong expertise in Azure/AWS/GCP cloud environments. Proven experience in leading teams and managing people in cloud solution delivery. Proficiency in IaC tools (Terraform, ARM, CloudFormation). Strong grasp of CI/CD pipelines and automation techniques. Familiarity with cloud security frameworks and risk management. Ability to collaborate cross-functionally with IT and business teams. Excellent communication, leadership, and stakeholder management skills. Preferred Qualifications: Cloud certifications ( AWS Solutions Architect, Azure Expert, GCP Professional Architect ) preferred. Experience in multi-cloud strategies and hybrid cloud integration .

Posted 3 weeks ago

Apply

2.0 - 3.0 years

10 - 19 Lacs

Panchkula

Work from Office

We are seeking a highly skilled and motivated DevOps Engineer to join our team. You will play a key role in designing, implementing, and maintaining scalable infrastructure and deployment pipelines. The ideal candidate should have hands-on experience with cloud environments, automation tools, and container orchestration. Key Responsibilities: Develop and maintain CI/CD pipelines for automated testing and deployment. Manage and monitor cloud infrastructure (AWS/Azure/GCP). Configure and maintain Docker containers and Kubernetes clusters. Automate infrastructure using Terraform , Ansible , or similar tools. Improve system reliability and performance through monitoring and alerting (Prometheus, Grafana, ELK stack). Collaborate with development, QA, and product teams to ensure seamless deployments and high system availability. Maintain security and compliance standards across environments. Manage source code and version control tools such as Git. Required Skills and Qualifications: Bachelors degree in Computer Science, Engineering, or a related field. 3–4 years of hands-on experience in DevOps or System Administration. Experience with containerization tools (Docker) and orchestration platforms (Kubernetes). Proficiency in cloud services (AWS, Azure, or GCP). Experience with scripting languages (Bash, Python, or Shell). Hands-on experience with CI/CD tools such as Jenkins, GitLab CI/CD, CircleCI, etc. Strong understanding of system/network administration and troubleshooting. Familiar with infrastructure as code tools like Terraform, CloudFormation, or Ansible.

Posted 3 weeks ago

Apply

8.0 - 12.0 years

20 - 25 Lacs

Bengaluru

Hybrid

We are seeking a highly skilled and experienced Cloud Data Engineer to join our dynamic team. You will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure on GCP/AWS/Azure, ensuring data is accessible, reliable, and available for business use. Key Responsibilities: Data Pipeline Development: Design, develop, and maintain data pipelines using GCP/AWS/Azure services such as Dataflow, Dataproc, BigQuery, and Cloud Storage. Data Integration: Work on integrating data from various sources (structured, semi-structured, and unstructured) into GCP/AWS/Azure environments. Data Modeling: Develop and maintain efficient data models in BigQuery to support analytics and reporting needs. Data Warehousing: Implement data warehousing solutions on GCP, optimizing performance and scalability. ETL/ELT Processes: Build and manage ETL/ELT processes using tools like Apache Airflow, Data Fusion, and Python. Data Quality & Governance: Implement data quality checks, data lineage, and data governance best practices to ensure high data integrity. Automation: Automate data pipelines and workflows to reduce manual effort and improve efficiency. Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver data solutions that meet business needs. Optimization: Continuously monitor and optimize the performance of data pipelines and queries for cost and efficiency. Security: Ensure data security and compliance with industry standards and best practices. Required Skills & Qualifications: Education: Bachelors degree in Computer Science, Information Technology, Engineering, or a related field. Experience: 8+ years of experience in data engineering, with at least 2 years working with GCP/Azure/AWS Technical Skills: Strong programming skills in Python, SQL,Pyspark and familiarity with Java/Scala. Experience with orchestration tools like Apache Airflow. Knowledge of ETL/ELT processes and tools. Experience with data modeling and designing data warehouses in BigQuery. Familiarity with CI/CD pipelines and version control systems like Git. Understanding of data governance, security, and compliance. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Ability to work in a fast-paced environment and manage multiple priorities. Preferred Qualifications: Certifications: GCP Professional Data Engineer or GCP Professional Cloud Architect certification. Domain Knowledge: Experience in the finance, e-commerce, healthcare domain is a plus.

Posted 3 weeks ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies