Home
Jobs

852 Aws Cloud Jobs - Page 7

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

0.0 - 1.0 years

0 - 2 Lacs

Ahmedabad

Work from Office

Naukri logo

The key qualifications we are looking for include: Strong problem-solving and communication skills. Experience with virtualization (VMware, Nutanix, Hyper-V). Hands-on experience with server hardware (Dell, HP, Cisco, Supermicro). Proficiency in networking protocols and security best practices. Prior experience as an infrastructure engineer or similar role. Relevant certifications (MCSE, AWS Solutions Architect) are a plus. Ability to work collaboratively in a team-oriented environment. 10th and 12th average should be 70%

Posted 4 days ago

Apply

3.0 - 5.0 years

27 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title : Data Engineer (DE) / SDE – Data Location Bangalore Experience range 3-15 What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 4 days ago

Apply

0.0 - 2.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Data Engineer -1 (Experience – 0-2 years) What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 4 days ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 4 days ago

Apply

2.0 - 5.0 years

30 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

Data Engineer -2 (Experience – 2-5 years) What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 4 days ago

Apply

3.0 - 5.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 4 days ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 4 days ago

Apply

3.0 - 5.0 years

5 - 7 Lacs

Pune, Ahmedabad

Work from Office

Naukri logo

We are seeking a skilled and motivated Google / AWS Cloud DevOps Engineer with over 3 years of hands-on experience in building and maintaining scalable, reliable, and secure cloud infrastructure. You will be part of a dynamic team that focuses on delivering robust DevOps solutions using Google Cloud Platform (GCP), AWS, helping streamline CI/CD pipelines, automate infrastructure provisioning, and optimize cloud-based deployments. Key Responsibilities: Design, implement, and manage scalable and secure infrastructure on Google Cloud Platform / AWS. Develop and maintain CI/CD pipelines using tools such as Cloud Build, Jenkins, GitLab CI/CD, or similar. Implement infrastructure as code (IaC) using Terraform or Pulumi. Monitor system health and performance using AWS / GCPs operations suite (formerly Stackdriver). Automate manual processes to improve system reliability and deployment frequency. Collaborate with software engineers to ensure best DevOps practices are followed in application development and deployment. Handle incident response and root cause analysis for production issues. Ensure compliance with security and governance policies on AWS / GCP. Optimize cost and resource utilization across cloud services. Required Qualifications: 3+ years of hands-on experience with DevOps tools and practices in a cloud environment. Strong experience with Google Cloud Platform (GCP) / AWS services (Compute Engine, Kubernetes Engine, Cloud Functions, Cloud Storage, VPC, etc.). Google / AWS Cloud Professional Cloud DevOps Engineer certification is mandatory. Proficiency with CI/CD tools and version control systems (e.g., Git, GitHub/GitLab, Cloud Build). Solid scripting skills in Bash, Python, or similar languages. Experience with Docker and Kubernetes. Familiarity with monitoring/logging tools such as Prometheus, Grafana, and Cloud Monitoring. Knowledge of networking, security best practices, and IAM on GCP / AWS. Preferred Qualifications: Experience with multi-cloud or hybrid cloud environments. Familiarity with Agile and DevOps culture and practices. Experience with serverless architectures and event-driven design patterns. Knowledge of cost optimization and GCP/AWS billing.

Posted 4 days ago

Apply

4.0 - 9.0 years

8 - 18 Lacs

Hyderabad, Chennai

Work from Office

Naukri logo

About the Role: We are looking for a highly skilled and experienced Machine Learning / AI Engineer to join our team at Zenardy. The ideal candidate needs to have a proven track record of building, deploying, and optimizing machine learning models in real-world applications. You will be responsible for designing scalable ML systems, collaborating with cross-functional teams, and driving innovation through AI-powered solutions. Location: Chennai, Hyderabad Key Responsibilities: Design, develop, and deploy machine learning models to solve complex business problems Work across the full ML lifecycle: data collection, preprocessing, model training, evaluation, deployment, and monitoring Collaborate with data engineers, product managers, and software engineers to integrate ML models into production systems Conduct research and stay up-to-date with the latest ML/AI advancements, applying them where appropriate Optimize models for performance, scalability, and robustness Document methodologies, experiments, and findings clearly for both technical and non-technical audiences Mentor junior ML engineers or data scientists as needed Required Qualifications: Bachelors or Masters degree in Computer Science, Machine Learning, Data Science, or related field (Ph.D. is a plus) Minimum of 5 hands-on ML/AI projects, preferably in production or with real-world datasets Proficiency in Python and ML libraries/frameworks like TensorFlow, PyTorch, Scikit-learn, XGBoost Solid understanding of core ML concepts: supervised/unsupervised learning, neural networks, NLP, computer vision, etc. Experience with model deployment using APIs, containers (Docker), cloud platforms (AWS/GCP/Azure) Strong data manipulation and analysis skills using Pandas, NumPy, and SQL Knowledge of software engineering best practices: version control (Git), CI/CD, unit testing Preferred Skills: Experience with MLOps tools (MLflow, Kubeflow, SageMaker, etc.) Familiarity with big data technologies like Spark, Hadoop, or distributed training frameworks Experience working in Fintech environments would be a plus Strong problem-solving mindset with excellent communication skills Experience in working with vector database. Understanding of RAG vs Fine-tuning vs Prompt Engineering Why Join Us: Work on impactful, real-world AI challenges Collaborate with a passionate and innovative team Opportunities for career advancement and learning Flexible work environment (remote/hybrid options) Competitive compensation and benefits

Posted 4 days ago

Apply

15.0 - 20.0 years

15 - 19 Lacs

Ahmedabad

Work from Office

Naukri logo

Project Role : Technology Architect Project Role Description : Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have skills : Amazon Web Services (AWS) Good to have skills : Java Full Stack DevelopmentMinimum 7.5 year(s) of experience is required Educational Qualification : 15 years of fulltime education.Role:Technology Architect Project Role Description:Review and integrate all application requirements, including functional, security, integration, performance, quality and operations requirements. Review and integrate the technical architecture requirements. Provide input into final decisions regarding hardware, network products, system software and security. Must have Skills :Amazon Web Services (AWS), SSINON SSI:Good to Have Skills :SSI:Java Full Stack Development NON SSI :Job :'',//field Key Responsibilities:1 Experience of designing multiple Cloud-native Application Architectures2 Experience of developing and deploying cloud-native application including serverless environment like Lambda 3 Optimize applications for AWS environment 4 Design, build and configure applications on AWS environment to meet business process and application requirements5 Understanding of security performance and cost optimizations for AWS6 Understanding to AWS Well-Architected best practices Technical Experience:1 8/15 years of experience in the industry with at least 5 years and above in AWS 2 Strong development background with exposure to majority of services in AWS3 AWS Certified Developer professional and/or AWs specialty level certification DevOps /Security 4 Application development skills on AWS platform with either Java SDK, Python SDK, Reactjs5 Strong in coding using any of the programming languages like Python/Nodejs/Java/Net understanding of AWS architectures across containerization microservices and serverless on AWS 6 Preferred knowledge in cost explorer, budgeting and tagging in AWS 7 Experience with DevOps tools including AWS native DevOps tools like CodeDeploy, Professional Attributes:a Ability to harvest solution and promote reusability across implementations b Self Motivated experts who can work under their own direction with right set of design thinking expertise c Proven interpersonal skills while contributing to team effort by accomplishing related results as needed Educational Qualification:15 years of fulltime education. Additional Info:1 Application developers skills on AWS platform with either Java SDK, Python SDK, Nodejs, ReactJS 2 AWS services Lambda, AWS Amplify, AWS App Runner, AWS CodePipeline, AWS Cloud nine, EBS, Faregate,Additional comments:Only Bangalore, No Location Flex and No Level Flex Qualification 15 years of fulltime education.

Posted 4 days ago

Apply

9.0 - 12.0 years

27 - 42 Lacs

Chennai

Work from Office

Naukri logo

We are looking for a Sr. Software Engineer to analyze large amounts of raw information to find patterns and build data products to extract valuable business insights. Java developer roles and responsibilities include managing Java/Java EE application development while providing expertise in the full software development lifecycle, from concept and design to testing. Mandatory • Database Knowledge (like MSQL, Oracle, NoSQL etc.,) • Advanced Java (mainly on Spring Boot framework, web development, networking, and some familiarity with specific tools like Maven) Nice to have • PowerBI/Tableau • Python • Azure/AWS Cloud Skills and any Azure AI skills Non-Technical skills: • Analytical mind and business acumen • Strong math skills (e.g. statistics, algebra) • Problem-solving aptitude • Excellent communication and presentation skills

Posted 4 days ago

Apply

2.0 - 5.0 years

18 - 21 Lacs

Hyderabad

Work from Office

Naukri logo

Overview Annalect is currently seeking a data engineer to join our technology team. In this role you will build Annalect products which sit atop cloud-based data infrastructure. We are looking for people who have a shared passion for technology, design & development, data, and fusing these disciplines together to build cool things. In this role, you will work on one or more software and data products in the Annalect Engineering Team. You will participate in technical architecture, design, and development of software products as well as research and evaluation of new technical solutions. Responsibilities Design, build, test and deploy scalable and reusable systems that handle large amounts of data. Collaborate with product owners and data scientists to build new data products. Ensure data quality and reliability Qualifications Experience designing and managing data flows. Experience designing systems and APIs to integrate data into applications. 4+ years of Linux, Bash, Python, and SQL experience 2+ years using Spark and other frameworks to process large volumes of data. 2+ years using Parquet, ORC, or other columnar file formats. 2+ years using AWS cloud services, esp. services that are used for data processing e.g. Glue, Dataflow, Data Factory, EMR, Dataproc, HDInsights , Athena, Redshift, BigQuery etc. Passion for Technology: Excitement for new technology, bleeding edge applications, and a positive attitude towards solving real world challenges

Posted 4 days ago

Apply

0.0 - 2.0 years

1 - 2 Lacs

Kolkata

Work from Office

Naukri logo

Job Title: Cloud Presales Support Executive (Kolkata Based Candidates are preferred) Location: Kolkata Job Type: Full-time Work from Office Experience: 6 months - 2 years Job Summary: We are seeking a skilled and dynamic Cloud Presales Support Executive to bridge the gap between cloud technical teams and business stakeholders. The ideal candidate will support cloud proposals, pricing, and customer engagement providing both technical insights and commercial value propositions. You will work closely with sales, engineering, and product teams to deliver cloud solutions that meet client needs while aligning with business goals. Role & responsibilities Assist in tracking client inquiries and coordinate with the sales and technical teams to ensure timely and accurate responses. Support cloud usage analysis by generating basic reports using tools like AWS Cost Explorer, Azure Cost Management. Maintain and regularly update cloud solution templates, pricing sheets, and client presentation decks to ensure accuracy and relevance. Benchmark cloud service pricing and features across providers (AWS, Azure, GWS) to support solution comparisons and recommendations. Create and manage a centralized repository of reusable cloud solution assets such as case studies, proposal templates, and SLAs. Monitor industry news, vendor updates, and promotional offers to inform the team about new opportunities or price changes. Participate in internal brainstorming sessions to contribute to the development of customized and cost-effective client cloud solutions. Schedule and coordinate meetings, demos, and follow-ups related to pre-sales and commercial discussions. Assist in the creation of customer-facing documentation, including FAQs, solution diagrams, service overviews, and how-to guides. Work under the guidance of senior cloud engineers to identify and suggest cost optimization strategies based on client usage patterns. Preferred candidate profile Good understanding of cloud services provided by AWS, Azure and GCP. Strong interest in a hybrid career role involving both technology and business. Good communication and presentation skills. Ability to work collaboratively with both technical and sales teams. Certification in AWS Cloud Practitioner or Azure Fundamentals (AZ-900) is an added advantage. Kolkata-based candidates will be preferred.

Posted 4 days ago

Apply

2.0 - 4.0 years

2 - 7 Lacs

Chennai, Bengaluru

Work from Office

Naukri logo

Role & responsibilities: Provision and manage Azure resources such as Virtual Machines, Storage Accounts, Networking components, Databricks workspaces, and Azure SQL Managed Instance Deploy and configure Azure Databricks privately, including setting up clusters, managing workspaces, libraries, and permissions. Configured and debugged private endpoint setups and firewall rules for secure access to Azure Databricks. Optimized cluster sizing and autoscaling configurations based on workload characteristics and cost considerations. Analyzed job run logs, and cluster event logs to identify and remediate root causes of failures. Troubleshoot and resolve errors and service interruptions across the Azure ecosystem, especially issues related to Databricks, Azure Data Factory, and APIs . Monitor health and performance of services using Azure Monitor, Log Analytics, and Application Insights . Ensure optimal configuration of Azure Networking , Loadbalacer, Application Gateway,including VNets, NSGs, Firewalls, and ExpressRoute or VPNs if required. Implement and maintain RBAC, IAM policies, and resource tagging for access control and cost management. Coordinate with engineering and data teams to support infrastructure needs and resolve platform-level issues. Maintain proper backup, disaster recovery, and patch management across services. Work with Azure DevOps for resource deployment automation and release management. Designed and implemented CI/CD pipelines using GitHub Actions for automated build, test, and deployment workflows across multiple environments. Keep documentation updated for architecture, configurations, and operational processes. Designed, deployed, and managed scalable Kubernetes clusters using Azure Kubernetes Service (AKS). Configured node pools, autoscaling, and workload balancing in AKS. Implemented and maintained AKS cluster upgrades and versioning strategies. Integrated AKS with Azure services including Azure Monitor, Azure Key Vault, and Azure Container Registry (ACR). Managed network configurations including VNETs, subnets, NSGs, and private cluster setups for AKS. Designed and implemented CI/CD pipelines using GitHub Actions for automated build, test, and deployment workflows across multiple environments. Required Skills & Experience: 2-6 years of experience as an Azure Administrator or similar role. Strong hands-on experience in managing: Azure Databricks (clusters, workspaces, permissions) Azure VMs, Networking, and Storage,Backup,AKS,Keyvault,Private Endpoint Azure Monitor and Diagnostics Azure Resource Manager (ARM) templates or Bicep Proficient in identifying and resolving Azure connectivity errors and performance issues , especially in Databricks pipelines and integrations . Working knowledge of PowerShell, CLI , and portal-based operations . Familiarity with Azure Data Factory , APIM , and SQL MI is a plus. Strong troubleshooting and communication skills to work across teams.

Posted 4 days ago

Apply

2.0 - 5.0 years

7 - 12 Lacs

Pune

Work from Office

Naukri logo

We're looking for a skilled Python Developer with strong hands-on experience integrting any modern LLMs such as OpenAI , Claude , Amazon Titan , Gemini , etc. The ideal candidate will have deep experience in building and fine-tuning LLM-powered features.

Posted 4 days ago

Apply

6.0 - 10.0 years

14 - 24 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title: Java +AWS OR JAVA + Azure Experience: Minimum 4years Location: Bengaluru Preferred- Immediate (Only who can able to join July 2025) Interested Candidate share CV Mohini.sharma@adecco.com Job Description: We are looking for a highly motivated and experienced Java Full Stack Developer with a strong command of Java Coding, SQL , and Data Structures & Algorithms (DSA) . The ideal candidate will have at least 4 years of hands-on experience in full stack development and a solid understanding of software design principles. Key Responsibilities: Design and develop robust, scalable applications using Java, Spring Boot, and SQL . Write efficient code leveraging Data Structures and Algorithms for performance optimization. Develop responsive UI components using ReactJS or Angular . Integrate RESTful services and manage data formats like JSON and XML . Collaborate with cross-functional teams in Agile development environments. Implement unit testing using JUnit/Mockito and participate in CI/CD pipelines. Required Skills: Bachelors degree in Computer Science or related field. Minimum 5 years of hands-on experience in Java Full Stack development. Proficiency in Core Java , Spring Boot , and Microservices architecture. Strong SQL expertise with databases such as Oracle, MySQL, PostgreSQL . Solid understanding of Data Structures and Algorithms . Experience with version control (Git) and build tools (Maven/Gradle). Familiarity with CI/CD , automated testing , and Agile/Scrum practices. Exposure to cloud platforms such as AWS, Azure, or Pivotal Cloud is a plus. Excellent communication and problem-solving skills. Preferred Skills (Good to Have) Frontend experience with ReactJS or Angular . Experience in writing and consuming REST/SOAP web services . Exposure to containerization tools like Docker and orchestration with Kubernetes . Location: Bengaluru Work Mode: WFO/Hybrid (depending on project requirements) Notice Period: Immediate joiners or up to 30 days preferred

Posted 4 days ago

Apply

4.0 - 9.0 years

5 - 15 Lacs

Coimbatore, Bengaluru

Work from Office

Naukri logo

Role & responsibilities The ideal candidate will have a strong background in understanding business problem statements, mapping them to available data, and deriving insights through thorough data analysis using SQL and Python. The role involves handling large datasets, writing complex database queries, and leveraging automation to streamline processes. Experience in Python and automation is a plus. Perform advanced business analysis to identify trends and insights in large healthcare datasets using SQL and Python Write and optimize SQL queries to extract, join, and analyse data from various systems Collaborate with business users to understand needs and translate them into actionable data models Interpret operational data to support process optimization and revenue improvement Analyze and interpret complex healthcare data from various sources to provide insights and support strategic decision making. Work closely with product, engineering, and client-facing teams to define data requirements and specifications. Produce and maintain data flow diagrams, data catalogs, and data dictionaries ensuring they are concise, up-to-date, and scalable. Uphold data architecture standards and best practices to improve data quality, interoperability, and portability. Stay informed on healthcare industry trends, regulations, and standard clinical metrics and taxonomies. Prepare and present detailed data analysis reports for both technical and non-technical stakeholders. Coordinate with clients, partners, and internal teams to understand their data needs and requirements and provide appropriate solutions. Qualifications: Business Data Analyst with strong SQL expertise and 5+years of experience preferably in a healthtech company. Proven ability to perform comprehensive business analysis and data interpretation Proficiency in writing complex SQL queries and optimizing database performance Familiarity with business simulation tools to test proposed improvements Hands-on experience in handling databases and working with data extraction and transformation. Experience in the healthcare or RCM domain is a big plus Proficiency in SQL, Python , and data visualization platforms (e.g., Power BI, Tableau ). Exposure to cloud platforms (e.g., AWS, Azure, GCP) Strong knowledge of Excel and PowerPoint for reporting. Some experience or interest with data from payer, hospital, and clinic support systems such as Electronic Health Records (EHR) systems (Epic, Cerner, Allscripts, Meditech, etc.), Payor claims processing, EDI, EHR, HIE, PBM as well as standard clinical metrics and taxonomies (HEDIS, STARS, HCC, CCS, etc). Understanding of the US healthcare system. Proven experience collaborating with business and engineering teams. Excellent communication skills, with the ability to effectively lead meetings and reach consensus through collaboration. Proficiency with data and analytics tools, including statistical software and databases. Experience with data visualization tools and techniques is a plus. Demonstrated ability to translate complex data into clear, actionable insights.

Posted 5 days ago

Apply

11.0 - 15.0 years

30 - 40 Lacs

Hyderabad, Chennai, Bengaluru

Hybrid

Naukri logo

Job Summary: As a Cloud Solution Lead, you will drive cloud strategy, architecture, and implementation for business-critical systems. You will provide technical leadership while managing a team of cloud engineers, ensuring efficient deployment, monitoring, and optimization of cloud solutions. Your expertise in cloud technologies, stakeholder collaboration, and people management will be essential in fostering innovation and operational excellence. Key Responsibilities: Lead the design, development, and deployment of scalable cloud solutions. Oversee cloud security, compliance, and governance in alignment with industry standards. Provide technical direction and mentoring to cloud engineering teams. Manage relationships with vendors and internal stakeholders. Optimize cloud costs and performance using industry best practices. Ensure seamless cloud integration with existing IT infrastructure and applications. Lead incident resolution and continuous improvement strategies. Drive automation and DevOps adoption to enhance operational efficiency. Required Skills & Experience: Strong expertise in Azure/AWS/GCP cloud environments. Proven experience in leading teams and managing people in cloud solution delivery. Proficiency in IaC tools (Terraform, ARM, CloudFormation). Strong grasp of CI/CD pipelines and automation techniques. Familiarity with cloud security frameworks and risk management. Ability to collaborate cross-functionally with IT and business teams. Excellent communication, leadership, and stakeholder management skills. Preferred Qualifications: Cloud certifications ( AWS Solutions Architect, Azure Expert, GCP Professional Architect ) preferred. Experience in multi-cloud strategies and hybrid cloud integration .

Posted 5 days ago

Apply

2.0 - 3.0 years

10 - 19 Lacs

Panchkula

Work from Office

Naukri logo

We are seeking a highly skilled and motivated DevOps Engineer to join our team. You will play a key role in designing, implementing, and maintaining scalable infrastructure and deployment pipelines. The ideal candidate should have hands-on experience with cloud environments, automation tools, and container orchestration. Key Responsibilities: Develop and maintain CI/CD pipelines for automated testing and deployment. Manage and monitor cloud infrastructure (AWS/Azure/GCP). Configure and maintain Docker containers and Kubernetes clusters. Automate infrastructure using Terraform , Ansible , or similar tools. Improve system reliability and performance through monitoring and alerting (Prometheus, Grafana, ELK stack). Collaborate with development, QA, and product teams to ensure seamless deployments and high system availability. Maintain security and compliance standards across environments. Manage source code and version control tools such as Git. Required Skills and Qualifications: Bachelors degree in Computer Science, Engineering, or a related field. 3–4 years of hands-on experience in DevOps or System Administration. Experience with containerization tools (Docker) and orchestration platforms (Kubernetes). Proficiency in cloud services (AWS, Azure, or GCP). Experience with scripting languages (Bash, Python, or Shell). Hands-on experience with CI/CD tools such as Jenkins, GitLab CI/CD, CircleCI, etc. Strong understanding of system/network administration and troubleshooting. Familiar with infrastructure as code tools like Terraform, CloudFormation, or Ansible.

Posted 5 days ago

Apply

8.0 - 12.0 years

20 - 25 Lacs

Bengaluru

Hybrid

Naukri logo

We are seeking a highly skilled and experienced Cloud Data Engineer to join our dynamic team. You will be responsible for designing, building, and maintaining scalable data pipelines and infrastructure on GCP/AWS/Azure, ensuring data is accessible, reliable, and available for business use. Key Responsibilities: Data Pipeline Development: Design, develop, and maintain data pipelines using GCP/AWS/Azure services such as Dataflow, Dataproc, BigQuery, and Cloud Storage. Data Integration: Work on integrating data from various sources (structured, semi-structured, and unstructured) into GCP/AWS/Azure environments. Data Modeling: Develop and maintain efficient data models in BigQuery to support analytics and reporting needs. Data Warehousing: Implement data warehousing solutions on GCP, optimizing performance and scalability. ETL/ELT Processes: Build and manage ETL/ELT processes using tools like Apache Airflow, Data Fusion, and Python. Data Quality & Governance: Implement data quality checks, data lineage, and data governance best practices to ensure high data integrity. Automation: Automate data pipelines and workflows to reduce manual effort and improve efficiency. Collaboration: Work closely with data scientists, analysts, and other stakeholders to understand data requirements and deliver data solutions that meet business needs. Optimization: Continuously monitor and optimize the performance of data pipelines and queries for cost and efficiency. Security: Ensure data security and compliance with industry standards and best practices. Required Skills & Qualifications: Education: Bachelors degree in Computer Science, Information Technology, Engineering, or a related field. Experience: 8+ years of experience in data engineering, with at least 2 years working with GCP/Azure/AWS Technical Skills: Strong programming skills in Python, SQL,Pyspark and familiarity with Java/Scala. Experience with orchestration tools like Apache Airflow. Knowledge of ETL/ELT processes and tools. Experience with data modeling and designing data warehouses in BigQuery. Familiarity with CI/CD pipelines and version control systems like Git. Understanding of data governance, security, and compliance. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and collaboration abilities. Ability to work in a fast-paced environment and manage multiple priorities. Preferred Qualifications: Certifications: GCP Professional Data Engineer or GCP Professional Cloud Architect certification. Domain Knowledge: Experience in the finance, e-commerce, healthcare domain is a plus.

Posted 5 days ago

Apply

12.0 - 18.0 years

35 - 60 Lacs

Hyderabad

Hybrid

Naukri logo

Senior Manager, Site Reliability Engineering Hyderabad Shift Timings: 1.00 PM - 10.00 PM Duties and Responsibilities: People Leader Responsibility Position will manage 5 to 10 engineers both directly and indirectly. The engineers will include Site Reliability Engineers, Observability Engineers, Performance Engineers, DevSecOps Engineers, and others These individuals will vary from entry level to senior titles. Responsibilities: Lead and manage a team of Site Reliability Engineers, providing mentorship, guidance, and support to ensure the team's success. Develop and implement strategies for improving system reliability, scalability, and performance. Establish and enforce SRE best practices, including monitoring, alerting, error budget tracking, and post-incident reviews. Collaborate with software engineering teams to design and implement reliable, scalable, and efficient systems. Implement and maintain monitoring and alerting systems to proactively identify and address issues before they impact customers. Implement performance engineering processes to ensure reliability of Products, Services, & Platforms. Drive automation and tooling efforts to streamline operations and improve efficiency. Continuously evaluate and improve our infrastructure, processes, and practices to ensure reliability and scalability. Provide technical leadership and guidance on complex engineering projects and initiatives. Stay up-to-date with industry trends and emerging technologies in site reliability engineering and cloud computing. Other duties as assigned. Required Work Experience: 10+ years of experience in site reliability engineering or a related field. 5+ years of experience in a leadership or management role, managing a team of engineers. 5+ years of hands on working experience with Dynatrace (administrative, deployment, etc). Strong understanding of DevSecOps principles. Strong understanding of cloud computing principles and technologies, preferably AWS, Azure, or GCP. Strong communication and interpersonal skills, with the ability to collaborate effectively with cross-functional teams. Proven track record of driving projects to successful completion in a fast-paced, dynamic environment. Experience with driving cultural change in technical excellence, quality, and efficiency. Experience managing and growing technical leaders and teams. Constructing, interpreting, and applying metrics to your work and decision making, able to use those metrics to identify correlation between drivers and results, and using that information to drive prioritization and action Preferred Work Experience: Proficiency in programming/scripting languages such as Python, Go, or Bash. Experience with infrastructure as code tools such as Terraform or CloudFormation. Deep understanding of Linux systems administration and networking principles. Experience with containerization and orchestration technologies such as Docker and Kubernetes. Experience or familiarity with IIS, HTML, Java, Jboss. Knowledge: Site Reliability Engineering Principles DevSecOps Principles Agile (SAFe) Healthcare industry ITLT ServiceNow Jira/Confluence Skills: Strong communication skills Leadership Programming languages (see above) Project Management Mentorship Continuous learning

Posted 5 days ago

Apply

10.0 - 15.0 years

25 - 40 Lacs

Pune, Gurugram, Bengaluru

Hybrid

Naukri logo

Our Client is leading Global IT Service and Consulting organization Java Architect Experience: 10-14years Work location: Mumbai/Pune/Bangalore/Gurgaon/Noida/Hyderabad/Chennai Notice period : Immediate to 30 days Preferred candidate profile -10 plus years of experience in building large-scale, high-volume, low latency, high availability, and complex distributed services. -Architecture and Design -Ability to identify, showcase potential solutions and recommend the best solution based on requirements. -Manage stakeholders to drive key decisions on tools, technologies, user journeys and overall governance -Experience in object-oriented, SOLID, and DRY principles, reactive programming model, Microservices and event driven solutions -Delivered solutions on alternative architecture patterns to meet business requirement understands enterprise security, compliance and data security at network and Application layer Language & frameworks and Database -Worked extensively on Java language Java 8 and above, having used concurrency, multithreaded models, blocking/non-blocking IO, lambdas, streams, generics, advance libraries, algorithm, and data structures. -Executed database DDL, DML, modeling, managed transactional scenarios & Isolation levels and experience with NoSQL and SQL based DBs in the past. -Extensively used SpringBoot/ Spring cloud or similar frameworks to deliver complex scalable solution -Worked extensively on API based digital journeys and enabled DBT and alternatives technologies to achieve the desired outcomes

Posted 5 days ago

Apply

3.0 - 6.0 years

5 - 15 Lacs

Pune

Work from Office

Naukri logo

Hi , We are hiring Automation Engineer ( Python , SQL , PLC, Ignition) Primary Skills: Python, SQL, Ignition, PLC Secondary Skills: Kepware/OPC, NLP, AWS Cloud, MQTT, CVML, Networking & Machine Connectivity, IE, ME skill sets, LabVIEW, TestStand Job Summary We are seeking an experienced Automation Engineer with a strong background in Python, SQL, Ignition SCADA, and PLC programming. The ideal candidate will be responsible for developing, deploying, and maintaining automation solutions in industrial or manufacturing settings, integrating various hardware and software platforms, and ensuring robust machine connectivity and data flow. Roles & Responsibilities Ignition & SCADA Development Design, develop, and maintain Ignition applications, dashboards, and HMIs for industrial automation. Configure and manage Ignition Gateways, projects, and tag structures. Integrate Ignition with PLCs, Kepware/OPC servers, and other industrial devices using protocols such as OPC-UA and MQTT. Implement scripting and automation logic using Python within Ignition environments. Database & Data Integration Design and optimize SQL database schemas, tables, views, and indexes to support automation solutions. Develop, test, and maintain complex SQL queries, stored procedures, and triggers. Integrate data from Ignition and other sources with SQL databases for analytics and reporting. Implement ETL processes and data integration solutions as required. Machine Connectivity & Networking Configure and maintain machine connectivity using Kepware, OPC, and networking protocols. Analyze machine communication capabilities and connectivity constraints. Troubleshoot and optimize network and machine data flows for reliability and performance. Cloud & Advanced Technologies Integrate automation systems with AWS Cloud and other cloud storage solutions as needed. Apply knowledge of MQTT, CVML, NLP, and other advanced technologies to enhance automation capabilities. Testing & Validation Develop and execute test plans for automation solutions, including integration, load, and user acceptance testing. Validate data accuracy, system reliability, and error handling mechanisms. Collaboration & Documentation Work closely with cross-functional teams including engineers, IT, and operations to deliver robust solutions. Provide training and technical support for deployed systems. Ensure thorough documentation of all development activities, configurations, and user manuals. Maintenance & Optimization Monitor system and database performance, implementing improvements as needed. Perform regular updates, ensure data security compliance, and troubleshoot issues promptly

Posted 5 days ago

Apply

8.0 - 13.0 years

25 - 35 Lacs

Chennai

Work from Office

Naukri logo

Responsibilities Design, develop and maintain web-based SaaS applications to enhance the performance and reliability of our current applications, as well as participate in the development of new industry-leading products, leveraging technologies such as Spring, Hibernate, MS SQL, RabbitMQ, etc. Develop high-performance distributed systems using Java and open-source technologies. Work with Product Managers, analysts, team members, and stakeholders to understand customer needs, document software requirements, and ensure applications deliver successful customer outcomes. Troubleshoot problems, whether due to data or software, and work to rapidly implement repairs. Provide guidance, mentorship, and technical leadership to a team of software developers. Set clear goals and expectations, and ensure the team is motivated and working efficiently. Demonstrate strong programming skills and in-depth knowledge of software development methodologies, languages, and frameworks. Act as a subject matter expert, assisting team members in problem-solving and code reviews. Ensure that software developed meets high-quality standards through continuous testing, debugging, and code refactoring. Implement best practices and maintain code integrity. Develop a deep understanding of the business domain in which the software operates. Collaborate closely with stakeholders to comprehend business needs, objectives, and challenges. Translate these requirements into technical solutions that align with business goals and contribute to the overall success of the organization. Identify areas for continuous improvement and work with team members and engineering leadership to advance processes, platforms, and tools. Work Location Type Chennai, Hybrid Required Education and Experience Bachelors Degree in Computer Science, Computer Engineering, Software Engineering, or a similar technology-related field is required. Master's Degree is highly preferred. Must have a minimum of five years of full-time experience in developing software solutions. Must be able to manage changing direction and adapt to a fast-paced environment. Must be able to work with a variety of personalities and skill levels in a matrixed environment to accomplish deliverables and goals. Required Skills Core Java and J2EE/Enterprise web application development expertise (5+ years) Strong Object-oriented coding practices (UMI, Design Patterns) Spring Framework MVC experience highly preferred Experience operating in an Agile environment and a solid understanding of Agile methodology, patterns, and practices. Advanced SQL knowledge (MS SQL preferred) with experience with DB and query optimization Event-driven architectures API Development experience Cloud native development experience (AWS preferred) Experience with test automation frameworks and understanding of automation best practices Experience with observability tools and application instrumentation. Full Software Development Life Cycle experience Additional Desired Skills SaaS application development Experience in practicing continuous delivery and configuring deployment pipelines Experience building containerized applications (Docker, Kubernetes, Infrastructure as Code) Message Queues: AMQP, Rabbit MQ, or equivalent Enterprise Architecture experience Front-end development with Javascript, HTML, & CSS Healthcare domain knowledge a plus

Posted 5 days ago

Apply

5.0 - 10.0 years

5 - 10 Lacs

Vadodara

Work from Office

Naukri logo

Experienced IT Manager with expertise in SAP support, cloud migration, IT infrastructure, cybersecurity, vendor & budget management, project delivery, data governance, and risk mitigation. Skilled in AWS, Azure, and IT operations optimization.

Posted 5 days ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies