Home
Jobs

31 Aws Technologies Jobs

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

3.0 - 8.0 years

14 - 20 Lacs

Hyderabad

Work from Office

Naukri logo

Job Area: Information Technology Group, Information Technology Group > IT Software Developer General Summary: Qualcomm OneIT team is looking for a talented senior Full-Stack Developer to join our dynamic team and contribute to our exciting projects. The ideal candidate will have strong understanding of Java, Spring Boot, Angular/React and AWS technologies as well as experience in designing, managing and deploying applications to the cloud.Key Responsibilities: Design, develop and maintain web applications using Java, Spring Boot, and Angular/React. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, maintainable, and efficient code. Ensure the performance, quality, and responsiveness of applications. Identify and correct bottlenecks and fix bugs. Help maintain code quality, organization, and automation. Stay up-to-date with the latest industry trends and technologies. Minimum Qualifications: 3+ years of IT-relevant work experience with a Bachelor's degree in a technical field (e.g., Computer Engineering, Computer Science, Information Systems). OR 5+ years of IT-relevant work experience without a Bachelor’s degree. 3+ years of any combination of academic or work experience with Full-stack Application Development (e.g., Java, Python, JavaScript, etc.) 1+ year of any combination of academic or work experience with Data Structures, algorithms, and data stores. Candidate should have: Bachelor's degree in Computer Science, Engineering, or a related field. 5-7 years of experience with minimum 3 years as a Full-Stack developer using Java, Spring Boot and Angular/React. Strong proficiency in Java and Spring Boot. Experience with front-end frameworks such as Angular or React. Familiarity with RESTful APIs and web services. Knowledge of database systems like Oracle, MySQL, PostgreSQL, or MongoDB. Experience with AWS services such as EC2, S3, RDS, Lambda, and API Gateway. Understanding of version control systems, preferably Git. Excellent problem-solving skills and attention to detail. Strong communication and teamwork abilities. Experience with any other programming language like C#, Python Knowledge of containerization technologies like Docker and Kubernetes. Familiarity with CI/CD pipelines and DevOps practices. Experience with Agile/Scrum/SAFe methodologies. Bachelors or Master’s degree in information technology, computer science or equivalent.

Posted 1 day ago

Apply

5.0 - 7.0 years

7 - 11 Lacs

Bengaluru

Work from Office

Naukri logo

Design, develop, and maintain system integration solutions that connect various applications and platforms using APIs, middleware, or other integration tools Collaborate with business analysts, architects, and development teams to gather requirements and implement robust integration workflows Monitor and troubleshoot integration processes, ensuring data consistency, accuracy, and performance Create technical documentation, perform testing, and resolve any integration-related issues Ensure compliance with security and data governance standards while optimizing system connectivity and scalability Stay updated with integration trends and tools to enhance system interoperability

Posted 1 day ago

Apply

3.0 - 5.0 years

27 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

Job Title : Data Engineer (DE) / SDE – Data Location Bangalore Experience range 3-15 What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills. For Managers, Customer centricity, obsession for customer Ability to manage stakeholders (product owners, business stakeholders, cross function teams) to coach agile ways of working. Ability to structure, organize teams, and streamline communication. Prior work experience to execute large scale Data Engineering projects

Posted 1 day ago

Apply

0.0 - 2.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

Data Engineer -1 (Experience – 0-2 years) What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 day ago

Apply

3.0 - 5.0 years

3 - 5 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 day ago

Apply

2.0 - 5.0 years

30 - 32 Lacs

Bengaluru

Work from Office

Naukri logo

Data Engineer -2 (Experience – 2-5 years) What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 day ago

Apply

3.0 - 5.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 day ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Bengaluru

Work from Office

Naukri logo

What we offer Our mission is simple – Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That’s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak’s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak’s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you’ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field 3-5 years of experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills BASIC QUALIFICATIONS for Data Engineering Manager / Software Development Manager 10+ years of engineering experience most of which is in Data domain 5+ years of engineering team management experience 10+ years of planning, designing, developing and delivering consumer software experience - Experience partnering with product or program management teams 5+ years of experience in managing data engineer, business intelligence engineers and/or data scientists Experience designing or architecting (design patterns, reliability and scaling) of new and existing systems Experience managing multiple concurrent programs, projects and development teams in an Agile environment Strong understanding of Data Platform, Data Engineering and Data Governance Experience partnering with product and program management teams - Experience designing and developing large scale, high-traffic applications PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 day ago

Apply

8.0 - 13.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Data Engineer -1 (Experience 0-2 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 week ago

Apply

9.0 - 14.0 years

30 - 35 Lacs

Bengaluru

Work from Office

Naukri logo

About The Role Data Engineer -2 (Experience 2-5 years) What we offer Our mission is simple Building trust. Our customer's trust in us is not merely about the safety of their assets but also about how dependable our digital offerings are. That"™s why, we at Kotak Group are dedicated to transforming banking by imbibing a technology-first approach in everything we do, with an aim to enhance customer experience by providing superior banking services. We welcome and invite the best technological minds in the country to come join us in our mission to make banking seamless and swift. Here, we promise you meaningful work that positively impacts the lives of many. About our team DEX is a central data org for Kotak Bank which manages entire data experience of Kotak Bank. DEX stands for Kotak"™s Data Exchange. This org comprises of Data Platform, Data Engineering and Data Governance charter. The org sits closely with Analytics org. DEX is primarily working on greenfield project to revamp entire data platform which is on premise solutions to scalable AWS cloud-based platform. The team is being built ground up which provides great opportunities to technology fellows to build things from scratch and build one of the best-in-class data lake house solutions. The primary skills this team should encompass are Software development skills preferably Python for platform building on AWS; Data engineering Spark (pyspark, sparksql, scala) for ETL development, Advanced SQL and Data modelling for Analytics. The org size is expected to be around 100+ member team primarily based out of Bangalore comprising of ~10 sub teams independently driving their charter. As a member of this team, you get opportunity to learn fintech space which is most sought-after domain in current world, be a early member in digital transformation journey of Kotak, learn and leverage technology to build complex data data platform solutions including, real time, micro batch, batch and analytics solutions in a programmatic way and also be futuristic to build systems which can be operated by machines using AI technologies. The data platform org is divided into 3 key verticals: Data Platform This Vertical is responsible for building data platform which includes optimized storage for entire bank and building centralized data lake, managed compute and orchestrations framework including concepts of serverless data solutions, managing central data warehouse for extremely high concurrency use cases, building connectors for different sources, building customer feature repository, build cost optimization solutions like EMR optimizers, perform automations and build observability capabilities for Kotak"™s data platform. The team will also be center for Data Engineering excellence driving trainings and knowledge sharing sessions with large data consumer base within Kotak. Data Engineering This team will own data pipelines for thousands of datasets, be skilled to source data from 100+ source systems and enable data consumptions for 30+ data analytics products. The team will learn and built data models in a config based and programmatic and think big to build one of the most leveraged data model for financial orgs. This team will also enable centralized reporting for Kotak Bank which cuts across multiple products and dimensions. Additionally, the data build by this team will be consumed by 20K + branch consumers, RMs, Branch Managers and all analytics usecases. Data Governance The team will be central data governance team for Kotak bank managing Metadata platforms, Data Privacy, Data Security, Data Stewardship and Data Quality platform. If you"™ve right data skills and are ready for building data lake solutions from scratch for high concurrency systems involving multiple systems then this is the team for you. You day to day role will include Drive business decisions with technical input and lead the team. Design, implement, and support an data infrastructure from scratch. Manage AWS resources, including EC2, EMR, S3, Glue, Redshift, and MWAA. Extract, transform, and load data from various sources using SQL and AWS big data technologies. Explore and learn the latest AWS technologies to enhance capabilities and efficiency. Collaborate with data scientists and BI engineers to adopt best practices in reporting and analysis. Improve ongoing reporting and analysis processes, automating or simplifying self-service support for customers. Build data platforms, data pipelines, or data management and governance tools. BASIC QUALIFICATIONS for Data Engineer/ SDE in Data Bachelor's degree in Computer Science, Engineering, or a related field Experience in data engineering Strong understanding of AWS technologies, including S3, Redshift, Glue, and EMR Experience with data pipeline tools such as Airflow and Spark Experience with data modeling and data quality best practices Excellent problem-solving and analytical skills Strong communication and teamwork skills Experience in at least one modern scripting or programming language, such as Python, Java, or Scala Strong advanced SQL skills PREFERRED QUALIFICATIONS AWS cloud technologiesRedshift, S3, Glue, EMR, Kinesis, Firehose, Lambda, IAM, Airflow Prior experience in Indian Banking segment and/or Fintech is desired. Experience with Non-relational databases and data stores Building and operating highly available, distributed data processing systems for large datasets Professional software engineering and best practices for the full software development life cycle Designing, developing, and implementing different types of data warehousing layers Leading the design, implementation, and successful delivery of large-scale, critical, or complex data solutions Building scalable data infrastructure and understanding distributed systems concepts SQL, ETL, and data modelling Ensuring the accuracy and availability of data to customers Proficient in at least one scripting or programming language for handling large volume data processing Strong presentation and communications skills.

Posted 1 week ago

Apply

3.0 - 5.0 years

0 Lacs

Chennai, Tamil Nadu, India

On-site

Foundit logo

For years, the transportation industry used technology that wasn`t designed with the future in mind. This limited fleets and drivers to proprietary systems, creating obstacles for innovating the optimal driver experience. Since launching in 2015, they have spent thousands of hours working with fleets to make transportation smart. From the back office to the driver`s seat, they provide solutions at every level to help fleets future-proof their operations. Since 2015, our client has connected fleets across the nation with the tools they need to take control of their technology. As part of their ongoing mission to transform transportation, they have made it easy for fleets to develop, deploy, and manage their commercial vehicles mobile devices and applications on a single platform. Through their partnerships with industry leaders at every stage of the supply chain, they empower fleets with endless opportunities to innovate and create solutions that evolve as their businesses grow. Software Engineer 2 Target Capabilities and Skills: 3 - 5 years of total experience Strong coding skills in Java Excellent Knowledge of J2EE Framework, WebServices (SOAP & REST), Spring, Spring Boot Experience in developing MicroServices application Experience in transformation of existing monolithic to microservice architecture Experience in modern JavaScript Framework - Angular Strong SQL knowledge Experience with AWS technologies is a plus Experience in NoSQL technologies is a plus Experience with Agile software development methodologies Other Requirements: Excellent oral and written communication skills. Passion and willingness to learn new technologies. Strong analytical and problem solving skills Self-starter, Ability to work well in a small team with good communication skills. You are a strong team player but are able to work independently You are dedicated to delivering high quality and performant solutions You are eager to learn new things, you take ownership and want to get things done

Posted 1 week ago

Apply

3.0 - 5.0 years

7 - 11 Lacs

Hyderabad, Gurugram, Ahmedabad

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 09 S&P Global Market Intelligence The Role: Software Developer II Grade ( relevant for internal applicants only )9 The Location: Ahmedabad, Gurgaon, Hyderabad The Team S&P Global Market Intelligence, a best-in-class sector-focused news and financial information provider, is looking for a Software Developer to join our Software Development team in our India offices. This is an opportunity to work on a self-managed team to maintain, update, and implement processes utilized by other teams. Coordinate with stakeholders to design innovative functionality in existing and future applications. Work across teams to enhance the flow of our data. Whats in it for you This is the place to hone your existing skills while having the chance to be exposed to fresh and divergent technologies. Exposure to work on the latest, cutting-edge technologies in the full stack eco system. Opportunity to grow personally and professionally. Exposure in working on AWS Cloud solutions will be added advantage. Responsibilities Identify, prioritize, and execute tasks in Agile software development environment. Develop solutions to develop/support key business needs. Engineer components and common services based on standard development models, languages and tools. Produce system design documents and participate actively in technical walkthroughs. Demonstrate a strong sense of ownership and responsibility with release goals. This includes understanding requirements, technical specifications, design, architecture, implementation, unit testing, builds/deployments, and code management. Build and maintain the environment for speed, accuracy, consistency and up time. Collaborate with team members across the globe. Interface with users, business analysts, quality assurance testers and other teams as needed. What Were Looking For Basic Qualifications: Bachelor's/masters degree in computer science, Information Systems or equivalent. 3-5 years of experience. Solid experience with building processes; debugging, refactoring, and enhancing existing code, with an understanding of performance and scalability. Competency in C#, .NET, .NET CORE. Experience with DevOps practices and modern CI/CD deployment models using Jenkins Experience supporting production environments Knowledge of T-SQL and MS SQL Server Exposure to Python/scala/AWS technologies is a plus Exposure to React/Angular is a plus Preferred Qualifications: Exposure to DevOps practices and CI/CD pipelines such as Azure DevOps or GitHub Actions. Familiarity with automated unit testing is advantageous. Exposure in working on AWS Cloud solutions will be added to an advantage. About S&P Global Market Intelligence At S&P Global Market Intelligence, a division of S&P Global we understand the importance of accurate, deep and insightful information. Our team of experts delivers unrivaled insights and leading data and technology solutions, partnering with customers to expand their perspective, operate with confidence, andmake decisions with conviction.For more information, visit www.spglobal.com/marketintelligence . Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. ----------------------------------------------------------- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ----------------------------------------------------------- 20 - Professional (EEO-2 Job Categories-United States of America), IFTECH202.1 - Middle Professional Tier I (EEO Job Group), SWP Priority Ratings - (Strategic Workforce Planning)

Posted 1 week ago

Apply

5.0 - 7.0 years

5 - 7 Lacs

Bengaluru / Bangalore, Karnataka, India

On-site

Foundit logo

We're on the lookout for a talented and experienced Java Fullstack Developer to become a key player on our dynamic team. If you have a solid background in Core Java, Spring Boot, Microservices, React.js, Redux, and AWS technologies , along with at least 5 years of hands-on experience , we want to hear from you! You'll be pivotal in designing, developing, and maintaining the complex software solutions that power our company's success. Key Responsibilities Full Stack Development: Design, develop, and maintain robust and scalable software solutions across both front-end and back-end systems. Back-end Expertise: Work extensively with Core Java, Spring Boot , and build Microservices architectures. Front-end Development: Create engaging and responsive user interfaces using React.js and Redux . Cloud Integration: Leverage AWS technologies for cloud-native development, deployment, and management of applications. Problem Solving: Play a crucial role in analyzing requirements, designing technical solutions, and resolving complex software issues. Collaboration: Work closely with cross-functional teams to deliver high-quality software products. Required Skills & Experience Minimum of 5 years of hands-on experience as a Java Fullstack Developer. Strong background in Core Java . Proven experience with Spring Boot for building robust applications. Solid understanding and practical experience with Microservices architecture. Proficiency in React.js and Redux for front-end development. Hands-on experience with AWS technologies . Preferred Qualifications AWS certification . Experience with containerization and orchestration tools like Docker and Kubernetes. Familiarity with agile development methodologies .

Posted 1 week ago

Apply

2.0 - 6.0 years

2 - 6 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Systechcorp Inc is looking for AWS Developer to join our dynamic team and embark on a rewarding career journey. Design and develop scalable, high-performance applications using AWS technologies. Implement AWS security best practices and ensure that applications are secure and compliant with industry standards. Develop, deploy, and manage cloud-based applications and services, including the design of custom Amazon Machine Images (AMIs) and the automation of application deployment. Collaborate with cross-functional teams, including development, quality assurance, and project management, to ensure the delivery of high-quality software products. Troubleshoot and resolve issues related to AWS infrastructure and applications. Continuously evaluate and improve the AWS infrastructure, including the implementation of new services and the optimization of existing services. Provide technical guidance and support to other team members and stakeholders. Excellent problem-solving and analytical skills. Excellent written and verbal communication skills.

Posted 1 week ago

Apply

3.0 - 5.0 years

4 - 8 Lacs

Ahmedabad

Work from Office

Naukri logo

About the Role: Grade Level (for internal use): 09 S&P Global Market Intelligence The Role: Software Developer II (.Net Backend Developer) Grade ( relevant for internal applicants only )9 The Location: Ahmedabad, Gurgaon, Hyderabad The Team S&P Global Market Intelligence, a best-in-class sector-focused news and financial information provider, is looking for a Software Developer to join our Software Development team in our India offices. This is an opportunity to work on a self-managed team to maintain, update, and implement processes utilized by other teams. Coordinate with stakeholders to design innovative functionality in existing and future applications. Work across teams to enhance the flow of our data. Whats in it for you This is the place to hone your existing skills while having the chance to be exposed to fresh and divergent technologies. Exposure to work on the latest, cutting-edge technologies in the full stack eco system. Opportunity to grow personally and professionally. Exposure in working on AWS Cloud solutions will be added advantage. Responsibilities Identify, prioritize, and execute tasks in Agile software development environment. Develop solutions to develop/support key business needs. Engineer components and common services based on standard development models, languages and tools. Produce system design documents and participate actively in technical walkthroughs. Demonstrate a strong sense of ownership and responsibility with release goals. This includes understanding requirements, technical specifications, design, architecture, implementation, unit testing, builds/deployments, and code management. Build and maintain the environment for speed, accuracy, consistency and up time. Collaborate with team members across the globe. Interface with users, business analysts, quality assurance testers and other teams as needed. What Were Looking For Basic Qualifications: Bachelor's/Masters degree in computer science, Information Systems or equivalent. 3-5 years of experience. Solid experience with building processes; debugging, refactoring, and enhancing existing code, with an understanding of performance and scalability. Competency in C#, .NET, .NET CORE. Experience with DevOps practices and modern CI/CD deployment models using Jenkins Experience supporting production environments Knowledge of T-SQL and MS SQL Server Exposure to Python/scala/AWS technologies is a plus Exposure to React/Angular is a plus Preferred Qualifications: Exposure to DevOps practices and CI/CD pipelines such as Azure DevOps or GitHub Actions. Familiarity with automated unit testing is advantageous. Exposure in working on AWS Cloud solutions will be added to an advantage.

Posted 2 weeks ago

Apply

8.0 - 13.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Responsibilities Design and build solutions for complex business workflows Understanding the user persona and deliver slick experience. Take end to end ownership of components and be responsible for the sub systems that you work on from design, code, testing, integration, deployment, enhancements etc. Write high quality code and taking responsibility of their task Solve performance bottlenecks Mentor junior engineers Communicate and collaborate with management, product, QA, UI/UX teams. Deliver with quality, on time in a fast-paced start-up environment: Bachelor/Master's in computer science or relevant fields 8 + years of relevant experience. Strong sense of ownership. Excellent Java and object-oriented development skills. Experience in building and scaling microserives Strong problem solving skills, technical troubleshooting and diagnosing Expected to be a role model for young engineers, have a strong sense of code quality and enforce code quality within the team. Strong knowledge in RDBMS and NoSql technologies Experience in developing backeds for enterprise systems like eCommerce / manufacturing / supply chain etc. Excellent understanding of Debugging performance and optimisation techniques. Experience in Java, Mongo, MySQl, AWS technologies, ELK stack, Spring boot, Kafka Experience in developing any large scale Experience in cloud technologies Demonstrated ability to deliver in a fast paced environment Good Communication SkillsAdd ons: Experience in Big Data Technologies, Data Warehousing AI/ML experience Experience in architecting large scale SaaS products/platforms like eCommerce, manufacturing, supply chain, etc.,

Posted 2 weeks ago

Apply

5.0 - 8.0 years

5 - 8 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

In this vital role, you'll be a key part of the Operations Generative AI (GenAI) Product team, delivering cutting-edge, innovative GenAI solutions across various Process Development functions (Drug Substance, Drug Product, Attribute Sciences & Combination Products) within Operations. This role involves: Developing and implementing GenAI solutions and strategies. Collaborating with cross-functional teams. Ensuring the scalability, reliability, and performance of AI solutions, adhering to Amgen's Enterprise AI strategy and roadmap. Role Description: Senior Manager, Information Systems As a Senior Manager, Information Systems, you will lead a dynamic team to deliver innovative GenAI solutions across Process Development functions (Drug Substance, Drug Product, Attribute Sciences & Combination Products) in the Operations area. Your responsibilities include: Leading a team of technical product owners, data engineers, AI & software engineers, business analysts, test/validation engineers, and scrum masters. Developing and implementing cutting-edge GenAI solutions and strategies. Collaborating with cross-functional teams. Ensuring the scalability, reliability, and performance of AI solutions, adhering to Amgen's Enterprise AI strategy and roadmap. Serving as the reporting manager for your team, responsible for their coaching and development. Providing leadership to own and refine the vision, feature prioritization, partner alignment, and leading solution delivery. The ideal candidate will have: A consistent track record of leadership in technology-driven environments. A passion for fostering innovation and excellence in the biotechnology industry. Deep expertise in managing the end-to-end development and delivery of customer-facing digital product capabilities and platforms leveraging generative AI-based digital solutions. Experience leading and effectively working with large, diverse, and globally dispersed teams within a matrixed organization. A strong background in the end-to-end software development lifecycle, technical product ownership, business analysis, and being a Scaled Agile practitioner. Leadership and transformation experience. The ability to drive and deliver against key organizational critical initiatives. The ability to develop a collaborative environment and deliver high-quality results in a matrixed organizational structure. Roles & Responsibilities Maintain strategic relationships and strong communication with the leadership team to ensure all stakeholders feel informed and engaged. Oversee the software development lifecycle, ensuring best practices in development, testing, and deployment across the product teams. Lead and manage large teams with varied strengths within a matrixed organization, collaborating with geographically dispersed teams, including those in the US and international locations. Develop and implement strategic roadmaps and plans for technology and workforce growth, including recruiting top talent and building a robust team in India. Develop talent, motivate the team, delegate effectively, champion diversity within the team, and act as a role model of servant leadership. Ensure global ways of working are embedded in the local organization. Develop a culture of collaboration, innovation, and continuous improvement, driving talent development, motivation, and effective delegation. Foster best practice sharing and alignment with business goals. Collaborate with Platform Owners, Product Owners, Service Owners, and delivery teams to ensure delivery matches commitments, acting as a critical escalation point and facilitating communication when service commitments are unmet. Participate in team member and leadership meetings, working with other parts of the organization and functional groups to ensure successful delivery and alignment with strategy, compliance, and regulatory requirements. Remain accountable for ensuring overall organizational security and compliance with quality and GxP in technology services. Monitor emerging AI tools and technologies and trends to find opportunities for platform growth and expansion. Ensure ongoing alignment with strategy, compliance, and regulatory requirements for technology investments and services. What We Expect of You We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications Education and Experience: Doctorate Degree and 2 years of experience in Engineering, IT, or a related field; OR Master's degree with 8 - 10 years of experience in Engineering, IT, or a related field; OR Bachelor's degree with 10 - 14 years of experience in Engineering, IT, or a related field; OR Diploma with 14 - 18 years of experience in Engineering, IT, or a related field. Role Background: Background as a Technical Product Owner (TPO) , people manager, and business analyst, ensuring the ability to oversee and guide a team to translate business needs into the definition and delivery of technical solutions, guiding development teams, prioritizing features, and ultimate delivery and management of the digital products. Leadership Skills: Proven leadership skills with the ability to lead large matrixed teams, demonstrated experience in leading and developing a high-performing team of technology professionals, building a culture of innovation and continuous improvement within the team to deliver powerful solutions and platform improvements. Technical Experience: Experience in implementing innovative web applications, Data Engineering and integration, Enterprise data fabric concepts, methodologies, and technologies (e.g., React.js, Python, Django, Fast API, AWS technologies, Databricks, DevOps CI/CD). GenAI Experience: Experience in implementing innovative scalable GenAI solutions using Retrieval-augmented generation (RAG), AI Agents, Vector stores, AI/ML platforms, embedding models (e.g., OpenAI, Langchain, Redis, pgvector). Methodology Experience: Experience in implementing a strategic roadmap and driving transformation initiatives using Scaled Agile methodology . Collaboration & Communication: Strong skills in collaborating and communicating with cross-functional teams, business stakeholders, and executives to ensure alignment of platform initiatives with business outcomes, managing expectations, and ensuring successful delivery of projects. Degree: Degree in Computer Science, Information Systems, Engineering, or Life Sciences. Preferred Qualifications At least 5-8 years of domain knowledge in health and/or life sciences combined with Information Technology. Understanding, and preferably applied experience and knowledge, in data management and CTD document drafting. Leadership experience within a highly regulated pharmaceutical or technology organization, with the ability to ensure compliance with industry regulations and best practices for GxP software validation. Experience driving a collaborative culture that values technical depth, accountability, and customer service. Strong problem-solving and analytical skills. Demonstrated ability to work effectively in a fast-paced, dynamic environment. Understanding of ITIL processes and implementation. Experience managing vendor relationships and working with external partners or consultants to ensure optimal performance, support, and development of digital products. Soft Skills Excellent leadership and team management skills. Strong transformation and change management experience. Exceptional collaboration and communication skills. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented with a focus on achieving team goals. Strong presentation and public speaking skills. Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global, virtual teams.

Posted 2 weeks ago

Apply

5.0 - 8.0 years

7 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Senior Manager Information Systems What you will do In this vital role you will play a key role as part of Operations Generative AI (GenAI) Product team to deliver cutting edge innovative GEN AI solutions across various Process Development functions(Drug Substance, Drug Product, Attribute Sciences & Combination Products) under Operations. The role involves developing and implementing GEN AI solutions and strategies, collaborating with cross-functional teams, and ensuring the scalability, reliability, and performance of AI solutions adhering to Amgen Enterprise AI strategy and roadmap. Role Description: In this vital role as Senior Manager Information Systems, you will lead a dynamic team of technical product owner, data engineers, AI & software engineers, business analysts, test/validation engineers and scrum master to deliver cutting edge innovative GEN AI solutions across various Process development functions(Drug Substance, Drug Product, Attribute Sciences & Combination Products) in Operations area . The role involves developing and implementing cutting edge GEN AI solutions and strategies, collaborating with cross-functional teams, and ensuring the scalability, reliability, and performance of AI solutions adhering to Amgen Enterprise AI strategy and roadmap. You will also be the reporting manager for this team and will be responsible for the coaching and development of these resources. The team will rely on your leadership to own and refine the vision, feature prioritization, partner alignment, and experience leading solution delivery while building this ground-breaking new capability for Amgen The ideal candidate will have a consistent track record of leadership in technology-driven environments with a passion for fostering innovation and excellence in the biotechnology industry. This role requires deep expertise in handling the end-to-end development and delivery of customer-facing digital product capabilities and platforms leverage generative AI based digital solutions. They should also have experience leading and effectively working with large, diverse and globally dispersed teams within a matrixed organization. Extensive collaboration with global teams is required to ensure seamless integration and operational excellence. The ideal candidate will have a strong background in the end-to-end software development lifecycle, technical product ownership, business analysis, and be a Scaled Agile practitioner, coupled with leadership and transformation experience. This role demands the ability to drive and deliver against key organizational critical initiatives, develop a collaborative environment, and deliver high-quality results in a matrixed organizational structure. Roles & Responsibilities: Maintain strategic relationships and strong communication with the leadership team to ensure all collaborators feel informed and engaged Oversee the software development lifecycle, ensuring standard methodologies in development, testing, and deployment across the product teams Lead and handle large teams with varied strengths within a matrixed organization, collaborating with geographically dispersed teams, including those in the US and international locations Develop and implement strategic roadmaps and plans for technology and workforce growth, including recruiting top talent and building a robust team in India Developing talent, motivating the team, delegating effectively, championing diversity within the team, and acting as a role model of servant leadership Ensuring global ways of working are embedded in the local organization Develop a culture of collaboration, innovation, and continuous improvement, driving talent development, motivation, and effective delegation Fostering standard methodology sharing and alignment with business goals Collaborate with Platform Owners, Product Owners, Service Owners, and delivery teams to ensure delivery matches commitments, acting as a critical issue point and facilitating communication when service commitments are unmet Participate in team member and leadership meetings, working with other parts of the organization and functional groups to ensure successful delivery and alignment with strategy, compliance, and regulatory requirements Remain accountable for ensuring overall organizational security and compliance with quality and GxP in technology services Monitor emerging AI tools and technologies and trends to find opportunities for platform growth and expansion Ensure ongoing alignment with strategy, compliance, and regulatory requirements for technology investments and services What we expect of you We are all different, yet we all use our unique contributions to serve patients. Basic Qualifications: Doctorate Degree and 2 years of experience in Engineering, IT or related field OR Masters degree with 8 - 10 years of experience in Engineering, IT or related field OR Bachelors degree with 10 - 14 years of experience in Engineering, IT or related field OR Diploma with 14 - 18 years of experience in Engineering, IT or related field Background as a Technical Product Owner (TPO), people manager, and business analyst, ensuring the ability to oversee and guide a team to translate business needs into definition and delivery of technical solutions, guiding development teams, prioritizing features, and ultimate delivery and management of the digital products. Proven leadership skills with the ability to lead large matrixed teams, demonstrated experience in leading and developing a hard-working team of technology professionals, building a culture of innovation and continuous improvement within the team to deliver powerful solutions and platform improvements. Experience in implementing innovative web applications, Data Engineering and integration, Enterprise data fabric concepts, methodologies, and technologies e.g. React js, Python, Django, Fast API, AWS technologies, Databricks, DevOps CI/CD Experience in implementing innovative scalable GEN AI solutions using Retrieval-augmented generation (RAG), AI Agents, Vector stores, AI/ML platforms, embedding models e.g Open AI, Langchain, Redis, pgvector. Experience in implementing a strategic roadmap and driving transformation initiatives using Scaled Agile methodology. Strong skills in collaborating and communicating with cross-functional teams, business collaborators, and executives to ensure alignment of platform initiatives with business outcomes, handling expectations, and ensuring successful delivery of projects. Degree in Computer Science, Information Systems, Engineering, or Life Sciences. Preferred Qualifications: At least 5 --8 years of domain knowledge in health and/or life sciences combined with Information Technology Understanding, and preferably applied experience and knowledge, in data management and CTD document drafting Leadership experience within a highly regulated pharmaceutical or technology organization, with the ability to ensure compliance with industry regulations and standard methodologies for GxP software validation. Experience driving a collaborative culture that values technical depth, accountability, and customer service. Strong problem-solving and analytical skills. Demonstrated ability to work effectively in a fast-paced, dynamic environment. Understanding of ITIL processes and implementation. Experience handling vendor relationships and working with external partners or consultants to ensure optimal performance, support, and development of digital products. Soft Skills: Excellent leadership and team management skills. Strong transformation and change management experience. Exceptional collaboration and communication skills. High degree of initiative and self-motivation. Ability to manage multiple priorities successfully. Team-oriented with a focus on achieving team goals. Strong presentation and public speaking skills. Excellent analytical and troubleshooting skills Strong verbal and written communication skills Ability to work effectively with global, virtual teams

Posted 2 weeks ago

Apply

8.0 - 10.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Naukri logo

Job Information Job Opening ID ZR_2174_JOB Date Opened 10/04/2024 Industry Technology Job Type Work Experience 8-10 years Job Title Lead - Backend Engineer City Bangalore Province Karnataka Country India Postal Code 560029 Number of Positions 4 Responsibilities Design and build solutions for complex business workflows Understanding the user persona and deliver slick experience. Take end to end ownership of components and be responsible for the sub systems that you work on from design, code, testing, integration, deployment, enhancements etc. Write high quality code and taking responsibility of their task Solve performance bottlenecks Mentor junior engineers Communicate and collaborate with management, product, QA, UI/UX teams. Deliver with quality, on time in a fast-paced start-up environment: Bachelor/Master's in computer science or relevant fields 8 + years of relevant experience. Strong sense of ownership. Excellent Java and object-oriented development skills. Experience in building and scaling microserives Strong problem solving skills, technical troubleshooting and diagnosing Expected to be a role model for young engineers, have a strong sense of code quality and enforce code quality within the team. Strong knowledge in RDBMS and NoSql technologies Experience in developing backeds for enterprise systems like eCommerce / manufacturing / supply chain etc. Excellent understanding of Debugging performance and optimisation techniques. Experience in Java, Mongo, MySQl, AWS technologies, ELK stack, Spring boot, Kafka Experience in developing any large scale Experience in cloud technologies Demonstrated ability to deliver in a fast paced environment Good Communication SkillsAdd ons: Experience in Big Data Technologies, Data Warehousing AI/ML experience Experience in architecting large scale SaaS products/platforms like eCommerce, manufacturing, supply chain, etc., check(event) ; career-website-detail-template-2 => apply(record.id,meta)" mousedown="lyte-button => check(event)" final-style="background-color:#2B39C2;border-color:#2B39C2;color:white;" final-class="lyte-button lyteBackgroundColorBtn lyteSuccess" lyte-rendered=""> I'm interested

Posted 2 weeks ago

Apply

10.0 - 13.0 years

35 - 55 Lacs

Hyderabad, Chennai, Bengaluru

Work from Office

Naukri logo

We are seeking a highly experienced and innovative Solution Architect with expertise in Java and AWS technologies to lead the design and implementation of scalable, high-performance solutions. This role offers an exciting opportunity to work on cutting-edge projects in cloud-native and microservices architectures while collaborating directly with clients and cross-functional teams. Responsibilities Translate business requirements into scalable, secure, and cost-effective solutions aligned with enterprise standards Define architecture vision across application, integration, and infrastructure layers, ensuring alignment with business goals Apply industry best practices and patterns, including microservices, serverless, and event-driven architectures Create and maintain detailed architecture documentation such as solution design diagrams, narratives, and work breakdown structures Facilitate technical brainstorming sessions, architecture workshops, and whiteboarding exercises with teams and stakeholders Conduct technical reviews, design validations, and walkthroughs of solution proposals with customers Lead decision-making processes for architecture spanning data, security, scalability, and integration domains Drive modernization initiatives by utilizing modern technology stacks and approaches for greenfield projects Oversee alignment with non-functional requirements (NFRs) such as performance, security, scalability, and high availability Collaborate in the adoption of architectural quality frameworks like Utility Trees and Quality Attribute Workshops Manage architectural complexities in cloud-native and serverless applications to ensure robust design principles are adhered to Requirements 13–19 years of professional experience with strong understanding of enterprise-scale solutions Proven expertise in Java 11/17+, Spring Boot, Spring Cloud, RESTful APIs, and microservices architecture Strong implementation knowledge of AWS services including Lambda, EC2, S3, RDS, API Gateway, Step Functions, SNS/SQS Completion of at least 3 solution architecture projects over the past 5 years Background in designing and building cloud-native applications and serverless solutions Showcase of driving architectural decisions across complex domains like data, security, and integration Experience in customer-facing technical discussions, architecture workshops, and reviews Familiarity with non-functional requirements such as security, scalability, high availability, and performance optimization Understanding of architectural quality frameworks including Utility Trees and Quality Attribute Workshops

Posted 3 weeks ago

Apply

4.0 - 7.0 years

15 - 25 Lacs

Chennai

Work from Office

Naukri logo

Job Summary We are seeking a skilled Cloud Developer with 4 to 7 years of experience to join our team. The ideal candidate will have extensive experience with AWS Cloud services and a strong background in the Process Manufacturing Industry. This hybrid role requires a deep understanding of various AWS tools and services to optimize our cloud infrastructure. The position operates during the day shift and does not require travel. Responsibilities Develop and maintain cloud-based solutions using AWS services to support business objectives. Implement and manage AWS CloudWatch for monitoring and logging of cloud resources. Design and deploy infrastructure using AWS CloudFormation templates. Configure and manage AWS Virtual Private Network for secure connections. Implement and manage AWS Backup solutions to ensure data integrity and availability. Utilize Amazon EBS for scalable storage solutions. Manage secrets and keys using AWS Secret Manager and AWS Key Management Service. Deploy and manage virtual servers using AWS EC2. Design and manage virtual private clouds using AWS VPC. Implement and manage relational databases using AWS RDS. Optimize content delivery using AWS CloudFront. Configure and manage DNS settings using AWS Route53. Implement scalable storage solutions using Amazon S3. Configure and manage load balancing using AWS Elastic Load Balancing. Manage user access and permissions using AWS IAM. Collaborate with cross-functional teams to ensure cloud solutions align with business needs. Provide technical support and troubleshooting for cloud-based applications. Stay updated with the latest AWS services and best practices to ensure optimal performance. Ensure compliance with industry standards and regulations in the Process Manufacturing Industry. Document cloud infrastructure and processes for future reference. Participate in code reviews and provide constructive feedback to peers. Continuously improve cloud infrastructure to enhance performance and reduce costs. Conduct regular security assessments to identify and mitigate potential risks. Provide training and support to team members on AWS services and best practices. Contribute to the companys overall cloud strategy and roadmap. Ensure high availability and disaster recovery plans are in place and tested regularly. Collaborate with stakeholders to gather requirements and deliver cloud solutions that meet their needs. Drive innovation by exploring new AWS services and technologies to improve business outcomes. Qualifications Must have experience with AWS Cloud AWS CloudWatch AWS CloudFormation AWS Virtual Private Network AWS Backup Amazon EBS AWS Secret Manager AWS Key Management Service AWS EC2 AWS VPC AWS RDS AWS CloudFront AWS Route53 Amazon S3 AWS Elastic Load Balancing AWS IAM. Must have domain experience in the Process Manufacturing Industry. Should have strong problem-solving skills and the ability to troubleshoot complex issues. Should have excellent communication and collaboration skills. Should have a proactive approach to learning and staying updated with the latest industry trends. Should have experience working in a hybrid work model. Should have the ability to work independently and as part of a team. Certifications Required AWS Certified Solutions Architect AWS Certified DevOps Engineer

Posted 3 weeks ago

Apply

5.0 - 10.0 years

6 - 12 Lacs

Hyderabad / Secunderabad, Telangana, Telangana, India

On-site

Foundit logo

Responsibilities: Analyze stories that have been written by Product Owner and perform an estimation of the complexity Work on all aspects of software development life cycle following agile methodologies Gather information and feedback from end users to understand and develop project requirements Conduct project design sessions and design solutions to meet current project requirements and be flexible enough to accommodate future project needs Perform reviews and integration testing to assure quality of project development efforts Ensure project tasks are assigned and completed in a timely manner and project milestone dates are met Setup a strategy to implement the stories Deliver with best quality Provide support and maintenance Collaborate effectively with technical and non-technical stakeholders Follow Agile best practices Raise concerns about incomplete or poor requirements Attend all Scrum ceremonies What We're Looking For: Bachelor's degree in Computer Science, Engineering, or related discipline 5+ years of in-depth development experience with C#/.NET, object-oriented design and building backend applications with REST API services using .NET Core Full Stack application development experience Proven ability in leading the design and development of API or data integration applications Expertise in web services, REST, WCF, and WebAPI Experience with developing APIs for data access using WebAPI with ODATA Experience in database design and advanced query techniques (DML and performance tuning) Proficient with software development lifecycle (SDLC) methodologies like Agile, Test-driven development Experience with DBMS such as Oracle, SQL, PostgreSQL Experience with DevOps practices and modern CI/CD deployment models using Jenkins/Ansible Experience leading team of engineers in the design, development, and maintenance of software code for business applications Provide technical guidance and coaching for less experienced team members Strong written and verbal communication skills Nice to Have: Knowledge on AWS technologies (e.g. EC2, RDS, ALB, Auto-scaling, S3, IAM, CloudWatch) to develop and maintain an Amazon AWS-based cloud solution

Posted 3 weeks ago

Apply

2.0 - 4.0 years

4 - 6 Lacs

Chennai

Work from Office

Naukri logo

The AWSProducts(Competency->AmazonWebService(AWS)CloudComputing)E0 role involves working with relevant technologies, ensuring smooth operations, and contributing to business objectives. Responsibilities include analysis, development, implementation, and troubleshooting within the AWSProducts(Competency->AmazonWebService(AWS)CloudComputing)E0 domain.

Posted 3 weeks ago

Apply

6.0 - 8.0 years

15 - 25 Lacs

Hyderabad

Remote

Naukri logo

Job Title : Data Engineer II Experience : 6+ Years Location : Remote (India) Job Type : Full-time Job Description : We are looking for a highly skilled Data Engineer II with 6+ years of experience, including at least 4 years in data engineering or software development. The ideal candidate will be well-versed in building scalable data solutions using modern data ecosystems and cloud platforms. Key Responsibilities : Design, build, and optimize scalable ETL pipelines. Work extensively with Big Data technologies like Snowflake and Databricks . Write and optimize complex SQL queries for large datasets. Define and manage SLAs, performance benchmarks, and monitoring systems. Develop data solutions using the AWS Data Ecosystem , including S3 , Lambda , and more. Handle both relational (e.g., PostgreSQL) and NoSQL databases. Work with programming languages like Python , Java , and/or Scala . Use Linux command-line tools for system and data operations. Implement best practices in data lineage , data quality , data observability , and data discoverability . Preferred (Nice-to-Have) : Experience with data mesh architecture or building distributed data products. Prior exposure to data governance frameworks.

Posted 3 weeks ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

Duration: 8Months Job Type: Contract Work Type: Onsite The top 3 responsibilities of this role: Work closely with senior engineers to implement and deliver high quality technology solutions Own development in multiple layers of the stack including distributed workflows hosted in native AWS architecture Operational rigor for a rapidly growing tech stack Leadership Principles: Learn and Be Curious, Ownership, Deep Dive, Bias for Action Mandatory requirements: Java, Problem solving, DSA, Spring Framework Preferred skills: AWS tools like DynamoDB, CDK, Lambda, S3 Education or certification requirements: Minimum of Bachelors degree in Computer Science

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies