Home
Jobs

852 Aws Cloud Jobs - Page 34

Filter Interviews
Min: 0 years
Max: 25 years
Min: ₹0
Max: ₹10000000
Setup a job Alert
Filter
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

2 - 5 years

6 - 10 Lacs

Bengaluru

Work from Office

Naukri logo

12 plus years of overall IT experience 5 plus years of Cloud implementation experience (AWS - S3), Terraform, Docker, Kubernetes Expert in troubleshooting cloud impementation projects Expert in cloud native technologies Good working knowledge in Terraform and Quarkus Must Have skills Cloud AWS Knowledge (AWSS3, Load-Balancers,VPC/VPC-Peering/Private-Public-Subnets, EKS, SQS, Lambda,Docker/Container Services, Terraform or other IaC-Technologies for normal deployment), Quakrus, PostgreSQL, Flyway, Kubernetes, OpenId flow, Open-Search/Elastic-Search, Open API/Swagger, Java OptionalKafka, Python #LI-INPAS Job Segment Developer, Java, Technology

Posted 1 month ago

Apply

2 - 6 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). Title - Lead Data Architect (Streaming) Required Skills and Qualifications Overall 10+ years of IT experience of which 7+ years of experience in data architecture and engineering Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS Strong experience with Confluent Strong experience in Kafka Solid understanding of data streaming architectures and best practices Strong problem-solving skills and ability to think critically Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Knowledge of Apache Airflow for data orchestration Bachelor's degree in Computer Science, Engineering, or related field Preferred Qualifications An understanding of cloud networking patterns and practises Experience with working on a library or other long term product Knowledge of the Flink ecosystem Experience with Terraform Deep experience with CI/CD pipelines Strong understanding of the JVM language family Understanding of GDPR and the correct handling of PII Expertise with technical interface design Use of Docker Key Responsibilities Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, Kafka and Confluent, all within a larger and overarching programme ecosystem Architect data processing applications using Python, Kafka, Confluent Cloud and AWS Develop data ingestion, processing, and storage solutions using Python and AWS Lambda, Confluent and Kafka Ensure data security and compliance throughout the architecture Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions Optimize data flows for performance, cost-efficiency, and scalability Implement data governance and quality control measures Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams Provide technical leadership and mentorship to development teams and lead engineers Stay current with emerging technologies and industry trends Collaborate with data scientists and analysts to enable efficient data access and analysis Evaluate and recommend new technologies to improve data architecture Position Overview: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Developer, Computer Science, Consulting, Technology

Posted 1 month ago

Apply

2 - 6 years

5 - 9 Lacs

Hyderabad

Work from Office

Naukri logo

AWS Data Engineer: ***************** As an AWS Data Engineer, you will contribute to our client and will have the below responsibilities: Work with technical development team and team lead to understand desired application capabilities. Candidate would need to do development using application development by lifecycles, & continuous integration/deployment practices. Working to integrate open-source components into data-analytic solutions Willingness to continuously learn & share learnings with others Required: 5+ years of direct applicable experience with key focus: Glue and Python; AWS; Data Pipeline creation Develop code using Python, such as o Developing data pipelines from various external data sources to internal data. Use of Glue for extracting data from the design data base. Developing Python APIs as needed Minimum 3 years of hands-on experience in Amazon Web Services including EC2, VPC, S3, EBS, ELB, Cloud-Front, IAM, RDS, Cloud Watch. Able to interpret business requirements, analyzing, designing and developing application on AWS Cloud and ETL technologies Able to design and architect server less application using AWS Lambda, EMR, and DynamoDB Ability to leverage AWS data migration tools and technologies including Storage Gateway, Database Migration and Import Export services. Understands relational database design, stored procedures, triggers, user-defined functions, SQL jobs. Familiar with CI/CD tools e.g., Jenkins, UCD for Automated application deployments Understanding of OLAP, OLTP, Star Schema, Snow Flake Schema, Logical/Physical/Dimensional Data Modeling. Ability to extract data from multiple operational sources and load into staging, Data warehouse, Data Marts etc. using SCDs (Type 1/Type 2/ Type 3/Hybrid) loads. Familiar with Software Development Life Cycle (SDLC) stages in a Waterfall and Agile environment. Nice to have: Familiar with the use of source control management tools for Branching, Merging, Labeling/Tagging and Integration, such as GIT and SVN. Experience working with UNIX/LINUX environments Hand-on experience with IDEs such as Jupiter Notebook Education & Certification University degree or diploma and applicable years of experience Job Segment Developer, Open Source, Data Warehouse, Cloud, Database, Technology

Posted 1 month ago

Apply

2 - 6 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). TitleLead Data Architect (Warehousing) Required Skills and Qualifications Overall 10+ years of IT experience of which 7+ years of experience in data architecture and engineering Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS Proficiency in Python Solid understanding of data warehousing architectures and best practices Strong Snowflake skills Strong Data warehouse skills Strong problem-solving skills and ability to think critically Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Experience of data cataloguing Knowledge of Apache Airflow for data orchestration Experience modelling, transforming and testing data in DBT Bachelor's degree in Computer Science, Engineering, or related field Preferred Qualifications Familiarity with Atlan for data catalog and metadata management Experience integrating with IBM MQ Familiarity with Sonarcube for code quality analysis AWS certifications (e.g., AWS Certified Solutions Architect) Experience with data modeling and database design Knowledge of data privacy regulations and compliance requirements An understanding of Lakehouses An understanding of Apache Iceberg tables SnowPro Core certification Key Responsibilities Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS, as well as Snowflake, DBT and Apache Airflow, all within a larger and overarching programme ecosystem Develop data ingestion, processing, and storage solutions using Python and AWS Lambda and Snowflake Architect data processing applications using Python Ensure data security and compliance throughout the architecture Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions Optimize data flows for performance, cost-efficiency, and scalability Implement data governance and quality control measures Ensure delivery of CI, CD and IaC for NTT tooling, and as templates for downstream teams Provide technical leadership and mentorship to development teams and lead engineers Stay current with emerging technologies and industry trends Ensure data security and implement best practices using tools like Synk Collaborate with data scientists and analysts to enable efficient data access and analysis Evaluate and recommend new technologies to improve data architecture Position Overview: We are seeking a highly skilled and experienced Data Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Developer, Solution Architect, Data Warehouse, Computer Science, Database, Technology

Posted 1 month ago

Apply

2 - 6 years

8 - 12 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 306668 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Engineering Sr. Staff Engineer to join our team in Bangalore, Karnataka (IN-KA), India (IN). Job TitleLead Data Engineer (Warehouse) Required Skills and Qualifications - 7+ years of experience in data engineering of which atleast 3+ years as lead / managed team of 5+ data engineering team. - Experience in AWS cloud services - Expertise with Python and SQL - Experience of using Git / Github for source control management - Experience with Snowflake - Strong understanding of lakehouse architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders - Strong use of version control and proven ability to govern a team in the best practice use of version control - Strong understanding of Agile and proven ability to govern a team in the best practice use of Agile methodologies Preferred Skills and Qualifications - An understanding of Lakehouses - An understanding of Apache Iceberg tables - An understanding of data cataloguing - Knowledge of Apache Airflow for data orchestration - An understanding of DBT - SnowPro Core certification - Bachelor's degree in Computer Science, Engineering, or related field Key Responsibilities Lead and direct a small team of engineers engaged in - Engineer end-to-end data solutions using AWS services, including Lambda, S3, Snowflake, DBT, Apache Airflow - Cataloguing data - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Providing best in class documentation for downstream teams to develop, test and run data products built using our tools - Testing our tooling, and providing a framework for downstream teams to test their utilisation of our products - Helping to deliver CI, CD and IaC for both our own tooling, and as templates for downstream teams - Use DBT projects to define re-usable pipelines Position Overview: We are seeking a highly skilled and experienced Lead Data Engineer to join our dynamic team. The ideal candidate will have a strong background in implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies; leading teams and directing engineering workloads. This role requires a deep understanding of data engineering, cloud services, and the ability to implement high quality solutions. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Computer Science, Database, SQL, Consulting, Technology

Posted 1 month ago

Apply

4 - 9 years

16 - 20 Lacs

Bengaluru

Work from Office

Naukri logo

Req ID: 301930 NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Digital Solution Architect Lead Advisor to join our team in Bangalore, Karnataka (IN-KA), India (IN). TitleData Solution Architect Position Overview: We are seeking a highly skilled and experienced Data Solution Architect to join our dynamic team. The ideal candidate will have a strong background in designing and implementing data solutions using AWS infrastructure and a variety of core and supplementary technologies. This role requires a deep understanding of data architecture, cloud services, and the ability to drive innovative solutions to meet business needs. Required Skills and Qualifications - Bachelor's degree in Computer Science, Engineering, or related field - 7+ years of experience in data architecture and engineering - Strong expertise in AWS cloud services, particularly Lambda, SNS, S3, and EKS - Proficiency in Kafka/Confluent Kafka and Python - Experience with Synk for security scanning and vulnerability management - Solid understanding of data streaming architectures and best practices - Strong problem-solving skills and ability to think critically - Excellent communication skills to convey complex technical concepts to both technical and non-technical stakeholders Preferred Qualifications - Experience with Kafka Connect and Confluent Schema Registry - Familiarity with Atlan for data catalog and metadata management - Knowledge of Apache Flink for stream processing - Experience integrating with IBM MQ - Familiarity with Sonarcube for code quality analysis - AWS certifications (e.g., AWS Certified Solutions Architect) - Experience with data modeling and database design - Knowledge of data privacy regulations and compliance requirements Key Responsibilities - Design and implement scalable data architectures using AWS services and Kafka - Develop data ingestion, processing, and storage solutions using Python and AWS Lambda - Ensure data security and implement best practices using tools like Synk - Optimize data pipelines for performance and cost-efficiency - Collaborate with data scientists and analysts to enable efficient data access and analysis - Implement data governance policies and procedures - Provide technical guidance and mentorship to junior team members - Evaluate and recommend new technologies to improve data architecture - Architect end-to-end data solutions using AWS services, including Lambda, SNS, S3, and EKS - Design and implement data streaming pipelines using Kafka/Confluent Kafka - Develop data processing applications using Python - Ensure data security and compliance throughout the architecture - Collaborate with cross-functional teams to understand business requirements and translate them into technical solutions - Optimize data flows for performance, cost-efficiency, and scalability - Implement data governance and quality control measures - Provide technical leadership and mentorship to development teams - Stay current with emerging technologies and industry trend About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA is an equal opportunity employer and considers all applicants without regarding to race, color, religion, citizenship, national origin, ancestry, age, sex, sexual orientation, gender identity, genetic information, physical or mental disability, veteran or marital status, or any other characteristic protected by law. We are committed to creating a diverse and inclusive environment for all employees. If you need assistance or an accommodation due to a disability, please inform your recruiter so that we may connect you with the appropriate team. Job Segment Solution Architect, Consulting, Database, Computer Science, Technology

Posted 1 month ago

Apply

2 - 5 years

6 - 10 Lacs

Mumbai

Work from Office

Naukri logo

NTT DATA strives to hire exceptional, innovative and passionate individuals who want to grow with us. If you want to be part of an inclusive, adaptable, and forward-thinking organization, apply now. We are currently seeking a Systems Integration Specialist to join our team in Mumbai, Maharashtra (IN-MH), India (IN). . Cloud Security Lead - a. Exp06-08 years' experience in cloud/security b. Skills: Actively and aggressively fix security observations on AWS cloud - Security Hub, Guard Duty, Detective, Inspector, etc. Experience working on Checkpoint, Prisma cloud will be advantage. Should be working is definitive targets to improve the security posture. About NTT DATA NTT DATA is a $30 billion trusted global innovator of business and technology services. We serve 75% of the Fortune Global 100 and are committed to helping clients innovate, optimize and transform for long term success. As a Global Top Employer, we have diverse experts in more than 50 countries and a robust partner ecosystem of established and start-up companies.Our services include business and technology consulting, data and artificial intelligence, industry solutions, as well as the development, implementation and management of applications, infrastructure and connectivity. We are one of the leading providers of digital and AI infrastructure in the world. NTT DATA is a part of NTT Group, which invests over $3.6 billion each year in R&D to help organizations and society move confidently and sustainably into the digital future. Visit us atus.nttdata.com NTT DATA endeavors to make https://us.nttdata.comaccessible to any and all users. If you would like to contact us regarding the accessibility of our website or need assistance completing the application process, please contact us at https://us.nttdata.com/en/contact-us. This contact information is for accommodation requests only and cannot be used to inquire about the status of applications. NTT DATA is an equal opportunity employer. Qualified applicants will receive consideration for employment without regard to race, color, religion, sex, sexual orientation, gender identity, national origin, disability or protected veteran status. For our EEO Policy Statement, please click here. If you'd like more information on your EEO rights under the law, please click here. For Pay Transparency information, please click here. Job Segment Consulting, Technology

Posted 1 month ago

Apply

3 - 6 years

2 - 6 Lacs

Hyderabad

Work from Office

Naukri logo

ABOUT AMGEN Amgen harnesses the best of biology and technology to fight the world’s toughest diseases, and make people’s lives easier, fuller and longer. We discover, develop, manufacture and deliver innovative medicines to help millions of patients. Amgen helped establish the biotechnology industry more than 40 years ago and remains on the cutting-edge of innovation, using technology and human genetic data to push beyond what’s known today. ABOUT THE ROLE Role Description We are seeking an experienced MDM Manager with 10–14 years of experience to lead strategic development and operations of our Master Data Management (MDM) platforms, with hands-on experience in Informatica or Reltio. This role will involve managing a team of data engineers, architects, and quality experts to deliver high-performance, scalable, and governed MDM solutions that align with enterprise data strategy. To succeed in this role, the candidate must have strong MDM experience along with Data Governance, DQ, Data Cataloging implementation knowledge, hence the candidates must have minimum 6-8 years of core MDM technical experience for this role (Along with total experience in the range of 10-14 years) . Roles & Responsibilities Lead the implementation and optimization of MDM solutions using Informatica or Reltio platforms. Define and drive enterprise-wide MDM architecture, including IDQ, data stewardship, and metadata workflows. Match/Merge and Survivorship strategy and implementation experience D esign and delivery of MDM processes and data integrations using Unix, Python, and SQL. Collaborate with backend data engineering team and frontend custom UI team for strong integrations and a seamless enhanced user experience respectively Manage cloud-based infrastructure using AWS and Databricks to ensure scalability and performance. Coordinate with business and IT stakeholders to align MDM capabilities with organizational goals. Establish data quality metrics and monitor compliance using automated profiling and validation tools. Promote data governance and contribute to enterprise data modeling and approval workflow (DCRs). Ensure data integrity, lineage, and traceability across MDM pipelines and solutions. Provide mentorship and technical leadership to junior team members and ensure project delivery timelines. Lead custom UI design for better user experience on data stewardship Basic Qualifications and Experience Master’s degree with 8 - 10 years of experience in Business, Engineering, IT or related field OR Bachelor’s degree with 10 - 14 years of experience in Business, Engineering, IT or related field OR Diploma with 14 - 16 years of experience in Business, Engineering, IT or related field Functional Skills: Must-Have Skills: Deep knowledge of MDM tools (Informatica, Reltio) and data quality frameworks (IDQ) from configuring data assets to building end to end data pipelines and integrations for data mastering and orchestrations of ETL pipelines Very good understanding on reference data, hierarchy and its integration with MDM Hands on experience with custom workflows AVOS , Eclipse etc Strong experience with external data enrichment services like D&B, Address doctor etc Strong experience on match/merge and survivorship rules strategy and implementations Strong experience with group fields, cross reference data and UUIDs Strong understanding of AWS cloud services and Databricks architecture. Proficiency in Python, SQL, and Unix for data processing and orchestration. Experience with data modeling, governance, and DCR lifecycle management. Proven leadership and project management in large-scale MDM implementations. Able to implement end to end integrations including API based integrations, Batch integrations and Flat file based integrations Must have worked on atleast 3 end to end implementations of MDM Good-to-Have Skills: Experience with Tableau or PowerBI for reporting MDM insights. Exposure to Agile practices and tools (JIRA, Confluence). Prior experience in Pharma/Life Sciences. Understanding of compliance and regulatory considerations in master data. Professional Certifications Any MDM certification ( e.g. Informatica , Reltio etc ) Any Data Analysis certification (SQL) Any cloud certification (AWS or AZURE) Soft Skills: Strong analytical abilities to assess and improve master data processes and solutions. Excellent verbal and written communication skills, with the ability to convey complex data concepts clearly to technical and non-technical stakeholders. Effective problem-solving skills to address data-related issues and implement scalable solutions. Ability to work effectively with global, virtual teams EQUAL OPPORTUNITY STATEMENT Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

6 - 10 years

8 - 12 Lacs

Hyderabad

Work from Office

Naukri logo

Join Amgen's Mission to Serve Patients If you feel like you’re part of something bigger, it’s because you are. At Amgen, our shared mission—to serve patients—drives all that we do. It is key to our becoming one of the world’s leading biotechnology companies. We are global collaborators who achieve together—researching, manufacturing, and deliver ever-better products that reach over 10 million patients worldwide. It’s time for a career you can be proud of. Digital Product Manager/Content Curator Live What you will do Let’s do this. Let’s change the world. In this vital role We are seeking a detail-oriented and research-savvy Content Curator to support our enterprise Search Program within the pharmaceutical sector. This role is critical to improving how scientists, researchers, clinicians, and business teams discover relevant, accurate, and well-structured information across vast internal and external data sources. You will curate, classify, and optimize content to ensure it is accessible, contextual, and aligned with regulatory standards. Curate scientific, clinical, regulatory, and commercial content for use within internal search platforms. Sourcing and aggregating relevant content across various platforms. Ensure high-value content is properly tagged, described, and categorized using standard metadata and taxonomies. Identify and fill content gaps based on user needs and search behavior. Organizing and scheduling content publication to maintain consistency. Analyzing content performance and making data-driven decisions to optimize engagement Provide feedback and input on synonym lists, controlled vocabularies, and NLP enrichment tools Apply and help maintain consistent metadata standards, ontologies, and classification schemes (e.g., MeSH, SNOMED, MedDRA). Work with taxonomy and knowledge management teams to evolve tagging strategies and improve content discoverability. Capture and highlight the best content from a wide range of topics Stay up-to-date on best practices and make recommendations for content strategy Edit and optimize content for search engine optimization Perform quality assurance checks on all content before publication Identify and track metrics to measure the success of content curation efforts Review and curate content from a wide variety of categories with a focus Understanding of fundamental data structures and algorithms Understanding how to optimize content for search engines is important for visibility. Experience in identifying, organizing, and sharing content. Ability to clearly and concisely communicate complex information. Ability to analyze data and track the performance of content. Ability to quickly adapt to changing information landscapes and find new resources. A deep understanding of Google Cloud Platform services and technologies is crucial and will be an added advantage Check and update digital assets regularly and, if needed, modify their accessibility and security settings Investigate, secure, and properly document permission clearance to publish data, graphics, videos, and other media Develop and manage a system for storing and organizing digital material Convert collected assets to a different digital format and discard the material that is no longer relevant or needed Investigate new trends and tools connected with the generation and curation of digital material Basic Qualifications: Degree in Data Management, Mass communication and computer science & engineering preferred with 9-12 years of software development experience 5+ years of experience in (digital) content curation or a related position Excellent organizational and time-management skills. Ability to analyze data and derive insights for content optimization. Familiarity with metadata standards, taxonomy tools, and content management systems. Ability to interpret scientific or clinical content and structure it for digital platforms. Ability to analyze data and derive insights for content optimization. Exceptional written and verbal communication skills. Experience in Content Management Systems (CMS), SEO, Google Analytics, GXP Search Engine/ Solr Search, enterprise search platforms, data bricks Strong problem solving, analytical skills; Ability to learn quickly; Excellent communication and interpersonal skills Exceptional written and verbal communication skills. Excellent organizational and time-management skills. Preferred Qualifications: Experience with enterprise search platforms (e.g., Lucene, Elasticsearch, Coveo, Sinequa). Experience with GCP Cloud/AWS cloud /Azure Cloud Experience GXP Search Engine/ Solr Search Experience in Posgres SQL /Mongo DB SQL database, vector database for large language models, Databricks or RDS, Dynamo DB, S3 Experience in Agile software development methodologies Good to Have Skills Willingness to work on AI Applications Experience with popular large language models Experience with Langchain or llamaIndex framework for language models Experience with prompt engineering, model fine tuning Knowledge of NLP techniques for text analysis and sentiment analysis Soft Skills: Excellent analytical and troubleshooting skills. Strong verbal and written communication skills. Ability to work effectively with global teams. High degree of initiative and self-motivation. Team-oriented, with a focus on achieving team goals. Strong presentation and public speaking skills. Thrive What you can expect from us As we work to develop treatments that take care of others, we also work to care for our teammates’ professional and personal growth and well-being. In addition to the base salary, Amgen offers competitive and comprehensive Total Rewards Plans that are aligned with local industry standards. Apply now for a career that defies imagination In our quest to serve patients above all else, Amgen is the first to imagine, and the last to doubt. Join us. careers.amgen.com Amgen is an Equal Opportunity employer and will consider you without regard to your race, color, religion, sex, sexual orientation, gender identity, national origin, protected veteran status, or disability status. We will ensure that individuals with disabilities are provided with reasonable accommodation to participate in the job application or interview process, to perform essential job functions, and to receive other benefits and privileges of employment. Please contact us to request accommodation.

Posted 1 month ago

Apply

14 - 20 years

40 - 45 Lacs

Bengaluru

Work from Office

Naukri logo

About the Company Ness is a full lifecycle digital engineering firm offering digital advisory through scaled engineering services. With 20+ years of specialization in product and platform engineering, Ness is a global leader in digital transformation. We design, build, & integrate digital platforms and enterprise software that help organizations to engage with customers, differentiate their brands, and drive profitable growth for them. Our experience designers, software engineers, data experts, and business consultants, partner with clients to develop roadmaps that identify ongoing opportunities to increase the value of their digital solutions and enterprise systems. The exciting work happens through 11 innovation hubs with 5000+ Nessians located across the globe. Please visit our website www.ness.com and learn about our wonderful work. We are inviting applications for Engineering Manager. In this role you would be working towards developing the organization’s strategy for using technological resources. Ensuring technologies are used efficiently, profitably, and securely and evaluating and implementing new solutions. Roles & Responsibility: • Should have 14+ years of experience of working in product development organizations with a proven experience of developing enterprise scale products in a highly Agile/Scrum environment. • Should be able to manage releases for products having multiple live versions and multiple releases through the year • Ability to manage delivery with 20+ Engineers, including architecture, design, code reviews & peoples career management; with good exposure to engineering processes, product delivery, playbooks, frameworks etc.) • Specific responsibilities include driving the team on innovation and implementation, creating and reviewing architectural designs, mentoring the team, and honing its engineering skills. • Strong knowledge of Java-Spring based technical stack, databases (SQL Server, Oracle), modern JS frameworks like React, AWS cloud, design and architectural patterns and frameworks • Good understanding of application Security, Performance & Quality, and DevOps process • Very good knowledge of software development tools, patterns and processes (Agile principles, SCRUM, SAFe) • Collaborate with architects, product management, and engineering teams to create solutions that increase the platform's value. • Create technical specifications, prototypes, and presentations to communicate your ideas. • Well-versed in emerging industry technologies and trends and the ability to communicate that knowledge to the team and influence product direction. • Own progress of the product through the development life cycle, identifying risks and opportunities, and ensuring visibility to senior leadership. • Partner with product management to define and refine our product road map, user experience, priorities, and schedule. • Excellent Critical thinking, Analytical, problem solving & Solutioning skills with a customer first mindset. Good to have: • Highly motivated and has the ability to convert vague and ill-defined problems into well-defined problems, take initiative and encourage consensus building in the team. • Strong written and verbal communication skills and articulation skills • Demonstrable project management, stakeholder management and organizational skills. • Proven ability to lead in a matrix environment. • Strong interpersonal and talent management skills, including the ability to identify and develop product management talent.

Posted 1 month ago

Apply

8 - 13 years

25 - 40 Lacs

Pune

Work from Office

Naukri logo

Role & responsibilities Design and develop infrastructure solutions to support business functions, processes, and applications. Responsible for designing, building AWS cloud Infrastructure solution and Automation framework. Participates in creation of new infrastructure and modification of existing infrastructure environments in the Cloud. Communicate with stakeholders and development teams to assist in coordinating the successful delivery of tools and software. Continuously improve the existing infra-structure by identifying the gaps and ticket trends Develop, evaluate, and make recommendations for alternative infrastructure solutions. Stay current with new technology options and cloud infra solutions Build Infrastructure as a code service IE: CloudFormation and/or terraform • Set up and drive standard enterprise process for cloud infra set up and deployment Help other development and engineering teams resolve application to platform integration issues for Platform as a Service (PaaS) and Infrastructure as a Service (IaaS) services. Work closely with stakeholders and product teams to gather technical and non-technical requirements and translate them into well-architected solutions Be upto date with the latest development in the Cloud services and bring the knowledge and best Preferred candidate profile Hands on experience in Design, implement, and manage cloud-based solutions, ensuring scalability, reliability, security, compliance and performance across various AWS services. Collaborate with development, operations, and architecture teams to integrate with AWS services. Optimize cost management by setting up budgets, analysing cost usage using tools like Cost Explorer, and providing recommendations for cost-saving measures. Develop migration strategies and Lead Execution of migration activities Implement security controls and align the cloud environment with relevant industry regulations (e.g., GDPR, HIPAA) Conduct security assessments and set up of control towers Develop recovery strategies: design disaster recovery (DR) and business continuity (BC) plans Develop and maintain cloud governance frameworks, policies, and procedures. Develop team proficiency in infrastructure automation tools like Terraform, AWS CloudFormation, Azure Resource Manager (ARM) templates Should be proficient in scripting languages like PowerShell, infrastructure automation frameworks and CI/CD pipelines to automate configuration management, provisioning, and deployment processes

Posted 1 month ago

Apply

8 - 13 years

30 - 35 Lacs

Pune

Hybrid

Naukri logo

We are seeking a highly skilled and experienced Cloud Solution Architect to lead the design, implementation, and management of cloud-based solutions across Azure and AWS. The ideal candidate will have deep expertise in cloud architecture, infrastructure, security, and automation while demonstrating strong capabilities in proposal writing, RFPs, architecture diagrams, and technical presentations. Key Responsibilities: Cloud Architecture & Design: Design and implement scalable, secure, and cost-effective cloud architectures across Azure and AWS. Pre-Sales & Solutioning: Engage with clients to understand business requirements and develop cloud solutions, including responding to RFPs and crafting technical proposals. Technical Documentation & Diagrams: Create high-level and low-level architecture diagrams, design documents, and best practices for cloud adoption. Presentations & Stakeholder Communication: Deliver technical presentations, PPTs , and cloud strategy discussions to clients and internal teams. Migration & Modernization: Assess and execute cloud migration strategies, including lift-and-shift, re-platforming, and re-architecting applications. Security & Compliance: Ensure cloud solutions comply with industry standards such as SOC 2, ISO 27001, NIST, and CIS benchmarks. Automation & Optimization: Utilize Infrastructure-as-Code (IaC) tools like Terraform, ARM Templates, or CloudFormation for automation. Collaboration & Leadership: Work closely with DevOps, engineering, and security teams to drive cloud adoption and best practices. Required Skills & Experience: 15+ years of experience in IT, with 10+ years in cloud architecture (Azure, AWS). Expertise in Azure services (VMs, AKS, AAD, Networking, Security, Storage, etc.) and AWS services (EC2, RDS, Lambda, VPC, IAM, etc.). Strong experience with architecture frameworks like TOGAF, Well-Architected Framework (Azure & AWS). Hands-on experience in Infrastructure as Code (Terraform, ARM, CloudFormation) and automation using PowerShell, Python, or Bash. Knowledge of cloud security, identity & access management, and compliance frameworks. Experience working on RFPs, proposals, and pre-sales activities. Strong communication and presentation skills with the ability to create and deliver PPTs, whitepapers, and technical documentation. Experience with hybrid cloud solutions, multi-cloud strategies, and cloud governance. Understanding of networking, VPNs, firewalls, load balancing, and DNS in a cloud environment. Certifications such as Azure Solutions Architect Expert, AWS Certified Solutions Architect.

Posted 1 month ago

Apply

5 - 8 years

4 - 8 Lacs

Bengaluru

Remote

Naukri logo

We are seeking a skilled and motivated AWS Cloud Engineer to manage and optimize our cloud infrastructure. You will be responsible for designing, implementing, and maintaining scalable, secure, and cost-effective AWS environments that support our fintech products and services. Key Responsibilities: Design, deploy, and maintain cloud infrastructure on AWS. Automate provisioning, configuration, and scaling using Infrastructure as Code (IaC) tools such as Terraform or CloudFormation. Monitor system performance, troubleshoot issues, and optimize cloud resources for performance and cost. Implement security best practices including IAM roles, security groups, and encryption. Collaborate with development, QA, and DevOps teams to support CI/CD pipelines. Ensure high availability, backup, and disaster recovery plans are in place and tested. Maintain compliance with security, governance, and regulatory standards. Key Skills: Deep knowledge of Amazon Web Services (AWS) : EC2, S3, RDS, Lambda, VPC, CloudWatch, etc. Experience with Infrastructure as Code : Terraform or AWS CloudFormation. Strong scripting skills (Bash, Python, etc.). Knowledge of CI/CD tools : Jenkins, GitHub Actions, GitLab CI. Experience with monitoring/logging tools: CloudWatch, ELK Stack, Prometheus, Grafana. Understanding of cloud security best practices and networking concepts . Familiarity with containerization (Docker) and orchestration (Kubernetes optional). Experience with Linux-based server environments.

Posted 1 month ago

Apply

8 - 13 years

40 - 50 Lacs

Bengaluru

Work from Office

Naukri logo

Role-Infrastructure Engineer Location-Bangalore Duration-Permanent Exp-8+ years About the role We are seeking an experienced Infrastructure Engineer to join our team at, a leader in blockchain technology and solutions. The ideal candidate will have a strong background in infrastructure management and a deep understanding of blockchain ecosystems. You will be responsible for designing, implementing, and maintaining the foundational infrastructure that supports our blockchain platforms, ensuring high availability, scalability, and security. Your expertise in AWS cloud technologies and database management, particularly with RDS, PostgreSQL, and Aurora, will be essential to our success. Responsibilities: Design & Deployment: Develop, deploy, and manage the infrastructure for blockchain nodes, databases, and network systems. Automation & Optimization: Automate infrastructure provisioning and maintenance tasks to enhance efficiency and reduce downtime. Optimize performance, reliability, and scalability across our blockchain systems. Monitoring & Troubleshooting: Set up monitoring and alerting systems to proactively manage infrastructure health. Quickly identify, troubleshoot, and resolve issues in production environments. Security Management: Implement robust security protocols, firewalls, and encryption to protect infrastructure and data from breaches and vulnerabilities. should be aware of VPC Virtual private cloud good in this Collaboration: Work closely with development, DevOps, and security teams to ensure seamless integration and support of blockchain applications. Support cross-functional teams in achieving network reliability and efficient resource management. Documentation: Maintain comprehensive documentation of infrastructure configurations, processes, and recovery plans. Continuous Improvement: Research and implement new tools and practices to improve infrastructure resiliency, performance, and cost-efficiency. Stay updated with blockchain infrastructure trends and industry best practices. Incident management: Incident dashboard management. Integrate dashboard using different power tools. Requirements: Educational Background: Bachelors degree in computer science, Information Technology, or a related field. Experience: Minimum of 7 years of experience in AWS infrastructure engineering, using terraforms, Terra-grunt, and Atlantis with incident management and resolution using automation (infrastructure as a code) , AWS infrastructure cloud provisioning. Should be aware of VPC Virtual private cloud. Technical Skills: Terraform and Automation AWS Cloud watch Hands-on experience with monitoring tools (e.g., Prometheus, Grafana). DevOps with CI/CD pipelines. Incident management resolution and reporting. Proficiency in cloud platforms (e.g., AWS, GCP, Azure) and container orchestration (e.g., Docker, Kubernetes). Strong knowledge of Linux/Unix system administration. Understanding of networking protocols, VPNs, and firewalls. Participate in on-call rotations to provide 24/7 support for critical systems. Security Knowledge: Strong understanding of security best practices, especially within blockchain environments. Soft Skills: Excellent problem-solving abilities, attention to detail, strong communication skills, and a proactive, team-oriented mindset. Experience working with consensus protocols and node architecture.

Posted 1 month ago

Apply

12 - 15 years

10 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

ROLE AND RESPONSIBILITIES Must be a Subject Matter Expert in one the following technologies AWS laaS Experience in key AWS services EC2, Simple storage service(S3), Virtual private cloud (VPC ), Auto-scaling, Security groups, Public/Private subnets, route 53, Cloud Front, Snapshot, Direc t Connect , NACL , Elastic Block Storage ( EBS ), Elastic Load Balancer (ELB) Internet Gateway (IG), Transit Gateway (TG), NAT, SNS, SQS etc . Experience in Security services AWS Identity and Access Management (IAM), CloudWatch, AWS Secrets Manager, Web Application Firewall(WAF), Guard Duty, AWS Config, Cloud Trail, Amazon Inspector, AWS Shield, AWS Security Hub, Trusted Advisors, KMS etc. Implement AWS Landing Zone using AWS control tower & Managing Multiple AWS accounts , users & resources using Landing Zone 2. Azure laaS Azure AD, Resource Groups, Virtual Machines, Containers, Virtual Networks, Storage, Vnet Gateway, Site to Site VPN, Availability Sets, Recovery Services Vaults, Load Balancers, NSG, Azure Security Center, Azure PowerShell, Express Route, Azure Monitor, Azure Advisor, Azure DNS, App Services, Policy, Blueprint, Automation Account, Log Analytics Workspace etc. Key skills Required Containerization like Dockers and Kubernetes Infrastructure as a code tooling like Terraform Windows Power shell, Unix or Linux bash scripting At least one Continuous Integration and Continuous Deployment Pipelines (CICD) tooling like GitHub or Cloud Build Serverless technologies like AWS Lambda or Azure Functions QUALIFICATIONS AND CERTIFICATIONS REQUIREMENTS Work experience and educational background that a candidate should have when applying for the position. Years of Exp.: 12 -15 years of experience. AWS Certified Solution Architect AWS Certified Professional Azure Certified Solution Architect Google Certified Professional (GCP) Looking for Immediate Joiners

Posted 1 month ago

Apply

2 - 5 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

About The Role The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems. The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Process ManagerRoles and responsibilities: Designing and implementing scalable, reliable, and maintainable data architectures on AWS. Developing data pipelines to extract, transform, and load (ETL) data from various sources into AWS environments. Creating and optimizing data models and schemas for performance and scalability using AWS services like Redshift, Glue, Athena, etc. Integrating AWS data solutions with existing systems and third-party services. Monitoring and optimizing the performance of AWS data solutions, ensuring efficient query execution and data retrieval. Implementing data security and encryption best practices in AWS environments. Documenting data engineering processes, maintaining data pipeline infrastructure, and providing support as needed. Working closely with cross-functional teams including data scientists, analysts, and stakeholders to understand data requirements and deliver solutions. Technical and Functional Skills: Typically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. Strong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc Proficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Experience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. Familiarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Knowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Understanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Proficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Ability to analyze complex technical problems and propose effective solutions. Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders.

Posted 1 month ago

Apply

1 - 4 years

2 - 6 Lacs

Pune

Work from Office

Naukri logo

About The Role The candidate must possess knowledge relevant to the functional area, and act as a subject matter expert in providing advice in the area of expertise, and also focus on continuous improvement for maximum efficiency. It is vital to focus on the high standard of delivery excellence, provide top-notch service quality and develop successful long-term business partnerships with internal/external customers by identifying and fulfilling customer needs. He/she should be able to break down complex problems into logical and manageable parts in a systematic way, and generate and compare multiple options, and set priorities to resolve problems. The ideal candidate must be proactive, and go beyond expectations to achieve job results and create new opportunities. He/she must positively influence the team, motivate high performance, promote a friendly climate, give constructive feedback, provide development opportunities, and manage career aspirations of direct reports. Communication skills are key here, to explain organizational objectives, assignments, and the big picture to the team, and to articulate team vision and clear objectives. Process ManagerRoles and responsibilities: Designing and implementing scalable, reliable, and maintainable data architectures on AWS. Developing data pipelines to extract, transform, and load (ETL) data from various sources into AWS environments. Creating and optimizing data models and schemas for performance and scalability using AWS services like Redshift, Glue, Athena, etc. Integrating AWS data solutions with existing systems and third-party services. Monitoring and optimizing the performance of AWS data solutions, ensuring efficient query execution and data retrieval. Implementing data security and encryption best practices in AWS environments. Documenting data engineering processes, maintaining data pipeline infrastructure, and providing support as needed. Working closely with cross-functional teams including data scientists, analysts, and stakeholders to understand data requirements and deliver solutions. Technical and Functional Skills: Typically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. Strong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc Proficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Experience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. Familiarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Knowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Understanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Proficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Ability to analyze complex technical problems and propose effective solutions. Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders.

Posted 1 month ago

Apply

2 - 5 years

4 - 8 Lacs

Pune

Work from Office

Naukri logo

About The Role Process Manager - AWS Data Engineer Mumbai/Pune| Full-time (FT) | Technology Services Shift Timings - EMEA(1pm-9pm)|Management Level - PM| Travel Requirements - NA The ideal candidate must possess in-depth functional knowledge of the process area and apply it to operational scenarios to provide effective solutions. The role enables to identify discrepancies and propose optimal solutions by using a logical, systematic, and sequential methodology. It is vital to be open-minded towards inputs and views from team members and to effectively lead, control, and motivate groups towards company objects. Additionally, candidate must be self-directed, proactive, and seize every opportunity to meet internal and external customer needs and achieve customer satisfaction by effectively auditing processes, implementing best practices and process improvements, and utilizing the frameworks and tools available. Goals and thoughts must be clearly and concisely articulated and conveyed, verbally and in writing, to clients, colleagues, subordinates, and supervisors. Process Manager Roles and responsibilities: Understand clients requirement and provide effective and efficient solution in AWS using Snowflake. Assembling large, complex sets of data that meet non-functional and functional business requirements Using Snowflake / Redshift Architect and design to create data pipeline and consolidate data on data lake and Data warehouse. Demonstrated strength and experience in data modeling, ETL development and data warehousing concepts Understanding data pipelines and modern ways of automating data pipeline using cloud based Testing and clearly document implementations, so others can easily understand the requirements, implementation, and test conditions Perform data quality testing and assurance as a part of designing, building and implementing scalable data solutions in SQL Technical and Functional Skills: AWS ServicesStrong experience with AWS data services such as S3, EC2, Redshift, Glue, Athena, EMR, etc. Programming LanguagesProficiency in programming languages commonly used in data engineering such as Python, SQL, Scala, or Java. Data WarehousingExperience in designing, implementing, and optimizing data warehouse solutions on Snowflake/ Amazon Redshift. ETL ToolsFamiliarity with ETL tools and frameworks (e.g., Apache Airflow, AWS Glue) for building and managing data pipelines. Database ManagementKnowledge of database management systems (e.g., PostgreSQL, MySQL, Amazon Redshift) and data lake concepts. Big Data TechnologiesUnderstanding of big data technologies such as Hadoop, Spark, Kafka, etc., and their integration with AWS. Version ControlProficiency in version control tools like Git for managing code and infrastructure as code (e.g., CloudFormation, Terraform). Problem-solving Skills: Ability to analyze complex technical problems and propose effective solutions. Communication Skills: Strong verbal and written communication skills for documenting processes and collaborating with team members and stakeholders. Education and ExperienceTypically, a bachelors degree in Computer Science, Engineering, or a related field is required, along with 5+ years of experience in data engineering and AWS cloud environments. About eClerx eClerx is a global leader in productized services, bringing together people, technology and domain expertise to amplify business results. Our mission is to set the benchmark for client service and success in our industry. Our vision is to be the innovation partner of choice for technology, data analytics and process management services. Since our inception in 2000, we've partnered with top companies across various industries, including financial services, telecommunications, retail, and high-tech. Our innovative solutions and domain expertise help businesses optimize operations, improve efficiency, and drive growth. With over 18,000 employees worldwide, eClerx is dedicated to delivering excellence through smart automation and data-driven insights. At eClerx, we believe in nurturing talent and providing hands-on experience. About eClerx Technology eClerxs Technology Group collaboratively delivers Analytics, RPA, AI, and Machine Learning digital technologies that enable our consultants to help businesses thrive in a connected world. Our consultants and specialists partner with our global clients and colleagues to build and implement digital solutions through a broad spectrum of activities. To know more about us, visit https://eclerx.com eClerx is an Equal Opportunity Employer. All qualified applicants will receive consideration for employment without regard to race, color, religion, sex, national origin, disability or protected veteran status, or any other legally protected basis, per applicable law

Posted 1 month ago

Apply

12 - 18 years

20 - 30 Lacs

Bengaluru

Hybrid

Naukri logo

Key Skills: AWS Cloud, Architect Key Responsibilities: Design and implement end-to-end AWS cloud infrastructure solutions to meet business and technical requirements. Lead cloud strategy, planning, and architecture design for complex enterprise environments. Migrate and modernize legacy applications and infrastructure to AWS using best practices. Develop Infrastructure as Code (IaC) using tools like CloudFormation, Terraform, or AWS CDK. Build and manage CI/CD pipelines, automate deployments using tools like AWS Code Pipeline, Jenkins, GitLab CI/CD, etc. Implement cloud security, monitoring, alerting, and logging solutions using AWS-native tools (CloudTrail, CloudWatch, Guard Duty, etc.). Optimize cloud cost, resource utilization, and performance. Provide technical leadership, mentoring junior engineers, and driving adoption of DevOps and cloud best practices. Troubleshoot and resolve issues related to infrastructure, networking, application deployment, and performance. Collaborate with cross-functional teams including developers, architects, security teams, and project managers. Expirence Requirments: 12+ years of IT experience with a minimum of 5+ years in AWS cloud environments. Expert-level understanding of core AWS services: EC2, VPC, S3, RDS, Lambda, IAM, CloudFront, ELB, ECS/EKS, etc. Strong hands-on experience with Infrastructure as Code (Terraform, AWS CloudFormation). Expertise in Linux/Unix system administration and scripting (Bash, Python, PowerShell). Proficiency in setting up and managing CI/CD pipelines and automation tools. Deep understanding of networking, security, IAM policies, encryption, and compliance in cloud environments. Strong knowledge of DevOps principles and tools. Experience with monitoring tools like CloudWatch, Datadog, Prometheus, or Grafana. Familiarity with multi-account AWS environments, AWS Organizations, and landing zone architectures. Exposure to containerization and orchestration: Docker, Kubernetes, ECS/EKS. Preferred Certifications: AWS Certified Solutions Architect - Professional. AWS Certified DevOps Engineer - Professional. AWS Certified Security - Specialty (optional but highly valued). Soft Skills: Excellent communication and stakeholder management skills. Strong leadership and mentoring capabilities. Strategic thinking with hands-on execution ability. Ability to work in a fast-paced, dynamic environment. Qualifications: B. Tech in Computer Science.

Posted 1 month ago

Apply

3 - 6 years

2 - 6 Lacs

Noida, Mohali

Work from Office

Naukri logo

Were looking for a skilled Cloud Presales Engineer with 36 years of experience in designing and presenting cloud solutions on Azure and AWS . You’ll support the sales team by understanding client needs, proposing the right cloud architecture, and showcasing technical value. Key Responsibilities: Collaborate with sales to understand client requirements. Gather requirement from client and undrstand their pain points Design and present Azure & AWS cloud solutions as per client's requirement Create BoM, proposals, demos, and technical documentations Suggest clients on how to optimize cloud cost Work closely with delivery team to implement the solution for client Support RFP/Bid responses and cost estimation Advise on best practices, architecture, and cloud strategy Requirements: 3–6 years in cloud presales or solutions engineering Strong expertise in Azure and AWS Solid understanding of IaaS, PaaS, security, and networking Good understanding of operating systems and database licenses Excellent communication and presentation skills Relevant cloud certifications (preferred)

Posted 1 month ago

Apply

4 - 9 years

11 - 15 Lacs

Bengaluru

Work from Office

Naukri logo

Wissen Is Hiring Experience:4+Years Notice:Immediate Location:Bangalore Job Description As part of the End-to-End Digital Customer Relationships, installed base tracking has become one of the key pillars to grow drastically Services revenues: both field services and digital services. Schneider Electric is seeking a highly skilled and experienced AWS Developer with strong AWS Cloud skills, strong development skills in NodeJs, proficiency in GitHub to join our Installed Base Team. The ideal candidate will have a robust understanding and hands-on experience with building, deploying, and maintaining microservices in AWS such as Lambda, DynamoDB, API Gateway, etc. You will play a crucial role in developing services, ensuring our infrastructure is maintained and secure. Experience with IaC/DevOps tools such as Terraform and CloudFormation is a plus. Schneider Installed base (IB) is the central data platform where all the product asset data that we track is collected, qualified, consolidated and exposed. Under IB AWS lead, The IB AWS Cloud Developer is a key team member of installed base custom development team, its role is to: - Work with IB AWS lead on the design of the architectures for the new capabilities hosted in our AWS platform. - Design and develop microservices using NodeJs. - Previous experience with Python following OOP is a plus. - Evaluate product requirements for operational feasibility and create detailed specifications based on user stories. - Write clean, efficient, high quality, secure, testable, maintainable code based on specifications. - Coordinate with stakeholders (Product Owner, Scrum Master, Architect, Quality and DevOps teams) to ensure successful execution of the project. - Troubleshoot and resolve issues related to the infrastructure. - Ensure best practices are followed in cloud services, focusing on scalability, maintainability, and security. - Keep abreast of the latest advancements in AWS cloud technologies and trends to recommend process improvements and technology upgrades. - Mentor and provide guidance to junior team members, fostering a culture of continuous learning and innovation. - Participate in architecture review board meetings and make strategic recommendations for choice of services. Internal Qualifications - 3+ years experience working with NodeJs along with AWS Cloud Services or a similar role. - Masters degree in computer science with a focus on Cloud/Data or equivalent (Or Bachelor with more years of XP) - Comprehensive knowledge and hands-on experience with NodeJs, AWS Cloud services, especially the modules (Lambda, serverless, API Gateway, SQS, SNS, SES, DynamoDB, CloudWatch,). - Knowledge of best practices in Python is a plus. - Knowledge of branching and version control systems like GIT (mandatory) - Experience with IaC tools such as Terraform and/or CloudFormation is a plus. - Proficient in Data Structure and algorithm. - Excellent collaboration skills. - A desire for continuous learning and staying updated with emerging technologies. Skills - Due to the nature of this position sitting on a global team, fluent English communication skills (written & spoken) is required. - Strong interpersonal skills, with ability to communicate and convince at various levels of the organization, and in a multicultural environment. - Ability to effectively multi-task and manage priorities. - Strong analytical and synthesis skills - Initiative to uncover and solve problems proactively. - Ability to understand complex software development environments.

Posted 1 month ago

Apply

5 - 7 years

20 - 30 Lacs

Pune

Work from Office

Naukri logo

Role & Responsibility: Azure Cloud Migration Expert: An Azure Cloud Migration Expert is responsible for planning, designing, and executing the migration of on-premises or other Public/Private Cloud Providers hosted applications and infrastructure to the Azure cloud. They ensure seamless transitions, optimization, integrity, and adhere to Azure Well-Architected Framework during and after the migration process. Key Responsibilities: Assessment and Planning: Evaluate existing systems (On-premises, AWS, GCP, etc.), and associated enabling capabilities (identity, security, HA/DR, monitoring, backup/restore, reporting, integrations, etc.). Design and develop comprehensive migration strategies and plans. Evaluate, recommend, and implement 7 Rs cloud migration strategies - rehost, replatform, refactor, repurchase, retire, retain, and relocate. Migration Execution: Manage and execute the migration process, ensuring minimal downtime and data integrity, and using tools like Azure Migrate. Cloud Infrastructure Management: Configure, optimize, and monitor Azure resources, including but not limited to virtual machines, AKS, storage, networking, and other services. Technical Expertise: Provide technical guidance to project teams, troubleshoot issues, and ensure compliance with cloud security best practices. Technical Leadership: Develop, train, and build internal teams with Azure skills and build a practice/Center of Excellence Post-Migration Support: Provide documentation, training, and ongoing support to internal teams and clients. Optimization and Cost Efficiency: Continuously monitor and optimize cloud infrastructure performance and cost-efficiency. Collaboration: Work with cross-functional teams (developers, IT, security, compliance) to ensure seamless integration and alignment. Required Skills: Azure expertise: Proficiency in Azure services, architecture, and best practices. AWS/Public Cloud awareness: Good working understanding of AWS or other public cloud providers. Cloud Architecture and Design: Good understanding of architecting cloud solutions cloud native design, micro services framework. Cloud Native Skills: In-depth knowledge and experience with technologies like Docker, Kubernetes, Packer Cloud migration tools: Experience with Azure Migrate, Site Recovery, and other relevant tools. Networking and security: Strong understanding of cloud networking, security protocols, and compliance. Scripting and automation: Proficiency in scripting languages (PowerShell, Python) for automating tasks and infrastructure management. Experience in Azure Automation, Azure DevOps. Problem-solving and analytical skills: Ability to diagnose issues, develop solutions, and analyze data. Communication and collaboration: Excellent communication skills for interacting with stakeholders and cross-functional teams. Experience: Minimum 2-3 years of experience in cloud migration projects with Azure or other cloud providers. Overall, 5-7 years of experience. Experience with cloud architecture and services, Azure migration, automation and DevOps tools. Experience in security and compliance, observability, monitoring, SIEM, SOAR, SRE. Preferred candidate profile

Posted 1 month ago

Apply

7 - 10 years

0 - 0 Lacs

Bengaluru

Work from Office

Naukri logo

Job Purpose We are seeking a Senior Lead Database Engineer to lead the design, development and maintenance of our company's databases. The Senior Lead Database Engineer will be responsible for ensuring the performance, security and integrity of our databases, as well as providing guidance and mentorship to junior team members. Job Responsibilities Lead the design, development and maintenance of databases Ensure the performance, security and integrity of databases Provide guidance and mentorship to junior team members Collaborate with other departments to gather requirements and provide solutions Develop and implement backup and recovery procedures Monitor and optimize database performance Identify and resolve database-related issues Keep up-to-date with new technologies and industry trends Total Yrs of experience: 10-12 Educ: B Tech in Computer Science or related discipline preferred. Key Skills Strong knowledge of SQL and database management systems such as MySQL, PostgreSQL, or Oracle Experience with database security, backup and recovery procedures

Posted 1 month ago

Apply

8 - 10 years

25 - 30 Lacs

Pune

Work from Office

Naukri logo

Oracle ADF, Java, Spring Boot, PL-SQL AWS Cloud Design, develop, and implement Oracle Application Development Framework (ADF) applications. Customize ADF components based on business needs and requirements. Test, debug, and optimize ADF applications for performance and functionality. Collaborate with teams to integrate ADF solutions with other systems. Document development processes and ensure adherence to best practices

Posted 1 month ago

Apply

5 - 10 years

10 - 14 Lacs

Pune, Mumbai (All Areas)

Hybrid

Naukri logo

ETL QA with 5+ years of relevant experience, with knowledge of ETL testing along with AWS experience will be preferred

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies