Jobs
Interviews

924 Aws Lambda Jobs - Page 7

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

5.0 - 9.0 years

0 Lacs

hyderabad, telangana

On-site

You must be an energetic self-starter with a desire to learn new things quickly. In this position, you will work in a team of talented engineers to innovate, implement, and support IDaptive Application Services. Duties will primarily revolve around building software by writing code, as well as modifying software to fix errors, adapt it to new hardware, improve its performance, or upgrade interfaces. Responsibilities: - Help guide and contribute to feature design and implementation to bring the product to the next level. - Participate in continuous and iterative engineering cycles with emphasis on code quality, supportability, scalability, and performance. - Develop and review unit test cases to ensure comprehensive unit testing. - Diagnose and fix product issues found internally or in the field. - Interface with Support to handle customer escalation issues. - Mentor junior members of the team in their assigned tasks and their technical skills development. Requirements: - 5-8 years in enterprise scale application development and hands-on software development experience, with the most recent experience preferably in a cloud/SaaS environment. - BS in Computer Science or equivalent combination of technical education and work experience. - Expertise with C#, ASP.NET MVC. - Expertise and hands-on experience with web services (e.g., REST, SOAP). - Experience with Javascript, CSS, HTML. - Familiarity with general software development release lifecycle, source code management, and defect management methodologies. - Proficient understanding of SQL and relationship databases. - Design, develop, and optimize scalable cloud services (AWS Lambda, serverless). - Working experience with Active Directory and/or LDAP a plus. - Knowledge and experience with authentication standards such as SAML, WS-Fed, OpenID, or OAuth a plus. - Solid understanding of security and networking implementation and best practices. - Demonstrate the ability to complete highly detailed tasks with strict attention to detail, quality, and timeliness. - Lead technical discussions and guide junior engineers. - Strong organizational and self-management skills. - Excellent analytical and troubleshooting skills. - Excellent oral and written communication skills.,

Posted 3 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

lucknow, uttar pradesh

On-site

As a member of our team, you will be responsible for building APIs that can efficiently handle thousands of requests per second, ensuring reliability and performance. Your role will involve delivering full features end-to-end, from CMS data modeling to production, as well as participating in technical design and code reviews. Additionally, you will be expected to maintain and enhance internal tooling, and approach complex technical challenges with a solution-driven mindset. We are looking for a solution-oriented, outcome-focused engineer with experience in TypeScript, Rust, JavaScript, or similar modern languages. A solid understanding of cloud-native engineering practices such as continuous delivery & integration, walking skeleton architecture, and pipeline automation is essential. You should be capable of owning the entire development lifecycle from development to deployment and possess working knowledge of MySQL and MSSQL databases. Bonus skills that would be advantageous include backend development with TypeScript, exposure to Rust & WebAssembly, experience with RESTful API and messaging technologies, AWS-native solution building, CI/CD automation, Docker, Kubernetes, and OpenSearch. Joining our team will allow you to solve real-world, high-impact problems using an advanced tech stack. You will have the opportunity to collaborate with a skilled and supportive engineering team in a flexible work culture that accommodates both onsite and remote work. By becoming part of our company, you will be valued for your innovation, ownership, and pursuit of excellence.,

Posted 3 weeks ago

Apply

6.0 - 10.0 years

0 Lacs

punjab

On-site

As a Python + AWS Developer at Orion eSolutions, you will be part of a Custom Software Development and DevOps Consulting Company in Canada that focuses on enabling businesses to operate through digital transformation. With a dedicated team of 160+ Custom Software and App Developers, as well as Certified Cloud & DevOps Engineers, we are committed to helping our customers transform their vision into scalable and efficient products. Over the last decade, Orion has collaborated with numerous well-funded start-ups and Fortune 500 companies in building their digital products. We are currently seeking a full-time Online Bidder with experience in selling IT Services, particularly fixed-price projects, to Small and Medium businesses in North America. In this role, you will be provided with actual case studies and marketing materials to align with our strategies. Your primary responsibility will be to meet the annual sales goals set for the company by identifying new prospects, nurturing sales leads, and converting them into paid users or customers, in collaboration with the company's leadership team in Toronto. Key Responsibilities: - Develop applications using Python, AWS Lambda, SNS, and AWS Aurora database - Demonstrate a deep understanding of application development through advanced Python programming and object-oriented concepts - Utilize ORM tools, with a preference for SQL Alchemy - Proficiency in GraphQL, API authentication, performance management, and internationalization - Work within a SCRUM team, utilizing GIT and branching mechanisms effectively - Possess 6+ years of experience in the field If you are passionate about application development, have a strong grasp of Python and AWS technologies, and thrive in a collaborative team environment, we encourage you to apply for the Python + AWS Developer position at Orion eSolutions.,

Posted 3 weeks ago

Apply

5.0 - 9.0 years

0 Lacs

haryana

On-site

You will be responsible for leading as a Cloud App Developer at Wipro Limited, a leading technology services and consulting company that specializes in creating innovative solutions for complex digital transformation needs. With a global presence spanning over 65 countries and a workforce of more than 230,000 employees and partners, we are committed to helping our clients, colleagues, and communities thrive in a dynamic world. As a Lead Cloud App Developer, you will need to possess expertise in Terraform, AWS, and DevOps. Additionally, you should hold certifications such as AWS Certified Solution Architect Associate and AWS Certified DevOps Engineer Professional. Your role will involve leveraging your IT experience of more than 6 years to set up and maintain ECS solutions, design AWS solutions with various services like VPC, EC2, WAF, ECS, ALB, IAM, KMS, and others. Furthermore, you will be expected to have experience with AWS services like SNS, SQS, EventBridge, RDS, Aurora DB, Postgres DB, DynamoDB, Redis, AWS Glue jobs, AWS Lambda, CI/CD using Azure DevOps, GitHub for source code management, and building cloud-native applications. Your responsibilities will also include working with container technologies like docker, configuring logging and monitoring solutions like CloudWatch and OpenSearch, and managing system configurations using Terraform and Terragrunt. In addition to technical skills, you should possess strong communication and collaboration abilities, be a team player, have excellent analytical and problem-solving skills, and understand Agile methodologies. Your role will also involve training others in procedural and technical topics, recommending process and architecture improvements, and troubleshooting distributed systems. Join us at Wipro to be a part of our journey to reinvent our business and industry. We are looking for individuals who are inspired by reinvention and are committed to evolving themselves, their careers, and their skills. Be a part of a purpose-driven business that empowers you to shape your reinvention. Realize your ambitions at Wipro, where applications from individuals with disabilities are warmly welcomed. Experience Required: 5-8 Years To learn more about Wipro Limited, visit www.wipro.com.,

Posted 3 weeks ago

Apply

2.0 - 6.0 years

0 Lacs

kolkata, west bengal

On-site

You are a passionate and customer obsessed AWS Solutions Architect looking to join Workmates, the fastest growing partner to the worlds major cloud provider, AWS. Your role will involve driving innovation, building differentiated solutions, and defining new customer experiences to help customers maximize their AWS potential in their cloud journey. Working alongside industry specialist organizations and technology groups, you will play a key role in leading our customers towards native cloud transformation. Choosing Workmates and the AWS Practice will enable you to elevate your AWS experience and skills in an innovative and collaborative environment. At Workmates, you will have the opportunity to lead the worlds AWS growing partner in pioneering cloud transformation and be at the forefront of cloud advancements. Join Workmates in delivering innovative work as part of your extraordinary career. People are considered the biggest assets at Workmates, and together we aim to achieve best-in-class cloud native operations. Be part of our mission to drive innovations across Cloud Management, Media, DevOps, Automation, IoT, Security, and more, where independence and ownership are valued, allowing you to thrive and contribute your best. Responsibilities: - Building and maintaining cloud infrastructure environments - Ensuring availability, performance, security, and scalability of production systems - Collaborating with application teams to implement DevOps practices - Creating solution prototypes and conducting proof of concepts for new tools - Designing repeatable, automated, and scalable processes to enhance efficiency - Automating and streamlining operations and processes - Troubleshooting and diagnosing issues/outages and providing operational support - Engaging in incident handling and supporting a culture of post-mortem and knowledge sharing Requirements: - 2+ years of hands-on experience in building and supporting large-scale environments - Strong Architecting and Implementation Experience with AWS Cloud - Proficiency in AWS CloudFormation and Terraform - Experience in Docker Containers and container environment deployment - Good understanding and work experience in Kubernetes and EKS - Sysadmin and infrastructure background (Linux internals, filesystems, networking) - Proficiency in scripting, particularly writing Bash scripts - Familiarity with CI/CD pipeline build and release - Experience with CICD tools like Jenkins/GitLab/TravisCI - Hands-on experience with AWS Developer tools such as AWS Code Pipeline, Code Build, Code Deploy, AWS Lambda, AWS Step Function, etc. - Experience in log management solutions (ELK/EFK or similar) - Experience with Configuration Management tools like Ansible or similar - Proficiency in modern Monitoring and Alerting tools like CloudWatch, Prometheus, Grafana, Opsgenie, etc. - Strong passion for automating routine tasks and solving production issues - Experience in automation testing, script generation, and integration with CI/CD - Familiarity with AWS Security features (IAM, Security Groups, KMS, etc.) - Good to have experience in database technologies (MongoDB/MySQL, etc.) Desired Skills: - AWS Professional Certifications - CKA/CKAD Certifications - Knowledge of Python/Go - Experience with Service Mesh and Distributed tracing - Familiarity with Scrum/Agile methodology Join Workmates and be part of a team that values innovation, collaboration, and continuous improvement in the cloud technology landscape. Your expertise and skills will play a crucial role in driving customer success and shaping the future of cloud solutions.,

Posted 3 weeks ago

Apply

3.0 - 6.0 years

0 - 1 Lacs

Hyderabad

Remote

Role & responsibilities : Work with Product owners and engineering managers to understand product roadmap. Contribute to designing technical specification artefacts, documentation, diagrams (HLD, LLD, TRD) and accordingly provide technical and functional recommendations. Design + develop massively scalable, event-driven microservices based and high-performance cloud-native applications, making the right tradeoffs for risk and long-term maintainability. Collaborate with your engineering manager, cross-functional stakeholders, and team members to solve complex tech. problems. Follow good coding practices, agile engineering processes, DevSecOpsSRE toolchain and complying with existing quality standards. Setup high quality standards in production running code by performing diligent code reviews and rigorous test coverage. Ensure feature KPIs / matrices and release objectives are met by delivering high-quality code and thus products. Preferred candidate profile : Min 3+ years of experience with deep technical knowledge and hands on coding skills. A computer science engineering or equivalent degree from IIT/IIM/NIT/REC or similar. Expert knowledge of computer science, with strong competencies in OOPs concepts, data structures, algorithms, multi-threading, concurrent systems, memory management, micro-services and full-stack software design. Has worked extensively in one or many of - Golang ecosystem and allied languages / tools. High level of proficiency and experience in few or many - AWS (or other) cloud, React, Redis, Postman, Kafka, Node, Golang, Postgres, GraphQL, containerization (Dockers, Kubernetes), agile methodologies, DevOps tools (Pref. Azure DevOps & Git ecosystem). Strong understanding of end-to-end architectures and development frameworks; knowledge across tiers in a multi-tier cloud environment (AWS preferred). Demonstrated history of providing firm-wide impact with self-sufficiency, as well as planning and leading projects delivery. Must have hands-on experience on designing and developing using AI tools.

Posted 3 weeks ago

Apply

10.0 - 14.0 years

25 - 40 Lacs

Hyderabad

Work from Office

Face to face interview on 2nd august 2025 in Hyderabad Apply here - Job description - https://careers.ey.com/job-invite/1604461/ Experience Required: Minimum 8 years Job Summary: We are seeking a skilled Data Engineer with a strong background in data ingestion, processing, and storage. The ideal candidate will have experience working with various data sources and technologies, particularly in a cloud environment. You will be responsible for designing and implementing data pipelines, ensuring data quality, and optimizing data storage solutions. Key Responsibilities: Design, develop, and maintain scalable data pipelines for data ingestion and processing using Python, Spark, and AWS services. Work with on-prem Oracle databases, batch files, and Confluent Kafka for data sourcing. Implement and manage ETL processes using AWS Glue and EMR for batch and streaming data. Develop and maintain data storage solutions using Medallion Architecture in S3, Redshift, and Oracle. Collaborate with cross-functional teams to understand data requirements and deliver solutions that meet business needs. Monitor and optimize data workflows using Airflow and other orchestration tools. Ensure data quality and integrity throughout the data lifecycle. Implement CI/CD practices for data pipeline deployment using Terraform and other tools. Utilize monitoring and logging tools such as CloudWatch, Datadog, and Splunk to ensure system reliability and performance. Communicate effectively with stakeholders to gather requirements and provide updates on project status. Technical Skills Required: Proficient in Python for data processing and automation. Strong experience with Apache Spark for large-scale data processing. Familiarity with AWS S3 for data storage and management. Experience with Kafka for real-time data streaming. Knowledge of Redshift for data warehousing solutions. Proficient in Oracle databases for data management. Experience with AWS Glue for ETL processes. Familiarity with Apache Airflow for workflow orchestration. Experience with EMR for big data processing. Mandatory: Strong AWS data engineering skills.

Posted 4 weeks ago

Apply

0.0 - 1.0 years

1 - 1 Lacs

Hyderabad

Work from Office

Responsibilities: * Develop scalable web apps with Django Rest Framework & FastAPI * Implement REST APIs using Python & AWS Lambda * Collaborate on CI/CD pipelines with DevOps mindset

Posted 4 weeks ago

Apply

4.0 - 8.0 years

0 Lacs

karnataka

On-site

You empower individuals to stay resilient and relevant in a constantly changing world. You are seeking individuals who are continuously exploring creative ways to grow and learn, individuals who aspire to make a tangible impact, both presently and in the future. As a Software Developer (Java based) with 4 to 6 years of experience in Core Java, Spring Boot, AWS Lambda, and Node JS, you play a crucial role in designing software solutions based on requirements and within the constraints of architectural/design guidelines. Your responsibilities include deriving software requirements and software functional specifications, validating software requirements, providing software feasibility analysis, and software effort estimation. Your role involves the accurate translation of software architecture into design and code, guiding Scrum team members on design topics and ensuring implementation consistency against the design/architecture. You will actively participate in coding features and/or bug-fixing, delivering solutions that adhere to coding and quality guidelines for self-owned components. Additionally, you will guide the team in test automation design and support its implementation. Key Requirements: - Proficiency in Testing Frameworks - Strong knowledge of SQL, GIT, and Cloud Computing - Familiarity with various AWS Services, Spring Framework, and REST Services - Experience working with Git/Bitbucket - Good to have Skills: Serverless Development This position is based in Bangalore with opportunities to travel to other locations in India and beyond. Joining the Smart Grids and Infrastructure team as a Power System Engineer, you will contribute to creating technology that will revolutionize entire industries, cities, and countries. Siemens, with over 379,000 minds in over 200 countries, is dedicated to equality and welcomes diverse applications that reflect the communities it serves. Employment decisions at Siemens are based on qualifications, merit, and business needs. Embrace the opportunity to work with teams that are shaping the future and be a part of crafting tomorrow. Discover more about Siemens careers at www.siemens.com/careers. Benefits: - Hybrid working opportunities - Inclusive and diverse culture - Array of learning and development prospects - Competitive compensation package,

Posted 4 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

haryana

On-site

As a Software Engineer specializing in AI/ML/LLM/Data Science at Entra Solutions, a FinTech company within the mortgage Industry, you will play a crucial role in designing, developing, and deploying AI-driven solutions using cutting-edge technologies such as Machine Learning, NLP, and Large Language Models (LLMs). Your primary focus will be on building and optimizing retrieval-augmented generation (RAG) systems, LLM fine-tuning, and vector search technologies using Python. You will be responsible for developing scalable AI pipelines that ensure high performance and seamless integration with both cloud and on-premises environments. Additionally, this role will involve implementing MLOps best practices, optimizing AI model performance, and deploying intelligent applications. In this role, you will: - Develop, fine-tune, and deploy AI/ML models and LLM-based applications for real-world use cases. - Build and optimize retrieval-augmented generation (RAG) systems using Vector Databases such as ChromaDB, Pinecone, and FAISS. - Work on LLM fine-tuning, embeddings, and prompt engineering to enhance model performance. - Create end-to-end AI solutions with APIs using frameworks like FastAPI, Flask, or similar technologies. - Establish and maintain scalable data pipelines for training and inferencing AI models. - Deploy and manage models using MLOps best practices on cloud platforms like AWS or Azure. - Optimize AI model performance for low-latency inference and scalability. - Collaborate with cross-functional teams including Product, Engineering, and Data Science to integrate AI capabilities into applications. Qualifications: Must Have: - Proficiency in Python - Strong hands-on experience in AI/ML frameworks such as TensorFlow, PyTorch, Hugging Face, LangChain, and OpenAI APIs. Good to Have: - Experience with LLM fine-tuning, embeddings, and transformers. - Knowledge of NLP, vector search technologies (ChromaDB, Pinecone, FAISS, Milvus). - Experience in building scalable AI models and data pipelines with Spark, Kafka, or Dask. - Familiarity with MLOps tools like Docker, Kubernetes, and CI/CD for AI models. - Hands-on experience in cloud-based AI deployment using platforms like AWS Lambda, SageMaker, GCP Vertex AI, or Azure ML. - Knowledge of prompt engineering, GPT models, or knowledge graphs. What's in it for you - Competitive Salary & Full Benefits Package - PTOs / Medical Insurance - Exposure to cutting-edge AI/LLM projects in an innovative environment - Career Growth Opportunities in AI/ML leadership - Collaborative & AI-driven work culture Entra Solutions is an equal employment opportunity employer, and we welcome applicants from diverse backgrounds. Join us and be a part of our dynamic team driving innovation in the FinTech industry.,

Posted 4 weeks ago

Apply

8.0 - 12.0 years

12 - 22 Lacs

Hyderabad

Remote

Tech stack- Database: Mongodb: S3 Postgres Strong experience on Data pipelines; mapping React; Node; Python Aws; Lambda About the job Summary We are seeking a detail-oriented and proactive Data Analyst to lead our file and data operations, with a primary focus on managing data intake from our clients and ensuring data integrity throughout the pipeline. This role is vital to our operational success and will work cross-functionally to support data ingestion, transformation, validation, and secure delivery. The ideal candidate must have hands-on experience with healthcare datasets, especially medical claims data, and be proficient in managing ETL processes and data operations at scale. Responsibilities File Intake & Management Serve as the primary point of contact for receiving files from clients, ensuring all incoming data is tracked, validated, and securely stored. Monitor and automate data file ingestion using tools such as AWS S3, AWS Glue, or equivalent technologies. Troubleshoot and resolve issues related to missing or malformed files and ensure timely communication with internal and external stakeholders. Data Operations & ETL Develop, manage, and optimize ETL pipelines for processing large volumes of structured and unstructured healthcare data. Perform data quality checks, validation routines, and anomaly detection across datasets. Ensure consistency and integrity of healthcare data (e.g., EHR, medical claims, ICD/CPT/LOINC codes) during transformations and downstream consumption. Data Analysis & Reporting Collaborate with data science and analytics teams to deliver operational insights and performance metrics. Build dashboards and visualizations using Power BI or Tableau to monitor data flow, error rates, and SLA compliance. Generate summary reports and audit trails to ensure HIPAA-compliant data handling practices. Process Optimization Identify opportunities for automation and efficiency in file handling and ETL processes. Document procedures, workflows, and data dictionaries to standardize operations. Required Qualifications Bachelors or Master’s degree in Health Informatics, Data Analytics, Computer Science, or related field. 5+ years of experience in a data operations or analyst role with a strong focus on healthcare data. Demonstrated expertise in working with medical claims data, EHR systems, and healthcare coding standards (e.g., ICD, CPT, LOINC, SNOMED, RxNorm). Strong programming and scripting skills in Python and SQL for data manipulation and automation. Hands-on experience with AWS, Redshift, RDS, S3, and data visualization tools such as Power BI or Tableau. Familiarity with HIPAA compliance and best practices in handling protected health information (PHI). Excellent problem-solving skills, attention to detail, and communication abilities.

Posted 4 weeks ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Pune

Hybrid

Greetings from Intelliswift- An LTTS Company Role : Fullstack Developer Work Location:- Pune Experience:- 5+ years Job Description in details: Job Summary Role : Fullstack Developer Experience : 5 to 8 Years Job Location : Pune As a Fullstack Developer specializing in generative AI and cloud technologies, you will design, build, and maintain end-to-end applications on AWS. Youll leverage services such as Bedrock, SageMaker, LangChain and Amplify to integrate AI/ML capabilities, architect scalable infrastructure, and deliver seamless front-end experiences using React. Youll partner with UX/UI designers, ML engineers, DevOps teams, and product stakeholders to take features from concept through production deployment. Job Description: 5+ years of professional experience as a Fullstack Developer building scalable web applications. Proficiency in Python and/or JavaScript/TypeScript; strong command of modern frameworks (React, Node.js). Hands-on AWS expertise: Bedrock, SageMaker, Amplify, Lambda, API Gateway, DynamoDB/RDS, CloudWatch, IAM, VPC. Architect & develop full-stack solutions using React for front-end, Python/Node.js for back-end, and AWS Lambda/API Gateway or containers for serverless services. Integrate Generative AI capabilities leveraging AWS Bedrock, LangChain retrieval-augmented pipelines, and custom prompt engineering to power intelligent assistants and data-driven insights. Design & Manage AWS Infrastructure using CDK/CloudFormation for VPCs, IAM policies, S3, DynamoDB/RDS, ECS/EKS, and Implement DevOps/MLOps Workflows: establish CI/CD pipelines (CodePipeline, CodeBuild, Jenkins), containerization (Docker), automated testing, and rollout strategies. Develop Interactive UIs in React: translate Figma/Sketch designs into responsive components, integrate with backend APIs, and harness AWS Amplify for accelerated feature delivery. Solid understanding of AI/ML concepts, including prompt engineering, generative AI frameworks (LangChain), and model deployment patterns. Experience designing and consuming APIs: RESTful and GraphQL. DevOps/MLOps skills: CI/CD pipeline creation, containerization (Docker), orchestration (ECS/EKS), infrastructure as code. Cloud architecture know-how: security groups, network segmentation, high-availability patterns, cost optimization. Excellent problem-solving ability and strong communication skills to collaborate effectively across distributed teams. Share your updated profiles on shakambnari.nayak@intelliswift.com with details.

Posted 4 weeks ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR

Hybrid

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 4 weeks ago

Apply

21.0 - 31.0 years

35 - 55 Lacs

Bengaluru

Work from Office

What we’re looking for The Infrastructure team is looking for a Software Engineer with deep experience in and passion for cloud systems and software security. Our team works with teams across the company to help ensure that SurveyMonkey’s cloud infrastructure and security standards and practices are world-class. You’ll have the opportunity to develop infrastructure tools, and help teams implement them. What you’ll be working on Maintain and operate AWS environments with well established best practices Partnering with teams across the company include Application Engineering, Security Operations, and Product to provide the technical foundations of our cloud infrastructure and security systems Analyzing our existing systems, and designing and implementing tools to maintain internal security guardrails Providing both technical guidance and mentorship to developers in adopting good security practices in their feature development Developing tools to help SurveyMonkey comply with privacy regulations across the globe We’d love to hear from people with 8+ years of experience with application development 4+ years of experience with AWS, including but not limited to heavy experience with IAM, ALB, S3, AWS Lambda, CloudWatch, Transit Gateway, VPC, CloudFront, Route 53, Cloud Map, and VPN Proficient in AWS security technologies including GuardDuty, ACM, Shield, WAF, and Firewall Manager Demonstrable experience leveraging Configuration Management tooling such as Terraform, and proven strategies for maintaining large infrastructure-as-code deployments A strong understanding of web application security risks and control (e.g., OWASP Top 10) Proficient in Python or equivalent; NodeJS experience a plus Familiarity with privacy regulations like GDPR, CCPA, etc is a plus This opportunity is hybrid and requires you to work from the SurveyMonkey office in Bengaluru 3 days per week. #LI-HYBRID

Posted 4 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

kochi, kerala

On-site

You should have experience in designing and building serverless data lake solutions using a layered components architecture. This includes expertise in Ingestion, Storage, processing, Security & Governance, Data cataloguing & Search, and Consumption layer. Proficiency in AWS serverless technologies like Lake formation, Glue, Glue Python, Glue Workflows, Step Functions, S3, Redshift, Quick sight, Athena, AWS Lambda, and Kinesis is required. It is essential to have hands-on experience with Glue. You must be skilled in designing, building, orchestrating, and deploying multi-step data processing pipelines using Python and Java. Experience in managing source data access security, configuring authentication and authorization, and enforcing data policies and standards is also necessary. Additionally, familiarity with AWS Environment setup and configuration is expected. A minimum of 6 years of relevant experience with at least 3 years in building solutions using AWS is mandatory. The ability to work under pressure and a commitment to meet customer expectations are essential traits for this role. If you meet these requirements and are ready to take on this challenging opportunity, please reach out to hr@Stanratech.com.,

Posted 4 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

As an AWS Developer at Marlabs Innovations Pvt Ltd, your main responsibility will be to design and implement a secure and scalable API Gateway on AWS. This API Gateway will serve as the integration point between a Salesforce Force.com application and an LLM (Large Language Model) AI endpoint service. You should have hands-on experience in creating serverless architectures, securing APIs, and connecting cloud-native services with third-party applications and AI/ML platforms. Your key responsibilities will include designing and developing a secure API Gateway on AWS to enable seamless communication between Salesforce and an AI endpoint. You will need to implement Lambda functions, IAM roles, and various authentication mechanisms such as OAuth, API Keys, and Cognito. Ensuring secure, low-latency, and scalable message flow between the Force.com backend and external LLM APIs will be crucial. Additionally, you will be responsible for integrating with Salesforce via REST APIs, managing authentication tokens, and optimizing API performance while handling error retries, logging, and monitoring through CloudWatch. Furthermore, you will need to ensure a fault-tolerant architecture with high availability using services like API Gateway, Lambda, S3, DynamoDB, or other relevant AWS offerings. Collaborating with AI teams to consume LLM endpoints from platforms like OpenAI, Anthropic, or custom-hosted models will also be part of your role. Following best practices in DevOps and Infrastructure as Code (IaC) using tools like CloudFormation or Terraform will be expected. To be successful in this role, you should have strong hands-on experience with AWS API Gateway, AWS Lambda, and IAM. Proficiency in Python or Node.js for Lambda development is required, as well as experience integrating with Salesforce REST APIs and authentication workflows. A good understanding of LLM APIs, AI service integration, secure API development practices, event-driven architectures, and serverless frameworks is essential. Experience with CI/CD pipelines, CloudFormation, or Terraform, along with strong troubleshooting and debugging skills in cloud environments, will be beneficial. Preferred qualifications for this position include being an AWS Certified Developer Associate or holding an equivalent certification. Prior experience in integrating Salesforce with external cloud services, knowledge of AI/ML pipelines, REST-based AI model interactions, and familiarity with API monitoring and analytics tools would be advantageous.,

Posted 4 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

chennai, tamil nadu

On-site

The role of Data Scientist - Clinical Data Extraction & AI Integration in our healthcare technology team requires an experienced individual with 3-6 years of experience. As a Data Scientist in this role, you will be primarily focused on medical document processing and data extraction systems. You will have the opportunity to work with advanced AI technologies to create solutions that enhance the extraction of crucial information from clinical documents, thereby improving healthcare data workflows and patient care outcomes. Your key responsibilities will include designing and implementing statistical models for medical data quality assessment, developing predictive algorithms for encounter classification, and validation. You will also be responsible for building machine learning pipelines for document pattern recognition, creating data-driven insights from clinical document structures, and implementing feature engineering for medical terminology extraction. Furthermore, you will apply natural language processing (NLP) techniques to clinical text, develop statistical validation frameworks for extracted medical data, and build anomaly detection systems for medical document processing. Additionally, you will create predictive models for discharge date estimation, encounter duration, and implement clustering algorithms for provider and encounter classification. In terms of AI & LLM Integration, you will be expected to integrate and optimize Large Language Models via AWS Bedrock and API services, design and refine AI prompts for clinical content extraction with high accuracy, and implement fallback logic and error handling for AI-powered extraction systems. You will also develop pattern matching algorithms for medical terminology and create validation layers for AI-extracted medical information. Having expertise in the healthcare domain is crucial for this role. You will work closely with medical document structures, implement healthcare-specific validation rules, handle medical terminology extraction, and conduct clinical context analysis. Ensuring HIPAA compliance and adhering to data security best practices will also be part of your responsibilities. Proficiency in programming languages such as Python 3.8+, R, SQL, and JSON, along with familiarity with data science tools like pandas, numpy, scipy, scikit-learn, spaCy, and NLTK is required. Experience with ML Frameworks including TensorFlow, PyTorch, transformers, huggingface, and visualization tools like matplotlib, seaborn, plotly, Tableau, and PowerBI is desirable. Knowledge of AI Platforms such as AWS Bedrock, Anthropic Claude, OpenAI APIs, and experience with cloud services like AWS (SageMaker, S3, Lambda, Bedrock) will be advantageous. Familiarity with research tools like Jupyter notebooks, Git, Docker, and MLflow is also beneficial for this role.,

Posted 4 weeks ago

Apply

3.0 - 7.0 years

0 Lacs

karnataka

On-site

A career at HARMAN Automotive offers you the opportunity to be part of a global, multi-disciplinary team dedicated to harnessing the power of technology to shape the future. We empower you to accelerate your professional growth and make a difference by: - Engineering cutting-edge audio systems and integrated technology platforms that enhance the driving experience. - Fostering innovation through collaborative efforts that combine in-depth research, design excellence, and engineering prowess. - Driving advancements in in-vehicle infotainment, safety, efficiency, and overall enjoyment for users. About The Role: We are looking for a skilled Python Backend Developer with 3 to 6 years of experience in building scalable and secure backend systems using AWS services. In this role, you will be instrumental in: - Designing and implementing microservices architecture and cloud-native solutions. - Integrating diverse data sources into a unified system to ensure data consistency and security. What You Will Do: Your responsibilities will include: - Backend Development: Creating scalable backend systems using Python and frameworks like Flask or Django. - Microservices Architecture: Developing and deploying microservices-based systems with AWS services like SQS, Step Functions, and API Gateway. - Cloud-Native Solutions: Building cloud-native solutions utilizing AWS services such as Lambda, CloudFront, and IAM. - Data Integration: Integrating multiple data sources into a single system while maintaining data integrity. - API Development: Designing and implementing RESTful/SOAP APIs using API Gateway and AWS Lambda. What You Need To Be Successful: To excel in this role, you should possess: - Technical Skills: Proficiency in Python backend development, JSON data handling, and familiarity with AWS services. - AWS Services: Knowledge of various AWS services including SQS, Step Functions, IAM, CloudFront, and API Gateway. - Security and Authentication: Understanding of identity management, authentication protocols like OAuth 2.0 and OIDC. - Data Management: Experience with ORM frameworks like SQLAlchemy or Django ORM. - Collaboration and Testing: Ability to collaborate effectively and work independently when needed, along with familiarity with testing tools. Bonus Points if You Have: Additional experience with AWS ECS, VPC, serverless computing, and DevOps practices would be advantageous. What Makes You Eligible: We are looking for individuals with relevant experience in backend development, strong technical expertise, problem-solving abilities, and effective collaboration and communication skills. What We Offer: Join us for a competitive salary and benefits package, opportunities for professional growth, a dynamic work environment, access to cutting-edge technologies, recognition for outstanding performance, and the chance to collaborate with a renowned German OEM. You Belong Here: At HARMAN, we value diversity, inclusivity, and empowerment. We encourage you to share your ideas, voice your perspective, and be yourself in a supportive culture that celebrates uniqueness. We are committed to your ongoing learning and development, providing training and education opportunities for you to thrive in your career. About HARMAN: With a legacy of innovation dating back to the 1920s, HARMAN continues to redefine technology across automotive, lifestyle, and digital transformation solutions. Our portfolio of iconic brands delivers exceptional experiences, setting new standards in engineering and design for our customers and partners worldwide. If you are ready to drive innovation and create lasting impact, we invite you to join our talent community at HARMAN Automotive.,

Posted 4 weeks ago

Apply

4.0 - 8.0 years

16 - 20 Lacs

Gurugram, Chennai, Bengaluru

Work from Office

Job Summary: We are looking for a talented and experienced Java Developer with strong AWS expertise to join our growing team. The ideal candidate will have a solid background in Java development and hands-on experience with AWS services, capable of building scalable, secure, and high-performance cloudnative applications. Key Responsibilities: • Design, develop, and maintain Java-based applications with a focus on cloud-native architecture. • Leverage AWS services such as EC2, Lambda, S3, RDS, DynamoDB, and API Gateway in application development. • Implement RESTful APIs and integrate with third-party services. • Collaborate with DevOps teams to ensure smooth CI/CD pipelines and infrastructure automation. • Write clean, efficient, and well-documented code following best practices. • Participate in code reviews, unit testing, and performance tuning. • Troubleshoot and resolve technical issues in development and production environments. Required Skills & Qualifications: • 46 years of hands-on experience in Java development. • Strong experience with Spring Boot, Spring Cloud, and RESTful APIs. • Proficiency in AWS services such as EC2, Lambda, S3, RDS, CloudWatch, and IAM. • Experience with CI/CD tools like Jenkins, Git, Maven, or Gradle. • Familiarity with containerization tools like Docker and orchestration with Kubernetes (preferred). • Good understanding of cloud security, scalability, and performance optimization. • Strong analytical and problem-solving skills. Preferred Qualifications: • AWS Certification (e.g., AWS Certified Developer Associate). • Experience with Infrastructure as Code (IaC) tools like Terraform or CloudFormation. • Exposure to Agile/Scrum methodologies

Posted 4 weeks ago

Apply

5.0 - 10.0 years

15 - 25 Lacs

Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR

Hybrid

Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.

Posted 4 weeks ago

Apply

5.0 - 7.0 years

8 - 12 Lacs

Bengaluru

Work from Office

Educational Requirements Master of Science (Technology),Master Of Comp. Applications,Master Of Engineering,Master Of Technology,Bachelor Of Comp. Applications,Bachelor Of Science (Tech),Bachelor of Engineering,Bachelor Of Technology Service Line Application Development and Maintenance Responsibilities A day in the life of an InfoscionAs part of the Infosys consulting team, your primary role would be to get to the heart of customer issues, diagnose problem areas, design innovative solutions and facilitate deployment resulting in client delight.You will develop a proposal by owning parts of the proposal document and by giving inputs in solution design based on areas of expertise. You will plan the activities of configuration, configure the product as per the design, conduct conference room pilots and will assist in resolving any queries related to requirements and solution designYou will conduct solution/product demonstrations, POC/Proof of Technology workshops and prepare effort estimates which suit the customer budgetary requirements and are in line with organization’s financial guidelines Actively lead small projects and contribute to unit-level and organizational initiatives with an objective of providing high quality value adding solutions to customers. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Ability to develop value-creating strategies and models that enable clients to innovate, drive growth and increase their business profitability Good knowledge on software configuration management systems Awareness of latest technologies and Industry trends Logical thinking and problem-solving skills along with an ability to collaborate Understanding of the financial processes for various types of projects and the various pricing models available Ability to assess the current processes, identify improvement areas and suggest the technology solutions One or two industry domain knowledge Client Interfacing skills Project and Team management Technical and Professional Requirements: Primary skillsTechnology Big Data Big Table | Technology Cloud Integration | Azure Data Factory (ADF) | Technology | Data on Cloud - Platform | AWS Preferred Skills: Technology-Data On Cloud - Platform-AWS Technology-Cloud Integration-Azure Data Factory (ADF) Technology-Cloud Platform-Google Big Data-GCP

Posted 1 month ago

Apply

3.0 - 5.0 years

5 - 9 Lacs

Bengaluru

Work from Office

Educational Requirements Master Of Comp. Applications,Master Of Engineering,Master Of Science,Master Of Technology,Bachelor Of Comp. Applications,Bachelor Of Science,Bachelor of Engineering,Bachelor Of Technology Service Line Engineering Services Responsibilities A day in the life of an Infoscion As part of the Infosys consulting team, your primary role would be to lead the engagement effort of providing high-quality and value-adding consulting solutions to customers at different stages- from problem definition to diagnosis to solution design, development and deployment. You will review the proposals prepared by consultants, provide guidance, and analyze the solutions defined for the client business problems to identify any potential risks and issues. You will identify change Management requirements and propose a structured approach to client for managing the change using multiple communication mechanisms. You will also coach and create a vision for the team, provide subject matter training for your focus areas, motivate and inspire team members through effective and timely feedback and recognition for high performance. You would be a key contributor in unit-level and organizational initiatives with an objective of providing high-quality, value-adding consulting solutions to customers adhering to the guidelines and processes of the organization. If you think you fit right in to help our clients navigate their next in their digital transformation journey, this is the place for you! Additional Responsibilities: Good knowledge on software configuration management systems Strong business acumen, strategy and cross-industry thought leadership Awareness of latest technologies and Industry trends Logical thinking and problem solving skills along with an ability to collaborate Two or three industry domain knowledge Understanding of the financial processes for various types of projects and the various pricing models available Client Interfacing skills Knowledge of SDLC and agile methodologies Project and Team management Technical and Professional Requirements: Technology-Cloud Platform-AWS Database-AWS,Technology-Container Platform-Docker,Technology-Container Platform-Kubernetes Preferred Skills: Technology-Cloud Platform-AWS Database Technology-Container Platform-Docker Technology-Cloud Platform-Power Platform

Posted 1 month ago

Apply

7.0 - 10.0 years

15 - 20 Lacs

Pune, Chennai, Bengaluru

Hybrid

Job Summary: We are seeking a highly skilled Senior .NET Developer with over 6 years of experience in backend development and solution design. The ideal candidate will have strong expertise in .NET/.NET Core , C# , .NET 6 , and hands-on experience with AWS Lambda and MongoDB . You will be responsible for designing scalable modules and implementing business logic for enterprise-grade applications in a cloud-native environment. Key Responsibilities: Design, develop, and maintain backend components and services using .NET 6/.NET Core and C#. Architect solutions and design modules that are scalable, maintainable, and aligned with business goals. Build and deploy serverless applications using AWS Lambda and related AWS services. Develop and manage NoSQL databases with MongoDB , including schema design and performance tuning. Collaborate with front-end developers, DevOps, and other stakeholders to ensure seamless integration. Write clean, testable, and efficient code following best practices and coding standards. Participate in code reviews, mentoring, and knowledge sharing within the team. Optimize application performance and scalability through profiling and tuning. Ensure the security, quality, and compliance of all delivered solutions. Must-Have Skills: Strong proficiency in .NET / .NET Core / .NET 6 , and C# Experience with AWS Lambda and serverless architecture Proficiency in MongoDB including query optimization and schema design Experience with RESTful API development and integration Solid understanding of object-oriented design and software engineering principles Why Join Us? Opportunity to work on modern cloud-native applications Collaborative and growth-oriented team environment Flexible working arrangements and supportive leadership Exposure to cutting-edge technologies and projects

Posted 1 month ago

Apply

4.0 - 8.0 years

9 - 13 Lacs

Bengaluru

Work from Office

As a Software Developer you'll participate in many aspects of the software development lifecycle, such as design, code implementation, testing, and support. You will create software that enables your clients' hybrid-cloud and AI journeys. Your primary responsibilities include Comprehensive Feature Development and Issue ResolutionWorking on the end to end feature development and solving challenges faced in the implementation. Stakeholder Collaboration and Issue ResolutionCollaborate with key stakeholders, internal and external, to understand the problems, issues with the product and features and solve the issues as per SLAs defined. Continuous Learning and Technology IntegrationBeing eager to learn new technologies and implementing the same in feature development. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Creative problem-solving skills and superb communication Skill. Container based solutions. Strong experience with Node.js and AWS stack - AWS Lambda, AWS APIGateway, AWS CDK, AWS DynamoDB, AWS SQS. Experience with infrastructure as a code using AWS CDK.Expertise in encryption and decryption techniques for securing APIs, API Authentication and Authorization Primarily more experience is required on Lambda and APIGateway. Candidates having the AWS Certified Cloud Practitioner / AWS Certified Developer Associate certifications will be preferred Preferred technical and professional experience Experience in distributed/scalable systems Knowledge of standard tools for optimizing and testing code Knowledge/Experience of Develoment/Build/Deploy/Test life cycle

Posted 1 month ago

Apply

6.0 - 11.0 years

9 - 19 Lacs

Noida

Work from Office

We are looking for a skilled Machine Learning Engineer with strong expertise in Natural Language Processing (NLP) and AWS cloud services to design, develop, and deploy scalable ML models and pipelines. You will play a key role in building innovative NLP solutions for classification, forecasting, and recommendation systems, leveraging cutting-edge technologies to drive data-driven decision-making in the US healthcare domain. Key Responsibilities: Design and deploy scalable machine learning models focused on NLP tasks, classification, forecasting, and recommender systems. Build robust, end-to-end ML pipelines encompassing data ingestion, feature engineering, model training, validation, and production deployment. Apply advanced NLP techniques including sentiment analysis, named entity recognition (NER), embeddings, and document parsing to extract actionable insights from healthcare data. Utilize AWS services such as SageMaker, Lambda, Comprehend, and Bedrock for model training, deployment, monitoring, and optimization. Collaborate effectively with cross-functional teams including data scientists, software engineers, and product managers to integrate ML solutions into existing products and workflows. Implement MLOps best practices for model versioning, automated evaluation, CI/CD pipelines, and continuous improvement of deployed models. Leverage Python and ML/NLP libraries including scikit-learn, PyTorch, Hugging Face Transformers, and spaCy for daily development tasks. Research and explore advanced NLP/ML techniques such as Retrieval-Augmented Generation (RAG) pipelines, foundation model fine-tuning, and vector search methods for next-generation solutions. Required Qualifications: Bachelors or Masters degree in Computer Science, Engineering, or a related technical field. 6+ years of professional experience in machine learning, with a strong focus on NLP and AWS cloud services. Hands-on experience in designing and deploying production-grade ML models and pipelines. Strong programming skills in Python and familiarity with ML/NLP frameworks like PyTorch, Hugging Face, spaCy, scikit-learn. Proven experience with AWS ML ecosystem: SageMaker, Lambda, Comprehend, Bedrock, and related services. Solid understanding of MLOps principles including version control, model monitoring, and automated deployment. Experience working in the US healthcare domain is a plus. Excellent problem-solving skills and ability to work collaboratively in an agile environment. Preferred Skills: Familiarity with advanced NLP techniques such as RAG pipelines and foundation model tuning. Knowledge of vector databases and semantic search technologies. Experience with containerization (Docker, Kubernetes) and cloud infrastructure automation. Strong communication skills with the ability to translate complex technical concepts to non-technical stakeholders.

Posted 1 month ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies