Jobs
Interviews

560 Lambda Jobs - Page 15

Setup a job Alert
JobPe aggregates results for easy application access, but you actually apply on the job portal directly.

4.0 - 9.0 years

10 - 20 Lacs

Bengaluru

Work from Office

Qualifications: 4+ years of software development experience, including work with cloud technologies. Bachelors or Master’s degree in Computer Science, Engineering, or equivalent experience. Proficiency in one or more modern programming languages (e.g., Python, Java, Go, NodeJS). Experience with cloud platforms such as AWS, Azure, or Google Cloud. Experience with microservices architecture, distributed systems, and event-driven design. Expertise in designing and consuming RESTful APIs and familiarity with GraphQL Hands one experience with CI/CD pipelines, infrastructure as a code (e.g. Terraform, CloudFormation) and automated deployments. Strong understanding of relational and NoSQL databases. Knowledge of SaaS specific security practices (e.g. OWASP, data encryption, identity management). Strong understanding of software development methodologies and tools. Familiarity with containerization (Docker) and orchestration (Kubernetes). Knowledge of monitoring and logging tools Experience with distributed systems and data-intensive applications.

Posted 1 month ago

Apply

6.0 - 9.0 years

14 - 22 Lacs

Pune, Chennai

Work from Office

Hiring For Top IT Company- Designation: Python Developer Skills: AWS SDK +AI services integration Location :Pune/Chennai Exp: 6-8 yrs Best CTC Surbhi:9887580624 Anchal:9772061749 Gitika:8696868124 Shivani:7375861257 Team Converse

Posted 1 month ago

Apply

4.0 - 8.0 years

10 - 20 Lacs

Hyderabad, Chennai

Work from Office

Roles & Responsibilities : • We are looking for a strong Senior Data Engineering who will be majorly responsible for designing, building and maintaining ETL/ ELT pipelines . • Integration of data from multiple sources or vendors to provide the holistic insights from data. • You are expected to build and manage Data Lake and Data warehouse solutions, design data models, create ETL processes, implementing data quality mechanisms etc. • Perform EDA (exploratory data analysis) required to troubleshoot data related issues and assist in the resolution of data issues. • Should have experience in client interaction oral and written. • Experience in mentoring juniors and providing required guidance to the team. Required Technical Skills • Extensive experience in languages such as Python, Pyspark, SQL (basics and advanced). • Strong experience in Data Warehouse, ETL, Data Modelling, building ETL Pipelines, Data Architecture . • Must be proficient in Redshift, Azure Data Factory, Snowflake etc. • Hands-on experience in cloud services like AWS S3, Glue, Lambda, CloudWatch, Athena etc. • Good to have knowledge in Dataiku, Big Data Technologies and basic knowledge of BI tools like Power BI, Tableau etc will be plus. • Sound knowledge in Data management, data operations, data quality and data governance. • Knowledge of SFDC, Waterfall/ Agile methodology. • Strong knowledge of Pharma domain / life sciences commercial data operations. Qualifications • Bachelors or masters Engineering/ MCA or equivalent degree. • 4-6 years of relevant industry experience as Data Engineer . • Experience working on Pharma syndicated data such as IQVIA, Veeva, Symphony; Claims, CRM, Sales, Open Data etc. • High motivation, good work ethic, maturity, self-organized and personal initiative. • Ability to work collaboratively and providing the support to the team. • Excellent written and verbal communication skills. • Strong analytical and problem-solving skills. Location • Preferably Hyderabad/ Chennai, India

Posted 1 month ago

Apply

6.0 - 10.0 years

25 - 30 Lacs

Noida, and Remote

Work from Office

Job Title: Full Stack Software Developer Experience Required: 6+ Years Location: [Noida / Remote] Employment Type: Full-Time Job Summary We are seeking a talented and motivated Full Stack Software Developer with 6+ years of experience to join our dynamic team. The ideal candidate should be highly skilled in React and Node.js, with a solid grasp of GraphQL and AWS being a significant advantage. You will be instrumental in designing, developing, and maintaining scalable, efficient, and user-centric applications across the entire technology stack. Key Responsibilities Design & Development: Build, deploy, and maintain robust front-end and back-end applications using React and Node.js. API Integration: Create and consume RESTful and GraphQL APIs to support dynamic client-server interactions. System Architecture: Contribute to the design of scalable and maintainable software systems. Cloud Integration: Leverage AWS services (e.g., Lambda, S3, EC2) to host and scale applications efficiently. Collaboration: Work closely with cross-functional teams including product managers, designers, and other developers. Code Quality: Maintain clean, testable, and maintainable code following best practices. Troubleshooting: Diagnose and resolve issues across the stack to ensure high performance and reliability. Skills and Qualifications Required: Strong proficiency in JavaScript/TypeScript, React, and Node.js. Solid understanding of front-end development concepts (state management, component lifecycle, performance tuning). Experience working with REST and/or GraphQL APIs. Familiarity with relational databases like PostgreSQL or similar. Excellent problem-solving abilities and experience in Agile development environments. Preferred: Hands-on experience with GraphQL and tools like Apollo. Working knowledge of AWS services such as EC2, S3, Lambda, API Gateway, and DynamoDB. Experience with CI/CD tools (e.g., GitHub Actions, Jenkins). Understanding of automated testing using frameworks like Jest, Cypress, etc.

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

The Core AI BI & Data Platforms Team has been established to create, operate and run the Enterprise AI, BI and Data that facilitate the time to market for reporting, analytics and data science teams to run experiments, train models and generate insights as well as evolve and run the CoCounsel application and its shared capability of CoCounsel AI Assistant. The Enterprise Data Platform aims to provide self service capabilities for fast and secure ingestion and consumption of data across TR. At Thomson Reuters, we are recruiting a team of motivated Cloud professionals to transform how we build, manage and leverage our data assets. The Data Platform team in Bangalore is seeking an experienced Software Engineer with a passion for engineering cloud-based data platform systems. Join our dynamic team as a Software Engineer and take a pivotal role in shaping the future of our Enterprise Data Platform. You will develop and implement data processing applications and frameworks on cloud-based infrastructure, ensuring the efficiency, scalability, and reliability of our systems. About the Role In this opportunity as the Software Engineer, you will: Develop data processing applications and frameworks on cloud-based infrastructure in partnership withData Analysts and Architects with guidance from Lead Software Engineer. Innovatewithnew approaches to meet data management requirements. Make recommendations about platform adoption, including technology integrations, application servers, libraries, and AWS frameworks, documentation, and usability by stakeholders. Contribute to improving the customer experience. Participate in code reviews to maintain a high-quality codebase Collaborate with cross-functional teams to define, design, and ship new features Work closely with product owners, designers, and other developers to understand requirements and deliver solutions. Effectively communicate and liaise across the data platform & management teams Stay updated on emerging trends and technologies in cloud computing About You You're a fit for the role of Software Engineer, if you meet all or most of these criteria: Bachelor's degree in Computer Science, Engineering, or a related field 3+ years of relevant experience in Implementation of data lake and data management of data technologies for large scale organizations. Experience in building & maintaining data pipelines with excellent run-time characteristics such as low-latency, fault-tolerance and high availability. Proficient in Python programming language. Experience in AWS services and management, including Serverless, Container, Queueing and Monitoring services like Lambda, ECS, API Gateway, RDS, Dynamo DB, Glue, S3, IAM, Step Functions, CloudWatch, SQS, SNS. Good knowledge in Consuming and building APIs Business Intelligence tools like PowerBI Fluency in querying languages such as SQL Solid understanding in Software development practicessuch as version control via Git, CI/CD and Release management Agile development cadence Good critical thinking, communication, documentation, troubleshooting and collaborative skills.

Posted 1 month ago

Apply

4.0 - 6.0 years

7 - 9 Lacs

Bengaluru

Hybrid

We are looking for an experienced *Software Engineer - Informatica* with 4 to 6 years of hands-on expertise* in designing, developing, and optimizing large-scale *ETL solutions* using *Informatica PowerCenter*. The ideal candidate will lead ETL projects, mentor junior developers, and ensure high-performance data integration across enterprise systems. About The Role In this role as Software Engineer, you will: - Analyze business and functional requirements to design and implement scalable data integration solutions - Understand and interpret High-Level Design (HLD) documents and convert them into detailed Low-Level Design (LLD) - Develop robust, reusable, and optimized Informatica mappings, sessions, and workflows - Apply mapping optimization and performance tuning techniques to ensure efficient ETL processes - Conduct peer code reviews and suggest improvements for reliability and performance - Prepare and execute comprehensive unit test cases and support system/integration testing - Maintain detailed technical documentation, including LLDs, data flow diagrams, and test cases - Build data pipelines and transformation logic in Snowflake, ensuring performance and scalability - Develop and manage Unix shell scripts for automation, scheduling, and monitoring of ETL jobs - Collaborate with cross-functional teams to support UAT, deployments, and production issues. About You You are a fit for this position if your background includes: - 4-6 years of strong hands-on experience with Informatica PowerCenter - Proficient in developing and optimizing ETL mappings, workflows, and sessions - Solid experience with performance tuning techniques and best practices in ETL processes - Hands-on experience with Snowflake for data loading, SQL transformations, and optimization - Strong skills in Unix/Linux scripting for job automation - Experience in converting HLDs into LLDs and defining unit test cases - Knowledge of data warehousing concepts, data modelling, and data quality frameworks Good to Have - Knowledge of Salesforce data model and integration (via Informatica or API-based solutions) - Exposure to AWS cloud services like S3, Glue, Redshift, Lambda, etc. - Familiarity with relational databases such as SQL Server and PostgreSQL - Experience with job schedulers like Control-M, ESP, or equivalent - Agile methodology experience and tools such as JIRA, Confluence, and Git - Knowledge of DBT (Data Build Tool) for data transformation and orchestration - Experience with Python scripting for data manipulation, automation, or integration tasks.

Posted 1 month ago

Apply

10.0 - 15.0 years

11 - 20 Lacs

Bengaluru

Work from Office

About Client Hiring for One of the Most Prestigious Multinational Corporations Job Title: Senior AWS Engineer Experience: 10 to 15 years Key Responsibilities : Design and implement AWS-based infrastructure solutions for scalability, performance, and security. Lead cloud architecture discussions, guiding development and operations teams in best practices. Automate infrastructure provisioning using tools like Terraform, CloudFormation, or AWS CDK. Implement and manage CI/CD pipelines (e.g., Jenkins, CodePipeline, GitHub Actions). Ensure cost optimization, monitoring, and governance for AWS accounts. Collaborate with security teams to enforce compliance and governance policies across cloud environments. Handle migration of on-premise workloads to AWS cloud (rehost, replatform, refactor). Provide mentorship to junior engineers and participate in code reviews and design sessions. Maintain high availability, disaster recovery, and backup strategies. Stay updated with the latest AWS services and architecture trends. Technical Skills: Strong hands-on experience with core AWS services: EC2, S3, RDS, Lambda, IAM, VPC, CloudWatch, CloudTrail, ECS/EKS, etc. Expert in Infrastructure as Code (IaC) using Terraform , CloudFormation , or AWS CDK . Strong scripting and automation skills in Python , Bash , or Shell . Experience with containerization and orchestration tools ( Docker , Kubernetes /EKS). Solid understanding of networking, load balancing, and security concepts in the cloud. Experience with monitoring/logging tools like CloudWatch , Prometheus , Grafana , or ELK stack . Knowledge of DevOps and CI/CD tools (Jenkins, GitLab CI, AWS CodePipeline, etc.). Familiarity with Agile/Scrum methodologies NOTE : Only immediate and 15 days joiners Notice period : Only immediate and 15 days joiners Location: Bangalore Mode of Work : WFO(Work From Office) Thanks & Regards, SWETHA Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,INDIA. Contact Number:8067432433 rathy@blackwhite.in |www.blackwhite.in

Posted 1 month ago

Apply

3.0 - 8.0 years

5 - 10 Lacs

Bengaluru

Work from Office

The Core AI BI & Data Platforms Team has been established to create, operate and run the Enterprise AI, BI and Data that facilitate the time to market for reporting, analytics and data science teams to run experiments, train models and generate insights as well as evolve and run the CoCounsel application and its shared capability of CoCounsel AI Assistant. The Enterprise Data Platform aims to provide self service capabilities for fast and secure ingestion and consumption of data across TR. At Thomson Reuters, we are recruiting a team of motivated Cloud professionals to transform how we build, manage and leverage our data assets. The Data Platform team in Bangalore is seeking an experienced Software Engineer with a passion for engineering cloud-based data platform systems. Join our dynamic team as a Software Engineer and take a pivotal role in shaping the future of our Enterprise Data Platform. You will develop and implement data processing applications and frameworks on cloud-based infrastructure, ensuring the efficiency, scalability, and reliability of our systems. About the Role In this opportunity as the Software Engineer, you will: Develop data processing applications and frameworks on cloud-based infrastructure in partnership with Data Analysts and Architects with guidance from Lead Software Engineer. Innovatewithnew approaches to meet data management requirements. Make recommendations about platform adoption, including technology integrations, application servers, libraries, and AWS frameworks, documentation, and usability by stakeholders. Contribute to improving the customer experience. Participate in code reviews to maintain a high-quality codebase Collaborate with cross-functional teams to define, design, and ship new features Work closely with product owners, designers, and other developers to understand requirements and deliver solutions. Effectively communicate and liaise across the data platform & management teams Stay updated on emerging trends and technologies in cloud computing About You You're a fit for the role of Software Engineer, if you meet all or most of these criteria: Bachelor's degree in Computer Science, Engineering, or a related field 3+ years of relevant experience in Implementation of data lake and data management of data technologies for large scale organizations. Experience in building & maintaining data pipelines with excellent run-time characteristics such as low-latency, fault-tolerance and high availability. Proficient in Python programming language. Experience in AWS services and management, including Serverless, Container, Queueing and Monitoring services like Lambda, ECS, API Gateway, RDS, Dynamo DB, Glue, S3, IAM, Step Functions, CloudWatch, SQS, SNS. Good knowledge in Consuming and building APIs Business Intelligence tools like PowerBI Fluency in querying languages such as SQL Solid understanding in Software development practicessuch as version control via Git, CI/CD and Release management Agile development cadence Good critical thinking, communication, documentation, troubleshooting and collaborative skills.

Posted 1 month ago

Apply

3.0 - 5.0 years

4 - 7 Lacs

Hyderabad

Work from Office

Data Analysis: Conduct in-depth analysis of data to identify trends, anomalies, and opportunities, utilizing SQL, AWS, and Python to extract and manipulate data. Business Transformation: Translate existing SQL queries into business transformation logics, enabling the conversion of raw data into actionable insights to drive strategic decision-making. Requirements Gathering: Collaborate with business stakeholders to gather and document. clear and concise business requirements, ensuring a thorough understanding of data needs. Documentation: Develop and maintain documentation related to data analysis, transformation, and reporting processes, ensuring knowledge transfer and continuity. AWS Integration: Leverage AWS services to facilitate data extraction, storage, and analysis, making data readily available for the business. Quality Assurance: Implement data quality checks and validation processes to ensure the accuracy and integrity of data used in analyses. Qualifications: Bachelors degree in business, Computer Science, or a related field. Proven experience as a Business Analyst with a strong focus on data analysis and transformation. Proficiency in SQL for querying and manipulating relational databases. Awareness of AWS services such as Redshift, S3, Athena, Lambda, Step Functions, AWS Batch Proficiency in Python for data analysis and scripting. Experience in converting SQL queries into actionable business transformation logics. Strong problem-solving and critical-thinking skills. Excellent communication and interpersonal skills to work effectively with cross-functional. teams and stakeholders. Attention to detail and a commitment to data accuracy and quality.

Posted 1 month ago

Apply

6.0 - 11.0 years

15 - 30 Lacs

Hyderabad

Work from Office

Design,develop,maintain scalable microservices using Java,Kotlin,Spring Boot. Build/ optimize data models and queries in MongoDB. Integrate and manage Apache Kafka for real-time data streaming and messaging. Implement CI/CD pipelines using Jenkins Required Candidate profile 6+ yrs of exp in backend development with Java/Spring Boot. Exp with Jenkins for CI/CD automation. Familiarity with AWS services (EC2, S3, Lambda, RDS, etc.) or OpenShift for container orchestration

Posted 1 month ago

Apply

8.0 - 12.0 years

25 - 40 Lacs

Chennai, Bengaluru, Delhi / NCR

Work from Office

AWS Admin exp involving design, Landing Zone deployment, Migration, and optimization Design and develop AWS cloud solutions Create architectural blueprints, diagrams, and documentation Hands on Exp on AWS Terraform and Cloud formation automation

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 22 Lacs

Pune, Chennai, Bengaluru

Hybrid

5-8 years of experience in backend development with a strong focus on Python. Develop and maintain serverless applications using AWS Lambda functions, DynamoDB, and other AWS services. Hands-on experience with Terraform

Posted 1 month ago

Apply

5.0 - 10.0 years

15 - 22 Lacs

Pune, Chennai, Bengaluru

Hybrid

EXP-5-12 Years NP-Immediate to 30-45 days if serving Python Developer+AWS(Lambda)+DynamoDB 5-8 years of experience in backend development with a strong focus on Python. Proven experience with AWS serverless technologies, including Lambda, DynamoDB

Posted 1 month ago

Apply

6.0 - 11.0 years

1 - 2 Lacs

Pune

Work from Office

Role & responsibilities Proficiency in Python and PySpark for data processing and transformation tasks. Solid experience with AWS Glue for ETL jobs and managing data workflows. Hands-on experience with AWS Data Pipeline (DPL) for workflow orchestration. Strong experience with AWS services such as S3, Lambda, Redshift, RDS, and EC2. Technical Skills: Deep understanding of ETL concepts and best practices.. Strong knowledge of SQL for querying and manipulating relational and semi-structured data. Experience with Data Warehousing and Big Data technologies, specifically within AWS. Additional Skills: Experience with AWS Lambda for serverless data processing and orchestration. Understanding of AWS Redshift for data warehousing and analytics. Familiarity with Data Lakes, Amazon EMR, and Kinesis for streaming data processing. Knowledge of data governance practices, including data lineage and auditing. Familiarity with CI/CD pipelines and Git for version control. Experience with Docker and containerization for building and deploying applications. Design and Build Data Pipelines: Design, implement, and optimize data pipelines on AWS using PySpark, AWS Glue, and AWS Data Pipeline to automate data integration, transformation, and storage processes. ETL Development: Develop and maintain Extract, Transform, and Load (ETL) processes using AWS Glue and PySpark to efficiently process large datasets. Data Workflow Automation: Build and manage automated data workflows using AWS Data Pipeline, ensuring seamless scheduling, monitoring, and management of data jobs. Data Integration: Work with different AWS data storage services (e.g., S3, Redshift, RDS) to ensure smooth integration and movement of data across platforms. Optimization and Scaling: Optimize and scale data pipelines for high performance and cost efficiency, utilizing AWS services like Lambda, S3, and EC2. Preferred candidate profile Skill Tech stack : AWS Data Engineer, Python, Pyspark, SQL, Data Pipeline, AWS, AWS Glue, lambda Experience: 6 - 8 Years, Location: Pune Notice Period Immediate to 1 week Joiner only

Posted 1 month ago

Apply

3.0 - 6.0 years

3 - 7 Lacs

Hyderabad, Bengaluru

Hybrid

Locations : Hyderabad & Bangalore Work Mode: Hybrid Interview Mode: Virtual (2 Rounds) Type: Contract-to-Hire (C2H) Key Skills & Responsibilities Hands-on experience with AWS services: S3, Lambda, Glue, API Gateway, and SQS. Strong data engineering expertise on AWS, with proficiency in Python, PySpark, and SQL. Experience in batch job scheduling and managing data dependencies across pipelines. Familiarity with data processing tools such as Apache Spark and Airflow. Ability to automate repetitive tasks and build reusable frameworks for improved efficiency. Provide RunOps / DevOps support, and manage the ongoing operation and monitoring of data services. Ensure high performance, scalability, and reliability of data workflows in cloud environments. Skills: aws,s3,glue,apache spark,lambda,airflow,sql,s3, lambda, glue, api gateway, and sqs,api gateway,pyspark,sqs,python,devops support

Posted 1 month ago

Apply

1.0 - 6.0 years

3 - 8 Lacs

Bengaluru

Work from Office

Responsibilities Design and develop new features to meet evolving business and technical needs Maintain and enhance existing functionality to ensure reliability and performance Collaborate directly with customers to gather requirements and understand business objectives Stay up to date with the latest technologies and apply them to influence project decisions and outcomes Requirements 1+ year of experience in developing commercial applications on .NET Good understanding of the Software Development Lifecycle Understanding of C#, including .NET 6/8 and .NET Framework 4.8 Good knowledge and experience with Azure (Azure Functions, VMs, Cosmos DB, Azure SQL) or AWS (EC2, Lambda, S3, DynamoDB) Skills in front-end web development (React, Angular, TypeScript) Substantial knowledge of relational and non-relational databases Good knowledge of Event-Driven Architecture (CQRS & Event Sourcing), Domain-Driven Design, and Microservices Experience working with CI/CD (Azure DevOps, AWS CodePipeline) Experience with testing tools and techniques Good spoken English (at least B1 level according to CEFR)

Posted 1 month ago

Apply

8.0 - 13.0 years

15 - 25 Lacs

Gurugram

Remote

Minimum 6 years of hands-on experience deploying, enhancing, and troubleshooting foundational AWS Services (EC2, S3, RDS, VPC, CloudTrail, CloudFront, Lambda, EKS, ECS, etc.) • 3+ years of experience with serverless technologies, services, and container technologies (Docker, Kubernetes, etc.) o Manage Kubernetes charts using helm. o Managed production application deployments in Kubernetes cluster using KubeCTL. o Expertise in deploying distributed apps with containers (Docker) & orchestration (Kubernetes EKS,). o Experience in infrastructure-as-code tools for provisioning and managing Kubernetes infrastructure. o (Preferred) Certification in container orchestration systems and/or Certified Kubernetes Administrator. o Experience with Log Management and Analytics tools such as Splunk / ELK • 3+ years of experience with writing, debugging, and enhancing Terraform to write infrastructure as code to create scrips for EKS, EC2, S3, and other AWS services. o Expertise with working with Terraform Key features such as Infrastructure as code, execution plans, resource graphs, and change automation. o Implemented cluster services using Kubernetes and docker to manage local deployments in Kubernetes by building self-hosted Kubernetes clusters using Terraform. o Managed provisioning of AWS infrastructure using Terraform. o Develop and maintain infrastructure-as-code solutions using Terraform. • Ability to write scripts in JavaScript, Bash, Python, Typescript, or similar languages. • Able to work independently and as a team to architect and implement new solutions and technologies. • Very strong written and verbal communication skills; the ability to communicate verbally and in writing with all levels of employees and management, capable of successful formal and informal communication, speaks and writes clearly and understandably at the right level. • Ability to identify, evaluate, learn, and POC new technologies for implementation. • Experience in designing and implementing highly resilient AWS solutions.

Posted 1 month ago

Apply

3.0 - 5.0 years

1 - 3 Lacs

Chennai

Work from Office

**AWS Infrastructure Management:** Design, implement, and maintain scalable, secure cloud infrastructure using AWS services (EC2, Lambda, S3, RDS, Cloud Formation/Terraform, etc.) Monitor and optimize cloud resource usage and costs **CI/CD Pipeline Automation:** Set up and maintain robust CI/CD pipelines using tools such as GitHub Actions, GitLab CI, Jenkins, or AWS Code Pipeline Ensure smooth deployment processes for staging and production environments **Git Workflow Management:** Implement and enforce best practices for version control and branching strategies (Gitflow, trunk-based development, etc.) Support development teams in resolving Git issues and improving workflows **Twilio Integration & Support:** Manage and maintain Twilio-based communication systems (SMS, Voice, WhatsApp, Programmable Messaging) Develop and deploy Twilio Functions and Studio Flows for customer engagement Monitor communication systems and troubleshoot delivery or quality issues **Infrastructure as Code & Automation:** Use tools like Terraform, Cloud Formation, or Pulumi for reproducible infrastructure Create scripts and automation tools to streamline routine DevOps tasks **Monitoring, Logging & Security:** Implement and maintain monitoring/logging tools (Cloud Watch, Datadog, ELK, etc.) Ensure adherence to best practices around IAM, secrets management, and compliance **Requirements** 3-5+ years of experience in DevOps or a similar role Expert-level experience with **Amazon Web Services (AWS)** Strong command of **Git** and Git-based CI/CD practices Experience building and supporting solutions using **Twilio APIs** (SMS, Voice, Programmable Messaging, etc.) Proficiency in scripting languages (Bash, Python, etc.) Hands-on experience with containerization (Docker) and orchestration tools (ECS, EKS, Kubernetes) Familiarity with Agile/Scrum workflows and collaborative development environments **Preferred Qualifications** AWS Certifications (e.g., Solutions Architect, DevOps Engineer) Experience with serverless frameworks and event-driven architectures Previous work with other communication platforms (e.g., SendGrid, Nexmo) a plus Knowledge of RESTful API development and integration Experience working in high-availability, production-grade systems

Posted 2 months ago

Apply

5.0 - 10.0 years

7 - 17 Lacs

Hyderabad, Pune, Chennai

Work from Office

Airflow Data Engineer in AWS platform Job Title Apache Airflow Data Engineer ROLE” as per TCS Role Master • 4-8 years of experience in AWS, Apache Airflow (on Astronomer platform), Python, Pyspark, SQL • Good hands-on knowledge on SQL and Data Warehousing life cycle is an absolute requirement. • Experience in creating data pipelines and orchestrating using Apache Airflow • Significant experience with data migrations and development of Operational Data Stores, Enterprise Data Warehouses, Data Lake and Data Marts. • Good to have: Experience with cloud ETL and ELT in one of the tools like DBT/Glue/EMR or Matillion or any other ELT tool • Excellent communication skills to liaise with Business & IT stakeholders. • Expertise in planning execution of a project and efforts estimation. • Exposure to working in Agile ways of working. Candidate for this position to be offered with TAIC or TCSL as Entity Data warehousing , pyspark , Github, AWS data platform, Glue, EMR, RedShift, databricks,Data Marts. DBT/Glue/EMR or Matillion, data engineering, data modelling, data consumption

Posted 2 months ago

Apply

6.0 - 10.0 years

20 - 25 Lacs

Bengaluru

Work from Office

We are looking for a skilled and proactive AWS Operational Support Analyst to join our cloud infrastructure team. The ideal candidate will be responsible for monitoring, maintaining, and improving the performance, security, and reliability of AWS-hosted environments. This role is essential in ensuring uninterrupted cloud operations and supporting DevOps, development, and business teams with cloud-related issues. Key Responsibilities Monitor AWS cloud infrastructure for performance, availability, and operational issues. Manage incident response, root cause analysis, and resolution of infrastructure-related issues. Execute daily operational tasks including backups, system patching, and performance tuning. Collaborate with DevOps and engineering teams to ensure smooth CI/CD operations. Maintain system documentation and support knowledge base. Automate routine tasks using shell scripts or AWS tools (e.g., Lambda, Systems Manager). Manage AWS services such as EC2, RDS, S3, CloudWatch, IAM, and VPC. Implement cloud cost-optimization practices and security compliance controls. Perform health checks, generate reports, and suggest performance improvements.

Posted 2 months ago

Apply

8.0 - 13.0 years

10 - 15 Lacs

Bengaluru

Work from Office

Qualification & Experience: Minimum of 8 years of experience as a Data Scientist/Engineer with demonstrated expertise in data engineering and cloud computing technologies. Technical Responsibilities Excellent proficiency in Python, with a strong focus on developing advanced skills. Extensive exposure to NLP and image processing concepts. Proficient in version control systems like Git. In-depth understanding of Azure deployments. Expertise in OCR, ML model training, and transfer learning. Experience working with unstructured data formats such as PDFs, DOCX, and images. O Strong familiarity with data science best practices and the ML lifecycle. Strong experience with data pipeline development, ETL processes, and data engineering tools such as Apache Airflow, PySpark, or Databricks. Familiarity with cloud computing platforms like Azure, AWS, or GCP, including services like Azure Data Factory, S3, Lambda, and BigQuery. Tool Exposure: Advanced understanding and hands-on experience with Git, Azure, Python, R programming and data engineering tools such as Snowflake, Databricks, or PySpark. Data mining, cleaning and engineering: Leading the identification and merging of relevant data sources, ensuring data quality, and resolving data inconsistencies. Cloud Solutions Architecture: Designing and deploying scalable data engineering workflows on cloud platforms such as Azure, AWS, or GCP. Data Analysis : Executing complex analyses against business requirements using appropriate tools and technologies. Software Development : Leading the development of reusable, version-controlled code under minimal supervision. Big Data Processing : Developing solutions to handle large-scale data processing using tools like Hadoop, Spark, or Databricks. Principal Duties & Key Responsibilities: Leading data extraction from multiple sources, including PDFs, images, databases, and APIs. Driving optical character recognition (OCR) processes to digitize data from images. Applying advanced natural language processing (NLP) techniques to understand complex data. Developing and implementing highly accurate statistical models and data engineering pipelines to support critical business decisions and continuously monitor their performance. Designing and managing scalable cloud-based data architectures using Azure, AWS, or GCP services. Collaborating closely with business domain experts to identify and drive key business value drivers. Documenting model design choices, algorithm selection processes, and dependencies. Effectively collaborating in cross-functional teams within the CoE and across the organization. Proactively seeking opportunities to contribute beyond assigned tasks. Required Competencies: Exceptional communication and interpersonal skills. Proficiency in Microsoft Office 365 applications. Ability to work independently, demonstrate initiative, and provide strategic guidance. Strong networking, communication, and people skills. Outstanding organizational skills with the ability to work independently and as part of a team. Excellent technical writing skills. Effective problem-solving abilities. Flexibility and adaptability to work flexible hours as required. Key competencies / Values: Client Focus : Tailoring skills and understanding client needs to deliver exceptional results. Excellence : Striving for excellence defined by clients, delivering high-quality work. Trust : Building and retaining trust with clients, colleagues, and partners. Teamwork : Collaborating effectively to achieve collective success. Responsibility : Taking ownership of performance and safety, ensuring accountability. People : Creating an inclusive environment that fosters individual growth and development.

Posted 2 months ago

Apply

8.0 - 10.0 years

8 - 18 Lacs

Gurugram

Work from Office

Oracle PL SQL Key Responsibilities: Develop and maintain complex PL/SQL procedures, packages, triggers, functions, and views in Oracle. Hands on In PostgreSQL and AWS is must Migrate and refactor PL/SQL logic and data from Oracle to PostgreSQL. Optimize SQL queries and ensure high-performance database access across platforms. Design and implement data models, schemas, and stored procedures in PostgreSQL. Work closely with application developers, data architects, and DevOps teams to ensure seamless database integration with applications. Develop scripts and utilities to automate database tasks, backups, and monitoring using AWS services. Leverage AWS cloud services such as RDS, S3, Lambda, and Glue for data processing and storage. Participate in code reviews, performance tuning, and troubleshooting of database-related issues. Ensure data integrity, consistency, and security across environments. Interested candidate, share the Resume (madhumithak@sightspectrum.in)

Posted 2 months ago

Apply

5.0 - 10.0 years

20 - 25 Lacs

Bengaluru, Mumbai (All Areas)

Work from Office

Designation : Python + AWS Experience : 5+ Years Work Location : Bangalore / Mumbai Notice Period: Immediate Joiners/ Serving Notice Period Job Description : Mandatory Skills: Python Data structures pandas, numpy Data Operations - DataFrames, Dict, JSON, Lists, Tuples, Strings Oops & APIs(Flask/FastAPI) AWS services(IAM, EC2, Lambda, S3, DynamoDB, etc) Sincerely, Sonia TS

Posted 2 months ago

Apply

5.0 - 8.0 years

12 - 22 Lacs

Bengaluru

Work from Office

Role & responsibilities - Manage and monitor AWS cloud infrastructure, including EC2, S3, VPC, RDS, Lambda, and more. - Implement and maintain Ubuntu Linux servers and applications. - Monitor system performance, conduct backups, and address potential issues. - Set up and maintain MySQL databases, optimizing performance and ensuring data integrity. - Collaborate with development teams to design, develop, and deploy secure cloud-based applications. - Implement and maintain cloud security best practices. - Provide technical support and guidance on cloud infrastructure and related technologies. - Stay updated on industry trends and best practices. Preferred candidate profile - Bachelor's degree in Computer Science, IT, or related field. - 5-8 years of overall experience, with a minimum of 3 years in AWS cloud services. - Strong Ubuntu Linux administration skills. - Familiarity with AWS services and cloud security best practices. - Strong problem-solving skills and the ability to work independently and in a team. - Excellent communication skills. - Basic understanding of MySQL database administration is a plus. - Relevant AWS certifications are a plus.

Posted 2 months ago

Apply

5.0 - 10.0 years

7 - 14 Lacs

Chennai, Bengaluru

Work from Office

Job Summary Synechron is seeking a skilled Full Stack Developer to join our innovative technology team. This position focuses on designing, developing, and maintaining high-performance, scalable web applications using Next.js and related modern technologies. As a key contributor, you will collaborate with cross-disciplinary teams to deliver responsive and user-centric solutions that support the organizations digital growth and strategic objectives. Your expertise will help ensure the delivery of seamless, secure, and efficient web experiences for our clients and stakeholders. Software Requirements Required Skills and Experience: Proficiency in Next.js, React, and modern JavaScript/TypeScript frameworks Strong experience with .NET Core, C#, and building scalable web APIs Hands-on experience designing and consuming GraphQL APIs Practical knowledge of AWS services such as EC2, S3, Lambda, and RDS Familiarity with version control systems, particularly Git Experience with CI/CD pipelines and automation tools like Jenkins or TeamCity Working knowledge of Agile frameworks and tools such as Jira and Confluence Preferred Skills: Containerization skills with Docker and Kubernetes Knowledge of testing frameworks for unit and integration testing Understanding of security best practices and data protection regulations Overall Responsibilities Develop, enhance, and maintain web applications leveraging Next.js for front-end and .NET for back-end components Build, optimize, and consume RESTful and GraphQL APIs to enable efficient data exchange Deploy, monitor, and scale cloud-based applications using AWS services, ensuring high availability and performance standards Collaborate actively with UX/UI designers, product managers, and fellow developers to deliver high-quality solutions Participate in code reviews, pair programming, and the adoption of best coding practices Continuously evaluate emerging technologies and recommend improvements for application architecture and performance Contribute to project planning, documentation, and technical decision-making for application features and integrations Technical Skills (By Category) Programming Languages: Required: JavaScript (including TypeScript), C# Preferred: Additional JavaScript frameworks/libraries, such as Redux or MobX Databases / Data Management: Required: Experience with relational databases such as MSSQL or Oracle, and NoSQL solutions like MongoDB Cloud Technologies: Required: AWS (EC2, S3, Lambda, RDS) Preferred: Azure cloud platform expertise Frameworks and Libraries: Required: Next.js, React Preferred: State management libraries, testing frameworks like Jest or Mocha Development Tools and Methodologies: Required: Git, CI/CD tools (Jenkins, TeamCity), version control practices Preferred: Containerization with Docker, orchestration with Kubernetes Other: Familiarity with Agile/Scrum processes using Jira and Confluence Security & Compliance: Understanding of secure coding practices, data privacy, and compliance regulations relevant to web development Experience Requirements 5 to 12 years of experience in full-stack web development, with demonstrable expertise in Next.js and .NET technologies Proven track record in developing scalable, production-grade web applications Experience working within Agile environments, participating in sprint planning and continuous delivery Industry experience in fintech, e-commerce, or enterprise solutions is a plus but not mandatory Prior leadership or mentoring experience is advantageous Day-to-Day Activities Architect, develop, and maintain feature-rich, responsive web applications Collaborate with cross-functional teams on feature design, implementation, and testing Develop and optimize APIs and facilitate data integration across systems Conduct code reviews, unit testing, and performance tuning to ensure code quality Manage deployment processes and monitor application health in cloud environments Engage in regular stand-ups, planning sessions, and technical discussions Identify, troubleshoot, and resolve software defects and performance issues promptly Qualifications Bachelors or Masters degree in Computer Science, Software Engineering, Information Technology, or related field Certifications in cloud technologies (e.g., AWS Certified Solutions Architect) or web development are a plus Evidence of continuous learning through industry certifications, courses, or self-driven projects Strong portfolio demonstrating previous work with Next.js, React, and cloud-based application deployment Professional Competencies Strong analytical and problem-solving skills to address complex technical challenges Effective communication and stakeholder management abilities Leadership qualities in mentoring team members and driving technical discussions Ability to adapt quickly to changing project requirements and technological advances Innovation-driven mindset to explore new tools, frameworks, and best practices Strong organizational skills for managing multiple tasks and meeting deadlines

Posted 2 months ago

Apply
cta

Start Your Job Search Today

Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.

Job Application AI Bot

Job Application AI Bot

Apply to 20+ Portals in one click

Download Now

Download the Mobile App

Instantly access job listings, apply easily, and track applications.

Featured Companies