Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
12 - 22 Lacs
Pune, Chennai, Bengaluru
Hybrid
Role : Gen AI developer/ AI ML / ML operations/Data science Experience: 4 Years - 11 Years Locations: Bangalore/Chennai/Pune/Kolkata Notice Period: Immediate to 30 Days Mandatory Skills : Gen AI, LLM, RAG, Lang chain, Mistral,Llama, Vector DB, Azure/GCP/ Lambda, Python, Tensorflow, Pytorch Preferred Skills : GPT-4, NumPy, Pandas, Keras, Databricks, Pinecone/Chroma/Weaviate, Scale/Labelbox, Job Description / Roles & Responsibilities (in Detail) : We are looking for a good Python Developer with Knowledge of Machine learning and deep learning framework. Take care of entire prompt life cycle like prompt design, prompt template creation, prompt tuning/optimization for various GenAI base models Design and develop prompts suiting project needs Stakeholder management across business and domains as required for the projects Evaluating base models and benchmarking performance Implement prompt guardrails to prevent attacks like prompt injection, jail braking and prompt leaking Develop, deploy and maintain auto prompt solutions Design and implement minimum design standards for every use case involving prompt engineering You will be responsible for training the machine learning and deep learning model. Writing reusable, testable, and efficient code using Python Design and implementation of low-latency, high-availability, and performant applications Implementation of security and data protection Integration of data storage solutions and API Gateways Production change deployment and related support Interested candidates can share their updated CV to pravallika@wrootsglobal.in
Posted 13 hours ago
3.0 - 6.0 years
20 - 30 Lacs
Bengaluru
Work from Office
Job Title: Data Engineer II (Python, SQL) Experience: 3 to 6 years Location: Bangalore, Karnataka (Work from office, 5 days a week) Role: Data Engineer II (Python, SQL) As a Data Engineer II, you will work on designing, building, and maintaining scalable data pipelines. Youll collaborate across data analytics, marketing, data science, and product teams to drive insights and AI/ML integration using robust and efficient data infrastructure. Key Responsibilities: Design, develop and maintain end-to-end data pipelines (ETL/ELT). Ingest, clean, transform, and curate data for analytics and ML usage. Work with orchestration tools like Airflow to schedule and manage workflows. Implement data extraction using batch, CDC, and real-time tools (e.g., Debezium, Kafka Connect). Build data models and enable real-time and batch processing using Spark and AWS services. Collaborate with DevOps and architects for system scalability and performance. Optimize Redshift-based data solutions for performance and reliability. Must-Have Skills & Experience: 3+ years in Data Engineering or Data Science with strong ETL and pipeline experience. Expertise in Python and SQL . Strong experience in Data Warehousing , Data Lakes , Data Modeling , and Ingestion . Working knowledge of Airflow or similar orchestration tools. Hands-on with data extraction techniques like CDC , batch-based, using Debezium, Kafka Connect, AWS DMS . Experience with AWS Services : Glue, Redshift, Lambda, EMR, Athena, MWAA, SQS, etc. Knowledge of Spark or similar distributed systems. Experience with queuing/messaging systems like SQS , Kinesis , RabbitMQ .
Posted 14 hours ago
6.0 - 10.0 years
8 - 18 Lacs
Zirakpur
Work from Office
• AWS Services ( Lambda, Glue, S3, Dynamo, EventBridge, Appsync, Open search) • Terraform • Python- • React/Vite • Unit testing (Jest, Pytest) • Software development lifecycle
Posted 17 hours ago
5.0 - 10.0 years
15 - 30 Lacs
Hyderabad
Hybrid
Dear Applicants, Astrosoft Technologies is back with Exciting Job opportunity Join our Growing Team & be a part to explore it. We are Hiring 'Sr. AWS Data Engineer' ! Hyderabad (Hybrid) ! 3 Tech Rounds ! Early Joiners ! Apply Here to Email - Karthik.Jangam@astrosofttech.com with your Updated Resume & Requested details to reach you out: Total Experience- AWS DE Services Current Location- Current Company- C-CTC- Ex-CTC – Offer (Y/N)– Notice Period (Max-20 Days)– Ready to Relocate Hyderabad (Y/N) – Company: AstroSoft Technologies (https://www.astrosofttech.com/) Astrosoft is an award-winning company that specializes in the areas of Data, Analytics, Cloud, AI/ML, Innovation, Digital. We have a customer first mindset and take extreme ownership in delivering solutions and projects for our customers and have consistently been recognized by our clients as the premium partner to work with. We bring to bear top tier talent, a robust and structured project execution framework, our significant experience over the years and have an impeccable record in delivering solutions and projects for our clients. Founded in 2004, Headquarters in FL,USA, Corporate Office - India, Hyderabad Benefits Program: H1B Sponsorship (Depends on Project & Performance) Lunch & Dinner (Every day) Health Insurance Coverage- Group Industry Standards Leave Policy Skill Enhancement Certification Hybrid Mode Job Summary: Strong experience and understanding of streaming architecture and development practices using Kafka, spark, flink etc., Strong AWS development experience using S3, SNS, SQS, MWAA (Airflow) Glue, DMS and EMR. Strong knowledge of one or more programing languages Python/Java/Scala (ideally Python) Experience using Terraform to build IAC components in AWS. Strong experience with ETL Tools in AWS; ODI experience is as plus. Strong experience with Database Platforms: Oracle, AWS Redshift Strong experience in SQL tuning, tuning ETL solutions, physical optimization of databases. Very familiar with SRE concepts which includes evaluating and implementing monitoring and observability tools like Splunk, Data Dog, CloudWatch and other job, log or dashboard concepts for customer support and application health checks. Ability to collaborate with our business partners to understand and implement their requirements. Excellent interpersonal skills and be able to build consensus across teams. Strong critical thinking and ability to think out-of-the box. Self-motivated and able to perform under pressure. AWS certified (preferred) Thanks & Regards Karthik Kumar (HR TAG Lead -India) Astrosoft Technologies India Private Limited Contact: +91-8712229084 Email: karthik.jangam@astrosofttech.com
Posted 18 hours ago
5.0 - 10.0 years
13 - 16 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
Job Title: Sr. Full Stack Engineer (React / Node.js / Nest.js / AWS / Fabric.js) Location: Remote (CET time zone overlap required) Notice Period: Immediate Type: Full-Time, Long-Term iSource Services is hiring for one of their USA based client for the position of Sr. Full Stack Engineer (React / Node.js / Nest.js / AWS / Fabric.js). About the Role - We are seeking an experienced Senior Full Stack Engineer for a long-term engagement on a functioning SaaS product in the manufacturing domain. The ideal candidate will have a strong foundation in both frontend and backend development and be comfortable working independently within a remote setup. This position provides an exciting opportunity to contribute to real-world AI and computer vision applications while working with a funded product that has already boarded a solid customer base. Project Overview A functional SaaS product operating in the manufacturing industry The platform has onboarded customers and secured initial funding Involvement in computer vision tasks integrated into real-world use cases Opportunity to work on a full-stack architecture and gain exposure to AI challenges Required Skills & Qualifications: Strong expertise in React.js (minimum 3 years in production-level projects) Advanced knowledge of Node.js and Nest.js (back-end frameworks) Practical experience with AWS (such as EC2, S3, Lambda, etc.) Relevant hands-on experience using Fabric.js for canvas-based UI Exposure to Python for computer vision (e.g., OpenCV, image processing) Strong version control practices using GitHub (GitHub profile required) Communication: Good command of English, both written and verbal Ability to communicate clearly in a remote environment Skills : - Full stack engineer,React.js,Node.js, nest.js,AWS, saas product ,AI application, computer vision, Full Stack Engineer, React.js, Node.js, Nest.js, AWS (EC2, S3, Lambda, etc.), Fabric.js (canvas-based UI), Python, GitHub, Frontend and backend development, SaaS product experience, AI application, computer vision.
Posted 3 days ago
3.0 - 4.0 years
20 - 25 Lacs
Mumbai, Delhi / NCR, Bengaluru
Work from Office
3-4 years hands-on experience on AWS services, ideally SaaS in the cloud Experience developing solutions with code/scripting language must have Python experience (e.g, python, Node.js) Experience in creating and configuring AWS resources like API Gateway, CloudWatch, Cloud-Formation, EC2, Lambda, Amazon Connect, SNS, Athena, Glue, VPC etc.Sourcing & Screening US profiles Location-Delhi NCR,Bangalore,Chennai,Pune,Kolkata,Ahmedabad,Mumbai,Hyderabad ,Remote
Posted 3 days ago
5.0 - 10.0 years
10 - 20 Lacs
Chennai, Bengaluru
Work from Office
Please share profiles Key Responsibilities • Application Development: Design, develop, and maintain dealership applications using Java, Spring Boot, and Microservices architecture. • Cloud Deployment: Build and manage cloud-native applications on AWS, leveraging services such as Lambda, ECS, RDS, S3, DynamoDB, and API Gateway. • System Architecture: Develop and maintain scalable, high-availability architectures to support large-scale dealership operations. • Database Management: Work with relational and NoSQL databases like PostgreSQL, and DynamoDB for efficient data storage and retrieval. • Integration & APIs: Develop and manage RESTful APIs and integrate with third-party services, including OEMs and dealership management systems (DMS). • DevOps & CI/CD: Implement CI/CD pipelines using tools like Jenkins, GitHub Actions, or AWS CodePipeline for automated testing and deployments. • Security & Compliance: Ensure application security, data privacy, and compliance with industry regulations. • Collaboration & Mentorship: Work closely with product managers, designers, and other engineers. Mentor junior developers to maintain best coding practices. Technical Skills Required • Programming Languages: Java (8+), Spring Boot, Hibernate • Cloud Platforms: AWS (Lambda, S3, EC2, ECS, RDS, DynamoDB, CloudFormation) • Microservices & API Development: RESTful APIs, API Gateway • Database Management: PostgreSQL, DynamoDB • DevOps & Automation: Docker, Kubernetes, Terraform, CI/CD pipelines • Testing & Monitoring: JUnit, Mockito, Prometheus, CloudWatch • Version Control & Collaboration: Git, GitHub, Jira, Confluence Nice-to-Have Skills • Experience with serverless architectures (AWS Lambda, Step Functions) • Exposure to event-driven architectures (Kafka, SNS, SQS) Qualifications • Bachelor's or masters degree in computer science, Software Engineering, or a related field • 5+ years of hands-on experience in Java-based backend development • Strong problem-solving skills with a focus on scalability and performance.
Posted 3 days ago
4.0 - 7.0 years
0 - 1 Lacs
Bengaluru
Hybrid
Job Requirements Job Description: AWS Developer Quest Global, a leading global technology and engineering services company, is seeking an experienced AWS Developer to join our team. As an AWS Developer, you will play a key role in designing, developing, and maintaining cloud-based applications using Amazon Web Services (AWS) and Java development skills. Responsibilities: - Designing, developing, and deploying scalable and reliable cloud-based applications on AWS platform. - Collaborating with cross-functional teams to gather requirements and translate them into technical solutions. - Writing clean, efficient, and maintainable code using Java programming language. - Implementing best practices for security, scalability, and performance optimization. - Troubleshooting and resolving issues related to AWS infrastructure and applications. - Conducting code reviews and providing constructive feedback to ensure code quality. - Keeping up-to-date with the latest AWS services, tools, and best practices. Join our dynamic team at Quest Global and contribute to the development of cutting-edge cloud-based applications using AWS and Java. Apply now and take your career to new heights! Note: This job description is intended to provide a general overview of the position and does not encompass all the tasks and responsibilities that may be assigned to the role. Work Experience Requirements: - Bachelor's degree in Computer Science, Engineering, or a related field. - Minimum 5 years of experience as an AWS Developer or similar role. - Strong proficiency in Java programming language. - In-depth knowledge of AWS services such as EC2, S3, Lambda, RDS, DynamoDB, etc. - Experience with cloud-based application development and deployment. - Familiarity with DevOps practices and tools. - Excellent problem-solving and analytical skills. - Strong communication and collaboration abilities.
Posted 4 days ago
3.0 - 8.0 years
5 - 13 Lacs
Chennai, Bengaluru
Work from Office
Job Description Role Snowflake DevOps Engineer Visit our website bmwtechworks.in to know more. Follow us on LinkedIn I Instagram I Facebook I X for the exciting updates. Location Bangalore/ Chennai Experience: 2 to 8 years Number of openings 4 What awaits you/ Job Profile Supporting Snowflakes BMW side Use Case Customers Consulting for Use Case Onboarding Monitoring of Data Synchronization jobs (Reconciliation between Cloud Data Hub and Snowflake) Cost Monitoring Reports to Use Cases Further technical implementations like M2M Authentication for applications, data traffic over VPC Integrate Snowflake to Use Case application process within Data Portal (automated Use Case Setup triggered by Data Portal) Technical Documentation Executing standard service requests (service user lifecycle etc.) Compiling of user and operational manuals Organizing and documenting knowledge regarding incidents/customer cases in a knowledge base Enhancing and editing process documentation Ability and willingness to coach and give training to fellow colleagues and users when Ability to resolve 2nd level incidents within the Data Analytics Platform (could entail basic code changes) Close collaboration with 3rd Level Support/Development and SaaS vendor teams Implementation of new development changes and assist and contribute to the development needs. What should you bring along Strong understanding and experience with Python AWS IAM, S3, KMS, Glue, Cloudwatch. Github Understanding of APIs Understanding of Software Development and background in Business Intelligence SQL (Queries, DDL, Materialized Views, Tasks, Procedures, Optimization) Any Data Portal or Cloud Data Hub Experience A technical background in operating and supporting IT Platforms IT Service Management (according to ITIL) 2nd Level Support Strong understanding of Problem, Incident and Change processes High Customer Orientation Working in a highly complex environment (many stakeholders, multi-platform/product environment, mission-critical use cases, high business exposure, complex ticket routing) Flexible communication on multiple support channels (ITSM, Teams, email) Precise and diligent execution of ops processes Working OnCall (Standby) Mindset of Continuous Learning (highly complex software stack with changing features) Proactive in Communication Must have technical skill Snowflake, Python, Lambda, IAM, S3, KMS, Glue, CloudWatch, Terraform, Scrum Good to have Technical skills AWS VPC, Route53, Bridgeevent, SNS, Cloudtrail, Confluence, Jira
Posted 4 days ago
7.0 - 10.0 years
45 - 50 Lacs
Pune
Work from Office
Requirements: Our client is seeking a highly skilled Technical Project Manager (TPM) with strong hands-on experience in full-stack development and cloud infrastructure to lead the successful planning, execution, and delivery of technical projects. The ideal candidate will have a strong background in React, Java, Spring Boot, Python, and AWS, and will work closely with cross-functional teams including developers, QA, DevOps, and product stakeholders. As a TPM, you will play a critical role in bridging technical and business objectives, ensuring timelines, quality, and scalability across complex software projects. Responsibilities : - Own and drive the end-to-end lifecycle of technical projects-from initiation to deployment and post-launch support. - Collaborate with development teams and stakeholders to define project scope, goals, deliverables, and timelines. - Act as a hands-on contributor when needed, with the ability to guide and review code and architecture decisions. - Coordinate cross-functional teams across front-end (React), back-end (Java/Spring Boot, Python), and AWS cloud infrastructure. - Manage risk, change, and issue resolution in a fast-paced agile environment. - Ensure projects follow best practices around version control, CI/CD, testing, deployment, and monitoring. - Deliver detailed status updates, sprint reports, and retrospectives to leadership and stakeholders. Required Qualifications : - IIT /NIT graduate with 5+ years of experience in software engineering, with at least 2 years in a technical project management role. - Hands-on expertise in : React Java & Spring Boot Python AWS (EC2, S3, Lambda, CloudWatch, etc.) - Experience leading agile/Scrum teams with strong understanding of software development lifecycles. - Excellent communication, organizational, and interpersonal skills. Desired Profile : - Experience designing and managing Microservices architectures. - Familiarity with Kafka or other messaging systems. - Knowledge of CI/CD pipelines, deployment strategies, and application monitoring tools (e.g., Prometheus, Grafana, CloudWatch). - Experience with containerization tools like Docker and orchestration platforms like Kubernetes.
Posted 4 days ago
2.0 - 5.0 years
2 - 7 Lacs
Bengaluru
Work from Office
o Deploy applications on AWS using services such as EC2, ECS, S3, RDS, or Lambda o Implement CI/CD pipelines using GitHub Actions, Jenkins, or CodePipeline o Apply DevSecOps best practices including container security (Docker, ECR), infrastructure as code (Terraform), and runtime monitoring Team Collaboration & Agility o Participate in Agile ceremonies (stand-ups, sprint planning, retros) o Work closely with product, design, and AI engineers to build secure and intelligent systems
Posted 4 days ago
4.0 - 9.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Qualifications: 4+ years of software development experience, including work with cloud technologies. Bachelors or Master’s degree in Computer Science, Engineering, or equivalent experience. Proficiency in one or more modern programming languages (e.g., Python, Java, Go, NodeJS). Experience with cloud platforms such as AWS, Azure, or Google Cloud. Experience with microservices architecture, distributed systems, and event-driven design. Expertise in designing and consuming RESTful APIs and familiarity with GraphQL Hands one experience with CI/CD pipelines, infrastructure as a code (e.g. Terraform, CloudFormation) and automated deployments. Strong understanding of relational and NoSQL databases. Knowledge of SaaS specific security practices (e.g. OWASP, data encryption, identity management). Strong understanding of software development methodologies and tools. Familiarity with containerization (Docker) and orchestration (Kubernetes). Knowledge of monitoring and logging tools Experience with distributed systems and data-intensive applications.
Posted 4 days ago
6.0 - 9.0 years
14 - 22 Lacs
Pune, Chennai
Work from Office
Hiring For Top IT Company- Designation: Python Developer Skills: AWS SDK +AI services integration Location :Pune/Chennai Exp: 6-8 yrs Best CTC Surbhi:9887580624 Anchal:9772061749 Gitika:8696868124 Shivani:7375861257 Team Converse
Posted 5 days ago
4.0 - 8.0 years
10 - 20 Lacs
Hyderabad, Chennai
Work from Office
Roles & Responsibilities : • We are looking for a strong Senior Data Engineering who will be majorly responsible for designing, building and maintaining ETL/ ELT pipelines . • Integration of data from multiple sources or vendors to provide the holistic insights from data. • You are expected to build and manage Data Lake and Data warehouse solutions, design data models, create ETL processes, implementing data quality mechanisms etc. • Perform EDA (exploratory data analysis) required to troubleshoot data related issues and assist in the resolution of data issues. • Should have experience in client interaction oral and written. • Experience in mentoring juniors and providing required guidance to the team. Required Technical Skills • Extensive experience in languages such as Python, Pyspark, SQL (basics and advanced). • Strong experience in Data Warehouse, ETL, Data Modelling, building ETL Pipelines, Data Architecture . • Must be proficient in Redshift, Azure Data Factory, Snowflake etc. • Hands-on experience in cloud services like AWS S3, Glue, Lambda, CloudWatch, Athena etc. • Good to have knowledge in Dataiku, Big Data Technologies and basic knowledge of BI tools like Power BI, Tableau etc will be plus. • Sound knowledge in Data management, data operations, data quality and data governance. • Knowledge of SFDC, Waterfall/ Agile methodology. • Strong knowledge of Pharma domain / life sciences commercial data operations. Qualifications • Bachelors or masters Engineering/ MCA or equivalent degree. • 4-6 years of relevant industry experience as Data Engineer . • Experience working on Pharma syndicated data such as IQVIA, Veeva, Symphony; Claims, CRM, Sales, Open Data etc. • High motivation, good work ethic, maturity, self-organized and personal initiative. • Ability to work collaboratively and providing the support to the team. • Excellent written and verbal communication skills. • Strong analytical and problem-solving skills. Location • Preferably Hyderabad/ Chennai, India
Posted 5 days ago
6.0 - 10.0 years
25 - 30 Lacs
Noida, and Remote
Work from Office
Job Title: Full Stack Software Developer Experience Required: 6+ Years Location: [Noida / Remote] Employment Type: Full-Time Job Summary We are seeking a talented and motivated Full Stack Software Developer with 6+ years of experience to join our dynamic team. The ideal candidate should be highly skilled in React and Node.js, with a solid grasp of GraphQL and AWS being a significant advantage. You will be instrumental in designing, developing, and maintaining scalable, efficient, and user-centric applications across the entire technology stack. Key Responsibilities Design & Development: Build, deploy, and maintain robust front-end and back-end applications using React and Node.js. API Integration: Create and consume RESTful and GraphQL APIs to support dynamic client-server interactions. System Architecture: Contribute to the design of scalable and maintainable software systems. Cloud Integration: Leverage AWS services (e.g., Lambda, S3, EC2) to host and scale applications efficiently. Collaboration: Work closely with cross-functional teams including product managers, designers, and other developers. Code Quality: Maintain clean, testable, and maintainable code following best practices. Troubleshooting: Diagnose and resolve issues across the stack to ensure high performance and reliability. Skills and Qualifications Required: Strong proficiency in JavaScript/TypeScript, React, and Node.js. Solid understanding of front-end development concepts (state management, component lifecycle, performance tuning). Experience working with REST and/or GraphQL APIs. Familiarity with relational databases like PostgreSQL or similar. Excellent problem-solving abilities and experience in Agile development environments. Preferred: Hands-on experience with GraphQL and tools like Apollo. Working knowledge of AWS services such as EC2, S3, Lambda, API Gateway, and DynamoDB. Experience with CI/CD tools (e.g., GitHub Actions, Jenkins). Understanding of automated testing using frameworks like Jest, Cypress, etc.
Posted 5 days ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Work from Office
The Core AI BI & Data Platforms Team has been established to create, operate and run the Enterprise AI, BI and Data that facilitate the time to market for reporting, analytics and data science teams to run experiments, train models and generate insights as well as evolve and run the CoCounsel application and its shared capability of CoCounsel AI Assistant. The Enterprise Data Platform aims to provide self service capabilities for fast and secure ingestion and consumption of data across TR. At Thomson Reuters, we are recruiting a team of motivated Cloud professionals to transform how we build, manage and leverage our data assets. The Data Platform team in Bangalore is seeking an experienced Software Engineer with a passion for engineering cloud-based data platform systems. Join our dynamic team as a Software Engineer and take a pivotal role in shaping the future of our Enterprise Data Platform. You will develop and implement data processing applications and frameworks on cloud-based infrastructure, ensuring the efficiency, scalability, and reliability of our systems. About the Role In this opportunity as the Software Engineer, you will: Develop data processing applications and frameworks on cloud-based infrastructure in partnership withData Analysts and Architects with guidance from Lead Software Engineer. Innovatewithnew approaches to meet data management requirements. Make recommendations about platform adoption, including technology integrations, application servers, libraries, and AWS frameworks, documentation, and usability by stakeholders. Contribute to improving the customer experience. Participate in code reviews to maintain a high-quality codebase Collaborate with cross-functional teams to define, design, and ship new features Work closely with product owners, designers, and other developers to understand requirements and deliver solutions. Effectively communicate and liaise across the data platform & management teams Stay updated on emerging trends and technologies in cloud computing About You You're a fit for the role of Software Engineer, if you meet all or most of these criteria: Bachelor's degree in Computer Science, Engineering, or a related field 3+ years of relevant experience in Implementation of data lake and data management of data technologies for large scale organizations. Experience in building & maintaining data pipelines with excellent run-time characteristics such as low-latency, fault-tolerance and high availability. Proficient in Python programming language. Experience in AWS services and management, including Serverless, Container, Queueing and Monitoring services like Lambda, ECS, API Gateway, RDS, Dynamo DB, Glue, S3, IAM, Step Functions, CloudWatch, SQS, SNS. Good knowledge in Consuming and building APIs Business Intelligence tools like PowerBI Fluency in querying languages such as SQL Solid understanding in Software development practicessuch as version control via Git, CI/CD and Release management Agile development cadence Good critical thinking, communication, documentation, troubleshooting and collaborative skills.
Posted 5 days ago
4.0 - 6.0 years
7 - 9 Lacs
Bengaluru
Hybrid
We are looking for an experienced *Software Engineer - Informatica* with 4 to 6 years of hands-on expertise* in designing, developing, and optimizing large-scale *ETL solutions* using *Informatica PowerCenter*. The ideal candidate will lead ETL projects, mentor junior developers, and ensure high-performance data integration across enterprise systems. About The Role In this role as Software Engineer, you will: - Analyze business and functional requirements to design and implement scalable data integration solutions - Understand and interpret High-Level Design (HLD) documents and convert them into detailed Low-Level Design (LLD) - Develop robust, reusable, and optimized Informatica mappings, sessions, and workflows - Apply mapping optimization and performance tuning techniques to ensure efficient ETL processes - Conduct peer code reviews and suggest improvements for reliability and performance - Prepare and execute comprehensive unit test cases and support system/integration testing - Maintain detailed technical documentation, including LLDs, data flow diagrams, and test cases - Build data pipelines and transformation logic in Snowflake, ensuring performance and scalability - Develop and manage Unix shell scripts for automation, scheduling, and monitoring of ETL jobs - Collaborate with cross-functional teams to support UAT, deployments, and production issues. About You You are a fit for this position if your background includes: - 4-6 years of strong hands-on experience with Informatica PowerCenter - Proficient in developing and optimizing ETL mappings, workflows, and sessions - Solid experience with performance tuning techniques and best practices in ETL processes - Hands-on experience with Snowflake for data loading, SQL transformations, and optimization - Strong skills in Unix/Linux scripting for job automation - Experience in converting HLDs into LLDs and defining unit test cases - Knowledge of data warehousing concepts, data modelling, and data quality frameworks Good to Have - Knowledge of Salesforce data model and integration (via Informatica or API-based solutions) - Exposure to AWS cloud services like S3, Glue, Redshift, Lambda, etc. - Familiarity with relational databases such as SQL Server and PostgreSQL - Experience with job schedulers like Control-M, ESP, or equivalent - Agile methodology experience and tools such as JIRA, Confluence, and Git - Knowledge of DBT (Data Build Tool) for data transformation and orchestration - Experience with Python scripting for data manipulation, automation, or integration tasks.
Posted 5 days ago
10.0 - 15.0 years
11 - 20 Lacs
Bengaluru
Work from Office
About Client Hiring for One of the Most Prestigious Multinational Corporations Job Title: Senior AWS Engineer Experience: 10 to 15 years Key Responsibilities : Design and implement AWS-based infrastructure solutions for scalability, performance, and security. Lead cloud architecture discussions, guiding development and operations teams in best practices. Automate infrastructure provisioning using tools like Terraform, CloudFormation, or AWS CDK. Implement and manage CI/CD pipelines (e.g., Jenkins, CodePipeline, GitHub Actions). Ensure cost optimization, monitoring, and governance for AWS accounts. Collaborate with security teams to enforce compliance and governance policies across cloud environments. Handle migration of on-premise workloads to AWS cloud (rehost, replatform, refactor). Provide mentorship to junior engineers and participate in code reviews and design sessions. Maintain high availability, disaster recovery, and backup strategies. Stay updated with the latest AWS services and architecture trends. Technical Skills: Strong hands-on experience with core AWS services: EC2, S3, RDS, Lambda, IAM, VPC, CloudWatch, CloudTrail, ECS/EKS, etc. Expert in Infrastructure as Code (IaC) using Terraform , CloudFormation , or AWS CDK . Strong scripting and automation skills in Python , Bash , or Shell . Experience with containerization and orchestration tools ( Docker , Kubernetes /EKS). Solid understanding of networking, load balancing, and security concepts in the cloud. Experience with monitoring/logging tools like CloudWatch , Prometheus , Grafana , or ELK stack . Knowledge of DevOps and CI/CD tools (Jenkins, GitLab CI, AWS CodePipeline, etc.). Familiarity with Agile/Scrum methodologies NOTE : Only immediate and 15 days joiners Notice period : Only immediate and 15 days joiners Location: Bangalore Mode of Work : WFO(Work From Office) Thanks & Regards, SWETHA Black and White Business Solutions Pvt.Ltd. Bangalore,Karnataka,INDIA. Contact Number:8067432433 rathy@blackwhite.in |www.blackwhite.in
Posted 5 days ago
3.0 - 8.0 years
5 - 10 Lacs
Bengaluru
Work from Office
The Core AI BI & Data Platforms Team has been established to create, operate and run the Enterprise AI, BI and Data that facilitate the time to market for reporting, analytics and data science teams to run experiments, train models and generate insights as well as evolve and run the CoCounsel application and its shared capability of CoCounsel AI Assistant. The Enterprise Data Platform aims to provide self service capabilities for fast and secure ingestion and consumption of data across TR. At Thomson Reuters, we are recruiting a team of motivated Cloud professionals to transform how we build, manage and leverage our data assets. The Data Platform team in Bangalore is seeking an experienced Software Engineer with a passion for engineering cloud-based data platform systems. Join our dynamic team as a Software Engineer and take a pivotal role in shaping the future of our Enterprise Data Platform. You will develop and implement data processing applications and frameworks on cloud-based infrastructure, ensuring the efficiency, scalability, and reliability of our systems. About the Role In this opportunity as the Software Engineer, you will: Develop data processing applications and frameworks on cloud-based infrastructure in partnership with Data Analysts and Architects with guidance from Lead Software Engineer. Innovatewithnew approaches to meet data management requirements. Make recommendations about platform adoption, including technology integrations, application servers, libraries, and AWS frameworks, documentation, and usability by stakeholders. Contribute to improving the customer experience. Participate in code reviews to maintain a high-quality codebase Collaborate with cross-functional teams to define, design, and ship new features Work closely with product owners, designers, and other developers to understand requirements and deliver solutions. Effectively communicate and liaise across the data platform & management teams Stay updated on emerging trends and technologies in cloud computing About You You're a fit for the role of Software Engineer, if you meet all or most of these criteria: Bachelor's degree in Computer Science, Engineering, or a related field 3+ years of relevant experience in Implementation of data lake and data management of data technologies for large scale organizations. Experience in building & maintaining data pipelines with excellent run-time characteristics such as low-latency, fault-tolerance and high availability. Proficient in Python programming language. Experience in AWS services and management, including Serverless, Container, Queueing and Monitoring services like Lambda, ECS, API Gateway, RDS, Dynamo DB, Glue, S3, IAM, Step Functions, CloudWatch, SQS, SNS. Good knowledge in Consuming and building APIs Business Intelligence tools like PowerBI Fluency in querying languages such as SQL Solid understanding in Software development practicessuch as version control via Git, CI/CD and Release management Agile development cadence Good critical thinking, communication, documentation, troubleshooting and collaborative skills.
Posted 5 days ago
3.0 - 5.0 years
4 - 7 Lacs
Hyderabad
Work from Office
Data Analysis: Conduct in-depth analysis of data to identify trends, anomalies, and opportunities, utilizing SQL, AWS, and Python to extract and manipulate data. Business Transformation: Translate existing SQL queries into business transformation logics, enabling the conversion of raw data into actionable insights to drive strategic decision-making. Requirements Gathering: Collaborate with business stakeholders to gather and document. clear and concise business requirements, ensuring a thorough understanding of data needs. Documentation: Develop and maintain documentation related to data analysis, transformation, and reporting processes, ensuring knowledge transfer and continuity. AWS Integration: Leverage AWS services to facilitate data extraction, storage, and analysis, making data readily available for the business. Quality Assurance: Implement data quality checks and validation processes to ensure the accuracy and integrity of data used in analyses. Qualifications: Bachelors degree in business, Computer Science, or a related field. Proven experience as a Business Analyst with a strong focus on data analysis and transformation. Proficiency in SQL for querying and manipulating relational databases. Awareness of AWS services such as Redshift, S3, Athena, Lambda, Step Functions, AWS Batch Proficiency in Python for data analysis and scripting. Experience in converting SQL queries into actionable business transformation logics. Strong problem-solving and critical-thinking skills. Excellent communication and interpersonal skills to work effectively with cross-functional. teams and stakeholders. Attention to detail and a commitment to data accuracy and quality.
Posted 6 days ago
6.0 - 11.0 years
15 - 30 Lacs
Hyderabad
Work from Office
Design,develop,maintain scalable microservices using Java,Kotlin,Spring Boot. Build/ optimize data models and queries in MongoDB. Integrate and manage Apache Kafka for real-time data streaming and messaging. Implement CI/CD pipelines using Jenkins Required Candidate profile 6+ yrs of exp in backend development with Java/Spring Boot. Exp with Jenkins for CI/CD automation. Familiarity with AWS services (EC2, S3, Lambda, RDS, etc.) or OpenShift for container orchestration
Posted 6 days ago
8.0 - 12.0 years
25 - 40 Lacs
Chennai, Bengaluru, Delhi / NCR
Work from Office
AWS Admin exp involving design, Landing Zone deployment, Migration, and optimization Design and develop AWS cloud solutions Create architectural blueprints, diagrams, and documentation Hands on Exp on AWS Terraform and Cloud formation automation
Posted 6 days ago
5.0 - 10.0 years
15 - 22 Lacs
Pune, Chennai, Bengaluru
Hybrid
5-8 years of experience in backend development with a strong focus on Python. Develop and maintain serverless applications using AWS Lambda functions, DynamoDB, and other AWS services. Hands-on experience with Terraform
Posted 6 days ago
5.0 - 10.0 years
15 - 22 Lacs
Pune, Chennai, Bengaluru
Hybrid
EXP-5-12 Years NP-Immediate to 30-45 days if serving Python Developer+AWS(Lambda)+DynamoDB 5-8 years of experience in backend development with a strong focus on Python. Proven experience with AWS serverless technologies, including Lambda, DynamoDB
Posted 6 days ago
6.0 - 11.0 years
1 - 2 Lacs
Pune
Work from Office
Role & responsibilities Proficiency in Python and PySpark for data processing and transformation tasks. Solid experience with AWS Glue for ETL jobs and managing data workflows. Hands-on experience with AWS Data Pipeline (DPL) for workflow orchestration. Strong experience with AWS services such as S3, Lambda, Redshift, RDS, and EC2. Technical Skills: Deep understanding of ETL concepts and best practices.. Strong knowledge of SQL for querying and manipulating relational and semi-structured data. Experience with Data Warehousing and Big Data technologies, specifically within AWS. Additional Skills: Experience with AWS Lambda for serverless data processing and orchestration. Understanding of AWS Redshift for data warehousing and analytics. Familiarity with Data Lakes, Amazon EMR, and Kinesis for streaming data processing. Knowledge of data governance practices, including data lineage and auditing. Familiarity with CI/CD pipelines and Git for version control. Experience with Docker and containerization for building and deploying applications. Design and Build Data Pipelines: Design, implement, and optimize data pipelines on AWS using PySpark, AWS Glue, and AWS Data Pipeline to automate data integration, transformation, and storage processes. ETL Development: Develop and maintain Extract, Transform, and Load (ETL) processes using AWS Glue and PySpark to efficiently process large datasets. Data Workflow Automation: Build and manage automated data workflows using AWS Data Pipeline, ensuring seamless scheduling, monitoring, and management of data jobs. Data Integration: Work with different AWS data storage services (e.g., S3, Redshift, RDS) to ensure smooth integration and movement of data across platforms. Optimization and Scaling: Optimize and scale data pipelines for high performance and cost efficiency, utilizing AWS services like Lambda, S3, and EC2. Preferred candidate profile Skill Tech stack : AWS Data Engineer, Python, Pyspark, SQL, Data Pipeline, AWS, AWS Glue, lambda Experience: 6 - 8 Years, Location: Pune Notice Period Immediate to 1 week Joiner only
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Lambda jobs in India are on the rise, with many companies leveraging serverless computing for their applications. As more businesses adopt this technology, the demand for professionals with expertise in Lambda functions is increasing. Job seekers looking to break into this field have a wealth of opportunities in India.
The average salary range for lambda professionals in India varies based on experience level: - Entry-level: INR 4-6 lakhs per annum - Mid-level: INR 8-12 lakhs per annum - Experienced: INR 15-20 lakhs per annum
In the lambda job market, a typical career path may include: - Junior Developer - Developer - Senior Developer - Tech Lead
In addition to lambda expertise, professionals in this field are often expected to have skills in: - AWS (Amazon Web Services) - Python - Serverless computing - Microservices architecture - DevOps
As the demand for lambda professionals continues to grow in India, now is the perfect time to hone your skills and apply for exciting opportunities in this field. Prepare thoroughly for interviews, showcase your expertise, and apply with confidence to land your dream lambda job. Good luck!
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17069 Jobs | Dublin
Wipro
9221 Jobs | Bengaluru
EY
7581 Jobs | London
Amazon
5941 Jobs | Seattle,WA
Uplers
5895 Jobs | Ahmedabad
Accenture in India
5813 Jobs | Dublin 2
Oracle
5703 Jobs | Redwood City
IBM
5669 Jobs | Armonk
Capgemini
3478 Jobs | Paris,France
Tata Consultancy Services
3259 Jobs | Thane