Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
5.0 - 10.0 years
15 - 22 Lacs
Pune, Chennai, Bengaluru
Hybrid
EXP-5-12 Years NP-Immediate to 30-45 days if serving Python Developer+AWS(Lambda)+DynamoDB 5-8 years of experience in backend development with a strong focus on Python. Proven experience with AWS serverless technologies, including Lambda, DynamoDB
Posted 1 week ago
5.0 - 10.0 years
15 - 25 Lacs
Noida, Chennai, Bengaluru
Hybrid
Job Title: Senior Python Backend Developer with AWS Serverless Expertise Location: Offshore Experience : 5-10 years Key Responsibilities: Design, develop, and maintain backend services using Python and AWS serverless technologies. Implement event-driven architectures to ensure efficient and scalable solutions. Utilize Terraform for infrastructure as code to manage and provision AWS resources. Configure and manage AWS networking components to ensure secure and reliable communication between services. Develop and maintain serverless applications using AWS Lambda functions, DynamoDB, and other AWS services. Collaborate with cross-functional teams to define, design, and ship new features. Write clean, maintainable, and efficient code while following best practices. Troubleshoot and resolve issues in a timely manner. Stay up to date with the latest industry trends and technologies to ensure our solutions remain cutting-edge. Required Qualifications: 5-8 years of experience in backend development with a strong focus on Python. Proven experience with AWS serverless technologies, including Lambda, DynamoDB, and other related services. Strong understanding of event-driven architecture and its implementation. Hands-on experience with Terraform for infrastructure as code. In-depth knowledge of AWS networking components and best practices. Excellent problem-solving skills and attention to detail. Strong communication and collaboration skills. Preferred Qualifications: AWS certifications (e.g., AWS Certified Developer, AWS Certified Solutions Architect). Experience with other programming languages and frameworks. Familiarity with CI/CD pipelines and DevOps practices.
Posted 1 week ago
6.0 - 7.0 years
11 - 12 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Java Full Stack Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Java and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in Java programming, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 6 to 7+ years of experience in full-stack development, with a strong focus on Java. Java Full Stack Developer Roles & Responsibilities: Develop scalable web applications using Java (Spring Boot) for backend and React/Angular for frontend. Implement RESTful APIs to facilitate communication between frontend and backend. Design and manage databases using MySQL, PostgreSQL, Oracle , or MongoDB . Write complex SQL queries, procedures, and perform database optimization. Build responsive, user-friendly interfaces using HTML, CSS, JavaScript , and frameworks like Bootstrap, React, Angular , NodeJS, Phyton integration Integrate APIs with frontend components. Participate in designing microservices and modular architecture. Apply design patterns and object-oriented programming (OOP) concepts. Write unit and integration tests using JUnit , Mockito , Selenium , or Cypress . Debug and fix bugs across full stack components. Use Git , Jenkins , Docker , Kubernetes for version control, continuous integration , and deployment. Participate in code reviews, automation, and monitoring. Deploy applications on AWS , Azure , or Google Cloud platforms. Use Elastic Beanstalk , EC2 , S3 , or Cloud Run for backend hosting. Work in Agile/Scrum teams, attend daily stand-ups, sprints, retrospectives, and deliver iterative enhancements. Document code, APIs, and configurations. Collaborate with QA, DevOps, Product Owners, and other stakeholders. Must-Have Skills : Java Programming: Deep knowledge of Java language, its ecosystem, and best practices. Frontend Technologies: Proficiency in HTML , CSS , JavaScript , and modern frontend frameworks like React or Angular etc... Backend Development: Expertise in developing and maintaining backend services using Java, Spring, and related technologies. Full Stack Development: Experience in both frontend and backend development, with the ability to work across the entire application stack. Soft Skills: Problem-Solving: Ability to analyze complex problems and develop effective solutions. Communication Skills: Strong verbal and written communication skills to effectively collaborate with cross-functional teams. Analytical Thinking: Ability to think critically and analytically to solve technical challenges. Time Management: Capable of managing multiple tasks and deadlines in a fast-paced environment. Adaptability: Ability to quickly learn and adapt to new technologies and methodologies. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm
Posted 1 week ago
5.0 - 10.0 years
10 - 20 Lacs
Pune, Bengaluru
Hybrid
PySpark coding skill Proficient in AWS Data Engineering Services Experience in Designing Data Pipeline & Data Lake Having good communications and capable of leading & mentoring team Role & responsibilities Share profiles to afreen.banu@in.experis.com
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Pune
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Navi Mumbai
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Navi Mumbai
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include: Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 week ago
5.0 - 10.0 years
7 - 12 Lacs
Mumbai
Work from Office
As a Big Data Engineer, you will develop, maintain, evaluate, and test big data solutions. You will be involved in data engineering activities like creating pipelines/workflows for Source to Target and implementing solutions that tackle the clients needs. Your primary responsibilities include Design, build, optimize and support new and existing data models and ETL processes based on our clients business requirements. Build, deploy and manage data infrastructure that can adequately handle the needs of a rapidly growing data driven organization. Coordinate data access and security to enable data scientists and analysts to easily access to data whenever they need too Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Must have 5+ years exp in Big Data -Hadoop Spark -Scala ,Python Hbase, Hive Good to have Aws -S3, athena ,Dynomo DB, Lambda, Jenkins GIT Developed Python and pyspark programs for data analysis. Good working experience with python to develop Custom Framework for generating of rules (just like rules engine). Developed Python code to gather the data from HBase and designs the solution to implement using Pyspark. Apache Spark DataFrames/RDD's were used to apply business transformations and utilized Hive Context objects to perform read/write operations. Preferred technical and professional experience Understanding of Devops. Experience in building scalable end-to-end data ingestion and processing solutions Experience with object-oriented and/or functional programming languages, such as Python, Java and Scala
Posted 1 week ago
4.0 - 9.0 years
6 - 11 Lacs
Kochi
Work from Office
As Data Engineer, you will develop, maintain, evaluate and test big data solutions. You will be involved in the development of data solutions using Spark Framework with Python or Scala on Hadoop and AWS Cloud Data Platform Responsibilities: Experienced in building data pipelines to Ingest, process, and transform data from files, streams and databases. Process the data with Spark, Python, PySpark, Scala, and Hive, Hbase or other NoSQL databases on Cloud Data Platforms (AWS) or HDFS Experienced in develop efficient software code for multiple use cases leveraging Spark Framework / using Python or Scala and Big Data technologies for various use cases built on the platform Experience in developing streaming pipelines Experience to work with Hadoop / AWS eco system components to implement scalable solutions to meet the ever-increasing data volumes, using big data/cloud technologies Apache Spark, Kafka, any Cloud computing etc Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala ; Minimum 3 years of experience on Cloud Data Platforms on AWS; Experience in AWS EMR / AWS Glue / Data Bricks, AWS RedShift, DynamoDB Good to excellent SQL skills Exposure to streaming solutions and message brokers like Kafka technologies Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers
Posted 1 week ago
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include: Strategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers
Posted 1 week ago
5.0 - 7.0 years
7 - 9 Lacs
Bengaluru
Work from Office
As a senior SAP Consultant, you will serve as a client-facing practitioner working collaboratively with clients to deliver high-quality solutions and be a trusted business advisor with deep understanding of SAP Accelerate delivery methodology or equivalent and associated work products. You will work on projects that assist clients in integrating strategy, process, technology, and information to enhance effectiveness, reduce costs, and improve profit and shareholder value. There are opportunities for you to acquire new skills, work across different disciplines, take on new challenges, and develop a comprehensive understanding of various industries. Your primary responsibilities include: Strategic SAP Solution FocusWorking across technical design, development, and implementation of SAP solutions for simplicity, amplification, and maintainability that meet client needs. Comprehensive Solution DeliveryInvolvement in strategy development and solution implementation, leveraging your knowledge of SAP and working with the latest technologies. Required education Bachelor's Degree Preferred education Master's Degree Required technical and professional expertise Total 5 - 7+ years of experience in Data Management (DW, DL, Data Platform, Lakehouse) and Data Engineering skills Minimum 4+ years of experience in Big Data technologies with extensive data engineering experience in Spark / Python or Scala. Minimum 3 years of experience on Cloud Data Platforms on AWS; Exposure to streaming solutions and message brokers like Kafka technologies. Experience in AWS EMR / AWS Glue / DataBricks, AWS RedShift, DynamoDB Good to excellent SQL skills Preferred technical and professional experience Certification in AWS and Data Bricks or Cloudera Spark Certified developers
Posted 1 week ago
4.0 - 7.0 years
5 - 15 Lacs
Mumbai
Hybrid
Role: Sr Python FastAPI Developer Location: Mumbai Experience: 4yrs to 7yrs Technologies / Skills: Python (FastAPI), Advance SQL, Postgres, DynamoDB, Docker Responsibilities: - Build high-performance REST APIs & WebSockets to power web applications. - Design, develop, and maintain scalable and efficient backend services using FastAPI for web applications. - Coordinating with development teams to determine application requirements and integration points. - Understanding of fundamental design principles behind a scalable application and writing scalable code. - Implement security best practices to safeguard sensitive data and ensure compliance with privacy regulations. - Own and manage all phases of the software development lifecycle planning, design, implementation, deployment, and support. - Build reusable, high-quality code and libraries for future use that are high-performance and can be used across multiple projects. - Conduct code reviews and provide constructive feedback to team members. - Stay up-to-date with emerging technologies and trends in Python development and FastAPI framework. - Ensuring the reliability and correctness of FastAPI applications using Pytest - Defines and documents business requirements for complex system development or testing - Comfortable working with agile / scrum / kanban - Willingness to join a distributed team operating across different time-zones Required Qualification for Sr Python FastAPI Developer - Bachelors degree in IT, computer science, computer engineering, or similar - Min. 3+ years of experience in Python (FastAPI) development. - Strong understanding of asynchronous programming and background tasks. - Knowledge of Pydantic, CRON jobs scheduler, Swagger Ul for endpoints. - Proficiency in database management systems.(e. g., DynamoDB, PostgreSQL). - Familiarity with containerization technologies such as Docker. - Excellent verbal and written communication skills - Experience with version control systems (e.g., Git, Git actions) is a plus.
Posted 1 week ago
6.0 - 8.0 years
11 - 12 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Java Full Stack Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Java and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in Java programming, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 6 to 8+ years of experience in full-stack development, with a strong focus on Java. Java Full Stack Developer Roles & Responsibilities: Develop scalable web applications using Java (Spring Boot) for backend and React/Angular for frontend. Implement RESTful APIs to facilitate communication between frontend and backend. Design and manage databases using MySQL, PostgreSQL, Oracle , or MongoDB . Write complex SQL queries, procedures, and perform database optimization. Build responsive, user-friendly interfaces using HTML, CSS, JavaScript , and frameworks like Bootstrap, React, Angular , NodeJS, Phyton integration Integrate APIs with frontend components. Participate in designing microservices and modular architecture. Apply design patterns and object-oriented programming (OOP) concepts. Write unit and integration tests using JUnit , Mockito , Selenium , or Cypress . Debug and fix bugs across full stack components. Use Git , Jenkins , Docker , Kubernetes for version control, continuous integration , and deployment. Participate in code reviews, automation, and monitoring. Deploy applications on AWS , Azure , or Google Cloud platforms. Use Elastic Beanstalk , EC2 , S3 , or Cloud Run for backend hosting. Work in Agile/Scrum teams, attend daily stand-ups, sprints, retrospectives, and deliver iterative enhancements. Document code, APIs, and configurations. Collaborate with QA, DevOps, Product Owners, and other stakeholders. Must-Have Skills : Java Programming: Deep knowledge of Java language, its ecosystem, and best practices. Frontend Technologies: Proficiency in HTML , CSS , JavaScript , and modern frontend frameworks like React or Angular etc... Backend Development: Expertise in developing and maintaining backend services using Java, Spring, and related technologies. Full Stack Development: Experience in both frontend and backend development, with the ability to work across the entire application stack. Soft Skills: Problem-Solving: Ability to analyze complex problems and develop effective solutions. Communication Skills: Strong verbal and written communication skills to effectively collaborate with cross-functional teams. Analytical Thinking: Ability to think critically and analytically to solve technical challenges. Time Management: Capable of managing multiple tasks and deadlines in a fast-paced environment. Adaptability: Ability to quickly learn and adapt to new technologies and methodologies. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm
Posted 1 week ago
3.0 - 5.0 years
3 - 8 Lacs
Noida
Work from Office
Roles & Responsibilities: Proficient in Python including, Github, Git commands Develop code based on functional specifications through an understanding of project code Test code to verify it meets the technical specifications and is working as intended, before submitting to code review Experience in writing tests in Python by using Pytest Follow prescribed standards and processes as applicable to software development methodology, including planning, work estimation, solution demos, and reviews Read and understand basic software requirements Assist with the implementation of a delivery pipeline, including test automation, security, and performance Assist in troubleshooting and responding to production issues to ensure the stability of the application Must-Have and Mandatory: Very Good experience in Python Flask, SQL Alchemy, Pytest Knowledge of Cloud like AWS Cloud , Lambda, S3, Dynamo DB Database - Postgres SQL or MySQL or Any relational database. Can provide suggestions for performance improvements, strategy, etc. Expertise in object-oriented design and multi-threaded programming Total Experience Expected: 04-06 years
Posted 1 week ago
12.0 - 15.0 years
35 - 60 Lacs
Chennai
Work from Office
AWS Solution Architect: Experience in driving the Enterprise Architecture for large commercial customers Experience in healthcare enterprise transformation Prior experience in architecting cloud first applications Experience leading a customer through a migration journey and proposing competing views to drive a mutual solution. Knowledge of cloud architecture concepts Knowledge of application deployment and data migration Ability to design high availability applications on AWS across availability zones and availability regions Ability to design applications on AWS taking advantage of disaster recovery design guidelines Design, implement, and maintain streaming solutions using AWS Managed Streaming for Apache Kafka (MSK) Monitor and manage Kafka clusters to ensure optimal performance, scalability, and uptime. Configure and fine-tune MSK clusters, including partitioning strategies, replication, and retention policies. Analyze and optimize the performance of Kafka clusters and streaming pipelines to meet high-throughput and low-latency requirements. Design and implement data integration solutions to stream data between various sources and targets using MSK. Lead data transformation and enrichment processes to ensure data quality and consistency in streaming applications Mandatory Technical Skillset: AWS Architectural concepts - designs, implements, and manages cloud infrastructure AWS Services (EC2, S3, VPC, Lambda, ELB, Route 53, Glue, RDS, DynamoDB, Postgres, Aurora, API Gateway, CloudFormation, etc.) Kafka Amazon MSK Domain Experience: Healthcare domain exp. is required Blues exp. is preferred Location – Pan India
Posted 1 week ago
12.0 - 18.0 years
35 - 45 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
Job Summary We are seeking an experienced Amazon Connect Architect with 12 to 15 years of experience to design, develop and implement scalable and reliable cloud-based contact center solutions using Amazon Connect and AWS ecosystem services You will play a key role in translating business needs into technical solutions and lead implementation across clients or business units Key Responsibilities Architect and design contact center solutions using Amazon Connect and AWS services like Lambda Lex DynamoDB S3 and CloudWatch Lead the endtoend implementation and configuration of Amazon Connect Integrate Amazon Connect with CRMs, Salesforce, ServiceNow etc, ticketing systems, and third-party tools Define call flows IVR designs, routing profiles and queue configurations Implement Contact Lens realtime metrics and historical reporting Collaborate with cross-functional teams, developers, business analysts project managers Create technical documentation diagrams and handoff materials Stay updated on AWS best practices and new Amazon Connect features Provide technical leadership and mentorship to development support teams Required Skills Proven experience designing and deploying Amazon Connect solutions Strong hands-on knowledge of AWS Lambda, IAM, S3, DynamoDB, Kinesis, and CloudFormation Experience with Amazon Lex and AIML for voice bots Proficiency in programming scripting JavaScript, Node.js Familiarity with CRM integrations especially Salesforce Service Cloud Voice Understanding of telephony concepts SIP DID ACD IVR CTI Experience with CICD pipelines and version control Git Strong documentation and communication skills Preferred Skills AWS Certified Solutions Architect or Amazon Connect accreditation
Posted 1 week ago
8.0 - 13.0 years
22 - 30 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
LOCATION: PAN INDIA Experience: 8+ years Support Model: 24x7 rotational Role Overview: Handle service delivery and ensure performance across all Amazon Connect support areas. Oversee overall support operations, enhancements, and system updates. Act as primary escalation point for incidents. Manage SLAs and ensure service standards are met. Identify process gaps and implement improvements. Lead and mentor Junior engineers. Maintain relationships with internal and external stakeholders. Skills Required: Deep hands-on experience with Amazon Connect Strong knowledge of AWS Lambda, DynamoDB, S3 In-depth understanding of contact flows, queues, routing profile, quick connect, telephony config, and call routing Strong troubleshooting skills in WebRTC and voice issues Experience with CloudWatch, Connect Metrics, CI/CD pipelines Experience integrating with Salesforce. (Service Cloud Voice) Good documentation and process improvement capability Strong leadership and communication skills
Posted 1 week ago
3.0 - 8.0 years
15 - 25 Lacs
Hyderabad, Bengaluru, Mumbai (All Areas)
Work from Office
LOCATION: PAN INDIA Experience: 3-5 years Support Model: 24x7 rotational Role Overview: Provide support on Amazon Connect-related incidents and user issues. Handle basic troubleshooting of voice, call routing, and UI-based configurations. Support change announcements and basic deployment activities. Coordinate with L2/L3 engineers for escalation. Maintain documentation and update knowledge base. Skills Required: Hands-on experience with Amazon Connect (basic flows, routing, and settings) Exposure to AWS Lambda, S3, DynamoDB Basic understanding of WebRTC and voice troubleshooting Familiar with CloudWatch, Connect Metrics Willingness to learn Salesforce integration. (Service Cloud Voice) Strong willingness to work in support model and take ownership Experience: 5-8 years Support Model: 24x7 rotational Role Overview: Provide L2 level support for Amazon Connect and associated AWS services. Address incidents and troubleshoot system or telephony-related issues. Support service delivery and ensure announced changes are implemented. Maintain SLAs and escalate where required. Contribute to documentation and improvement plans. Support deployment through CI/CD pipeline. Skills Required: Strong hands-on experience with Amazon Connect Working knowledge of Lambda, DynamoDB, S3 Good understanding of call flows, routing, and WebRTC troubleshooting Familiarity with CloudWatch, Connect Metrics, CI/CD Exposure to Salesforce integration helpful. (Service Cloud Voice) Ability to work independently with issue resolution Good communication and support handling
Posted 1 week ago
5.0 - 10.0 years
15 - 25 Lacs
Hyderabad/Secunderabad, Bangalore/Bengaluru, Delhi / NCR
Hybrid
Genpact (NYSE: G) is a global professional services and solutions firm delivering outcomes that shape the future. Our 125,000+ people across 30+ countries are driven by our innate curiosity, entrepreneurial agility, and desire to create lasting value for clients. Powered by our purpose the relentless pursuit of a world that works better for people – we serve and transform leading enterprises, including the Fortune Global 500, with our deep business and industry knowledge, digital operations services, and expertise in data, technology, and AI. Inviting applications for the role of Lead Consultant-Data Engineer, AWS+Python, Spark, Kafka for ETL! Responsibilities Develop, deploy, and manage ETL pipelines using AWS services, Python, Spark, and Kafka. Integrate structured and unstructured data from various data sources into data lakes and data warehouses. Design and deploy scalable, highly available, and fault-tolerant AWS data processes using AWS data services (Glue, Lambda, Step, Redshift) Monitor and optimize the performance of cloud resources to ensure efficient utilization and cost-effectiveness. Implement and maintain security measures to protect data and systems within the AWS environment, including IAM policies, security groups, and encryption mechanisms. Migrate the application data from legacy databases to Cloud based solutions (Redshift, DynamoDB, etc) for high availability with low cost Develop application programs using Big Data technologies like Apache Hadoop, Apache Spark, etc with appropriate cloud-based services like Amazon AWS, etc. Build data pipelines by building ETL processes (Extract-Transform-Load) Implement backup, disaster recovery, and business continuity strategies for cloud-based applications and data. Responsible for analysing business and functional requirements which involves a review of existing system configurations and operating methodologies as well as understanding evolving business needs Analyse requirements/User stories at the business meetings and strategize the impact of requirements on different platforms/applications, convert the business requirements into technical requirements Participating in design reviews to provide input on functional requirements, product designs, schedules and/or potential problems Understand current application infrastructure and suggest Cloud based solutions which reduces operational cost, requires minimal maintenance but provides high availability with improved security Perform unit testing on the modified software to ensure that the new functionality is working as expected while existing functionalities continue to work in the same way Coordinate with release management, other supporting teams to deploy changes in production environment Qualifications we seek in you! Minimum Qualifications Experience in designing, implementing data pipelines, build data applications, data migration on AWS Strong experience of implementing data lake using AWS services like Glue, Lambda, Step, Redshift Experience of Databricks will be added advantage Strong experience in Python and SQL Proven expertise in AWS services such as S3, Lambda, Glue, EMR, and Redshift. Advanced programming skills in Python for data processing and automation. Hands-on experience with Apache Spark for large-scale data processing. Experience with Apache Kafka for real-time data streaming and event processing. Proficiency in SQL for data querying and transformation. Strong understanding of security principles and best practices for cloud-based environments. Experience with monitoring tools and implementing proactive measures to ensure system availability and performance. Excellent problem-solving skills and ability to troubleshoot complex issues in a distributed, cloud-based environment. Strong communication and collaboration skills to work effectively with cross-functional teams. Preferred Qualifications/ Skills Master’s Degree-Computer Science, Electronics, Electrical. AWS Data Engineering & Cloud certifications, Databricks certifications Experience with multiple data integration technologies and cloud platforms Knowledge of Change & Incident Management process Genpact is an Equal Opportunity Employer and considers applicants for all positions without regard to race, color, religion or belief, sex, age, national origin, citizenship status, marital status, military/veteran status, genetic information, sexual orientation, gender identity, physical or mental disability or any other characteristic protected by applicable laws. Genpact is committed to creating a dynamic work environment that values diversity and inclusion, respect and integrity, customer focus, and innovation. Get to know us at genpact.com and on LinkedIn, X, YouTube, and Facebook. Furthermore, please do note that Genpact does not charge fees to process job applications and applicants are not required to pay to participate in our hiring process in any other way. Examples of such scams include purchasing a 'starter kit,' paying to apply, or purchasing equipment or training.
Posted 1 week ago
7.0 - 8.0 years
11 - 12 Lacs
Hyderabad
Work from Office
We are seeking a highly skilled Java Full Stack Engineer to join our dynamic development team. In this role, you will be responsible for designing, developing, and maintaining both frontend and backend components of our applications using Java and associated technologies. You will collaborate with cross-functional teams to deliver robust, scalable, and high-performing software solutions that meet our business needs. The ideal candidate will have a strong background in Java programming, experience with modern frontend frameworks, and a passion for full-stack development. Requirements : Bachelor's degree in Computer Science Engineering, or a related field. 7 to 8+ years of experience in full-stack development, with a strong focus on Java. Java Full Stack Developer Roles & Responsibilities: Develop scalable web applications using Java (Spring Boot) for backend and React/Angular for frontend. Implement RESTful APIs to facilitate communication between frontend and backend. Design and manage databases using MySQL, PostgreSQL, Oracle , or MongoDB . Write complex SQL queries, procedures, and perform database optimization. Build responsive, user-friendly interfaces using HTML, CSS, JavaScript , and frameworks like Bootstrap, React, Angular , NodeJS, Phyton integration Integrate APIs with frontend components. Participate in designing microservices and modular architecture. Apply design patterns and object-oriented programming (OOP) concepts. Write unit and integration tests using JUnit , Mockito , Selenium , or Cypress . Debug and fix bugs across full stack components. Use Git , Jenkins , Docker , Kubernetes for version control, continuous integration , and deployment. Participate in code reviews, automation, and monitoring. Deploy applications on AWS , Azure , or Google Cloud platforms. Use Elastic Beanstalk , EC2 , S3 , or Cloud Run for backend hosting. Work in Agile/Scrum teams, attend daily stand-ups, sprints, retrospectives, and deliver iterative enhancements. Document code, APIs, and configurations. Collaborate with QA, DevOps, Product Owners, and other stakeholders. Must-Have Skills : Java Programming: Deep knowledge of Java language, its ecosystem, and best practices. Frontend Technologies: Proficiency in HTML , CSS , JavaScript , and modern frontend frameworks like React or Angular etc... Backend Development: Expertise in developing and maintaining backend services using Java, Spring, and related technologies. Full Stack Development: Experience in both frontend and backend development, with the ability to work across the entire application stack. Soft Skills: Problem-Solving: Ability to analyze complex problems and develop effective solutions. Communication Skills: Strong verbal and written communication skills to effectively collaborate with cross-functional teams. Analytical Thinking: Ability to think critically and analytically to solve technical challenges. Time Management: Capable of managing multiple tasks and deadlines in a fast-paced environment. Adaptability: Ability to quickly learn and adapt to new technologies and methodologies. Interview Mode : F2F for who are residing in Hyderabad / Zoom for other states Location : 43/A, MLA Colony,Road no 12, Banjara Hills, 500034 Time : 2 - 4pm
Posted 1 week ago
2.0 - 5.0 years
6 - 17 Lacs
Noida
Work from Office
Responsibilities: * Design, develop, test & maintain Vue.js applications using TypeScript, Node.js, GraphQL & AWS services. * Collaborate with cross-functional teams on API Gateway integration & Dynamo DB data management. Food allowance
Posted 1 week ago
3.0 - 6.0 years
7 - 12 Lacs
Bengaluru
Work from Office
Job Description: Intellimind is seeking and PL/SQL Developer with 3-6 years of hands-on experience to join our expanding IT team. The ideal candidate will possess a robust technical knowledge of Oracle databases, be skilled in creating and configuring application pages using Oracle objects, and have actively contributed to both the design and development phases. This role involves collaborating with a team of skilled professionals to drive our database initiatives forward. Key Responsibilities: Participate in the full lifecycle of Oracle PL/SQL development. Develop, Test, and Implement Oracle PL/SQL packages, Table, View, Procedures, Functions and Triggers Create and configure application pages using Oracle or any database objects based on the product training provided by us. Assist in the analysis and gathering of business requirements. Improve query performance and optimize queries using both optimization tools and manual methods. Collaborate with other team members to integrate databases with other applications. Maintain data integrity and security using quality procedures and Oracle features. Follow assigned tasks and resolve issues using ticketing tools. Conduct root-cause analysis of technical and application issues. Create documentation for technical specifications, user manuals. Prepare development time estimates for requirements. Participate in regular team meetings to discuss ongoing projects and potential roadblocks with Project managers and Operations team. Stay updated with the latest Oracle features, technologies, and best practices. Work closely with the Business and Operations team - collaborate effectively to ensure alignment on project goals and priorities. Qualifications & Skills: Bachelors degree in computer science, Information Technology, or a related field. 4-5 years of hands-on experience with Oracle PL/SQL development . Proficient in Oracle Database SQL and PL/SQL with a strong working knowledge of DDL, DML, DQL, DCL, and TCL commands. Strong working knowledge in IDE Tools like Toad for oracle and optimization tools. Working experience in AWS RDS databases, No SQL databases Familiarity with database design principles and normalization. Strong problem-solving abilities and attention to detail. Ability to work both independently and as part of a collaborative team. Effective communication skills, both written and verbal. Knowledge of XML batch configuration and integration with Oracle. Experience with version control tools like GitHub. Ensure close collaboration with the Reporting Manager, Internal teams and the stakeholders. Working Hours : The shift timings for this role are flexible and may vary between 9:00 AM to 6:00 PM and 1:00 PM to 10:00 PM. Candidates should be available for both shifts as required
Posted 1 week ago
4.0 - 9.0 years
7 - 17 Lacs
Pune, Chennai, Mumbai (All Areas)
Hybrid
Client name: Atos Syntel Payroll Company Compunnel Inc. in Noida has completed six months of direct collaboration with Atos. Job Location: Pune/Mumbai/Chennai Experience Required: 4+ years Mode of Work: Hybrid, 3 days work from the office and 2 days work from home Job Title: AWS Java Lead Should have min 4+ years of experience in the software industry. Must have experience in Java, Data structures, Algorithms, Spring Boot, Microservices, Rest API, Design Patterns, Problem Solving & Knowledge on any cloud. 4+ experience with AWS (S3, Lambda, DynamoDB, API Gateway, etc.) Hands-on with engineering excellence, AWS CloudFormation, AWS DevOps toolchain, and practices. Excellent problem-solving and critical thinking. Independent and strong ownership of business problems and technical solutions Strong Communication and interpersonal skills Experience with open source (Apache Projects, Spring, Maven, etc.) Expert knowledge of the Java language, platform, ecosystem, and underlying concepts and constructs Knowledge of common design patterns and design principles Good knowledge of networking & security constructs specific to AWS AWS Associate or Professional Solutions Architect Certification will provide more weightage. Please fill in all the essential details which are given below & attach your updated resume, and send it to ralish.sharma@compunnel.com 1. Total Experience: 2. Relevant Experience in Java : 3. Relevant Experience in AWS: 4. Experience in S3/Lambda/Dynamo DB/ API Gateway: 5. Experience in Design Patterns: 6. Experience in Spring Boot : 7. Experience in Microservices : 8. Current company : 9. Current Designation : 10. Highest Education : 11. Notice Period: 12 Current CTC: 13. Expected CTC: 14. Current Location: 15. Preferred Location: 16. Hometown: 17. Contact No: 18. If you have any offer from some other company, please mention the Offer amount and Offer Location: 19. Reason for looking for change: 20. PANCARD : If the job description is suitable for you, please get in touch with me at the number below: 9910044363 .
Posted 1 week ago
2.0 - 7.0 years
10 - 20 Lacs
Bengaluru
Work from Office
Apply Here (Mandatory to consider your profile) : https://forms.gle/8UigBWPhwYFM6Uuu5 Hiring for a FAANG company. Note: This position is part of a program designed to support women professionals returning to the workforce after a career break (9+ months career gap) The role is within a dynamic and fast-evolving team focused on developing innovative technology solutions to convert offline customers into online customers. The team builds scalable, strategic, and sustainable products tailored to market needs, often deploying solutions that can be adapted globally. In this position, you will lead end-to-end development of solutions, leveraging a wide range of technologies, including cloud computing platforms, big data, machine learning, mobile platforms, APIs, and modern frontend frameworks. Your responsibilities will include maintaining high code quality standards, optimizing development processes, and mentoring junior engineers. This role thrives in a fast-paced environment where delivering impactful features and products is key. The ideal candidate is a skilled software engineer with experience building and launching distributed systems at scale. You are adaptable, thrive in dynamic and entrepreneurial environments, and are eager to mentor others. You excel at managing competing priorities, navigating ambiguity, and making data-driven decisions. Strong communication skills and the ability to influence and lead are critical for success in this role. Key Responsibilities : Collaborate with senior engineers to design and deliver high-quality technology solutions. Contribute to the development of distributed workflows hosted in cloud-native architecture. Maintain operational excellence for a rapidly scaling technology stack. Drive innovation through patents, technical presentations, and ideation sessions. Play a key role in hiring and nurturing technical talent. Define and measure success metrics to guide the evolution of technology products. Basic Qualifications: 2-8 years of professional software development experience (excluding internships). Proficiency in at least one software programming language. Preferred Qualifications : Bachelors degree in Computer Science or a related field (or equivalent experience). Coding Efficiency Key Skills : Arrays Graphs User Acceptance Testing (UAT) MySQL Oracle DynamoDB DNS VPN Unix Java C++ C# Python SQL Kotlin TypeScript Greedy Algorithms Backtracking Infrastructure as Code (IaC) Cloud Platform Management Serverless Computing Continuous Integration (CI) IDE Jenkins Docker Web Design Note : Work with FAANG, top MNCs, and fast-growing startupsLorvensoft connects you to premier tech roles via our recruitment partners like JCurve
Posted 1 week ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
17062 Jobs | Dublin
Wipro
9393 Jobs | Bengaluru
EY
7759 Jobs | London
Amazon
6056 Jobs | Seattle,WA
Accenture in India
6037 Jobs | Dublin 2
Uplers
5971 Jobs | Ahmedabad
Oracle
5764 Jobs | Redwood City
IBM
5714 Jobs | Armonk
Tata Consultancy Services
3524 Jobs | Thane
Capgemini
3518 Jobs | Paris,France