Get alerts for new jobs matching your selected skills, preferred locations, and experience range. Manage Job Alerts
6.0 - 14.0 years
0 Lacs
hyderabad, telangana
On-site
You have a job opportunity for a Back End Engineer position requiring 6-14 years of experience in technologies such as Java, Springboot, Microservices, Python, AWS or Cloud Native Deployment, Event bridge, Api gateway, DynamoDb, and CloudWatch. The ideal candidate should have at least 7 years of experience in these technologies and be comfortable working with complex code and requirements. The essential functions of this position include working with a Tech Stack that includes Java, Springboot, Microservices, Python, AWS, Event bridge, Api gateway, DynamoDb, and CloudWatch. The qualifications required for this role include expertise in Spring boot (Annotations, Autowiring with reflection, spring starters, auto-configuration vs configuration), CI CD Tools, Gradle or Maven Knowledge, Docker, containers, scale up and scale down, Health checks, Distributed Tracing, exception handling in microservices, Lambda expressions, threads, and streams. Candidates with knowledge of GraphQL, prior experience working on projects with a lot of PII data, or experience in the Financial Services industry are preferred. The job offers an opportunity to work on bleeding-edge projects, collaborate with a highly motivated team, competitive salary, flexible schedule, benefits package including medical insurance, sports, corporate social events, professional development opportunities, and a well-equipped office. Grid Dynamics (NASDAQ: GDYN) is the company offering this job opportunity. They are a leading provider of technology consulting, platform and product engineering, AI, and advanced analytics services. With a focus on solving technical challenges and enabling positive business outcomes for enterprise companies undergoing business transformation, Grid Dynamics has expertise in enterprise AI, data, analytics, cloud & DevOps, application modernization, and customer experience. Founded in 2006, Grid Dynamics is headquartered in Silicon Valley with offices across the Americas, Europe, and India.,
Posted 1 month ago
3.0 - 7.0 years
5 - 9 Lacs
Gurugram
Work from Office
Job Description: As an L2 AWS Support Engineer, you will be responsible for providing advanced technical support for AWS-based solutions. You will troubleshoot and resolve complex technical issues, including those related to networking, security, and automation. Key Responsibilities: Advanced Troubleshooting: Investigate and resolve issues related to networking (VPC, subnets, security groups) and storage. Analyze and fix application performance issues on AWS infrastructure. Automation: Develop and maintain scripts for routine tasks using Python, Bash, or AWS CLI. Implement Infrastructure as Code (IaC) using tools like AWS CloudFormation or Terraform. Automate common Kubernetes tasks Cluster Management: Create and manage EKS clusters using AWS Management Console, AWS CLI, or Terraform. Manage Kubernetes resources such as namespaces, deployments, and services. Automation: Write and maintain Terraform modules for provisioning EKS clusters and associated resources. Backup & Recovery: Configure and verify backups, snapshots, and disaster recovery plans. Perform DR drills as per defined procedures. Optimization : Monitor and optimize AWS resource utilization and costs. Suggest improvements for operational efficiency. Support Escalations: Address Level 2 support tickets and provide resolutions for moderately complex issues. Collaboration: Assist Level 1 engineers with escalations and mentor them as required. Work closely with application teams to deploy and manage services effectively. Required Skills and Qualifications: Technical Skills: Advanced understanding of AWS core services (EC2, S3, VPC, IAM, Lambda, etc.) Strong knowledge of AWS automation, scripting (Bash, Python, PowerShell), and CLI. Experience with AWS CloudFormation and Terraform. Understanding of AWS security best practices and identity and access management. Migration and Modernization: Assist with migrating workloads to AWS and modernizing existing infrastructure. Performance Optimization: Analyze AWS resource usage and identify optimization opportunities. Cost Optimization: Implement cost-saving measures, such as rightsizing instances and using reserved instances. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Customer-focused approach. Certifications (Preferred): AWS Certified Solutions Architect - Associate AWS Certified DevOps Engineer Professional
Posted 1 month ago
4.0 - 6.0 years
6 - 10 Lacs
Tamil Nadu
Work from Office
Introduction to the Role: Are you passionate about unlocking the power of data to drive innovation and transform business outcomes? Join our cutting-edge Data Engineering team and be a key player in delivering scalable, secure, and high-performing data solutions across the enterprise. As a Data Engineer , you will play a central role in designing and developing modern data pipelines and platforms that support data-driven decision-making and AI-powered products. With a focus on Python , SQL , AWS , PySpark , and Databricks , you'll enable the transformation of raw data into valuable insights by applying engineering best practices in a cloud-first environment. We are looking for a highly motivated professional who can work across teams to build and manage robust, efficient, and secure data ecosystems that support both analytical and operational workloads. Accountabilities: Design, build, and optimize scalable data pipelines using PySpark , Databricks , and SQL on AWS cloud platforms . Collaborate with data analysts, data scientists, and business users to understand data requirements and ensure reliable, high-quality data delivery. Implement batch and streaming data ingestion frameworks from a variety of sources (structured, semi-structured, and unstructured data). Develop reusable, parameterized ETL/ELT components and data ingestion frameworks. Perform data transformation, cleansing, validation, and enrichment using Python and PySpark . Build and maintain data models, data marts, and logical/physical data structures that support BI, analytics, and AI initiatives. Apply best practices in software engineering, version control (Git), code reviews, and agile development processes. Ensure data pipelines are well-tested, monitored, and robust with proper logging and alerting mechanisms. Optimize performance of distributed data processing workflows and large datasets. Leverage AWS services (such as S3, Glue, Lambda, EMR, Redshift, Athena) for data orchestration and lakehouse architecture design. Participate in data governance practices and ensure compliance with data privacy, security, and quality standards. Contribute to documentation of processes, workflows, metadata, and lineage using tools such as Data Catalogs or Collibra (if applicable). Drive continuous improvement in engineering practices, tools, and automation to increase productivity and delivery quality. Essential Skills / Experience: 4 to 6 years of professional experience in Data Engineering or a related field. Strong programming experience with Python and experience using Python for data wrangling, pipeline automation, and scripting. Deep expertise in writing complex and optimized SQL queries on large-scale datasets. Solid hands-on experience with PySpark and distributed data processing frameworks. Expertise working with Databricks for developing and orchestrating data pipelines. Experience with AWS cloud services such as S3 , Glue , EMR , Athena , Redshift , and Lambda . Practical understanding of ETL/ELT development patterns and data modeling principles (Star/Snowflake schemas). Experience with job orchestration tools like Airflow , Databricks Jobs , or AWS Step Functions . Understanding of data lake, lakehouse, and data warehouse architectures. Familiarity with DevOps and CI/CD tools for code deployment (e.g., Git, Jenkins, GitHub Actions). Strong troubleshooting and performance optimization skills in large-scale data processing environments. Excellent communication and collaboration skills, with the ability to work in cross-functional agile teams. Desirable Skills / Experience: AWS or Databricks certifications (e.g., AWS Certified Data Analytics, Databricks Data Engineer Associate/Professional). Exposure to data observability , monitoring , and alerting frameworks (e.g., Monte Carlo, Datadog, CloudWatch). Experience working in healthcare, life sciences, finance, or another regulated industry. Familiarity with data governance and compliance standards (GDPR, HIPAA, etc.). Knowledge of modern data architectures (Data Mesh, Data Fabric). Exposure to streaming data tools like Kafka, Kinesis, or Spark Structured Streaming. Experience with data visualization tools such as Power BI, Tableau, or QuickSight.
Posted 1 month ago
9.0 - 12.0 years
16 - 20 Lacs
Hyderabad
Work from Office
Roles and Responsibilities 1. Architect and design scalable, maintainable, and high-performance backend systems using Python. 2. Lead the development of clean, modular, and reusable code components and services across various domains. 3. Own the technical roadmap for Python-based services, including refactoring strategies, modernization efforts, and integration patterns. 4. Provide expert guidance on system design, code quality, performance tuning, and observability. 5. Collaborate with DevOps teams to build CI/CD pipelines, containerization strategies, and robust cloud-native deployment patterns. 6. Mentor and support software engineers by enforcing strong engineering principles, design best practices, and performance debugging techniques. 7. Evaluate and recommend new technologies or frameworks where appropriate, particularly in the areas of AI/ML and GenAI integration. Qualifications Required Preferred Qualifications: Desirable (Good-to-Have) GenAI / AI/ML Skills 1. Exposure to Large Language Models (LLMs) and prompt engineering. 2. Basic familiarity with Retrieval-Augmented Generation (RAG) and vector databases (FAISS, Pinecone, Weaviate) 3. Understanding of model fine-tuning concepts (LoRA, QLoRA, PEFT) 4. Experience using or integrating LangChain, LlamaIndex, or Hugging Face Transformers 5. Familiarity with AWS AI/ML services like Bedrock and SageMaker Technology Stack Languages: Python (Primary), Bash, YAML/JSON Web Frameworks: FastAPI, Flask, gRPC Databases: PostgreSQL, Redis, MongoDB, DynamoDB Cloud Platform: AWS (must), GCP/Azure (bonus) DevOps & Deployment: Docker, Kubernetes, Terraform, GitHub Actions, ArgoCD Observability: OpenTelemetry, Prometheus, Grafana, Loki GenAI Tools (Optional): Bedrock, SageMaker, LangChain, Hugging Face Private and Confidential Preferred Profile 8+ years of hands-on software development experience 3+ years in a solution/technical architect or lead engineer role Strong problem-solving skills and architectural thinking Experience collaborating across teams and mentoring engineers Passion for building clean, scalable systems and openness to learning emerging technologies like GenAI Skills and Experience Required Core Python & Architectural Skills Strong hands-on experience in advanced Python programming (7+ years), including: 1. Language internals (e.g., decorators, metaclasses, descriptors) 2. Concurrency (asyncio, multiprocessing, threading) 3. Performance optimization and profiling (e.g., cProfile, py-spy) 4. Strong testing discipline (pytest, mocking, coverage analysis) Proven track record in designing scalable, distributed systems: 1. Event-driven architectures, service-oriented and microservice-based systems. 2. Experience with REST/gRPC APIs, async queues, caching strategies, and database modeling. Proficiency in building and deploying cloud-native applications: 1. Strong AWS exposure (EC2, Lambda, S3, IAM, etc.) Infrastructure-as-Code (Terraform/CDK) Private and Confidential 2. CI/CD pipelines, Docker, Kubernetes, GitOps Deep understanding of software architecture patterns (e.g., hexagonal, layered, DDD) Excellent debugging, tracing, and observability skills with tools like OpenTelemetry, Prometheus, Grafana
Posted 1 month ago
3.0 - 8.0 years
10 - 15 Lacs
Gurugram
Work from Office
As an L2 AWS Support Engineer, you will be responsible for providing advanced technical support for AWS-based solutions. You will troubleshoot and resolve complex technical issues, including those related to networking, security, and automation. Key Responsibilities: Develop, manage, and optimize CI/CD pipelines using tools like Jenkins and Opsera. Automate infrastructure provisioning using Terraform and CloudFormation. Administer and optimize key AWS services, including EC2, S3, RDS, Lambda, and IAM. Strengthen security by implementing best practices for IAM, encryption, and network security (VPC, Security Groups, WAF, NACLs, etc.). Design, configure, and maintain AWS networking components such as VPCs, Subnets, Route53, Transit Gateway, and Security Groups. Advanced Troubleshooting: Investigate and resolve issues related to networking (VPC, subnets, security groups) and storage. Analyze and fix application performance issues on AWS infrastructure. Cluster Management: Create and manage EKS clusters using AWS Management Console, AWS CLI, or Terraform. Manage Kubernetes resources such as namespaces, deployments, and services. Backup & Recovery: Configure and verify backups, snapshots, and disaster recovery plans. Perform DR drills as per defined procedures. Optimization: Monitor and optimize AWS resource utilization and costs. Suggest improvements for operational efficiency. Technical Skills: Advanced understanding of AWS core services (EC2, S3, VPC, IAM, Lambda, etc.) Strong knowledge of AWS automation, scripting (Bash, Python, PowerShell), and CLI. Experience with AWS CloudFormation and Terraform. Understanding of AWS security best practices and identity and access management. Migration and Modernization: Assist with migrating workloads to AWS and modernizing existing infrastructure. Performance Optimization: Analyze AWS resource usage and identify optimization opportunities. Cost Optimization: Implement cost-saving measures, such as rightsizing instances and using reserved instances. Soft Skills: Excellent problem-solving and analytical skills. Strong communication and interpersonal skills. Ability to work independently and as part of a team. Customer-focused approach. Certifications (Preferred): AWS Certified Solutions Architect - Associate AWS Certified DevOps Engineer Professional
Posted 1 month ago
2.0 - 7.0 years
3 - 7 Lacs
Noida
Work from Office
We are looking for a skilled Java + AWS Developer with 2 to 7 years of experience to join our team at VAYUZ Technologies. The ideal candidate will have expertise in developing scalable and efficient software systems using Java and AWS. Roles and Responsibility Design, develop, and deploy high-quality Java-based applications on AWS cloud platforms. Collaborate with cross-functional teams to identify and prioritize project requirements. Develop and maintain large-scale data processing pipelines using Java and AWS services. Ensure the scalability, security, and performance of developed applications. Troubleshoot and resolve technical issues efficiently. Participate in code reviews and contribute to improving overall code quality. Job Requirements Proficiency in Java programming language and its ecosystem. Experience with AWS cloud platforms, including EC2, S3, Lambda, and DynamoDB. Strong understanding of software development principles, patterns, and practices. Excellent problem-solving skills and attention to detail. Ability to work collaboratively in a team environment. Strong communication and interpersonal skills.
Posted 1 month ago
4.0 - 7.0 years
20 - 22 Lacs
Noida, Hyderabad
Work from Office
4-5 Years exp with AWS with Cloud Practioner certification Exp in working with Cloud Formation to create AWS components Exp in working with Terraform to create cloud components Working exp in creating Lambdas using Java and Python Working exp with AWS batch using Java and Python Good to have: Experience with Appflow and Event bridge Experience in integrating with external applications like Salesforce
Posted 1 month ago
4.0 - 7.0 years
6 - 10 Lacs
Hyderabad, Gurugram, Ahmedabad
Work from Office
About the Role: Grade Level (for internal use): 10 The RoleSenior Software Developer The Team Do you love to collaborate & provide solutionsThis team comes together across multiple different locations every single day to craft enterprise grade applications that serve a large customer base with growing demand and usage. You will use a wide range of technologies and cultivate a collaborative environment with other internal teams. The ImpactWe focus primarily developing, enhancing and delivering required pieces of information & functionality to internal & external clients. You will have a highly visible role where even small changes have very wide impact. Whats in it for you - Opportunities for innovation and learning new state of the art technologies - To work in pure agile & scrum methodology Responsibilities: Design and implement .NET related projects. Perform analyses and articulate solutions. Design underlying engineering for use in multiple product offerings supporting a large volume of end-users. Develop project plans with task breakdowns and estimates. Manage and improve existing solutions. Solve a variety of complex problems and figure out possible solutions, weighing the costs and benefits. What were Looking For: Basic Qualifications: Bachelor's degree in computer science or Equivalent 7+ years related experience Passionate, smart, and articulate developer Strong C#, .Net and SQL skills Experience implementingREST APIs, Able to demonstrate strong OOP skills Able to demonstrate strong understanding of SOLID principles, design patterns Understanding of working with Container platforms and Container orchestration systems. Experience working with AWS services such as Lambda, SQS, S3, API Gateway etc Able to work well individually and with a team Strong problem-solving skills Good work ethic, self-starter, and results-oriented Agile/Scrum experience Preferred: Understanding use of AI, Copilot, Agents Understanding of Event Driven Architecture to create scalable components TDD(Test Driven Development) Knowledge of Python will be a plus Whats In It For You Our Purpose: Progress is not a self-starter. It requires a catalyst to be set in motion. Information, imagination, people, technologythe right combination can unlock possibility and change the world.Our world is in transition and getting more complex by the day. We push past expected observations and seek out new levels of understanding so that we can help companies, governments and individuals make an impact on tomorrow. At S&P Global we transform data into Essential Intelligence, pinpointing risks and opening possibilities. We Accelerate Progress. Our People: Our Values: Integrity, Discovery, Partnership At S&P Global, we focus on Powering Global Markets. Throughout our history, the world's leading organizations have relied on us for the Essential Intelligence they need to make confident decisions about the road ahead. We start with a foundation of integrity in all we do, bring a spirit of discovery to our work, and collaborate in close partnership with each other and our customers to achieve shared goals. Benefits: We take care of you, so you cantake care of business. We care about our people. Thats why we provide everything youand your careerneed to thrive at S&P Global. Health & WellnessHealth care coverage designed for the mind and body. Continuous LearningAccess a wealth of resources to grow your career and learn valuable new skills. Invest in Your FutureSecure your financial future through competitive pay, retirement planning, a continuing education program with a company-matched student loan contribution, and financial wellness programs. Family Friendly PerksIts not just about you. S&P Global has perks for your partners and little ones, too, with some best-in class benefits for families. Beyond the BasicsFrom retail discounts to referral incentive awardssmall perks can make a big difference. For more information on benefits by country visithttps://spgbenefits.com/benefit-summaries Global Hiring and Opportunity at S&P Global: At S&P Global, we are committed to fostering a connected andengaged workplace where all individuals have access to opportunities based on their skills, experience, and contributions. Our hiring practices emphasize fairness, transparency, and merit, ensuring that we attract and retain top talent. By valuing different perspectives and promoting a culture of respect and collaboration, we drive innovation and power global markets. Recruitment Fraud Alert If you receive an email from a spglobalind.com domain or any other regionally based domains, it is a scam and should be reported to reportfraud@spglobal.com. S&P Global never requires any candidate to pay money for job applications, interviews, offer letters, pre-employment training or for equipment/delivery of equipment. Stay informed and protect yourself from recruitment fraud by reviewing our guidelines, fraudulent domains, and how to report suspicious activity here. ---- Equal Opportunity Employer S&P Global is an equal opportunity employer and all qualified candidates will receive consideration for employment without regard to race/ethnicity, color, religion, sex, sexual orientation, gender identity, national origin, age, disability, marital status, military veteran status, unemployment status, or any other status protected by law. Only electronic job submissions will be considered for employment. If you need an accommodation during the application process due to a disability, please send an email to EEO.Compliance@spglobal.com and your request will be forwarded to the appropriate person. US Candidates Only The EEO is the Law Poster http://www.dol.gov/ofccp/regs/compliance/posters/pdf/eeopost.pdf describes discrimination protections under federal law. Pay Transparency Nondiscrimination Provision - https://www.dol.gov/sites/dolgov/files/ofccp/pdf/pay-transp_%20English_formattedESQA508c.pdf ---- , SWP Priority Ratings - (Strategic Workforce Planning)
Posted 1 month ago
8.0 - 10.0 years
5 - 9 Lacs
Hyderabad
Work from Office
Enterprise Develop: Coding, database integration, API development, back-end and front-end implementation, Unit testing.Node.JS ( Express JS , Nest JS) , React, AWS (API Gateway, Lambda ,ECS)
Posted 1 month ago
1.0 - 3.0 years
8 - 12 Lacs
Bengaluru
Work from Office
About The Role Were looking for a backend developer to join our early engineering team and drive our product development. If youre someone who thrives on high ownership, can figure stuff out on your own and wants to be part of the zero-to-one journey, this might be for you. What Youll Do Looking out for cool new technologies and implementing them for required internal as well as business use cases. Designing scalable architectures for backend systems. Optimising performance of applications for full scale production deployments. Implementing business logic and developing APIs and services. Conceptualising and implementing scalable databases across various services. Youll be hiring and mentoring junior makes you a good fit : If you can write code that works, we should be good but read on (disclaimer: most of what follows is not a hard requirement); You have 1-3 years of experience building and scaling backend systems from scratch; You've built server-less backend architectures using AWS platforms and resources like Lambda, DynamoDB, Aurora, etc. (Brownie points if youve worked with NodeJS, Redis and PostgreSQL); You have managed deployment at scale using CI/CD integrations and implemented error management systems like Sentry;
Posted 1 month ago
3.0 - 6.0 years
5 - 8 Lacs
Nagpur
Work from Office
We are looking for a highly motivated and skilled individual to join our team as an AWS Training Program professional. The ideal candidate will have a strong background in IT Services & Consulting, with experience working on AWS projects. Roles and Responsibility Collaborate with cross-functional teams to design and implement comprehensive training programs. Develop and deliver high-quality training materials, including presentations, handouts, and assessments. Conduct workshops and seminars to educate employees on AWS best practices and features. Evaluate the effectiveness of training programs and recommend improvements. Stay up-to-date with the latest AWS developments and incorporate new technologies into training programs. Provide coaching and mentoring to junior team members to enhance their skills and knowledge. Job Requirements Strong understanding of AWS services, including EC2, S3, Lambda, and CloudFormation. Experience with cloud-based technologies and platforms is desirable. Excellent communication and presentation skills are required. Ability to work collaboratively in a team environment and build strong relationships with stakeholders. Strong analytical and problem-solving skills, with attention to detail and the ability to meet deadlines. Familiarity with adult learning principles and instructional design methodologies is preferred.
Posted 1 month ago
3.0 - 8.0 years
15 - 25 Lacs
Bengaluru
Work from Office
Dev Ops -AWS/Azure -Terraform/Kubernetes: Hands on experience with Terraform & docker containerization. Hands on experience with AWS service like S3, API Gateway, Lambda , ECS and EKS. Hands on experience in different scripting languages like Groovy, Python, Shell Scripting etc. Implement security controls and best practices throughout the CI/CD pipeline, including vulnerability scanning, static code analysis, and dependency management using tools such as SonarQube and Software Composition Analysis (SCA). Primary Skills: AWS, Terraform, Docker, Jenkins, GitHub, Groovy Script and Python. Secondary Skills: Ansible, SonarQube , GitHub Actions
Posted 1 month ago
5.0 - 10.0 years
7 - 11 Lacs
Noida
Work from Office
Expert inPython(5+ years) Expert in Django or any similar frameworks (5+ years) Experience (2+ years) inTypeScript, JavaScript and JS frameworks (Angular > 2 with Angular Material) Good knowledge of RDMS (preferably Postgres) Experience and sound knowledge of AWS services (ECS, lambda, deployment pipelines etc). Excellent written and verbal communication skills. Very good analytical and problem-solving skills. Ability to pick up new technologies. Write clean, maintainable and efficient code. Willingness to learn and understand the business domain. Mandatory Competencies Programming Language - Python - Django User Interface - HTML - HTML/CSS Beh - Communication
Posted 1 month ago
5.0 - 7.0 years
15 - 18 Lacs
Pune
Hybrid
So, what’s the role all about? Seeking a skilled and experienced DevOps Engineer in designing, producing, and testing high-quality software that meets specified functional and non-functional requirements within the time and resource constraints given. How will you make an impact? Design, implement, and maintain CI/CD pipelines using Jenkins to support automated builds, testing, and deployments. Manage and optimize AWS infrastructure for scalability, reliability, and cost-effectiveness. To streamline operational workflows and develop automation scripts and tools using shell scripting and other programming languages. Collaborate with cross-functional teams (Development, QA, Operations) to ensure seamless software delivery and deployment. Monitor and troubleshoot infrastructure, build failures, and deployment issues to ensure high availability and performance. Implement and maintain robust configuration management practices and infrastructure-as-code principles. Document processes, systems, and configurations to ensure knowledge sharing and maintain operational consistency. Performing ongoing maintenance and upgrades (Production & non-production) Occasional weekend or after-hours work as needed Have you got what it takes? Experience: 5-8 years in DevOps or a similar role. Cloud Expertise: Proficient in AWS services such as EC2, S3, RDS, Lambda, IAM, CloudFormation, or similar. CI/CD Tools: Hands-on experience with Jenkins pipelines (declarative and scripted). Scripting Skills: Proficiency in either shell scripting or powershell Programming Knowledge: Familiarity with at least one programming language (e.g., Python, Java, or Go). IMP: Scripting/Programming is integral to this role and will be a key focus in the interview process. Version Control: Experience with Git and Git-based workflows. Monitoring Tools: Familiarity with tools like CloudWatch, Prometheus, or similar. Problem-solving: Strong analytical and troubleshooting skills in a fast-paced environment. CDK Knowledge in AWS DevOps. You will have an advantage if you also have: Prior experience in Development or Automation is a significant advantage. Windows system administration is a significant advantage. Experience with monitoring and log analysis tools is an advantage. Jenkins pipeline knowledge What’s in it for you? Join an ever-growing, market disrupting, global company where the teams – comprised of the best of the best – work in a fast-paced, collaborative, and creative environment! As the market leader, every day at NiCE is a chance to learn and grow, and there are endless internal career opportunities across multiple roles, disciplines, domains, and locations. If you are passionate, innovative, and excited to constantly raise the bar, you may just be our next NiCEr! Enjoy NiCE-FLEX! At NiCE, we work according to the NiCE-FLEX hybrid model, which enables maximum flexibility: 2 days working from the office and 3 days of remote work, each week. Naturally, office days focus on face-to-face meetings, where teamwork and collaborative thinking generate innovation, new ideas, and a vibrant, interactive atmosphere. Requisition ID: 7876 Reporting into: Tech Manager Role Type: Individual Contributor
Posted 1 month ago
4.0 - 6.0 years
6 - 10 Lacs
Gurugram
Work from Office
About the Role: We are looking for a skilled and experienced Backend Developer with 56 years of hands-on experience to join our growing technology team. The ideal candidate will be responsible for designing, developing, and maintaining robust backend systems, with a strong focus on scalable microservices and cloud-native applications. Key Responsibilities: Design, develop, and deploy backend services and APIs (REST & GraphQL) with a focus on performance, scalability, and reliability. Build and maintain serverless applications using AWS Chalice /Fast API / Flask and other AWS services. Strong experience with AWS services (Lambda, API Gateway, S3, DynamoDB, etc.). Collaborate with cross-functional teams including front-end developers, product managers, and DevOps engineers. Write clean, maintainable, and well-tested code in Python . Hand-on exp in Git commands . Familiarity with database systems (SQL and NoSQL). Monitor application performance and troubleshoot production issues. Participate in code reviews and ensure adherence to best practices and coding standards. Technical Skills : Good to Have: Develop microservices architecture and contribute to continuous integration and delivery (CI/CD) pipelines. Exposure to containerization tools like Docker and orchestration tools like Kubernetes. Knowledge of security best practices in backend development. Experience with monitoring and logging tools (e.g., CloudWatch, ELK stack). Nice-to-have skills Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 1 month ago
4.0 - 6.0 years
6 - 10 Lacs
Gurugram
Work from Office
Role Description : As a Senior Software Engineer - AWS Python at Incedo, you will be responsible for developing and maintaining applications on the Amazon Web Services (AWS) platform. You will be expected to have a strong understanding of Python and AWS technologies, including EC2, S3, RDS, and Lambda. Roles & Responsibilities: Writing high quality code, participating in code reviews, designing systems of varying complexity and scope, and creating high quality documents substantiating the architecture. Engaging with clients, understanding their technical requirements, planning and liaising with other team members to develop technical design & approach to deliver end-to-end solutions. Mentor & guide junior team members, review their code, establish quality gates, build & deploy code using CI/CD pipelines, apply secure coding practices, adopt unit-testing frameworks, provide better coverage, etc. Responsible for teams growth. Technical Skills : Must Have : Python, FAST API, Uvicorn,SQLAlchemy,boto3,Lamdba server less, pymysql Nice to have : AWS lambda, Step functions, ECR, ECS,S3,SNS,SQS, Docker, CICD Proficiency in Python programming language Experience in developing and deploying applications on AWS Knowledge of serverless computing and AWS Lambda Must have excellent communication skills and be able to communicate complex technical information to non-technical stakeholders in a clear and concise manner. Must understand the company's long-term vision and align with it. Provide leadership, guidance, and support to team members, ensuring the successful completion of tasks, and promoting a positive work environment that fosters collaboration and productivity, taking responsibility of the whole team. Nice-to-have skills Nice to have : AWS lambda, Step functions, ECR, ECS,S3,SNS,SQS, Docker, CICD Qualifications 4-6 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 1 month ago
5.0 - 8.0 years
8 - 12 Lacs
Bengaluru
Work from Office
We are looking for Software Engineers who are passionate about building high-impact,scalable systems in a collaborative, fast-paced environment. At Exxat, you'll work oncutting-edge microservices-based web solutions that serve thousands of clinical studentsand healthcare practitioners across the US. This is your chance to contribute to mission critical products that directly shape the future of healthcare education. If you thrive onsolving complex problems and creating meaningful technology, wed love to hear from you. KEY RESPONSIBILITIES: Solve complex programming problems, have an understanding about how to buildHLDs and LLDs. Create High level and low-level designs for the problem statement/feature. Hands-on expertise in high-performance backend languages like Go, C#/.NET, orJava, or in JavaScript/TypeScript along with frameworks such as React or Angular. Full stack understanding of web/mobile/API/database development concepts anddesign patterns.Strive for quality, performance, usability, reliability, maintainability and extensibility. Design and develop high quality microservices and features, working with architectsand dev managers.Participate and contribute to continuously improve Agile software developmentprocesses. Deliver clean code with automated unit tests. Work under the guidance of development managers and Product owners to analyse,modify and implement various common business needs.Get involved into tech POCs to provide recommendations and apply the outcome torealisation. Able to mentor team to scale up on best design, coding and code review processes. Implement unit, integration and other automated tests.Understanding of DevOps, automation testing, test driven development, behaviourdriven development, serverless or micro-services. DESIRED EXPERIENCE: 5+ years of experience building enterprise grade software. Background in product-based or B2B SaaS environments, particularly involvingscalable product development, is a strong advantage. We look for good programmers (not good technologists) who bring a deepunderstanding of algorithms, data structures, and SOLID principles to the table. While we operate on a .Net stack, we are not hung up on specific technologiesor programming languages. Understanding of Agile Scrum and SDLC principles is a plus. NICE TO HAVE SKILLS: Experience with AWS or Azure cloud technologies (such as Azure Functions,Lambda or DynamoDB)
Posted 1 month ago
2.0 - 4.0 years
8 - 12 Lacs
Bengaluru
Work from Office
About the role: The expectations from this role are two-fold. Backend Developer who can perform data analysis and create outputs for Gen AI related tasks while supporting standard data analysis tasks. Gen AI expert who can effectively understand and translate business requirements and provide a Gen AI powered output independently. The person should be able to: Analyse data as needed from the data tables to generate data summaries and insights that are used for Gen AI and non-Gen AI work. Collaborate effectively with cross-functional teams (engineering, product, business) to ensure alignment and understanding. Create and use AI assistants to solve business problems. Comfortable providing and advocating recommendations for a better user experience. Support product development teams, enabling them to create and manage APIs that interact with the Gen AI backend data and create a next gen experience for our clients. Visualize and create data flow diagrams and materials required for effective coordination with devops teams. Manage the deployment of related APIs in Kubernetes or other relevant spaces Provide technical guidance to the UI development and data analyst teams on Gen AI best practices. Coordinate with business teams to ensure the outputs are aligned with expectations. Continuously integrating new developments in the Gen AI space into our solutions and provide product and non-product implementation ideas to fully leverage potential of Gen AI. The person should have: Proven experience as a data analyst, with a strong track record of delivering impactful insights and recommendations. Strong working knowledge of OpenAI, Gemini or other Gen AI platforms, and prior experience in creating and optimizing Gen AI models. Familiarity with API and application deployment, data pipelines and workflow automation. High-agency mindset with strong critical thinking skills. Strong business acumen to proactively identify what is right for the business. Excellent communication and collaboration skills. Technical Skills: Python SQL AWS Services (Lambda, EKS) Apache Airflow CICD (Serverless Framework) Git Jira / Trello It will be great to have: Good understanding of marketing/advertising product industry. At least 1 Gen AI project in production. Strong programming skills in Python or similar languages. Prior experience in working as Devops engineer or have worked closely with Devops. Strong background in data management.
Posted 1 month ago
5.0 - 10.0 years
15 - 20 Lacs
Chennai, Bengaluru
Work from Office
Job Description: Job Title: Data Engineer Experience: 5-8 Years location: Chennai, Bangalore Employment Type: Full Time. Job Type: Work from Office (Monday - Friday) Shift Timing: 12:30 PM to 9:30 PM Required Skills: 5-8 years' experience candidate as back end - data engineer. Strong experience in SQL. Strong knowledge and experience Python and Py Spark. Experience in AWS. Experience in Docker and OpenShift. Hands on experience with REST Concepts. Design and Develop business solutions on the data front. Experience in implementation of new enhancements and also handling defect triage. Candidate must have strong analytical abilities. Skills/ Competency Additionally Preferred Jira, Bit Bucket Experience on Kafka. Experience on Snowflake. Domain knowledge in Banking. Analytical skills. Excellent communication skills Working knowledge of Agile. Thanks & Regards, Suresh Kumar Raja, CGI.
Posted 1 month ago
4.0 - 7.0 years
8 - 12 Lacs
Noida
Work from Office
Hands-on individual responsible for producing excellent quality of code, adhering to expected coding standards and industry best practices. Must have strong experience in Java 8, Multithreading, Springboot, Oracle/PostgreSql. Must have good knowledge on - Hibernate, Caching Frameworks, Memory Management AWS - Deployment (Docker and Kubernetes) + Common Services (mainly S3, Lambda, CloudFront, API Gateway, Cloud Formation and ALBs) Kafka, building event driven microservices and streaming applications Good to have MongoDB and ElasticSearch knowledge Excellent problem solving trouble shooting skills High levels of ownership and commitment on deliverables Strong Communication Skills - Should be able to interact with clients stakeholders comfortably to probe a technical problem or provide a clear progress update or clarify requirement specifications Mandatory Competencies Programming Language - Java - Core Java (java 8+) Fundamental Technical Skills - Spring Framework/Hibernate/Junit etc. Beh - Communication and collaboration DevOps/Configuration Mgmt - DevOps/Configuration Mgmt - Containerization (Docker, Kubernetes) Middleware - Message Oriented Middleware - Messaging (JMS, ActiveMQ, RabitMQ, Kafka, SQS, ASB etc) Cloud - AWS - AWS S3, S3 glacier, AWS EBS
Posted 1 month ago
5.0 - 8.0 years
8 - 12 Lacs
Noida
Work from Office
1. Design and manage cloud-based systems on AWS. 2. Develop and maintain backend services and APIs using Java. 3. Basic knowledge on SQL and able to write SQL queries. 4. Good hands on Docker file and multistage docker 5. Implement containerization using Docker and orchestration with ECS/Kubernetes. 6. Monitor and troubleshoot cloud infrastructure and application performance. 7. Collaborate with cross-functional teams to integrate systems seamlessly. 8. Document system architecture, configurations, and operational procedures. Need Strong Hands-on Knowledge: * ECS, ECR, NLB, ALB, ACM, IAM, S3, Lambda, RDS, KMS, API Gateway, Cognito, CloudFormation. Good to Have: * Experience with AWS CDK for infrastructure as code. * AWS certifications (e.g., AWS Certified Solutions Architect, AWS Certified Developer). * Pyhton Mandatory Competencies Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Beh - Communication Cloud - AWS - ECS Database - Sql Server - SQL Packages
Posted 1 month ago
5.0 - 10.0 years
7 - 11 Lacs
Noida
Work from Office
Expert inPython(5+ years) Expert in Django or any similar frameworks (5+ years) Experience (2+ years) inTypeScript, JavaScript and JS frameworks (Angular > 2 with Angular Material) Good knowledge of RDMS (preferably Postgres) Experience and sound knowledge of AWS services (ECS, lambda, deployment pipelines etc). Excellent written and verbal communication skills. Very good analytical and problem-solving skills. Ability to pick up new technologies. Write clean, maintainable and efficient code. Willingness to learn and understand the business domain. Mandatory Competencies Programming Language - Python - Django Beh - Communication User Interface - Typescript - Typescript User Interface - JavaScript - JavaScript Cloud - AWS - ECS Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate
Posted 1 month ago
6.0 - 10.0 years
10 - 15 Lacs
Noida
Work from Office
Must-Have Skills: Expertise in AWS CDK, Services(Lambda, ECS, S3), Python and PostgreSQL DB management. Strong understanding serverless architecture and event-driven design(SNS, SQS). Nice to have: Knowledge of multi-account AWS Setups and Security best practices (IAM, VPC, etc.), Experience in cost optimization strategies in AWS. Mandatory Competencies Beh - Communication and collaboration Cloud - AWS - ECS Cloud - AWS - AWS Lambda,AWS EventBridge, AWS Fargate Data Science and Machine Learning - Data Science and Machine Learning - Python
Posted 1 month ago
3.0 - 8.0 years
6 - 10 Lacs
Gurugram
Work from Office
Understands the process flow and the impact on the project module outcome. Works on coding assignments for specific technologies basis the project requirements and documentation available Debugs basic software components and identifies code defects. Focusses on building depth in project specific technologies. Expected to develop domain knowledge along with technical skills. Effectively communicate with team members, project managers and clients, as required. A proven high-performer and team-player, with the ability to take the lead on projects. Design and create S3 buckets and folder structures (raw, cleansed_data, output, script, temp-dir, spark-ui) Develop AWS Lambda functions (Python/Boto3) to download Bhav Copy via REST API and ingest into S3 Author and maintain AWS Glue Spark jobs to: partition data by scrip, year and month convert CSV to Parquet with Snappy compression Configure and run AWS Glue Crawlers to populate the Glue Data Catalog Write and optimize AWS Athena SQL queries to generate business-ready datasets Monitor, troubleshoot and tune data workflows for cost and performance Document architecture, code and operational runbooks Collaborate with analytics and downstream teams to understand requirements and deliver SLAs Technical Skills 3+ years hands-on experience with AWS data services (S3, Lambda, Glue, Athena) PostgreSQL basics Proficient in SQL and data partitioning strategies Experience with Parquet file formats and compression techniques (Snappy) Ability to configure Glue Crawlers and manage the AWS Glue Data Catalog Understanding of serverless architecture and best practices in security, encryption and cost control Good documentation, communication and problem-solving skills Nice-to-have skills Qualifications Qualifications 3-5 years of work experience in relevant field B.Tech/B.E/M.Tech or MCA degree from a reputed university. Computer science background is preferred
Posted 1 month ago
2.0 - 3.0 years
5 - 9 Lacs
Kochi, Coimbatore, Thiruvananthapuram
Work from Office
Location:Kochi, Coimbatore, Trivandrum Must have skills:Python/Scala, Pyspark/Pytorch Good to have skills:Redshift Job Summary Youll capture user requirements and translate them into business and digitally enabled solutions across a range of industries. Your responsibilities will include: Roles and Responsibilities Designing, developing, optimizing, and maintaining data pipelines that adhere to ETL principles and business goals Solving complex data problems to deliver insights that helps our business to achieve their goals. Source data (structured unstructured) from various touchpoints, format and organize them into an analyzable format. Creating data products for analytics team members to improve productivity Calling of AI services like vision, translation etc. to generate an outcome that can be used in further steps along the pipeline. Fostering a culture of sharing, re-use, design and operational efficiency of data and analytical solutions Preparing data to create a unified database and build tracking solutions ensuring data quality Create Production grade analytical assets deployed using the guiding principles of CI/CD. Professional and Technical Skills Expert in Python, Scala, Pyspark, Pytorch, Javascript (any 2 at least) Extensive experience in data analysis (Big data- Apache Spark environments), data libraries (e.g. Pandas, SciPy, Tensorflow, Keras etc.), and SQL. 2-3 years of hands-on experience working on these technologies. Experience in one of the many BI tools such as Tableau, Power BI, Looker. Good working knowledge of key concepts in data analytics, such as dimensional modeling, ETL, reporting/dashboarding, data governance, dealing with structured and unstructured data, and corresponding infrastructure needs. Worked extensively in Microsoft Azure (ADF, Function Apps, ADLS, Azure SQL), AWS (Lambda,Glue,S3), Databricks analytical platforms/tools, Snowflake Cloud Datawarehouse. Additional Information Experience working in cloud Data warehouses like Redshift or Synapse Certification in any one of the following or equivalent AWS- AWS certified data Analytics- Speciality Azure- Microsoft certified Azure Data Scientist Associate Snowflake- Snowpro core- Data Engineer Databricks Data Engineering Qualification Experience:3.5 -5 years of experience is required
Posted 1 month ago
Upload Resume
Drag or click to upload
Your data is secure with us, protected by advanced encryption.
Browse through a variety of job opportunities tailored to your skills and preferences. Filter by location, experience, salary, and more to find your perfect fit.
We have sent an OTP to your contact. Please enter it below to verify.
Accenture
54024 Jobs | Dublin
Wipro
24262 Jobs | Bengaluru
Accenture in India
18733 Jobs | Dublin 2
EY
17079 Jobs | London
Uplers
12548 Jobs | Ahmedabad
IBM
11704 Jobs | Armonk
Amazon
11059 Jobs | Seattle,WA
Bajaj Finserv
10656 Jobs |
Accenture services Pvt Ltd
10587 Jobs |
Oracle
10506 Jobs | Redwood City